0% found this document useful (0 votes)
73 views24 pages

BSIMM6: Software Security Maturity Model

The document introduces the Building Security In Maturity Model (BSIMM), which quantifies the practices of existing software security initiatives. The BSIMM aims to help organizations plan, carry out, and measure their own software security initiatives. It reflects the current state of software security based on data from 202 measurements of 78 organizations. The BSIMM is organized into 112 activities across four domains: governance, intelligence, secure software development lifecycle touchpoints, and deployment. It provides a common framework for comparing different software security initiatives.

Uploaded by

Lorena Strechie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views24 pages

BSIMM6: Software Security Maturity Model

The document introduces the Building Security In Maturity Model (BSIMM), which quantifies the practices of existing software security initiatives. The BSIMM aims to help organizations plan, carry out, and measure their own software security initiatives. It reflects the current state of software security based on data from 202 measurements of 78 organizations. The BSIMM is organized into 112 activities across four domains: governance, intelligence, secure software development lifecycle touchpoints, and deployment. It provides a common framework for comparing different software security initiatives.

Uploaded by

Lorena Strechie
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

6

Authors
Gary McGraw, Ph.D., Sammy Migues, and Jacob West
PART ONE
The Building Security In Maturity Model (BSIMM, pronounced “bee simm”) is a study of existing software security
initiatives. By quantifying the practices of many different organizations, we can describe the common ground
shared by many as well as the variation that makes each unique. Our aim is to help the wider software security
community plan, carry out, and measure initiatives of their own. The BSIMM is not a “how to” guide, nor is it a one-
size-fits-all prescription. Instead, the BSIMM is a reflection of the current state of Software Security.

We begin with a brief description of the function and importance of a software security initiative. We then explain
our model and the method we use for quantifying the state of an initiative. Since the BSIMM study began in
2008, we have studied 104 initiatives, which comprise 235 distinct measurements—some firms use the BSIMM to
measure each of their business units and some have been measured more than once. To ensure the continued
relevance of the data we report, we excluded from BSIMM6 measurements older than 42 months. The current data
set comprises 202 distinct measurements collected from 78 firms. Thanks to repeat measurements, not only do we
report on current practices, but also on the ways in which
some initiatives have evolved over a period of years.

We devote the later portion of the document to a detailed


explanation of the key roles in a software security The BSIMM is a
initiative, the 112 activities that now comprise our model,
and a summary of the raw data we have collected. reflection of the
We have reviewed the description of each activity for
BSIMM6. current state of
software security.
Our work with the BSIMM model shows that measuring
a firm’s software security initiative is both possible and
extremely useful. BSIMM measurements can be used to
plan, structure, and execute the evolution of a software
security initiative. Over time, firms participating in the
BSIMM show measurable improvement in their software
security initiatives.

Introduction
History
In the late 1990s, software security began to flourish as a discipline separate from computer and network security.
Researchers began to put more emphasis on studying the ways a programmer can contribute to or unintentionally
undermine the security of a computer system: What kinds of bugs and flaws lead to security problems? How can we
identify problems systematically?

5 | Building Security In Maturity Model (BSIMM) Version 6


By the middle of the following decade, there was an emerging consensus that creating secure software required
more than just smart individuals toiling away. Getting security right means being involved in the software
development process.

Since then, practitioners have come to know that process alone is insufficient. Software security encompasses
business, social, and organizational aspects as well. We use the term software security initiative (SSI) to refer to all of
the activities undertaken for the purpose of building secure software.

BSIMM6
The purpose of the BSIMM is to quantify the activities carried out by real software security initiatives. Because these
initiatives use different methodologies and different terminology, the BSIMM requires a framework that allows
us to describe all of the initiatives in a uniform
way. Our software security framework (SSF) and
activity descriptions provide a common vocabulary
for explaining the salient elements of a software
security initiative, thereby allowing us to compare
We believe all
initiatives that use different terms, operate at organizations can benefit
different scales, exist in different vertical markets,
or create different work products. from using the same
We classify our work as a maturity model because measuring stick.
improving software security almost always
means changing the way an organization works—
something that doesn’t happen overnight. We
understand that not all organizations need to achieve the same security goals, but we believe all organizations can
benefit from using the same measuring stick.

BSIMM6 is the sixth major version of the BSIMM model. It includes updated activity descriptions, data from 78 firms
in multiple vertical markets, and a longitudinal study.

Audience
The BSIMM is meant for use by anyone responsible for creating and executing an SSI. We have observed that
successful software security initiatives are typically run by a senior executive who reports to the highest levels in an
organization. These executives lead an internal group that we call the software security group (SSG), charged with
directly executing or facilitating the activities described in the BSIMM. The BSIMM is written with the SSG and SSG
leadership in mind.

We expect readers to be familiar with the software security literature. You can become familiar with many concepts
by reading Software Security: Building Security In. The BSIMM does not attempt to explain software security basics,
describe its history, or provide references to the ever-expanding literature. Succeeding with the BSIMM without
becoming familiar with the literature is unlikely.

6 | Building Security In Maturity Model (BSIMM) Version 6


BSIMM Terminology
Nomenclature has always been a problem in computer security and software security is no
exception. Several terms used in the BSIMM have particular meaning for us. Here are some of the
most important terms used throughout the document:

Activity: Actions carried out or facilitated by the SSG as part of a practice. Activities are divided
into three levels in the BSIMM.

Domain: The domains are: governance, intelligence, secure software development lifecycle (SSDL)
touchpoints, and deployment. See the SSF section on page 10.

Practice: One of the 12 categories of BSIMM activities. Each domain in the Software Security
Framework has three practices. Activities in each practice are divided into three levels. See the SSF
section on page 10.

Satellite: A group of interested and engaged developers, architects, software managers, testers,
and similar roles who have a natural affinity for software security and are organized and leveraged
by a Software Security Initiative.

Secure Software Development Lifecycle (SSDL): Any SDLC with integrated software security
checkpoints and activities.

Security Development Lifecycle (SDL): A term used by Microsoft to describe their Secure
Software Development Lifecycle.

Software Security Framework (SSF): The basic structure underlying the BSIMM, comprising 12
practices divided into four domains. See the SSF section on page 10.

Software Security Group (SSG): The internal group charged with carrying out and facilitating
software security. According to our observations, the first step of a Software Security Initiative is
forming an SSG.

Software Security Initiative: An organization-wide program to instill, measure, manage, and


evolve software security activities in a coordinated fashion. Also known in the literature as an
Enterprise Software Security Program (see chapter 10 of Software Security: Building Security In).

9 | Building Security In Maturity Model (BSIMM) Version 6


BSIMM6 Structure
The BSIMM is organized as a set of 112 activities in a framework.

The Software Security Framework


The table below shows the software security framework (SSF) used to organize the 112 BSIMM activities. There are
12 practices organized into four domains.

The four domains are:

Governance: Practices that help organize, manage, and measure a software security initiative.
Staff development is also a central governance practice.

Intelligence: Practices that result in collections of corporate knowledge used in carrying out
software security activities throughout the organization. Collections include both proactive
security guidance and organizational threat modeling.

SSDL Touchpoints: Practices associated with analysis and assurance of particular software
</> development artifacts and processes. All software security methodologies include these practices.

Deployment: Practices that interface with traditional network security and software
maintenance organizations. Software configuration, maintenance and other environment issues
have direct impact on software security.

The 12 practices are:

1. Strategy & Metrics (SM)


Governance 2. Compliance & Policy (CP)
3. Training (T)

4. Attack Models (AM)


Intelligence 5. Security Features & Design (SFD)
6. Standards & Requirements (SR)

7. Architecture Analysis (AA)


SSDL Touchpoints 8. Code Review (CR)
9. Security Testing (ST)

10. Penetration Testing (PT)


Deployment 11. Software Environment (SE)
12. Configuration Management & Vulnerability Management (CMVM)

10 | Building Security In Maturity Model (BSIMM) Version 6


The BSIMM6 Skeleton
The BSIMM skeleton provides a way to view the model at a glance and is useful when assessing a software security
initiative. The skeleton is shown below, organized by practices and levels, and the percentage of the firms, out of 78,
performing that activity in their SSI. More complete descriptions of the activities, examples, and term definitions can
be found in Part Two of this document.

Governance
STRATEGY & METRICS (SM)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Publish process (roles, responsibilities, plan), evolve as necessary. SM1.1 53%
Create evangelism role and perform internal marketing. SM1.2 51%
Educate executives. SM1.3 46%
Identify gate locations, gather necessary artifacts. SM1.4 85%

LEVEL 2
Publish data about software security internally. SM2.1 46%
Enforce gates with measurements and track exceptions. SM2.2 37%
Create or grow a satellite. SM2.3 38%
Identify metrics and use them to drive budgets. SM2.5 22%
Require security sign-off. SM2.6 37%

LEVEL 3
Use an internal tracking application with portfolio view. SM3.1 19%
Run an external marketing program. SM3.2 9%

COMPLIANCE & POLICY (CP)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Unify regulatory pressures. CP1.1 58%
Identify PII obligations. CP1.2 78%
Create policy. CP1.3 53%
Table continued on next page >

12 | Building Security In Maturity Model (BSIMM) Version 6


Governance continued...
LEVEL 2
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Identify PII data inventory. CP2.1 24%
Require security sign-off for compliance-related risk. CP2.2 29%
Implement and track controls for compliance. CP2.3 32%
Paper all vendor contracts with software security SLAs. CP2.4 37%
Ensure executive awareness of compliance and privacy obligations. CP2.5 42%

LEVEL 3
Create regulator eye-candy. CP3.1 23%
Impose policy on vendors. CP3.2 14%
Drive feedback from SSDL data back to policy. CP3.3 8%

TRAINING (T)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Provide awareness training. T1.1 76%
Deliver role-specific advanced curriculum (tools, technology stacks, bug parade). T1.5 33%
Create and use material specific to company history. T1.6 22%
Deliver on-demand individual training. T1.7 46%

LEVEL 2
Enhance satellite through training and events. T2.5 13%
Include security resources in onboarding. T2.6 19%
Identify satellite through training. T2.7 8%

LEVEL 3
Reward progression through curriculum (certification or HR). T3.1 4%
Provide training for vendors or outsourced workers. T3.2 4%
Host external software security events. T3.3 4%
Require an annual refresher. T3.4 10%
Establish SSG office hours. T3.5 5%

13 | Building Security In Maturity Model (BSIMM) Version 6


Intelligence
ATTACK MODELS (AM)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Build and maintain a top N possible attacks list. AM1.1 22%
Create a data classification scheme and inventory. AM1.2 65%
Identify potential attackers. AM1.3 40%
Collect and publish attack stories. AM1.4 10%
Gather and use attack intelligence. AM1.5 59%
Build an internal forum to discuss attacks. AM1.6 14%

LEVEL 2
Build attack patterns and abuse cases tied to potential attackers. AM2.1 8%
Create technology-specific attack patterns. AM2.2 10%

LEVEL 3
Have a science team that develops new attack methods. AM3.1 5%
Create and use automation to do what attackers will do. AM3.2 3%

SECURITY FEATURES & DESIGN (SFD)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Build and publish security features. SFD1.1 78%
Engage SSG with architecture. SFD1.2 76%

LEVEL 2
Build secure-by-design middleware frameworks and common libraries. SFD2.1 31%
Create SSG capability to solve difficult design problems. SFD2.2 50%

LEVEL 3
Form a review board or central committee to approve and maintain secure
design patterns. SFD3.1 10%

Require use of approved security features and frameworks. SFD3.2 14%


Find and publish mature design patterns from the organization. SFD3.3 3%

14 | Building Security In Maturity Model (BSIMM) Version 6


Intelligence continued...
STANDARDS & REQUIREMENTS (SR)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Create security standards. SR1.1 73%
Create a security portal. SR1.2 64%
Translate compliance constraints to requirements. SR1.3 67%

LEVEL 2
Create a standards review board. SR2.2 35%
Create standards for technology stacks. SR2.3 27%
Identify open source. SR2.4 24%
Create SLA boilerplate. SR2.5 26%
Use secure coding standards. SR2.6 29%

LEVEL 3
Control open source risk. SR3.1 8%
Communicate standards to vendors. SR3.2 14%

15 | Building Security In Maturity Model (BSIMM) Version 6


</> SSDL Touchpoints

ARCHITECTURE ANALYSIS (AA)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Perform security feature review. AA1.1 86%
Perform design review for high-risk applications. AA1.2 37%
Have SSG lead design review efforts. AA1.3 28%
Use a risk questionnaire to rank applications. AA1.4 59%

LEVEL 2
Define and use AA process. AA2.1 15%
Standardize architectural descriptions (including data flow). AA2.2 12%
Make SSG available as AA resource or mentor. AA2.3 17%

LEVEL 3
Have software architects lead design review efforts. AA3.1 8%
Drive analysis results into standard architecture patterns. AA3.2 1%

CODE REVIEW (CR)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use a top N bugs list (real data preferred). CR1.1 23%
Have SSG perform ad hoc review. CR1.2 68%
Use automated tools along with manual review. CR1.4 71%
Make code review mandatory for all projects. CR1.5 31%
Use centralized reporting to close the knowledge loop and drive training. CR1.6 35%

LEVEL 2
Enforce coding standards. CR2.2 9%
Assign tool mentors. CR2.5 26%
Use automated tools with tailored rules. CR2.6 21%

LEVEL 3
Build a factory. CR3.2 4%
Build a capability for eradicating specific bugs from the entire codebase. CR3.3 6%
Automate malicious code detection. CR3.4 4%

16 | Building Security In Maturity Model (BSIMM) Version 6


</> SSDL Touchpoints continued...

SECURITY TESTING (ST)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Ensure QA supports edge/boundary value condition testing. ST1.1 78%
Drive tests with security requirements and security features. ST1.3 85%

LEVEL 2
Integrate black box security tools into the QA process. ST2.1 31%
Share security results with QA. ST2.4 10%
Include security tests in QA automation. ST2.5 13%
Perform fuzz testing customized to application APIs. ST2.6 14%

LEVEL 3
Drive tests with risk analysis results. ST3.3 5%
Leverage coverage analysis. ST3.4 5%
Begin to build and apply adversarial security tests (abuse cases). ST3.5 6%

17 | Building Security In Maturity Model (BSIMM) Version 6


Deployment
PENETRATION TESTING (PT)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use external penetration testers to find problems. PT1.1 88%
Feed results to the defect management and mitigation system. PT1.2 60%
Use penetration testing tools internally. PT1.3 60%

LEVEL 2
Provide penetration testers with all available information. PT2.2 26%
Schedule periodic penetration tests for application coverage. PT2.3 22%

LEVEL 3
Use external penetration testers to perform deep-dive analysis. PT3.1 13%
Have the SSG customize penetration testing tools and scripts. PT3.2 10%

SOFTWARE ENVIRONMENT (SE)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Use application input monitoring. SE1.1 47%
Ensure host and network security basics are in place. SE1.2 88%

LEVEL 2
Publish installation guides. SE2.2 40%
Use code signing. SE2.4 32%

LEVEL 3
Use code protection. SE3.2 13%
Use application behavior monitoring and diagnostics. SE3.3 6%

18 | Building Security In Maturity Model (BSIMM) Version 6


Deployment continued...
CONFIGURATION MANAGEMENT & VULNERABILITY MANAGEMENT (CMVM)

LEVEL 1
ACTIVITY DESCRIPTION ACTIVITY # PARTICIPANT %
Create or interface with incident response. CMVM1.1 91%

Identify software defects found in operations monitoring and feed them back CMVM1.2 94%
to development.

LEVEL 2
Have emergency codebase response. CMVM2.1 82%
Track software bugs found in operations through the fix process. CMVM2.2 78%
Develop an operations inventory of applications. CMVM2.3 40%

LEVEL 3
Fix all occurrences of software bugs found in operations. CMVM3.1 5%
Enhance the SSDL to prevent software bugs found in operations. CMVM3.2 8%
Simulate software crisis. CMVM3.3 8%
Operate a bug bounty program. CMVM3.4 4%

Putting BSIMM6 to Use


The BSIMM describes 112 activities that any organization can put into practice. The activities are structured in terms
of the SSF, which identifies 12 practices grouped into four domains.

What BSIMM6 Tells Us


The BSIMM data yield very interesting analytical “I’m so glad to see this important
results. The BSIMM6 scorecard on the following
body of work continuing to grow and
page shows the number of times each of the 112
activities briefly outlined in the BSIMM skeleton was evolve. BSIMM remains one of the best
observed in the BSIMM6 data. This is the highest yardsticks available to practitioners
resolution BSIMM data that is published.
today for measuring how their secure
software development stacks up against
the rest of the industry.”

– Kenneth R. van Wyk, President, KRvW

19 | Building Security In Maturity Model (BSIMM) Version 6


TWELVE CORE ACTIVITIES “EVERYBODY” DOES

ACTIVITY DESCRIPTION
[SM1.4] Identify gate locations and gather necessary artifacts
[CP1.2] Identify PII obligations
[T1.1] Provide awareness training
[AM1.2] Create a data classification scheme and inventory
[SFD1.1] Build and publish security features
[SR1.1] Create security standards
[AA1.1] Perform security feature review
[CR1.4] Use automated tools along with manual review
[ST1.3] Drive tests with security requirements and security features
[PT1.1] Use external penetration testers to find problems
[SE1.2] Ensure host and network security basics are in place
[CMVM1.2] Identify software bugs found in operations monitoring and feed them back to development

Spider charts are created by noting the highest-level activity for each practice per BSIMM firm (a “high-water mark”)
and averaging these scores over a group of firms to produce 12 numbers (one for each practice). The resulting spider
chart plots these values on 12 activity spokes corresponding to the 12 practices. Note that level 3 (the outside edge)
is considered more mature than level 0 (center point). Other, more sophisticated analyses are possible, of course. We
continue to experiment with weightings by level, normalization by number of activities and other schemes.

EARTH SPIDER CHART


Strategy & Metrics
3.0

Configuration Mgmt. & Vulnerability Mgmt. Compliance & Policy


2.5

2.0

Software Environment 1.5


Training

1.0

0.5

Penetration Testing 0.0 Attack Models

Security Testing Security Features & Design

Code Review Standards & Requirements

Architecture Analysis

Earth (78)

21 | Building Security In Maturity Model (BSIMM) Version 6


By computing a score for each firm in the study, we can also compare relative and average maturity for one firm
against the others. The range of observed scores in the current data pool is [10, 85].

The graph below shows the distribution of scores among the population of 78 participating firms. To make this
graph, we divided the scores into six bins. As you can see, the scores represent a slightly skewed bell curve. We
also plotted the average age of the firms in each bin as the orange line on the graph. In general, firms where more
BSIMM activities have been observed have older software security initiatives.

SCORE DISTRIBUTION

30

25

20
FIRMS

15

10 8.9

6.0
5.3

5 3.9
3.1
0.9

0-15 16-30 31-45 46-60 61-75 76-115

Earth (78) - With average SSG age (in years) per score bucket

We are pleased that the BSIMM study continues to grow year after year (the data set we report on here is more
than 26 times the size it was for the original publication). Note that once we exceeded a sample size of 30 firms, we
began to apply statistical analysis, yielding statistically significant results.

Measuring Your Firm with BSIMM6


The most important use of the BSIMM is as a measuring stick to determine where your approach currently stands
relative to other firms. Note which activities you already have in place, find them in the skeleton, then determine
their levels and build a scorecard. A direct comparison of all 112 activities is perhaps the most obvious use of the
BSIMM. This can be accomplished by building a scorecard using the data above.

The scorecard you see on the next page depicts a fake firm that performs 37 BSIMM activities (noted as 1s in the
FIRM columns), including eight activities that are the most common in their respective practices (purple boxes).
On the other hand, the firm does not perform the most commonly observed activities in the other four practices
(red boxes) and should take some time to determine whether these are necessary or useful to its overall software
security initiative. The BSIMM6 FIRM columns show the number of observations (currently out of 78) for each
activity, allowing the firm to understand the general popularity of an activity among the 78 BSIMM firms.

22 | Building Security In Maturity Model (BSIMM) Version 6


PART TWO
This part of the document provides detail behind the people and activities involved in a software security initiative
and measured by the BSIMM. We begin by describing the software security group (SSG) and other key roles. We
then provide descriptions for each of the 112 activities that comprise BSIMM6. Throughout, we annotate our
discussion with relevant observation data from the 78 participating firms.

Roles in a Software Security Initiative


Determining who is supposed to carry out the activities described in the BSIMM is an important part of making any
software security initiative work.

Executive Leadership
Of primary interest is identifying and empowering a senior executive to manage operations, garner resources, and
provide political cover for a software security initiative. Grassroots approaches to software security sparked and
led solely by developers and their direct managers have a poor track record in the real world. Likewise, initiatives
spearheaded by resources from an existing network security group often run into serious trouble when it comes
time to interface with development groups. By identifying a senior executive and putting him or her in charge of
software security directly, you address two management 101 concerns—accountability and empowerment. You also
create a place in the organization where software security can take root and begin to thrive.

The executives in charge of the software security initiatives we studied have a variety of titles, including: Information
Security Manager, Security Director, Director of Application Security Strategy, Director of Enterprise Information
Security, VP of Security Architecture, Chief Product Security and Privacy Officer, CISO, Head of Information
Governance, Global Head of Security Maturation and
Certification, Technical Director, Director of IT Security,
and Manager of Enterprise Information Protection.
We observed a fairly wide spread in exactly where the Carrying out the
SSG is situated in the firms we studied. In particular,
28 exist in the CIO’s organization, 16 exist in the CTO’s activities in the
organization, 10 report to the COO, four report to the
Chief Assurance Officer, two report to the CSO, and one BSIMM successfully
SSG reports to each of the General Counsel, the CFO, and
the founder. Twelve SSGs report up through technology
without an SSG is
or product operations groups (as opposed to governance very unlikely.
organizations). Three report up through the business unit
where they were originally formed. Several companies we
studied did not specify where their SSG fits in the larger
organization.

32 | Building Security In Maturity Model (BSIMM) Version 6


Software Security Group (SSG)
The second most important role in a software security initiative after the senior executive is that of the Software
Security Group. Every single one of the 78 initiatives we describe in BSIMM6 has an SSG. Carrying out the activities
in the BSIMM successfully without an SSG is very unlikely (and has never been observed in the field to date), so
create an SSG before you start working to adopt the BSIMM activities. The best SSG members are software security
people, but software security people are often impossible to find. If you must create software security types from
scratch, start with developers and teach them about security. Starting with network security people and attempting
to teach them about software, compilers, SDLCs, bug tracking, and everything else in the software universe usually
fails to produce the desired results. Unfortunately, no amount of traditional security knowledge can overcome a
lack of experience building software.

SSGs come in a variety of shapes and sizes. All good SSGs appear to include both people with deep coding
experience and people with architectural chops. As you will see below, software security can’t only be about finding
specific bugs such as the OWASP Top Ten. Code review is a very important best practice and to perform code
review you must actually understand code (not to mention the huge piles of security bugs). However, the best code
reviewers sometimes make very poor software architects and asking them to perform an architecture risk analysis
will only result in blank stares. Make sure you cover architectural capabilities in your SSG as well as you do code.
Finally, the SSG is often asked to mentor, train, and work directly with hundreds of developers. Communications
skills, teaching capability, and good consulting horse sense are must-haves for at least a portion of the SSG staff.
For more about this issue, see our SearchSecurity article based on SSG structure data gathered at the 2014 BSIMM
Community Conference: How to Build a Team for Software Security Management.

Though no two of the 78 firms we examined had exactly the same SSG structure (suggesting that there is no
one set way to structure an SSG), we did observe some commonalities that are worth mentioning. At the highest
level of organization, SSGs come in five major flavors: those that are 1) organized to provide software security
services, 2) organized around setting policy, 3) mirroring business unit organizations, 4) organized with a hybrid
policy and services approach, and 5) structured around managing a distributed network of others doing software
security work. Some SSGs are highly distributed across a firm and others are very centralized. If we look across
all of the SSGs in our study, there are several common “subgroups” that are often observed: people dedicated to
policy, strategy, and metrics; internal “services” groups that (often separately) cover tools, penetration testing, and
middleware development plus shepherding; incident response groups; groups responsible for training development
and delivery; externally-facing marketing and communications groups; and vendor-control groups.

In the statistics reported above, we noted an average ratio of SSG to development of 1.51% across the entire group
of 78 organizations we studied. That means one SSG member for every 75 developers when we average the ratios
for each participating firm. The SSG with the largest ratio was 16.7% and the smallest was 0.03%. To remind you
of the particulars in terms of actual bodies, SSG size on average among the 78 firms was 13.9 people (smallest 1,
largest 130, median 6).

Satellite
In addition to the SSG, many software security programs have identified a number of individuals (often developers,
testers, and architects) who share a basic interest in software security, but are not directly employed in the SSG.
When people like this carry out software security activities, we call this group the satellite.

Sometimes the satellite is widely distributed, with one or two members in each product group. Sometimes the

33 | Building Security In Maturity Model (BSIMM) Version 6


satellite is more focused and gets together regularly to compare notes, learn new technologies, and expand the
understanding of software security in an organization. Identifying and fostering a strong satellite is important to
the success of many software security initiatives (but not all of them). Some BSIMM activities target the satellite
explicitly.

Of particular interest, all 10 firms with the highest BSIMM scores have a satellite (100%), with an average satellite
size of 131 people. Outside of the top 10, 30 of the remaining 68 firms have a satellite (44.1%). Of the 10 firms with
the lowest BSIMM scores, none have a satellite. This suggests that as a software security initiative matures, its
activities become distributed and institutionalized into the organizational structure. Among our population of 78
firms, initiatives tend to evolve from centralized and specialized in the beginning to decentralized and distributed
(with an SSG at the core orchestrating things).

Everybody Else
Our survey participants have engaged everyone involved
in the software development lifecycle as a means of
All 10 firms with
addressing software security. the highest BSIMM
• Builders, including developers, architects, and scores have a
their managers must practice security engineering,
ensuring that the systems that we build are satellite.
defensible and not riddled with holes. The SSG will
interact directly with builders when they carry out
the activities described in the BSIMM. Generally
speaking, as an organization matures, the SSG attempts to empower builders so they can carry out most of the
BSIMM activities themselves with the SSG helping in special cases and providing oversight. In this version of the
BSIMM, we often don’t explicitly point out whether a given activity is to be carried out by the SSG, developers,
or testers. You should come up with an approach that makes sense for your organization and accounts for your
workload and your software lifecycle.

• Testers concerned with routine testing and verification should do what they can to keep a weather eye out for
security problems. Some of the BSIMM activities in the Security Testing practice (see page 52) can be carried out
directly by QA.

• Operations people must continue to design reasonable networks, defend them, and keep them up. As you will
see in the Deployment domain of the SSF, software security doesn’t end when software is “shipped.”

• Administrators must understand the distributed nature of modern systems and begin to practice the principle
of least privilege, especially when it comes to the applications they host or attach to as services in the cloud.

• Executives and middle management, including line of business owners and product managers, must
understand how early investment in security design and security analysis affects the degree to which users will
trust their products. Business requirements should explicitly address security needs. Any sizeable business
today depends on software to work. Software security is a business necessity.

• Vendors, including those who supply COTS, custom software and software-as-a-service, are increasingly
subjected to SLAs and reviews (such as vBSIMM) that help ensure products are the result of a secure SDLC.

34 | Building Security In Maturity Model (BSIMM) Version 6


APPENDIX
Adjusting BSIMM-V for BSIMM6
Because the BSIMM is a data driven model, we have chosen to make adjustments to the model based on the data
observed between BSIMM-V and BSIMM6.

We have added, deleted, and adjusted the level of various activities based on the data observed as the project
continues. To preserve backwards compatibility, all changes are made by adding new activity labels to the model,
even when an activity has simply changed levels. We make changes by considering outliers both in the model itself
and in the levels we assigned to various activities in the 12 practices. We use the results of an intra-level standard
deviation analysis to determine which “outlier” activities to move between levels. We focus on changes that
minimize standard deviation in the average number of observed activities at each level.

Here are the four changes we made according to that paradigm:

1. [SM1.6] became [SM2.6]; [SM1.6] was removed (level promotion)

2. [SR1.4] became [SR2.6]; [SR1.4] was removed (level promotion)

3. [ST3.1] became [ST2.5]; [ST3.1] was removed (level demotion)

4. [ST3.2] became [ST2.6]; [ST3.2] was removed (level demotion)

We also carefully considered, but did not adjust, the following activities [SM2.1][AM1.4][CR 1.1][CR2.2][ST2.4]. We
are keeping a close eye on the Attack Models and Code Review practices; both of which were impacted by culling
aged data.

59 | Building Security In Maturity Model (BSIMM) Version 6


112 BSIMM Activities at a Glance
(Red indicates most observed BSIMM activity in that practice)

Level 1 Activities

Governance
Strategy & Metrics (SM)
• Publish process (roles, responsibilities, plan), evolve as necessary. [SM1.1]
• Create evangelism role and perform internal marketing. [SM1.2]
• Educate executives. [SM1.3]
• Identify gate locations, gather necessary artifacts. [SM1.4]

Compliance & Policy (CP)


• Unify regulatory pressures. [CP1.1]
• Identify PII obligations. [CP1.2]
• Create policy. [CP1.3]

Training (T)
• Provide awareness training. [T1.1]
• Deliver role-specific advanced curriculum (tools, technology stacks and bug parade). [T1.5]
• Create and use material specific to company history. [T1.6]
• Deliver on-demand individual training. [T1.7]

Intelligence
Attack Models (AM)
• Build and maintain a top N possible attacks list. [AM1.1: 17]
• Create a data classification scheme and inventory. [AM1.2]
• Identify potential attackers. [AM1.3]
• Collect and publish attack stories. [AM1.4]
• Gather and use attack intelligence. [AM1.5]
• Build an internal forum to discuss attacks. [AM1.6]

Security Features & Design (SFD)


• Build and publish security features. [SFD1.1]
• Engage SSG with architecture. [SFD1.2]

Standards & Requirements (SR)


• Create security standards. [SR1.1]
• Create a security portal. [SR1.2]
• Translate compliance constraints to requirements. [SR1.3]

60 | Building Security In Maturity Model (BSIMM) Version 6


</> SSDL Touchpoints
Architecture Analysis (AA)
• Perform security feature review. [AA1.1]
• Perform design review for high-risk applications. [AA1.2]
• Have SSG lead design review efforts. [AA1.3]
• Use a risk questionnaire to rank applications. [AA1.4]

Code Review (CR)


• Use a top N bugs list (real data preferred). [CR1.1]
• Have SSG perform ad hoc review. [CR1.2]
• Use automated tools along with manual review. [CR1.4]
• Make code review mandatory for all projects. [CR1.5]
• Use centralized reporting to close the knowledge loop and drive training. [CR1.6]

Security Testing (ST)


• Ensure QA supports edge/boundary value condition testing. [ST1.1]
• Drive tests with security requirements and security features. [ST 1.3]

Deployment
Penetration Testing (PT)
• Use external penetration testers to find problems. [PT1.1]
• Feed results to the defect management and mitigation system. [PT1.2]
• Use penetration testing tools internally. [PT1.3]

Software Environment (SE)


• Use application input monitoring. [SE1.1]
• Ensure host and network security basics are in place. [SE1.2]

Configuration Management & Vulnerability Management (CMVM)


• Create or interface with incident response. [CMVM1.1]
• Identify software defects found in operations monitoring and feed them back to development.
[CMVM 1.2]

Level 2 Activities

Governance
Strategy & Metrics (SM)
• Publish data about software security internally. [SM2.1]
• Enforce gates with measurements and track exceptions. [SM2.2]
• Create or grow a satellite. [SM2.3]
• Identify metrics and use them to drive budgets. [SM2.5]
• Require security sign-off. [SM2.6]

61 | Building Security In Maturity Model (BSIMM) Version 6


Compliance & Policy (CP)
• Identify PII data inventory. [CP2.1]
• Require security sign-off for compliance-related risk. [CP2.2]
• Implement and track controls for compliance. [CP2.3]
• Paper all vendor contracts with software security SLAs. [CP2.4]
• Ensure executive awareness of compliance and privacy obligations. [CP2.5]

Training (T)
• Enhance satellite through training and events. [T2.5]
• Include security resources in onboarding. [T2.6]
• Identify satellite through training. [T2.7]

Intelligence
Attack Models (AM)
• Build attack patterns and abuse cases tied to potential attackers. [AM2.1]
• Create technology-specific attack patterns. [AM2.2]

Security Features & Design (SFD)


• Build secure-by-design middleware frameworks and common libraries. [SFD2.1]
• Create SSG capability to solve difficult design problems. [SFD2.2]

Standards & Requirements (SR)


• Create a standards review board. [SR2.2]
• Create standards for technology stacks. [SR2.3]
• Identify open source. [SR2.4]
• Create a SLA boilerplate. [SR2.5]
• Use secure coding standards. [SR2.6]

</> SSDL Touchpoints


Architecture Analysis (AA)
• Define and use AA process. [AA2.1]
• Standardize architectural descriptions (including data flow). [AA2.2]
• Make SSG available as AA resource or mentor. [AA2.3]

Code Review (CR)


• Enforce coding standards. [CR2.2]
• Assign tool mentors. [CR2.5]
• Use automated tools with tailored rules. [CR2.6]

Security Testing (ST)


• Integrate black box security tools into the QA process. [ST2.1]
• Share security results with QA. [ST2.4]
• Include security tests in QA automation. [ST2.5]
• Perform fuzz testing customized to application APIs. [ST2.6]

62 | Building Security In Maturity Model (BSIMM) Version 6


Deployment
Penetration Testing (PT)
• Provide penetration testers with all available information. [PT2.2]
• Schedule periodic penetration tests for application coverage. [PT2.3]

Software Environment (SE)


• Publish installation guides. [SE2.2]
• Use code signing. [SE2.4]

Configuration Management & Vulnerability Management (CMVM)


• Have emergency codebase response. [CMVM2.1]
• Track software bugs found in operations through the fix process. [CMVM2.2]
• Develop an operations inventory of applications. [CMVM2.3]

Level 3 Activities

Governance
Strategy & Metrics (SM)
• Use an internal tracking application with portfolio view. [SM3.1]
• Run an external marketing program. [SM3.2]

Compliance & Policy (CP)


• Create regulator eye candy. [CP3.1]
• Impose policy on vendors. [CP3.2]
• Drive feedback from SSDL data back to policy. [CP3.3]

Training (T)
• Reward progression through curriculum (certification or HR). [T3.1]
• Provide training for vendors or outsourced workers. [T3.2]
• Host external software security events. [T3.3]
• Require an annual refresher. [T3.4]
• Establish SSG office hours. [T3.5]

Intelligence
Attack Models (AM)
• Have a science team that develops new attack methods. [AM3.1]
• Create and use automation to do what attackers will do. [AM3.2]

Security Features & Design (SFD)


• Form a review board or central committee to approve and maintain secure design patterns. [SFD 3.1]
• Require use of approved security features and frameworks. [SFD3.2]
• Find and publish mature design patterns from the organization. [SFD3.3]

63 | Building Security In Maturity Model (BSIMM) Version 6


Standards & Requirements (SR)
• Control open source risk. [SR3.1]
• Communicate standards to vendors. [SR3.2]

</> SSDL Touchpoints


Architecture Analysis (AA)
• Have software architects lead design review efforts. [AA3.1]
• Drive analysis results into standard architecture patterns. [AA3.2]

Security Testing (ST)


• Drive tests with risk analysis results. [ST3.3]
• Leverage coverage analysis. [ST3.4]
• Begin to build and apply adversarial security tests (abuse cases). [ST3.5]

Code Review (CR)


• Build a factory. [CR3.2]
• Build a capability for eradicating specific bugs from the entire codebase. [CR3.3]
• Automate malicious code detection. [CR3.4]

Deployment
Penetration Testing (PT)
• Use external penetration testers to perform deep-dive analysis. [PT3.1]
• Have the SSG customize penetration testing tools and scripts. [PT3.2]

Software Environment (SE)


• Use code protection. [SE3.2]
• Use application behavior monitoring and diagnostics. [SE3.3]

Configuration Management & Vulnerability Management (CMVM)


• Fix all occurrences of software bugs found in operations. [CMVM3.1]
• Enhance the SSDL to prevent software bugs found in operations. [CMVM3.2]
• Simulate software crisis. [CMVM3.3]
• Operate a bug bounty program. [CMVM3.4]

64 | Building Security In Maturity Model (BSIMM) Version 6

You might also like