NIST Measurement Guide For Information Security
NIST Measurement Guide For Information Security
Katherine Schroeder
Hung Trinh
Victoria Yan Pillitteri
Katherine Schroeder
Hung Trinh
Victoria Yan Pillitteri
Computer Security Division
Information Technology Laboratory
January 2024
Certain equipment, instruments, software, or materials, commercial or non-commercial, are identified in this
paper in order to specify the experimental procedure adequately. Such identification does not imply
recommendation or endorsement of any product or service by NIST, nor does it imply that the materials or
equipment identified are necessarily the best available for the purpose.
There may be references in this publication to other publications currently under development by NIST in
accordance with its assigned statutory responsibilities. The information in this publication, including concepts and
methodologies, may be used by federal agencies even before the completion of such companion publications.
Thus, until each publication is completed, current requirements, guidelines, and procedures, where they exist,
remain operative. For planning and transition purposes, federal agencies may wish to closely follow the
development of these new publications by NIST.
Organizations are encouraged to review all draft publications during public comment periods and provide feedback
to NIST. Many NIST cybersecurity publications, other than the ones noted above, are available at
https://csrc.nist.gov/publications.
Authority
This publication has been developed by NIST in accordance with its statutory responsibilities under the Federal
Information Security Modernization Act (FISMA) of 2014, 44 U.S.C. § 3551 et seq., Public Law (P.L.) 113-283. NIST is
responsible for developing information security standards and guidelines, including minimum requirements for
federal information systems, but such standards and guidelines shall not apply to national security systems
without the express approval of appropriate federal officials exercising policy authority over such systems. This
guideline is consistent with the requirements of the Office of Management and Budget (OMB) Circular A-130.
Nothing in this publication should be taken to contradict the standards and guidelines made mandatory and
binding on federal agencies by the Secretary of Commerce under statutory authority. Nor should these guidelines
be interpreted as altering or superseding the existing authorities of the Secretary of Commerce, Director of the
OMB, or any other federal official. This publication may be used by nongovernmental organizations on a voluntary
basis and is not subject to copyright in the United States. Attribution would, however, be appreciated by NIST.
Publication History
Approved by the NIST Editorial Review Board on YYYY-MM-DD [Will be added in the final publication]
Supersedes NIST Series XXX (Month Year) DOI [Will be added in the final publication]
Submit Comments
[email protected]
All comments are subject to release under the Freedom of Information Act (FOIA).
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
1 Abstract
2 This document provides guidance on how an organization can develop information security
3 measures to identify the adequacy of in-place security policies, procedures, and controls. It
4 explains the measures prioritization process and how to evaluate measures.
5 Keywords
6 assessment; information security; measurement; measures; metrics; performance; qualitative;
7 quantitative; reports; security controls.
19 Audience
20 This guide is written primarily for users with responsibilities or interest in information security
21 measurement and assessment. Government and industry can use the concepts, processes, and
22 candidate measures presented in this guide.
23
i
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
ii
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
52 Note to Reviewers
53 The initial public drafts (ipd) of NIST Special Publication (SP) 800-55, Measurement Guide for
54 Information Security, Volume 1 – Identifying and Selecting Measures and Volume 2 – Developing
55 an Information Security Measurement Program are available for comment after extensive
56 research, development, and customer engagement.
57 In response to the feedback from the pre-draft call for comment and initial working draft
58 (annotated outline), NIST continued to refine the publications by organizing the guidance into
59 two volumes and developing more actionable and focused guidance in each.
60 • Volume 1 – Identifying and Selecting Measures – is a flexible approach to the
61 development, selection, and prioritization of information security measures. This
62 volume explores both quantitative and qualitative assessment and provides basic
63 guidance on data analysis techniques as well as impact and likelihood modeling.
64 • Volume 2 – Developing an Information Security Measurement Program - is a
65 methodology for developing and implementing a structure for an information security
66 measurement program.
67 Reviewers are encouraged to comment on all or parts of draft NIST SP 800-55 Measurement
68 Guide for Information Security, Volume 1 – Identifying and Selecting Measures and Volume 2 –
69 Developing an Information Security Measurement Program. NIST request comments be
70 submitted to [email protected] by 11:59 PM Eastern Time (ET) on March 18, 2024.
71 Commenters are encouraged to use the comment template provided with the document
72 announcement.
73
iii
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
74 Table of Contents
75 1. Introduction ...................................................................................................................................1
76
77
78
79
80 2. Fundamentals ................................................................................................................................4
81
82
83
84
85 2.4.1. Measures Documentation .......................................................................................................... 10
86 2.4.2. Data Management...................................................................................................................... 11
87 2.4.3. Data Quality ................................................................................................................................ 12
88 2.4.4. Uncertainty and Errors ............................................................................................................... 12
89
90 3. Selecting and Prioritizing Measures ..............................................................................................16
91
92
93 3.2.1. Implementation Measures ......................................................................................................... 17
94 3.2.2. Effectiveness Measures .............................................................................................................. 17
95 3.2.3. Efficiency Measures .................................................................................................................... 17
96 3.2.4. Impact Measures ........................................................................................................................ 17
97 3.2.5. Comparing Measures and Assessment Results .......................................................................... 18
98
99 3.3.1. Likelihood and Impact Modeling ................................................................................................ 19
100 3.3.2. Weighing Scale ........................................................................................................................... 19
101
102 References.......................................................................................................................................22
103 Appendix A. Glossary .......................................................................................................................24
104 Appendix B. Data Analysis Dictionary ...............................................................................................27
105
106
107
108 Appendix C. Modeling Impact and Likelihood ...................................................................................30
iv
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
109
110
111
112
113 Appendix D. Change Log ...................................................................................................................32
v
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
122 1. Introduction
123 Information security measurement enables organizations to describe and quantify information
124 security, allocate finite resources, and make informed and data-driven decisions. However,
125 organizations first need to know what policies, procedures, and controls they have in place at
126 any given time; whether those countermeasures are working effectively and efficiently; and
127 how the organization and its risks are impacted. By developing and monitoring measurements
128 that evaluate what an organization has in place for information security risk management and
129 how well those efforts are working, an organization can better address their goals and direct
130 resources.
1
This document uses the term controls to broadly describe identified countermeasures for managing information security risks. It is intended to
be framework- and standard-agnostic and can also apply to other existing models or frameworks.
2 SP 800-55 uses the terms quantitative assessment and measurement synonymously. Refer to Sec. 1.4, Document Terminology, for additional
information.
1
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
153 • NIST Internal Report (IR) 8286, Identifying and Estimating Cybersecurity Risk for
154 Enterprise Risk Management (ERM) [4]
2
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
This document discusses concepts that are similar to the Stevens Scale of Measurement,
as shown in Table 1, but takes a different view on what is and is not a measurement. For
the purposes of this document, a nominal scale is considered a form of data gathering,
and an ordinal scale is considered a ranking system. Both interval and ratio scales use
variables that represent true numbers and can be used in a quantitative assessment, so
they are considered measurement [19].
184
185 Table 1. Stevens Scale of Measurement
186
3
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
187 2. Fundamentals
188 The terms measurement and assessment are often used interchangeably in the information
189 security field. This document provides a lexicon for key terminology and an overview of
190 foundational concepts to those looking to measure and assess information security risk and
191 clarifies the distinction between measurement and assessment. As described in Sec. 1.4,
192 assessment refers to the process of evaluating, estimating, or judging against defined criteria,
193 and measurement is the process of obtaining quantitative values. Hence, assessment is a
194 broader concept that also includes measurement.
195 Organizations perform multiple kinds of assessment when evaluating information security risk,
196 such as risk assessments, program assessments, and control assessments. Risk assessments are
197 used to identify the risks that an organization faces and can support decision-making [9].
198 Program-level assessments are used for decision-making about the strategies, policies,
199 procedures, and operations that determine the security posture of an information security
200 program. In control assessments, organizations evaluate whether specific controls are
201 performing the way they were intended and achieving the desired results. Both program
202 assessments and control assessments are in and of themselves a form of risk assessment and
203 provide a different lens for viewing information security risk. SP 800-55 is intentionally agnostic
204 on specific risk assessment models. However, many identify threat, likelihood, vulnerability,
205 and impact as areas to assess. 4
4
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
224 may be more commonly used and easier to conduct, but their results can also be subjective and
225 require everyone to have an equal understanding of the scale used.
226 Organizations will first consider their motivations for measuring information security risks
227 before determining whether a quantitative or qualitative assessment is appropriate. For
228 example, an organization motivated primarily by compliance with an industry certification or
229 international standard has different measurement needs than an organization motivated by
230 cost reduction. An organization could have multiple, competing motivations that drive the
231 identification and selection of measures.
Some organizational motivations may benefit from quantitative assessments, such as trying
to determine whether the organization is patching known vulnerabilities in an acceptable
amount of time. Knowing the mean time to remediate a vulnerability provides more precise
insight into patching efficiency than simply knowing the number of vulnerabilities patched in
a year. Because the question of mean time to remediate a vulnerability deals in non-zero
numbers that are attainable to gather, a measurement can be taken, and a mathematically
derived answer can be given.
232
233 When real and attainable numbers based on gathered data can be found and analyzed, a
234 quantitative assessment may be the appropriate action. If there are proposed questions that do
235 not have measurable numbers attached to them but still need to be addressed, a qualitative
236 assessment may be the best option.
237 Commonly used qualitative methods include color scales that represent risk levels or number
238 scales that show rankings. For the purposes of this document, qualitative and semi-quantitative
239 assessments are not considered measurement, and the values produced by these types of
240 assessments are not considered measures. Most organizations will use a mixture of
241 quantitative, semi-quantitative, and qualitative assessments. Ultimately, some or all the
242 assessment results will be used to determine success.
243 In addition to measurement, organizations also utilize metrics to track progress, facilitate
244 decision-making, and improve performance. Information gained from measurement may be
245 used to identify and define new metrics. Metrics can be applied at the system level, program
246 level, 5 and organization level. System-level metrics, such as the frequency of third-party access
247 to a system or the number of communication ports open, can facilitate tactical decision-making
248 and support program-level metrics. Program-level metrics, such as the number of security
249 incidents in a year or the cost per incident, may be helpful when making organizational
250 strategic decisions. Both system- and program-level metrics can also support risk management-
251 informed decision-making.
5
SP 800-39, Managing Information Security Risk: Organization, Mission, and Information System View, includes a model of multi-level risk
management for the integration of risk management across the organization. In this model, three levels are identified to address risk: (i) the
organization level, (ii) the missions/business process level, and (iii) and the system level. For the purpose of this document, the program-level
can be synonymous with the mission/business process-level and/or the organization-level.
5
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
6As described in Section 1.4, measures and quantitative assessment results can be used synonymously, as can the terms measurement and
quantitative assessment.
6
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
289 internal security testing where organizations send fake phishing emails to determine
290 which users respond to it. The rates of success are then judged against set criteria.
291 • Observational data refers to data captures through the observation of an activity or
292 behavior without the direct involvement of the subject. Observational data is often
293 gathered as part of routine information security operations, such as log management
294 tools that are used to collect and analyze network activities. Data from these logs are
295 observational and can be used for further analysis.
296 • Sampling is the process of taking samples of something for the purpose of analysis.
297 Sampling may be used when continuous observation and passive data collection are not
298 an option or when random, stratified, or systematic sampling may be preferred.
299 Random sampling is a method of sampling where each sample has an equal chance of
300 selection in hopes of gathering an unbiased representation. Stratified sampling is the
301 process of segmenting a population across levels of some factors to minimize variability
302 within those segments (e.g., taking a sample from a terminal in each department of an
303 organization). Systematic sampling is a method of sampling where samples are taken at
304 a regular interval (e.g., once an hour or from every tenth user).
305 Once the data from measurement is procured, the outputs of quantitative analysis can be used
306 in a quantitative assessment to determine whether the organization is meeting its information
307 security goals and support risk-based decision-making. Data analysis methods 7 are largely
308 based off of the type of questions that the organization is asking about their information
309 security risks, program, and controls. The NIST Engineering Statistics Handbook [18] identifies
310 three popular approaches to data analysis:
311 1. Classical — The classical data analysis approach is when data collection is directly
312 followed by modeling, and the analysis, estimation, and testing that come after focus on
313 the parameters of that model. Classical data analysis includes deterministic and
314 probabilistic models, such as regression and the analysis of variance (ANOVA).
315 2. Exploratory — Exploratory data analysis begins by inferring what model would be
316 appropriate before trying different analytic models. Identifying patterns in the data may
317 give insight as to what models would produce the most useful information. Some
318 common exploratory data analysis graphical techniques include standard deviation plots
319 and histograms.
320 3. Bayesian — Bayesian methodology consists of formally combining both the prior
321 distribution of the parameters and the collected data to jointly make inferences and/or
322 test assumptions about the model of parameters. Bayesian methods can be used for
323 expected range setting and predictive models.
324 Table 2 shows examples of quantitative analysis across risk assessment, program-level
325 assessment, and control-level assessment.
7
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
8
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
Organizations that are early in the process of assessing their information security risks,
program, or systems may rely heavily on qualitative assessments that present non-
numerical information in place of measurement. These non-numerical methods can help
show context, examine labels, and look at behavior. A prominent example of qualitative
assessment featured in many information security measurement programs is the risk
matrix — a table that uses colored rating scales to show the impact and likelihood of
various risks. As organizations gain the ability to record and track information security
data, they are able to move away from the subjectivity of qualitative assessments and
toward the increased precision and reduced bias of quantitative assessments.
328
9
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
344 Measurements can be used to monitor organizational information security activities at the
345 program and organization levels. These measurements may be derived by aggregating multiple
346 system-level measures or developed by using the entire enterprise as the scope. Organization-
347 level measurements require that the processes on which the measures depend are consistent,
348 repeatable, and ensure the availability of data across the organization.
349 Perfectly measuring information security is challenging due to the gap between mathematical
350 models and practical implementations [21]. Instead, experimenting as possible with relative
351 metrics, models, and approaches over time is the best way to identify the most effective
352 performance indicators.
10
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
381 • Implementation evidence: Evidence used to compute the measure, validate that the
382 activity is performed, and identify probable causes of unsatisfactory results for a specific
383 measure.
384 o For manual data collection, identify questions and data elements that would
385 provide the data inputs necessary to calculate the measure’s formula, qualify the
386 measure for acceptance, and validate the information provided.
387 o For automated data collection, identify data elements that would be required for
388 the formula, qualify the measure for acceptance, and validate the information
389 provided.
390 • Frequency: How often the data is collected, analyzed, and reported. Select the
391 frequency of data collection based on a rate of change that is being evaluated. Select
392 the frequency of data reporting based on external reporting requirements and internal
393 customer preferences.
394 • Responsible parties: Key stakeholders, such as:
395 o Information owner — Identify the organizational component and the individual
396 who owns the required information.
397 o Information collector — Identify the organizational component and the
398 individual responsible for collecting the data. 8
399 o Information customer — Identify the organizational component and the
400 individual who will revive the data.
401 • Data source: Location of the data to be used in calculating the measure, including
402 databases, tracking tools, logs, organizations, and specific roles within the organization
403 that can provide the required information.
404 • Reporting format: Indication of how the measure will be reported, such as a pie chart,
405 line chart, bar graph, or other format. It may also be beneficial to include a sample.
8 When possible, the information collector will be a different individual or even a representative of a different organizational unit than the
information owner to avoid the possibility of a conflict of interest and ensure separation of duties, though this may not be feasible for smaller
organizations.
11
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
414 performance measurement data repositories are protected in accordance with applicable laws,
415 regulations, policies, and procedures.
438 In addition to making the data itself more useable, data analysis methods (e.g., sensitivity
439 analysis and Monte Carlo analysis) can address uncertainty within the data. Organizations often
440 make quantitative projections using statistical methods, such as regression, time series analysis,
12
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
441 and machine learning methods. When looking at projections, it is helpful to consider that future
442 events and other unknown factors can cause unforeseen changes.
13
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
453
454 Fig. 1. Notional process for the definition, collection, and analysis of metrics
455 When selecting measurements and metrics to focus on, it is helpful to know why the
456 measurements are being taken and their purpose. It is important that the chosen metrics tell a
457 meaningful story about organization-, program-, or system-level information security. To do so,
458 metrics are designed to be unambiguous so that their purpose and output can be more easily
459 understood. For example, when evaluating cybersecurity awareness training, consider
460 completion rates and the results of review quizzes instead of marking participation as “low,
461 medium, or high.”
462 By keeping metrics consistent over time, organizations can evaluate long-term trends and
463 expected ranges. A new metric may provide important insight, but tracking the measurements
14
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
464 related to metrics over a continuous period (e.g., quarter to quarter, year to year) will give
465 more information about the success of organization-, program-, and system-level information
466 security plans, policies, procedures, and goals. Some metrics may be gathered because of
467 outside guidance or regulations.
468 Key risk indicators (KRIs) and key performance indicators (KPIs) are examples of metrics, though
469 not all metrics fall into these categories. Organizations may find a wide variety of metrics fit
470 their needs. For example, appropriate measures at the organization level may include the cost
471 per security incident as part of the budget allocation process, whereas measurements at the
472 system level may include the frequency of virus scans across individual systems.
15
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
16
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
504 4. Impact
10 Records of these essential implementation assessment results are foundational to information security measurement and are addressed in SP
800-55 Volume 2.
17
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
Examples of Qualitative or
Assessment Types Semi-Quantitative Assessment Examples of Measures
Results
Implementation: Examine the Determine whether identified The percentage of systems with
progress of specific controls. controls are in place. up-to-date patches (i.e.,
implementation of a specific
control or capability)
Effectiveness: Examine how well Use a color-coded risk matrix to A chart that shows the changes of
controls are working. demonstrate the potential risks percentage of information
involved with improperly security incidents caused by
configured access controls. improperly configured access
controls over a 5-year period
Efficiency: Examine the timeliness of Use a 1–5 scale to determine Data that compares the mean
controls. whether the organization is at an time of response to information
acceptable level of security incidents versus the cost
responsiveness in case of an of the incident
information security incident.
Impact: Examine the impact of Rank risks on a color-coded scale Data on the known costs of
information security on an to evaluate financial impacts to breaches to industry peers
organization’s mission. an organization.
18
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
11
More information on risk assessments can be found in SP 800-30, Guide for Conducting Risk Assessments.
12 More information on risk registers can be found in [4].
13 Recency bias is the tendency to favor recent events or experiences over historical ones.
14 More information on cyber resiliency can be found in SP 800-160 Volume 2.
19
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
586 when prioritizing and selecting measures, even those that are unrelated to information
587 security. Measures that are ultimately selected are useful for:
588 • Identifying causes of unsatisfactory performance
589 • Pinpointing areas for improvement
590 • Facilitating consistent policy implementation
591 • Redefining goals and objectives
592 • Modifying security policies
20
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
620 Access to average outputs, acceptable ranges, and long-term data makes effectiveness and
621 efficiency measures more accurate and beneficial by enabling organizations to track changes
622 over time. Even if processes are not yet consistent, average outputs and acceptable ranges help
623 organizations set metrics. Some metrics are directly related to established averages, while
624 others are set by other sources, and established ranges may not have any effect on
625 organizational goals. While inconsistent processes will not provide meaningful data,
626 measurements may still be used to establish average outputs and acceptable ranges for future
627 analysis. Data analysis for finding average outputs and acceptable ranges will typically include
628 historical data and a forecast of what that trend would continue to look like in the future if all
629 variables stay the same.
630 It is important to remember that some measures have the potential to give misleading
631 information. Inputs such as phishing test success rates or the number of knows vulnerabilities
632 depend heavily on the quality of work behind them. A poorly designed phishing test might
633 show a better success rate while giving less information about the preparedness of the
634 workforce to recognize a well-designed phishing email. This does not mean that organizations
635 need to avoid these measures altogether, but numbers alone may not always show the whole
636 story.
21
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
637 References
638 [1] National Institute of Standards and Technology (2018) Framework for Improving Critical
639 Infrastructure Cybersecurity, Version 1.1. (National Institute of Standards and
640 Technology, Gaithersburg, MD), NIST Cybersecurity White Paper (CSWP) NIST CSWP 6.
641 https://doi.org/10.6028/NIST.CSWP.6
642 [2] National Institute of Standards and Technology (2006) Minimum Security Requirements
643 for Federal Information and Information Systems. (U.S. Department of Commerce,
644 Washington, DC), Federal Information Processing Standards Publication (FIPS) 200.
645 https://doi.org/10.6028/NIST.FIPS.200
646 [3] Bowen P, Kissel RL (2007) Program Review for Information Security Management
647 Assistance (PRISMA). (National Institute of Standards and Technology, Gaithersburg,
648 MD), NIST Interagency or Internal Report (IR) 7358.
649 https://doi.org/10.6028/NIST.IR.7358
650 [4] Stine KM, Quinn SD, Witte GA, Gardner RK (2020) Integrating Cybersecurity and
651 Enterprise Risk Management (ERM). (National Institute of Standards and Technology,
652 Gaithersburg, MD), NIST Interagency or Internal Report (IR) 8286.
653 https://doi.org/10.6028/NIST.IR.8286
654 [5] Taylor BN (2011) The current SI seen from the perspective of the proposed new
655 SI. Journal of Research of the National Institute of Standards and Technology 116(6):797.
656 https://doi.org/10.6028/jres.116.022
657 [6] Software Quality Group (2021) Metrics and Measures. (National Institute of Standards
658 and Technology, Gaithersburg, MD). Available at https://www.nist.gov/itl/ssd/software-
659 quality-group/metrics-and-measures
660 [7] Thomas D (2019). Monte Carlo Tool. (National Institute of Standards and Technology,
661 Gaithersburg, MD). Available at https://www.nist.gov/services-
662 resources/software/monte-carlo-tool
663 [8] ASTM International (2018) ASTM C1012/C1012M-18a – Standard Test Method for
664 Length Change of Hydraulic-Cement Mortars Exposed to a Sulfate Solution (ASTM
665 International, West Conshohocken, PA). https://doi.org/10.1520/C1012_C1012M-18A
666 [9] Joint Task Force Transformation Initiative (2012) Guide for Conducting Risk
667 Assessments. (National Institute of Standards and Technology, Gaithersburg, MD), NIST
668 Special Publication (SP) 800-30, Rev. 1. https://doi.org/10.6028/NIST.SP.800-30r1
669 [10] Joint Task Force (2018) Risk Management Framework for Information Systems and
670 Organizations: A System Life Cycle Approach for Security and Privacy. (National Institute
671 of Standards and Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-37,
672 Rev. 2. https://doi.org/10.6028/NIST.SP.800-37r2
673 [11] Chew E, Swanson MA, Stine KM, Bartol N, Brown A, Robinson W (2008) Performance
674 Measurement Guide for Information Security. (National Institute of Standards and
675 Technology, Gaithersburg, MD), NIST Special Publication (SP) 800-55, Rev. 1.
676 https://doi.org/10.6028/NIST.SP.800-55r1
677 [12] Grance T, Nolan T, Burke K, Dudley R, White G, Good T (2006) Guide to Test, Training,
678 and Exercise Programs for IT Plans and Capabilities. (National Institute of Standards and
22
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
23
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
727 experimentation
728 A systematic approach to the process of testing new ideas, methods, or activities that applies principles and
729 techniques at the data collection stage to ensure the generation of valid, defensible, and supportable conclusions.
735 imputation
736 The replacement of unknown, unmeasured, or missing data with a particular value. The simplest form of
737 imputation is to replace all missing values with the average of that variable. [18, adapted]
24
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
755 mean
756 The sum of the data points divided by the number of data points. Commonly referred to as the average. [18,
757 adapted]
762 measurement
763 The process of obtaining quantitative values using quantitative methods.
764 measures
765 Quantifiable and objective values that result from measurement.
766 median
767 The value of the point that has half the data smaller than that point and half the data larger than that point. [18]
768 metrics
769 Measures and assessment results designed to track progress, facilitate decision-making, and improve performance
770 with respect to a set target.
771 mode
772 The value of the random sample that occurs with the greatest frequency. This value is not necessarily unique. [18]
777 normalization
778 The conversion of information into consistent representations and categorization. [4]
784 outliers
785 An observation that lies an abnormal distance from other values in a random sample from a population. [18]
25
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
797 regression
798 A statistical technique used to predict the value of a variable based on the relationship between explanatory
799 variables.
800 sampling
801 The process of taking samples of something for the purpose of analysis.
812 transformation
813 The conversion of one state or format into another state or format.
814
26
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
27
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
28
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
29
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
30
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
31
NIST SP 800-55v1 ipd (Initial Public Draft) Measurement Guide for Information Security
January 2024 Volume 1 — Measures
32