Iso Iec 27041-2015
Iso Iec 27041-2015
STANDARD 27041
First edition
2015-06-15
Reference number
ISO/IEC 27041:2015(E)
© ISO/IEC 2015
ISO/IEC 27041:2015(E)
Contents Page
Foreword......................................................................................................................................................................................................................................... iv
Introduction...................................................................................................................................................................................................................................v
1 Scope.................................................................................................................................................................................................................................. 1
2 Normative references....................................................................................................................................................................................... 1
3 Terms and definitions...................................................................................................................................................................................... 1
4 Symbols and abbreviated terms............................................................................................................................................................ 4
5 Method development and assurance................................................................................................................................................ 4
5.1 Overview....................................................................................................................................................................................................... 4
5.2 General principles................................................................................................................................................................................. 4
5.3 General development and deployment model.............................................................................................................. 4
5.4 Assurance stages.................................................................................................................................................................................... 5
5.5 Requirements capture and analysis....................................................................................................................................... 6
5.5.1 General principles of requirements.................................................................................................................. 6
5.5.2 Functional Requirements.......................................................................................................................................... 7
5.5.3 Verification of requirements................................................................................................................................... 7
5.6 Process Design......................................................................................................................................................................................... 7
5.6.1 Overview.................................................................................................................................................................................. 7
5.6.2 Tool Selection....................................................................................................................................................................... 7
5.6.3 Uncertainty and risk evaluation.......................................................................................................................... 7
5.7 Process Implementation.................................................................................................................................................................. 8
5.7.1 Overview.................................................................................................................................................................................. 8
5.7.2 Tool choice — guidance for deployment...................................................................................................... 8
5.8 Process Verification............................................................................................................................................................................. 8
5.8.1 General principles of verification....................................................................................................................... 8
5.8.2 Verification of processes............................................................................................................................................ 9
5.8.3 Verification of tools......................................................................................................................................................... 9
5.9 Process Validation................................................................................................................................................................................. 9
5.9.1 General principles of validation........................................................................................................................... 9
5.9.2 Comprehensive validation........................................................................................................................................ 9
5.9.3 Sufficient validation....................................................................................................................................................... 9
5.9.4 Fully validated processes........................................................................................................................................ 10
5.9.5 Failed validation............................................................................................................................................................. 10
5.10 Confirmation........................................................................................................................................................................................... 10
5.11 Deployment.............................................................................................................................................................................................. 10
5.11.1 Tool choice........................................................................................................................................................................... 10
5.12 Review and Maintenance............................................................................................................................................................. 10
6 Assurance Models..............................................................................................................................................................................................11
6.1 Overview.................................................................................................................................................................................................... 11
6.2 In-house assurance........................................................................................................................................................................... 11
6.3 External assurance............................................................................................................................................................................ 11
6.4 Mixed assurance.................................................................................................................................................................................. 11
7 Production of evidence for assurance..........................................................................................................................................11
7.1 Overview.................................................................................................................................................................................................... 11
7.2 Pre-validation preparation......................................................................................................................................................... 12
7.3 Producing Evidence of Validation......................................................................................................................................... 12
7.4 Maintenance of Validation........................................................................................................................................................... 12
7.5 Validation of Examinations......................................................................................................................................................... 12
7.6 Validation of Investigations........................................................................................................................................................ 13
Annex A (informative) Examples............................................................................................................................................................................14
Bibliography.............................................................................................................................................................................................................................. 18
Foreword
ISO (the International Organization for Standardization) and IEC (the International Electrotechnical
Commission) form the specialized system for worldwide standardization. National bodies that are
members of ISO or IEC participate in the development of International Standards through technical
committees established by the respective organization to deal with particular fields of technical
activity. ISO and IEC technical committees collaborate in fields of mutual interest. Other international
organizations, governmental and non-governmental, in liaison with ISO and IEC, also take part in the
work. In the field of information technology, ISO and IEC have established a joint technical committee,
ISO/IEC JTC 1.
The procedures used to develop this document and those intended for its further maintenance are
described in the ISO/IEC Directives, Part 1. In particular the different approval criteria needed for
the different types of document should be noted. This document was drafted in accordance with the
editorial rules of the ISO/IEC Directives, Part 2 (see www.iso.org/directives).
Attention is drawn to the possibility that some of the elements of this document may be the subject
of patent rights. ISO and IEC shall not be held responsible for identifying any or all such patent rights.
Details of any patent rights identified during the development of the document will be in the Introduction
and/or on the ISO list of patent declarations received (see www.iso.org/patents).
Any trade name used in this document is information given for the convenience of users and does not
constitute an endorsement.
For an explanation on the meaning of ISO specific terms and expressions related to conformity
assessment, as well as information about ISO’s adherence to the WTO principles in the Technical Barriers
to Trade (TBT) see the following URL: Foreword - Supplementary information
The committee responsible for this document is ISO/IEC JTC 1, Information technology, SC 27, IT Security
techniques.
Introduction
About this International Standard
This International Standard is concerned with providing assurance that the investigative process used
is appropriate for the incident under investigation and the results which are required. It also describes,
at an abstract level, the concept of breaking seemingly complex processes into a series of smaller atomic
parts, which should aid in the development of simple, yet robust, investigation methods. It should be
considered by any person authorising, giving instruction for, managing, or conducting an investigation.
It should be applied prior to any investigation, in the context of principles and processes (defined in
ISO/IEC 27043:2015) and sound preparation and planning (defined in ISO/IEC 27035‑21)) to ensure the
suitability of methods to be applied in the investigative processes described in ISO/IEC 27037:2012 and
ISO/IEC 27042:2015.
Relationship to other standards
This International Standard is intended to complement other standards and documents which give
guidance on the investigation of, and preparation to investigate, information security incidents. It is not
a comprehensive guide, but lays down certain fundamental principles which are intended to ensure that
tools, techniques, and methods can be selected appropriately and shown to be fit for purpose should the
need arise.
This International Standard also intends to inform decision-makers that need to determine the
reliability of digital evidence presented to them. It is applicable to organizations needing to protect,
analyse, and present potential digital evidence. It is relevant to policy-making bodies that create and
evaluate procedures relating to digital evidence, often as part of a larger body of evidence.
This International Standard describes part of a comprehensive investigative process which includes, but
is not limited to, the following topic areas:
— incident management, including preparation and planning for investigations;
— handling of digital evidence;
— use of, and issues caused by, redaction;
— intrusion prevention and detection systems, including information which can be obtained from
these systems;
— security of storage, including sanitization of storage;
— ensuring that investigative methods are fit for purpose;
— carrying out analysis and interpretation of digital evidence;
— understanding principles and processes of digital evidence investigations;
— security incident event management, including derivation of evidence from systems involved in
security incident event management;
— relationship between electronic discovery and other investigative methods, as well as the use of
electronic discovery techniques in other investigations;
— governance of investigations, including forensic investigations.
These topic areas are addressed, in part, by the following ISO/IEC standards:
— ISO/IEC 27037:2012
1) To be published.
This International Standard describes the means by which those involved in the early stages of an
investigation, including initial response, can ensure that sufficient potential digital evidence is captured
to allow the investigation to proceed appropriately.
— ISO/IEC 27038:2014
Some documents can contain information that must not be disclosed to some communities. Modified
documents can be released to these communities after an appropriate processing of the original
document. The process of removing information that is not to be disclosed is called “redaction”.
The digital redaction of documents is a relatively new area of document management practice, raising
unique issues and potential risks. Where digital documents are redacted, removed information must
not be recoverable. Hence, care needs to be taken so that redacted information is permanently removed
from the digital document (e.g. it must not be simply hidden within non-displayable portions of the
document).
ISO/IEC 27038:2014 specifies methods for digital redaction of digital documents. It also specifies
requirements for software that can be used for redaction.
— ISO/IEC 27040:2015
This International Standard provides detailed technical guidance on how organizations can define
an appropriate level of risk mitigation by employing a well-proven and consistent approach to the
planning, design, documentation, and implementation of data storage security. Storage security applies
to the protection (security) of information where it is stored and to the security of the information
being transferred across the communication links associated with storage. Storage security includes
the security of devices and media, the security of management activities related to the devices and
media, the security of applications and services, and security relevant to end-users during the lifetime
of devices and media and after end of use.
Security mechanisms like encryption and sanitization can affect one’s ability to investigate by
introducing obfuscation mechanisms. They have to be considered prior to and during the conduct of
an investigation. They can also be important in ensuring that storage of evidential material during and
after an investigation is adequately prepared and secured.
— ISO/IEC 27042:2015
This International Standard describes how methods and processes to be used during an investigation
can be designed and implemented in order to allow correct evaluation of potential digital evidence,
interpretation of digital evidence, and effective reporting of findings.
— ISO/IEC 27043:2015
This International Standard defines the key common principles and processes underlying the
investigation of incidents and provides a framework model for all stages of investigations.
The following ISO/IEC projects also address, in part, the topic areas identified above and can lead to the
publication of relevant standards at some time after the publications of this International Standard.
— ISO/IEC 27035 (all parts)2)
This is a three-part standard that provides organizations with a structured and planned approach to
the management of security incident management. It is composed of
— ISO/IEC 27035‑13)
2) To be published.
3) To be published.
This part presents basic concepts and phases of information security incident management. It combines
these concepts with principles in a structured approach to detecting, reporting, assessing, responding,
and applying lessons learned.
— ISO/IEC 27035‑24)
This part presents the concepts to plan and prepare for incident response. The concepts, including
incident management policy and plan, incident response team establishment, and awareness briefing
and training, are based on the plan and prepare phase of the model presented in ISO/IEC 27035‑15). This
part also covers the “Lessons Learned” phase of the model.
— ISO/IEC 27035‑36)
This part includes staff responsibilities and practical incident response activities across the organization.
Particular focus is given to the incident response team activities such including monitoring, detection,
analysis, and response activities for the collected data or security events.
— ISO/IEC 27050 (all parts)7)
This addresses activities in electronic discovery, including, but not limited to identification, preservation,
collection, processing, review, analysis, and production of electronically stored information (ESI).
In addition, it provides guidance on measures, spanning from initial creation of ESI through its final
disposition, which an organization can undertake to mitigate risk and expense should electronic
discovery become an issue. It is relevant to both non-technical and technical personnel involved in some
or all of the electronic discovery activities. It is important to note that this guidance is not intended to
contradict or supersede local jurisdictional laws and regulations.
Electronic discovery often serves as a driver for investigations, as well as evidence acquisition and
handling activities. In addition, the sensitivity and criticality of the data sometimes necessitate
protections like storage security to guard against data breaches.
— ISO/IEC 30121:2015
This International Standard provides a framework for governing bodies of organizations (including
owners, board members, directors, partners, senior executives, or similar) on the best way to prepare
an organization for digital investigations before they occur. This International Standard applies to the
development of strategic processes (and decisions) relating to the retention, availability, access, and
cost effectiveness of digital evidence disclosure. This International Standard is applicable to all types
and sizes of organizations. The International Standard is about the prudent strategic preparation for
digital investigation of an organization. Forensic readiness ensures that an organization has made the
appropriate and relevant strategic preparation for accepting potential events of an evidential nature.
Actions may occur as the result of inevitable security breaches, fraud, and reputation assertion. In every
situation, information technology (IT) has to be strategically deployed to maximize the effectiveness of
evidential availability, accessibility, and cost efficiency
Figure 1 shows typical activities surrounding an incident and its investigation. The numbers shown in
this diagram (e.g. 27037) indicate the International Standards listed above and the shaded bars show
where each is most likely to be directly applicable or has some influence over the investigative process
(e.g. by setting policy or creating constraints). It is recommended, however, that all should be consulted
prior to, and during, the planning and preparation phases. The process classes shown are defined fully
in this International Standard and the activities identified match those discussed in more detail in
ISO/IEC 27035‑2, ISO/IEC 27037:2012, and ISO/IEC 27042:2015.
4) To be published.
5) To be published.
6) To be published.
7) New project.
1 Scope
This International Standard provides guidance on mechanisms for ensuring that methods and processes
used in the investigation of information security incidents are “fit for purpose”. It encapsulates best
practice on defining requirements, describing methods, and providing evidence that implementations of
methods can be shown to satisfy requirements. It includes consideration of how vendor and third-party
testing can be used to assist this assurance process.
This document aims to
— provide guidance on the capture and analysis of functional and non-functional requirements
relating to an Information Security (IS) incident investigation,
— give guidance on the use of validation as a means of assuring suitability of processes involved in the
investigation,
— provide guidance on assessing the levels of validation required and the evidence required from a
validation exercise,
— give guidance on how external testing and documentation can be incorporated in the validation
process.
2 Normative references
The following documents, in whole or in part, are normatively referenced in this document and are
indispensable for its application. For dated references, only the edition cited applies. For undated
references, the latest edition of the referenced document (including any amendments) applies.
ISO/IEC 27000:2013, Information technology — Security techniques — Information security management
systems — Overview and vocabulary
3.2
black box testing
examining a process by using it to process known inputs and comparing the results against predicted
outputs which reflect the requirements for the process
3.3
client
person or organisation on whose behalf the investigation is to be undertaken
[SOURCE: ISO/IEC 27042:2015, 3.2]
3.4
confirmation
formal assessment of existing objective evidence that a process is fit (or remains fit) for a specified
purpose
3.5
contemporaneous notes
contemporaneous record
written record of actions taken and decisions made, produced at the same time or as soon afterwards as
is practically possible, as the actions and decisions it records
Note 1 to entry: In many jurisdictions, it is necessary for contemporaneous notes to be handwritten in non-
erasable ink in a tamper-evident notebook to assist with issues of authenticity and admissibility.
3.12
process
operational implementation of a method (3.11)
3.13
producer
creator or provider of a tool (3.17), including anyone who modifies or customises a tool
Note 1 to entry: The person(s) or organization(s) responsible for the creation or maintenance of a tool or
customisation of a tool is the producer.
Note 2 to entry: Providing scripts to automate common functions modifies or customises a tool.
3.14
requirements
statement which translates or expresses a need and its associated constraints and conditions
Note 1 to entry: Requirements exist at different tiers and express the need in high-level form (e.g. software
component requirement).
[SOURCE: ISO/IEC 27004:2009, 3.17, Modified – Note 1 to entry has been added]
3.19
validation set
series of objective tests with clearly defined goals, inputs, and outputs, directly related to the agreed
requirements (3.14) for the process (3.12) under validation (3.18)
3.20
verification
confirmation (3.4), through the provision of objective evidence, that specified requirements have been
fulfilled
Note 1 to entry: Verification only provides assurance that a product conforms to its specification.
[SOURCE: ISO/IEC 27004:2009, 3.18, Modified – Original note was removed, Note 1 to entry has been
added]
3.21
verification function
function which is used to verify that two sets of data are identical
[SOURCE: ISO/IEC 27037:2012, 3.25, Modified – Notes were removed]
3.22
white box testing
testing which includes inspection of the implementation detail
3.23
work instruction
detailed description of how to perform and record a process (3.12)
[SOURCE: ISO/TR 10013:2001, 3.1, Modified - changed from plural to singular, task changed to process]
5.1 Overview
Assurance of suitability and adequacy of incident investigation methods can be required in order
to demonstrate clearly that the investigator used methods which were fit for the purpose(s) of the
investigation and used methods which were not subject to unacceptable errors or uncertainty. Digital
evidence resulting from the application of unassured methods can be considered inherently flawed and
subject to challenge which can result in it being rendered useless for the purposes of the investigation.
This standard presents an assurance model which includes all stages of development of the activities
which make up an investigation, as described in ISO/IEC 27042:2015, from initial specification through
to deployment and maintenance.
— Confirmation
— Deployment
— Review and maintenance
Each of these stages is discussed in more detail below.
5.5
5.12
5.11
5.6
5.10
5.7
5.9
5.8
Prior to designing a process for use in an examination, a proper set of requirements should be produced,
accepted by the client and recorded in accordance with good practice. This set of requirements should be
derived from the requirements identified for the complete investigation and may include both functional
and non-functional requirements.
Each requirement defines an essential capability, characteristic or quality factor. Each individual
requirement statement should be necessary, implementation-free (i.e. it states only what is required,
not how the requirement should be met), unambiguous, complete, singular and consistent with the
remainder of the requirements in the set.
Requirements vary in intent and in the kinds of properties they represent. They can be grouped together
into similar types to aid in analysis and verification. Examples of types of requirements include:
— Functional – describe the functions or tasks to be performed and will include such considerations
as expected inputs and outputs;
— Performance – defines the extent, how well, and under what conditions a function or task is to be
performed;
— Interface – defines how the solution interacts with external systems, or how elements within the
solution (including human elements) interact with each other;
— Process – include compliance with local laws and processes or administrative requirements;
— Non-functional – define how a solution is supposed to be, including quality requirements such as
portability, reliability, maintainability and security, or human factors requirements such as safety,
efficiency or health and wellbeing.
In addition to all essential requirements, the lists of requirements produced should also include clear
definitions of the boundaries of operation associated with the anticipated potential digital evidence
and related investigative processes (e.g. maximum file sizes, maximum and minimum number of input
values).
A new list of requirements may need to be formulated for each investigation undertaken to ensure the
examination correctly fulfils the specific case requirements. Using a monolithic approach to the design
would require a significant validation overhead and so the user should where practically possible select
predefined atomic stages which are compatible with dynamic user definable input parameters.
In that way the unique changes to the requirements will typically be limited to the specific case
input parameters, and so the case specific validation would predominantly be limited to the specific
parameters supplied to the case under investigation, and not the underlying function or process which
should have been designed at the readiness phase.
EXAMPLE While specific keyword searches will be directly dependent on the case being investigated the
keyword filter process should, if designed correctly, be an atomic process which is independent of the keywords
used. The area which requires unique case specific validation be the definition of the correctness of the keyword[s]
applied (i.e. the undefined uncertainty error will be in the user’s design of the specific search terms used, for
instance only searching for “Joe Blogs” would miss references to “Joe Bloggs”, “Mr Blogs”, “J. Blogs”, “Joe”, “Joey”,
etc.).
The incident under investigation should be clearly identified and defined, including limitations to the
scope of the investigation. Sources of potential digital evidence and questions to be answered should be
identified. Sources of risk and their potential effects on the investigation, personnel and systems should
also be identified.
Once the requirements for the investigation have been identified, the investigative team should then
develop requirements for the examinations, analyses and processes which will make up the investigation
(see ISO/IEC 27042:2015).
Functional requirements are those stemming directly from investigative needs and which are expected
by the users of the process. They do not define how the process should operate but will include such
considerations as expected inputs and outputs. All functional requirements should be satisfied by the
investigation.
EXAMPLE The need to process a particular type of filesystem is a functional requirement as it is derived
directly from a source of potential digital evidence.
Undertaking an exercise to verify the requirements will ensure that the specified requirements are well
formed and that the needs of the investigative method have been adequately expressed. It involves an
analysis of the recorded requirements to identify problems such as conflicting, missing, incomplete,
ambiguous, inconsistent or incongruous requirements. Any identified problems should be resolved
before moving on to subsequent assurance stages.
5.6.1 Overview
The design of a process should take account of all requirements identified as a result of the requirement
capture and analysis stages. It should give detail of how the method will be implemented, taking account
of accepted non-functional requirements and is the point at which tool selection should be carried out.
Design need not specify the exact detail of each element of the process, but should clearly identify the
flow of activity and evidential material from one step to the next.
During the design phase, any tools which may participate in the process should be identified and their
role(s) in the resulting process identified. Where several tools can perform the same function in the
process, it may be useful to identify some or all of these tools in order to cope with variation in operating
environments (e.g. write blockers may offer different interfaces such as ATA, SATA, USB etc.) Care should
be taken, however, to ensure that allowing for variable requirements in this way does not adversely
affect the atomic nature of the process.
The documented process should define the group of tools which should be considered for use, along with
the identified and, where possible, quantified risk to specific atomic functions of each of the tools listed.
All tools, be they hardware or software based, will be prone to a level of error. This is an unavoidable
reality which is caused by the fact that they are physically 8) manufactured components, designed and
implemented to within a predefined tolerance of an ideal which can never be guaranteed to be 100 %
perfect. The error may range from relatively high to insignificantly minute, but either way it will exist
and can never be completely eliminated, only controlled and accounted for.
The investigator’s familiarity with the proposed tool or process should also be taken into account, as the
less familiar a user is with a tool or process, the greater the chance that additional uncontrolled errors
will occur. Effective training and routine proficiency testing are widely accepted techniques for helping
to minimize this specific type of error.
8) Software resides on a physical computer and so is affected by both hardware and software based errors.
These errors are collectively known as the uncertainty characteristics of each element or component
of a process, and can be, in simple terms, characterized as the strengths or weaknesses of the process.
The uncertainty characteristics will typically be additive in nature in a linear system, such as the model
described, and so will normally increase proportionally in line with the number of processes used.
To compensate, robust proportional overlapping processes should be designed to ensure they strengthen
the provenance of all the digital evidence found.
Before using a preferred tool or method to conduct an investigation the investigators should consider the
likely effects of all the weaknesses of the complete process sequence selected. A formal understanding of
the weaknesses of a process enables these errors to be effectively controlled through the use of correct
process selection and robust documented risk analysis or assessments.
The use of validated atomic elements within a process sequence can also be a significant aid in helping
to simplify the understanding and controlling these uncertainties, and is the primary reason why it is
central to assurance methods detailed in this document.
Finally, it should also be understood that even though a specific process may exhibit unknown or high
weakness it does not necessarily exclude it from consideration as appropriate. Indeed in some cases,
where it is the only process available to complete a required task it will likely be considered invaluable.
5.7.1 Overview
Once the design has been completed, it should be implemented in the form of a documented detailed work
instruction which gives step by step instructions for the correct operation of each step of the process.
During this stage, final decisions about tool selection (e.g. choosing between alternative versions of the
same tool type) may be taken in order to improve the process.
Where the design of the process includes a list of tools which may be used to perform the same, or
similar, functions, the work instruction should provide guidance (see also 5.6.2) on how the investigator
should choose the appropriate tool for the conditions encountered during the examination. Care should
be taken, however, to ensure that allowing for variation, in this way, does not adversely affect the atomic
nature of the process.
Maintenance of a risk register, which includes assessments of risk and uncertainty for available tools, can
assist with tool choice. A tool can start with a blank entry in this register, but this should be considered
as a risk as it is likely to indicate that the tool is new and has not yet been evaluated in any detail.
A tool which is not the most suitable for the particular circumstance might still be chosen for use as long
as its evaluation and defined process usage is clearly defined along with its associated risks.
NOTE A tool may be chosen because it is the best available from a set of generally poor solutions, or because
it is the only tool available which can produce any form of usable result.
Verification provides a level of assurance that a process or tool conforms to its specification. This does
not guarantee that it will operate in the desired way in the context of a complete investigation or process.
Evidence of verification against requirements which are similar to those for the intended use should be
treated as an initial indicator that the tool or process may be suitable for deployment in the context of
an investigation, but not a complete assurance that it will satisfy the requirements for the intended use.
Verification should be considered an optional, but potentially useful, part of assurance.
Following development of the work instruction it should be compared with the design and evidence
produced, to show how the work instruction complies with the design. The design may be amended
to reflect implementational changes made during production of the work instruction (e.g. a result of
unexpected or new tool behaviours). Verification will usually be carried out using “white box testing” in
order to allow comparison with the design.
Tools can be verified by the user, the producer, or a third party. Where a tool is verified by the producer
or a third party, the verification will normally be based on the design requirements for the tool. A third
party or producer verification is only useful, as part of the validation, if complete information about the
verification is provided, including the design requirements against which the tool was verified. If these
design requirements can be mapped onto the requirements for the tool’s participation in the process
under consideration, the matching verification data can provide partial evidence of validation for those
stages of the process in which the tool participates. Verification is not sufficient, in its own right, to
achieve validation of a process as verification does not consider the way in which the user of the tool
intends to use the tool in the process.
Validation demonstrates that the process defined in the work instruction fulfils the requirements
agreed with the client. It does not directly consider the implementation defined by the work instruction,
but provides evidence that the process produces correct outputs for the defined set of inputs. Where
possible, the validation process should also determine boundary conditions and error rates. Typically,
the validation process will be conducted through “black box testing” to ensure that knowledge of
implementational detail does not affect the conduct of the testing or influence the results.
A validation plan and associated data should be produced independently of the design and implementation
phases and should be based solely on the agreed requirements.
NOTE Processes which use unverified tools or methods can be validated if they provide consistent results (cf.
requirement for repeatability and reproducibility described in ISO/IEC 27037:2012).
Comprehensive validation refers to a validation which tests a process under all possible conditions (e.g.
on all possible hardware configurations for all possible inputs). This is not considered essential and is
likely to be prohibitively expensive in terms of time and resources required. Where a process forms a
key part of several analyses, and is likely to be deployed regularly, comprehensive validation can be
essential, but this may only apply to processes which are used in multiple different types of investigation
by multiple different investigative teams.
In other circumstances, (e.g. for a “one-off” process intended to solve an immediate problem but not likely
to be re-used), a sufficient validation may be adequate. Post-deployment validation should be avoided
unless absolutely necessary. Some simple pre-deployment validation should always be attempted (based
on a limited set of requirements) but more thorough post-deployment validation should be applied as
soon as practically possible, especially if the process is to be used in the future.
EXAMPLE A process for recovering data from magnetic stripe cards may be comprehensively validated as
there are relatively few formats for storing data on such cards.
Sufficient validation refers to validation against agreed functional and non-functional requirements
for the conditions pertaining at the time of the investigation. It is not necessary to validate a method
against software and hardware configurations which will not be relevant in the investigation, nor is it
necessary to consider validation for data which will not be processed.
A sufficient validation is one which shows that the process produces correct results for the type of inputs
encountered in the investigation under consideration, i.e. a sufficient validation shows that a process is
fit for a particular intended use as defined by the identified requirements.
A process which has passed validation may be described as fully validated for the intended use defined
in the validation plan. A process should, normally, not be employed until it has been fully validated.
If a process does not pass validation, the requirements, design and implementation should be reviewed
and amended appropriately. Once this is complete, the process should be subjected to validation again.
5.10 Confirmation
The final step before deployment of the process is confirmation. This formally assesses whether the
process meets the agreed requirements and provides formal evidence that the process is fit for use in
the investigation.
For confirmation to be carried out the evidence of validation for the process should be checked against
the agreed requirements for the intended use of the process. A process may only be confirmed if it is
fully validated.
A process which has previously been validated may be confirmed without further validation if it is fully
validated for the current investigation. This includes processes which have been subject to external
validation.
Confirmation can be the final step of a validation or re-validation, or can constitute a step, in its own
right, which provides a formal record that previously produced validation evidence, from a prior
investigation, is adequate for the current investigation.
5.11 Deployment
Once the process has been accepted it can be deployed for use in the examinations which make up
the investigation. Any and all deviations from expected results or behaviours should be recorded and
remedial actions taken. Where remedial actions involve a change to the process or the deviation from
expected behaviour conflicts with previous validation results, revalidation may be required.
During deployment of a process, the investigator may have to choose amongst several tools which
provide the same, or similar, functionality. Although the work instruction should contain guidance on
how this choice should be made, the investigator should also maintain a contemporaneous record of the
tool choices made and the factors which influenced these choices.
6 Assurance Models
6.1 Overview
In the context of the assurance stages, individual assurance stages should be, as far as practically
possible, carried out independently of the development of the processes (i.e. carried out by someone
other than the developers), in order to provide an additional level of confidence that processes are truly
adequate for their intended use. In order to achieve this, steps should be taken to ensure that assurance
stages are carried out in a manner which ensures that they are not unduly influenced by design and
implementation considerations.
This section discusses assurance models which can be used to assist in ensuring that undue influence
is not introduced. In general, assurance may be carried out within the organization which will use the
processes (see 6.2), by another organization (see 6.3) or using a combination of the two mechanisms
(see 6.4). Whichever model is used, proper evidence of assurance (see Clause 7) should be produced and
maintained.
7.1 Overview
Digital evidence can be challenged on the grounds that methods used to produce it were not fit for
purpose. For this reason it is important that evidence of fitness for purpose can be produced. This can
be achieved by applying the assurance stages, described in Clause 5, and maintaining thorough records
of the assurance process. A method for producing this evidence of assurance is described below.
Annex A
(informative)
Examples
Bibliography
[1] ISO/IEC 17024:2012, Conformity assessment — General requirements for bodies operating
certification of persons
[2] ISO/IEC 17043:2010, Conformity assessment — General requirements for proficiency testing
[3] ISO/IEC/IEEE 29148:2011, Systems and software engineering — Life cycle processes — Requirements
engineering
[4] ISO/IEC 17025:2005, General requirements for the competence of testing and calibration laboratories
[5] ISO/IEC 27004:2009, Information technology — Security techniques — Information security
management — Measurement
ICS 35.040
Price based on 18 pages