EDIT CHECKS CREATION,
VALIDATION,
PROGRAMMING AND UAT
DR. ANKITA SUMAN
OBJECTIVES
The Primary objective of CDM is to ensure timely delivery of
high quality data which is necessary to satisfy both GCP
requirements and the statistical analysis and reporting
requirements.
CREATING EDIT CHECK SPECIFICATIONS
⦿ Edit check specifications are crucial to identify invalid
data, missing data, inconsistent data, and out-of-range
values.
⦿ Edit check specification planning requires information
from a number of sources and should be performed with a
comprehensive strategy for specification development in
place prior to creating the initial draft.
SOURCES FOR INFORMATION FOR EDIT CHECK
SPECIFICATIONS
• Study Protocol
• Data Management Plan
• Annotated CRFs and database design
documentation.
• Standard edit check macros
• Biostatisticians
• Study personnel.
SAMPLE EDIT CHECK SPECIFICATION TABLE
HIEARARCHAL VIEW OF EDIT CHECKS
• General Clinical data Checks
• End-Point Checks
• Safety checks
• Protocol Compliance checks
• Programmed checks
• Manual checks
• Listing checks
• External checks
TYPES OF CHECKS
⦿ Missing values
⦿ Missing CRF pages
⦿ Range checks—
⦿ Checks for duplicates
⦿ Logical inconsistencies across single CRF
⦿ Inconsistencies across CRF pages or modules
⦿ Checks of external data
⦿ Protocol violations.
⦿ FRONT – END CHECKS
Edit checks that are triggered upon data entry are often
referred to as front-end edit checks.
Front-end edit checks are typically limited to a single field or
CRF page Example: A flag or warning that appears when an
entry operator attempts to enter an impossible visit date, such
as February 30 or a date in the future.
⦿ BACK-END CHECKS
Edit checks across multiple forms are often known as
back-end edit checks.
Back-end edit checks are typically more complicated and
therefore more difficult to program.
Example : Check that notifies CDM personnel that a BMI
(body mass index) entry is not consistent with the
subject’s reported height and weight.
POINTS TO REMEMBER
⦿ The edit check specifications document should be consistent
in its wording and conventions.
⦿ The following are some examples of areas that should be
reviewed for consistency within an edit checks
specifications document.
✔ Use generic terms, such as "Subject" rather than "Patient,"
although a global change to “Patient” may need to be made for
some studies.
✔ Note field names exactly as they are provided on the
corresponding CRF (e.g., "Date of Birth" rather than "Birth
Date," if “Date of Birth” is how the field is identified on the
corresponding CRF).
✔ All descriptions in the edit check specifications document should
be stated in complete sentences, using consistent terms such as
"Visit Date must be present," or "If Not Done is marked, Result
must be blank."
✔ Use consistent formatting conventions such as capitalizing all
field names, or adding brackets only when a sentence is not
clear without them (e.g., “A response must be marked for [Were
any Adverse Events experienced?]”).
✔ Note any exceptions or special instructions for the reviewer
(e.g., “NOTE: Do not query if page is lined through.”).
CREATING TEST DATA
.
Testing Edit
checks with test
data
Testing Feed-
back loop Process
Documentation
Quality Control
USER ACCEPTANCE TESTING (UAT)
The decisive step in regard to initiation of Study Start-up in
Clinical Data Management system would be User acceptance
testing (UAT).
It can be categorized as:-
i) Data Entry application/EDC tool UAT.
ii) Edit check UAT.
EDC TOOL UAT
Data Entry application/EDC tool should be designed :
to satisfy the processes assigned to these systems for use
in the specific study protocol (e.g., record data in metric
units, blind the study),
to prevent errors in data creation, modification, maintenance,
archiving, retrieval, or transmission (e.g., inadvertently
unblinding a study).
USER-ACCEPTANCE TESTING
⦿ User Acceptance testing of EDC tool is performed post
completion of EDC tool build and prior to release into
production environment.
⦿ User requirement specifications should be referred
before the initiation of application testing.
⦿ UAT Test Plan should be devised to identify the objective
of each test, required input data, tab rules, expected
outputs, documentation to be kept and required signatures.
USER-ACCEPTANCE TESTING
⦿ Test data/dummy data should be entered in the application
as per the test plan .
Test data should include:
Missing values
Out of range values
Representative data
Incorrect values etc.
USER-ACCEPTANCE TESTING
(PROCESS)
⦿ Tests should be run/reviewed as per UAT Test plan.
Any deviation/finding should be captured in the Test
findings log.
⦿ All screens, logs and outputs should be checked for errors
against expected outputs.
⦿ If the test is performed as per plan and yields expected
results, the test can be signed off by Tester and Programmer.
⦿ If the test fails, then findings should be submitted to
respective programmer to get the necessary
corrections/modifications incorporated in the
Application.
EDIT CHECK UAT
• Overall, the process flow consists of accessing edit check
outputs run with context to test/dummy data, which may
contain invalid data.
• By examining the outputs it can be concluded if
programmed edit checks are getting triggered as per
program specifications or not.
• If not, findings would be routed to respective programmer
who would be modifying/correcting programmed edit
checks.
EDIT CHECK UAT
Edit Check UAT is performed mainly to check enlisted type
of Data issues.