0% found this document useful (0 votes)
70 views146 pages

ACL105 AATI Modify

This document provides an overview of a training on ACL (Audit Command Language) concepts and practices. It discusses the ACL interface and basics such as projects, tables, views, scripts, logs, workspaces and folders. It also covers data concepts such as files, records, fields and data types. The training objectives are to explain data analysis fundamentals and teach how to access, display, filter and verify data integrity using ACL.

Uploaded by

Asher khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views146 pages

ACL105 AATI Modify

This document provides an overview of a training on ACL (Audit Command Language) concepts and practices. It discusses the ACL interface and basics such as projects, tables, views, scripts, logs, workspaces and folders. It also covers data concepts such as files, records, fields and data types. The training objectives are to explain data analysis fundamentals and teach how to access, display, filter and verify data integrity using ACL.

Uploaded by

Asher khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 146

105 FOUNDATIONS OF ACL

CONCEPTS AND PRACTICES

PIFRA Training Wing

Muhammad Yousuf
Azmatullah Khan

7/13/23

Edition 4.1.2 January 26, 2004


Introductions

 Name / Job Title


 Department name
 ACL Experience/Computer knowledge
 Is ACL already being used at your office?
 What are your reasons for attending this class?
 Do you have any specific objectives for using ACL?
Class Administration

 Class Hours
 Breaks
 Lunch
 Telephones
 Sign-in Sheet
 Training Materials
 Comprehensive test
 Course Evaluation
Training Objectives

By the end of this training you will be able to:

 Explain the fundamentals of data concepts, ACL


basics, and data analysis cycles
 Describe the three stages of data access
 Create tables to access data
 Display and filter data
 Verify data integrity, create expressions,
compare data from different files/systems,
profile data, and produce data reports
 Employ ACL to add value to audits
CAATs – General Orientation

 DAGP’s Field Standards


 What are CAATs
 Why use CAATs
 Factors for using CAATs
 Benefits
 Strategies for CAATs
 Types of CAATs
 Common tools for performing CAATs
 CAATs Tools Comparison
DAGP’s Field Standards
• The auditor, in determining the extent and scope of the
audits, should study and evaluate the reliability of the
internal control
• Where accounting or other information systems are
computerized, the auditor should determine whether internal
controls are functioning properly to ensure the integrity,
reliability and completeness of the data, and the information
system.
• Competent, reliable, relevant and reasonable evidence
should be obtained to support the auditors’ judgment and
conclusions regarding the organization, program, activity
or function under audit
• When computer-based system data are an important
part of the audit and the data reliability is crucial to
accomplishing the audit objective, auditors need to
satisfy themselves that the data are reliable and
relevant.
What are CAATs?

 Computer based tools and techniques that give


auditors ability to maximize their efficiency and
effectiveness in audit function
 CAATs brings together electronic data, people,
and software tools to achieve the goal of
creating value added information
Why use CAATs?

 Digital information age


 Ability to collect evidence
 Absence of input documents
 Absence of visible audit trail
 Support for audit findings
Factors for using CAATs?
 Increased audit coverage
 Dependence on IS Function
 Records only in electronic format
 Computer knowledge, expertise and experience
of the auditor
 Availability of CAATs and suitable computer
facilities
 Impracticability of manual tests
 Effectiveness and efficiency
 Timing
Benefits

 Increase Audit Economy and Efficiency


 Perform huge volume of transactions
 Better coverage and assurance obtained
 Improves audit effectiveness
 Join and compare different files to quantify
findings
 Enhances image of auditing
 Move focus to areas where risks were noted
 Back up findings with detail
Strategies for CAATs?

 Identify goals and objectives


 Identify information requirements to address
 Understand the data
 Understand the systems generating the data
 Develop working knowledge of CAATs
Types of CAATs?

 Word processing
 Spreadsheet
 Database
 Integrated audit software
 Custom reporting software
 Real time testing programs
Common tools for performing CAATs?

 Excel
 Access
 Audit Command Language (ACL)
 Interactive Data Extraction and Analysis (IDEA)
 SQL
CAATs Tools Comparison

Tools Capacity Ease of use Analytic


capabilities
Excel • 65,536 rows by • Standard, easy • Data analysis
256 columns to use Office toolkit
• 255 chars per Application • Built-in-functions
field
Access • 2 GB database • Training is • Built-in-functions
• 255 fields required • Great for joining
(columns) tables
ACL / IDEA • Unlimited • Requires basic • Complete set of
training pre-programmed
• Menu based analysis

SQL • 1,048,516 • Advanced • Built-in-functions


terabytes training required • Great for joining
• 1,024 columns tables
Training Modules

1. Fundamentals
2. Data Access
3. Expressions
4. Data Integrity Verification
5. Data Analysis
6. Reporting Results
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Fundamentals
Module 1
In this Module

 What is ACL?
 Data concepts
 The ACL Interface
 ACL basics
 Best Practice – File Organization
 Elements of Data Analysis
 The data analysis cycle
What is ACL?

 Software used to read and analyze data


 Software used for
 External / Internal Auditing
 Fraud Detection
 Reporting
 Data-mining
 Continuous Monitoring
 Operational Analysis
 Data Migration
 Data Harmonization
What can I do with ACL?

 Gather information for decision-making


 Retain data integrity – READ ONLY
 Process data from different systems
 Process large files rapidly
 Test 100% of data, rather than sample
 Automate analytical procedures
 Maintain records of your work
Data Concepts

 Letters, numbers, or
symbols representing
information, suitable for
processing,

V
communicating, or
interpreting.

 What does this


symbol represent?
Data Concepts

 Data is letters, numbers, or symbols representing information,


suitable for processing, communicating, or interpreting.

 What does this string of data represent?

04092003
 This string could be interpreted as
 An Amount (NUMERIC) $4,092,003 or $40,920.03
 An Invoice Date (DATE) April 9, 2003 or September 4, 2003
 An Account Number (CHARACTER) 04092003
Data Structure

 Files
 Named collection of records
stored or processed as an
individual entity
 Records
 Subset of a file
 Collection of related fields
containing data items
grouped for processing
 Fields
 Subset of a record
 A specified area of a record
used for storing a particular
class of data
File and Fields

Cust No. Name Date Trans_Amount


308250 Lawrence O’Mara 10/05/2000 367.12

Data Type
TEXT TEXT DATE NUMERIC
....|....10...|....20...|....30...|....40...|

Decimals
Length

Length

Length

Length
Start

Start

Start

Start
1 9 10 17 27 10 37 9 2

Record_Length 45
The ACL Interface

 Welcome tab
 Project Navigator
 Status bar
ACL Basics

 A project gives a user the means to organize the


data analysis activities and document the results
of the analysis
 The elements associated with a project are
 Tables
 View
 Scripts
 Logs
 Workspaces
 Folders
ACL Basics

 Tables
 Data Source
 Table Layout
 View
 Scripts Data Source
 Logs Collection of related records
stored as a unit with a single
 Workspaces
name. Data source does not
 Folders reside inside an ACL project.
ACL Basics

 Tables
 Table layout
 Data source
 View
 Scripts Table Layout
 Logs Contains everything needed
to read a data source.
 Workspaces
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View
 Scripts Views
 Logs The visual presentation of a
file. A table can have one
 Workspaces
or more views.
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View
 Scripts Table
 Logs A table layout is linked to a
data source. Contents of a
 Workspaces
file are displayed in views.
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View Scripts
 Scripts A macro or batch file – a
 Logs series of ACL commands
that can be executed
 Workspaces without user interaction.
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View
 Scripts Logs
 Logs History of commands and
results from your analysis.
 Workspaces
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View
 Scripts Workspaces
 Logs Field definitions that are
saved for reuse with other
 Workspaces
tables.
 Folders
ACL Basics

 Tables
 Table layout
 Data Source
 View
 Scripts Folders
 Logs Used to organize your work
in an ACL project.
 Workspaces
 Folders
Best Practice – File Organization

- Local Disk (C:)


 Organize your hard drive
- ACL Audit Data
by creating a new
- AR 2003 North America Windows folder for each
AR 2003 North America.ACL
ACL Project
new ACL Project
9KB

AR 2003 North America.LOG


Text Document
 The location of your ACL
46KB
AR 2003.dat
Project is your Default
DAT File
129KB Working Folder
- Payroll 2003 All Regions  All new files created by
Empmast.dat
DAT File ACL as a bi-product of
your analysis will be
56KB
Payroll 2003 All Regions.ACL

stored automatically in
ACL Project
12KB
Payroll 2003 All Regions.LOG
Text Document
5KB
the Default Working
Payroll.dat
DAT File
Folder
135KB
Elements of Data Analysis

 Commands
 Expressions
 Filters
 Computed fields
 Functions
 Variables
Elements of Data Analysis

 Commands
 Expressions
Commands
 Filters
 Computed fields
Predefined routines that can
be used for Verification and/or
 Functions Analytical purposes.
 Variables Some parameters are optional.
Most Commands are located
under Data & Analyze on the
menu bar.
Results can be sent to file,
screen, print, or graph.
Elements of Data Analysis

 Commands
 Expressions
 Filters
 Computed fields
 Functions Expressions
 Variables Statements used to create
filters and computed fields.
Filters create logical
conditions (True/False).
Computed fields create
values that don’t exist.
Elements of Data Analysis

 Commands
 Expressions
 Filters
 Computed fields
 Functions Functions
 Variables Predefined routines that are
incorporated into expressions.
There are over 80 functions
that can be used to achieve
either simple or complex
objectives.
Elements of Data Analysis

 Commands
 Expressions
 Filters
 Computed fields
 Functions Variables
 Variables Named memory space that
stores data. Can be
character, numeric, date, or
logical type.
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification
 Data analysis
 Reporting results
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification Planning
 Data analysis Plan your work before you
 Reporting results start.
Consider the steps required
to achieve the objectives of
the remaining 4 phases.
Formulate clear objectives,
develop concise strategies,
and budget the right
amount of time.
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification
 Data analysis
Data access
 Reporting results Access the data outlined in
your strategic plan.
Includes locating, requesting,
and transferring the data.
Data access will be discussed
in detail in module 2.
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification Data integrity verification
 Data analysis Test the integrity of the data you
 Reporting results receive. If you don’t, results may
be incomplete or incorrect.
ACL provides many tools,
including commands and
expressions, which make it easy
to identify data integrity errors.
Data integrity verification will be
discussed in detail in module 4.
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification
 Data analysis
Data analysis
 Reporting results
Perform the tests necessary
to achieve your objectives.
Use commands, filters,
computed fields, etc.
Data analysis will be
discussed in detail in
module 5.
The Data Analysis Cycle

 Planning
 Data access
 Data integrity
verification Reporting results
 Data analysis
Create reports from your results
 Reporting results and document your work.
Reports can be printed or sent
to a file. File types include text,
HTML and Excel. Some reports
can be sent to a graph.
Reporting results will be
discussed in detail in module 6.
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Data Access
Module 2
In This Module

 Locating data
 Acquiring data
 Accessing/importing data
 Creating tables to access data
 Reusing table layouts
Locating Data

 Partner with IS  Determine available


data formats
 Educate yourself and your staff
 Excel, Access, XML
 Identify available data:
 ODBC-compliant data
 Meet with IS and Auditee. sources
 Review data dictionary  dBASE files
 Obtain reports from area being analyzed
 Flat files
 Meet with data entry personnel  Report (print-image) files
 Delimited files
Acquiring Data

 Transferring
Determine your
the objectives
data
 Access the
 Request to production
data database

 Access to a copy
Data request of the data
letter

 User access
Summary to source data
Report
 Record Layout
 Control Totals
Accessing Data

Create an ACL Project to Direct access


hold your table layouts.  Automatic layout
Create a table layout  Manual layout
within an ACL Project to  External definition
access your data. Import and copy
 Automatic layout
 Manual layout
Creating tables to access data

Before you can analyze data with ACL, you must


create tables to access the data. Slides to follow
will describe how to use the Data Definition
Wizard to access data from different data
sources.
Flat File – Direct Access, Manual Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
Flat Files – Direct Access, Manual Layout

Obtain a copy of record


layout from data provider
Review data layout
Define flat file – Data
Definition Wizard (DDW)
Confirm file properties and
fields
Name the fields and select
data types
Create and save the table
layout
Define overlapping fields (if
applicable)
MS Excel file – Import and Copy, Automatic Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
MS Excel file – Import and Copy, Automatic Layout

Confirm first row contains field


names
Confirm properties of each
column; format if necessary*
Select Excel file and define
with DDW
Select worksheet or named
range
Create and save the table
* Note: if a data field contains
both numeric and alphanumeric
data, only one type will be
available after import
ODBC – Import and Copy, Automatic Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
ODBC – Import and Copy, Automatic Layout

Obtain read access to


database
Confirm appropriate ODBC
driver is available and installed
Launch ODBC Wizard and
select the applicable driver
Select the table or view to
define
Select the fields and records
Filter data if necessary – SQL
query
Create and save the table
Refresh your data (as needed)
– remember to back up the file
dBASE file – Direct Access, Automatic Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
dBASE file – Direct Access, Automatic Layout

1. Select the dBASE file and define with DDW


2. Create and save the table
 Only check the record_deleted column for
any records that are marked for deletion if the file came
from an application which makes use of DBF type files
Report File - Import and Copy, Manual Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
Report File – Import and Copy, Manual Layout

Print first 3 pages of report


Plan – identify
records/fields to define
Select the report file and
define with Data Definition
Wizard
Define header information
(if applicable)
Define detail information
Define footer information
(if applicable)
Create and save the table
Report File – Import and Copy, Plan Phase
Header Record 1 contains 1 field
- Field name rptdte
-Identifier used to select field is
“As AT”

Header Record 2 contains 1 field


-Field name classdesc
-Identifier used to select field is
“Product Class”

Detail Record contains 5 fields


-Field names prodno, proddesc,
Qty, unitcost,totcost
-Identifier used to select lines on
Which these fields appear prodno
Area – numeric, nine digits long
Plus Format “999999999”

Footer Record contains 2 fields


-Field name subqty and subcost
-Identifier used to select lines on
Which these fields appear “Class
Totals:”
Delimited file – Import and Copy, Automatic Layout

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
Delimited file – Import and Copy, Automatic Layout

Confirm if column headers


appear on first row
Identify field separator and
text qualifier – if applicable
Use DDW to define table
layout
Confirm field properties
and edit if applicable
Save settings and create
table layout
External Definition – Direct Access

 Direct access
 Automatic layout
 Manual layout
 External definition
 Import and copy
 Automatic layout
 Manual layout
External Definition – Direct Access

 AS400 data extract


 FDF file – record layout
 Extract produced thru COBOL
 Copy book – record layout
 Extract produced thru PL1
 Copy book – record layout
 Why? Able to import the layout and build the
table layout
 Great to use for extracts with many fields – 50
or more
Reusing Table Layouts

 To
Reuse
reuse
table
a table
layouts
layout,
to save
you time
can:
 Duplicate
 Useful whenand rename a table layout
you:

 Link
Haveamultiple
table layout to a new
files with source
the same structure

 Import a table
Regularly layout
receive files from
with another
the sameACL project
structure
 Export a table layout
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Expressions
Module 3
In This Module

 Understanding expressions
 Filters
 Computed fields
Expressions

 Set of operators and  Can be a combination of:


values used to perform:  Data fields
 Calculations  Operators
 Specify logical conditions  Constants
 Create values that don’t  Functions
exist in data
 Variables
Expressions – Filters & Computed Fields

DECISION: When creating an expression, will the output be


a filter (True/False) or a Computed Field?

Expression
Output?

Filter
Filter Computed
Field

Global Command Unconditional Conditional


Filters View Filters Command Same expression on Different expression
Result every record for each condition
Edit View Filter Edit Table Layout Edit Table Layout
(Found on Every Command)
Add a New Expression Add a New Expression

Filters: Named or Unnamed Computed Fields: Named Only


Filters

 A logical expression – true (T) or false (F)


 Lets you select the data you work with
 Similar to a query
 Two kinds of filters:
 Global
 Command
 Global filters can also be activated using:
 Quick Filter
Filter elements

 Fields
 Operators
 Values
Computed Fields

 Computed fields can that


A named expression be character,
calculatesnumeric,
results todate, or
create
logical
additional data

 Four main uses
Is a virtual field for
thatcomputed fields: calculations
lets you perform
 Performing mathematical computations
 Does not affect or change the original data
 Converting fields from one data type to another
 Making word substitutions
 Creating logical tests (filters)
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Data Integrity Verification


Module 4
In This Module

 About data integrity


 Verifying data integrity
 Checking validity
 Confirming control totals
 Checking for correct bounds
 Identifying missing items
 Identifying duplicates
 Testing for reliability
 Other data integrity verification tests
What is Data Integrity?

 Data integrity means that your table contains:


 No corrupt data or erroneous field definitions
 Only valid dates
 Numeric totals that reconcile to controls
 All the data and only the data you requested
 Only unique records
 Calculated fields that contain correct values
 Data in fields where it is expected
 Field relationships that are consistent and logical
 Data that looks reasonable
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission
 Definition
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Input
 Definition Includes incorrect data,
omitted items, unwanted
items, or invalid data. Sign
of faulty input validation
controls.
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Processing
 Definition Undetected flaws in
programming can cause
validity errors.
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Extraction
 Definition The party responsible for
extracting requested data
may misinterpret your
request, mistakenly or
deliberately extract the
wrong data.
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Conversion
 Definition Programmers often convert
data from EBCDIC to ASCII.
This can corrupt fields
native to the mainframe
environment.
 ACL can read and process EBCDIC.
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Transmission
 Definition The process of transferring
data can corrupt data.
Sources of Error

 Input
 Processing
 Extraction
 Conversion
 Transmission Definition
 Definition Mistakes in your table
layout can cause validity
errors.
Checking Validity

 If
Ensure
errorsyour
are table
found:
isvalid
Determine if errors are in the table layout or in the data itself
 Character fields
If errors are contain
in table onlyfix
layout, printable characters
and check validity again
 Numeric fields
If errors are in contain only have
data, either numeric
the digits, decimals,
data re-sent minus
or try to fix
signs
them and currency symbols
 Date fields are valid dates
 To check validity use the Verify command
Confirming Control Totals

 Ensure data
If control meets
totals thematch:
don’t specifications in the summary
report
 Usually means data was not extracted properly from source
 Compare
data control totals generated in ACL to those
provided
 If therein
aresummary report use a filter to extract the
too many records,
required data
 To confirm control totals, use:

 If
Thethere arecommand
Count too few records, have the file re-sent
 The Total command
 The Statistics command
Checking for Correct Bounds

 Ensure
If your records
upper and arelower
not within
boundsspecified
of databounds:
match your
request
 If your table contains extraneous records, extract the valid
 records
Numericinto a new table
bounds

 If there
Date are insufficient records, have the data re-sent
bounds
 To check for correct bounds, use:
 The Statistics command
 The BETWEEN( ) function
Identifying Missing Items

 Identify possible  If missing items are


records or values detected:
omitted on the source  Determine if missing
system items are critical to your
analysis
 To identify missing
 Inform the data provider
items, use: of your findings
 The Gaps command
 To identify unpopulated
character fields, use:
 The ISBLANK( ) function
 Filters – date fields and
numeric fields
Identifying Duplicates

 If
Duplicates
you find occur
duplicate
often
entries:
and can be valid
 Examine
 Check findings in context
for duplicate recordstoand
determine their validity
duplicates in fields
that shouldn’t
 Contact contain
the data duplicates
provider
 You can create a table that does not contain duplicates by
 To identify duplicates, use the Duplicates command
using the Summarize command
Testing for Reliability

 Always
If your filter
test values
or computed
derivedfields
from reveal
calculations
errors i.e. they
 don’t
Ensurematch the original
that fields based field:
on calculations are free of
 Contact the data provider
errors
Use the computed field in your analysis rather than the
 To test for reliability, use a filter or computed fields
original field
 If errors are pervasive or significant, have the data resent
Other DIV Tests

 Many other ways of testing for data integrity:


 Test for reasonableness
 Test common sense assumptions about your data
 Use various commands
 Test for relationships
 Ensure that relationships between fields are consistent.
 Use filters or computed fields
Summary

To Check Use To Ensure

Validity VERIFY Data and Table are valid

Control Totals COUNT Record counts, numeric


TOTAL fields agree to control totals

STATISTICS
Bounds STATISTICS Dates within bounds
BETWEEN() Filter data within bounds
Missing Items GAPS Data is not missing
ISBLANK() Test for blanks where data
is expected
Duplicates DUPLICATES Unique transactions
Reliability Computed Fields Valid processing
Reasonableness Various Commands Data meets expectations
Relationships Various Commands Data is consistent
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Data Analysis
Module 5
In This Module

 Profiling data
 Isolating data
 Reordering tables
 Combining tables
Profiling Data

 Overview of data  Five commands to


 Can help identify trends profile data:
and anomalies  Classify
 Summarize
 Create summary sized
 Cross-tabulate
results for files of any
size  Stratify
 Age
The Classify Command

 Groups unique character


field values
 With Classify, you:
 Specify a character field
 Specify fields to subtotal
(optional)
 Output to screen, print,
graph, or file
The Summarize Command

 Groups unique character


or date field values
 Can summarize on one or
more fields
 With Summarize, you:
 Specify fields to summarize
 Specify fields to subtotal
(optional)
 Select other fields
(optional)
 Output to screen, print, or
file
Classify vs Summarize

Functional
Classify Summarize
Specification
Calculates # of times each
Yes Yes
key field appears in the table
Sequences records on key
Yes No
field before producing result
Can produce a graph Yes No
Primary location for command
RAM Hard disk
processing
Number of key fields One One or more
Allows output of other fields No Yes
Profiles on Character Fields Yes Yes
Profiles on Date Fields No Yes
Computes numeric sub-totals Yes Yes
The Cross-tabulate Command

 Logical extension of Classify


 command
With Cross-tabulate, you:
 Specify a character column
 Produces a report of two or more character fields
 Specify one or more character rows
 Arranges fields into columns and rows
 Specify fields to subtotal (optional)
 Include count (optional)
 Output to screen, graph,
print, or file
The Stratify Command

 Groups records into rangesbased


With on
Stratify,
numericyou: field
values  Specify a numeric field

 Before you Stratify:  Specify the number of intervals or


enter free intervals
 Find the range of values in the numeric field
 Specify (Statistics)
fields to subtotal (optional)
 Select the minimum and maximum
 values
Output of
to the range
screen, to print, or
graph,
stratify on file
The Age Command

 Groups
With Age,records
you: into specified ranges or
aging periods
 Specify a date field
 Specify a cutoff date
 Commonly used to age receivables data
 Specify the aging periods
 Specify fields to subtotal (optional)
 Output to screen, graph, print, or file
Isolating Data

 Create subsets of data  Three options to


 Avoid data that is isolate data:
irrelevant to your  Filters
analysis  The Extract command
 The Export command
View (Global) Filters

 Isolate records in a file without physically


removing them or creating new files
 When a Command is issued with a View
Filter active the Command will only return the
result for the filtered (true) records but the
entire file still has to be processed
 Can be inefficient when returning few records
from a large table
The Extract Command

 Use Extract to create Two options with Extract:


a new table from
 Record
the records and fields
in an existing table  New table has identical
table structure, including
 Isolate just the records undefined areas of the
and fields you need input record
to work with
 Fields option
 Future processing will be
 New table contains only
faster on a sub-set of the
selected fields
original file
 Computed fields are
resolved and written to
the data file
The Export Command

 Use ACL as a data  Export data to:


conversion tool  Microsoft Access
 Create files readable by  Clipboard
other applications  dBASE III Plus
 Two options to export  Delimited file

 Specify fields to export  Microsoft Excel

 Export fields from the  Lotus 1.0


current view  Plain text
 Microsoft Word merge file
 WordPerfect merge file
 XML
Reordering Tables

 Reorder tables to:  Three options when


 Clarify the meaning of reordering tables:
data  The Sort command
 Prepare for requirements  The Index command
of subsequent
 Quick Sort
commands

 The Sequence command tests that the selected fields are in sequence.
The Sequence Command

 Test if a table is in  With Sequence, you:


sequence by specified  Specify the fields in which
fields to test for sequence
 Default error limit of 10  Specify whether to
test for sequence in
 Sequence does not modify the ascending or descending
order of the original table order
 Output to screen,
print, or file
The Sort Command

 Creates a new table in which records are


ordered by specified key fields
 Output table has a record structure identical to
original table
The Index Command

 Allows you to work in a sequential table without


creating a new sorted table
 Reordering is logical, not physical
Sort vs Index

Condition Sort Index

Execution speed Slower Faster

Resulting file size Larger Smaller

Required disk space More Less

Subsequent processing of an
Much faster Much slower
entire table
Subsequent processing when
Much slower Much faster
searching for a few records
Quick Sort

 Useful for temporarily sorting a file on a single


field
 Does not create a new sorted file
 Use for visual reference only
 Most Commands issued when a Quick Sort is
active will read the file in the original sequence
of the file
Combining Tables

 Sometimes you need to  Three options to


compare data from two combine tables:
or more tables  The Extract command
with the Append option
 The Join command
 The Relations command
Extract with the Append Option

 Create a new
table that:
 Combines multiple tables
that have the same
structure
 Contains the same type
of information
Extract with the Append Option

AP_Trans_January
Custno Product Inv_Date Inv_Amount
01542 Printer 5-Jan-05 257.89
04723 Toner 18-Jan-05 39.99 AP_Trans_Quarter_1
29452 Printer 19-Jan-05 294.32 Custno Product Inv_Date Inv_Amount
03914 Paper 25-Jan-05 15.86 01542 Printer 5-Jan-05 257.89
Extract
46778 Scanner 30-Jan-05 125.99 04723 Toner 18-Jan-05 39.99
29452 Printer 19-Jan-05 294.32
AP_Trans_February 03914 Paper 25-Jan-05 15.86
Custno Product Inv_Date Inv_Amount 46778 Scanner 30-Jan-05 125.99
04723 Paper 1-Feb-05 15.86 04723 Paper 1-Feb-05 15.86
Extract / Append
33397 Paper 15-Feb-05 84.33 33397 Paper 15-Feb-05 84.33
46778 Toner 28-Feb-05 79.98 46778 Toner 28-Feb-05 79.98
01542 Flash Drive 2-Mar-05 99.99
AP_Trans_March 88754 Toner 3-May-05 39.99
Custno Product Inv_Date Inv_Amount Extract / Append 12679 Scanner 25-Mar-05 134.99
01542 Flash Drive 2-Mar-05 99.99 46778 Scanner 31-Mar-05 85.00
88754 Toner 3-May-05 39.99
12679 Scanner 25-Mar-05 134.99
46778 Scanner 31-Mar-05 85.00
The Join Command

 Create
Six types
a new
of joins:
table that contains matched or
unmatched
 Matched records from two tables
 Based on common key fields in two tables
 Unmatched
 Matched, All Primary
 The first 5 join types use a Many-to-One relationship
 Matched, All Secondary
 Matched, All Primary & Secondary
 Many-to-Many (covered in advanced courses only)
Rules & Guidelines

 Rules  Guidelines
 Primary and secondary  Data in key fields should
tables must be in the be in the same case:
same project UPPER(), LOWER(), or
 Both tables must share PROPER() functions
at least one common  Key fields should have
key field same justification:
 Key fields must be LTRIM() function
defined as the same data
type in both files
 Key fields must be
the same length:
SUBSTRING() function
 Files must be sequenced
in the same order –
presort option
Planning

 Before performing a  If blanks are found:


join:  What do blanks mean?
 Understand both files  Are the blanks supposed
 Harmonize key fields to exist?
 Identify any missing  What data will blank key
data (blanks) in key fields be matched to?
fields
 Identify any records
 If duplicates are found:
with duplicate key fields  Will duplicate records
 Determine objectives for affect the join results?
comparing the two tables  Remove duplicates
Join Examples

 Illustrations for the first  Employee Records


five join options  Contains a list of valid
employees and the amount
 Payroll Ledger
that they should be paid
 Contains payroll  One employee (002)
disbursements for is missing
a single pay period
 One employee (003)
was paid twice
Matched Primary Records

 One output record for every match between primary and secondary
tables
 Shows employees who have been paid and who are also listed in
the employees table
Unmatched Primary Records

 One output record for every record in the primary table that doesn’t
have a match
 Shows all employees paid, but not listed in the employees table
Matched Primary Records, include All Primary records

 One output record for every match between primary and secondary
tables, plus one record for every unmatched primary record
 Results equivalent to combined results of Matched and Unmatched
Matched Primary Records, include All Secondary records

 One output record for every match between primary and secondary
tables, plus one record for every unmatched secondary record
 Lets you account for all employees in the employee table
Matched Primary Records, include All Primary & All
Secondary records

 One output record for every match between primary and secondary
tables, plus one record for every unmatched record
 Shows all payroll checks issued and all employees listed in the
employees table
How to Join Tables

 Plan the join:


1. Determine the objective of the Join
2. Identify the two tables to be joined
3. Identify key field(s) from each table
4. Examine the key field(s) to ensure that they:
 Are the same data type
 Are the same length
 Have the same case and justification
5. Determine which table will be primary, secondary
6. Determine which type of join to perform
How to Join Tables…

 Test and prepare the tables:


1. Test the key fields for duplicates
2. Test the key fields for blanks
3. Determine if tables are in sequence by key fields
 Physically sort tables prior to joining, or
 Use the Presort option
How to Join Tables…

 Use the Join command:


1. Open the primary table
2. Open the Join dialog
3. Select the secondary table
4. Select the key fields
5. Select fields to include in the output table
6. Select presort (optional)
7. Apply a filter (optional)
8. Select the type of join to perform
9. Name the output table – Click OK
10. Check the Log
The Relations Command

 Easily access data in  Results are virtual and


multiple tables do not result in a new
simultaneously table
 Create reports with  Use fields from up to 17
data from different relationships (18 tables)
systems  Relationships can be
 Recreate environment direct or indirect
of a relational database
 Logical equivalent to
Join Matched Records
include All Primary
record option
The Relations Command …

 Direct relationship  Indirect relationship

AR (parent) Customer AR (parent) Contract


invoice number customer number invoice number contract number
invoice date name invoice date date
customer number address customer number Type
contract number city contract number sales rep num
state

Salesperson
sales rep num
name
address
Rules & Guidelines

 Guidelines
Rules

 Key fields must
All tables should
bebe
in the
the
same length:
same project SUBSTRING() function
 Data in key fields should be in the same case: UPPER(),
 Both tables must share
LOWER(), or PROPER() functions
at least one common key
 Key
field fields should have same justification: LTRIM() function
 Key fields must be
defined as the same
data type in both files
 Relationships are defined
and stored in the parent
table
 Delete or modify existing
relationships from the
parent table
 Child table must be
indexed by key field
How to Relate Tables

 Plan the relation:


1. Determine the objective
2. Identify the two tables to be related
3. Identify key field(s) from each table
4. Examine the key field(s) to ensure that they:
 Are the same data type
 Are the same length
 Have the same case and justification
5. Determine which table will be parent and child
How to Relate Tables…

 Test and prepare the tables:


 Test the key fields for duplicates. If they exist, consider
removing them with the Classify or Summarize commands
prior to relating the tables.
 Test the key fields for blanks: ISBLANK() function
 Relate the tables:
 Open the parent table
 Open the Relations dialog
 Add the child table to the Relations dialog
 Click and drag the key field from the parent table to the key
field from the child table
 Add a field to the View from the child table to ensure the
relationship was successful
Join vs Relations

Requirement Join Relations

Disk space Creates a third table that The only disk space
required for may be larger than both required is to create an
results original tables combined. index for the child table(s).
Can vary depending on
Time to Considerably less. No
whether the primary table
process record matching is done
is sorted and the
command when Relations is run.
complexity of the join.
Time to Results in a flat file can Record matching is done at
process be processed very command processing and
results quickly. takes longer than Join.
Presort Sort or Presort required Index is required for child
or Index for secondary tables. tables.
Number of
One or more key fields Limited to one key field
key fields
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Reporting Results
Module 6
In This Module

 Reporting with ACL


 Creating a report from a view
 Advanced reporting
 Generating reports with Crystal Reports
 Graphing results
 Using the log
 Documenting your analysis
Reporting with ACL

 Views
 Graphs
 Results tab
 Logs
 Project contents
Creating a Report from a View

1. Create a new view 3. Design the report


2. Format the view
layout
 Add columns
 Header, Footer
 Delete columns  Filter
 Rearrange columns  Presort, Summarize,
Suppress
 Format columns
 Fit to page
 Change the font
4. Print the report
 Page setup
 Page margins
 Output
 Print preview
 Print
Advanced Reporting

 Sorted reports
 Reports with subtotals
 Multiline reports
Advanced Reporting

 Sorted reports
 Reports with subtotals
 Multiline reports
Advanced Reporting

 Sorted reports
 Reports with subtotals
 Multiline reports
Advanced Reporting

 Sorted reports
 Reports with subtotals
 Multiline reports
Reporting with Crystal Reports

1. Create a report template


 Create a blank Crystal Reports template
 Use Crystal Reports to complete the template
2. Generate reports using the custom template
 Update the template from ACL
 View the report
Graphing Results

 Commands
Add visual appeal
that create graphs:
 Stratify
 Provide easily accessible presentation of results
 Classify
 Create a graph from a view:
 Histogram
 Select a range of data
 Age
 Right-click, select Graph Selected Data
 Cross-tabulate
 Benford
Graph Options

 Formatting graphs  Incorporating a graph


 Graph Type into a report
 Graph Properties  Print Graph
 Legend Properties
 Save Graph as Bitmap
 Axis Properties
 Copy Graph to Clipboard
 Format Data
 Label Properties  Exploring the data
 Show/Hide Legend in a graph
 Show/Hide Axis  Edit Command
 Drill-down
Using the Log

 Log sessions  Export from the log


 A new session is created  Export your log entries to
when you open a project HTML, new log files, scripts,
WordPad, Text files
 You can:
 Copy and paste results
 Start a new, named
session  Copy and past log entries
into Word, Excel, and most
 View a session other text/spreadsheet
 Add comments programs
 Search the log
 Helps you and others
understand your log  In a large log, use a
entries keyword search to find
specific results
Documenting your analysis

 Create
View andproject
print notes
a
table history
 Helpful for projects that you don’t work on consistently
 Created whenever a new table is created as the result of an ACL
 Print project contents
command
 Good for archiving work done on a project
 Useful to find how a table was created or what tables and fields
 were
Print out:
used to create a table
 Record Table layouts

Notes
 View, Script, Index, and Workspace definitions
 Attach a note to a single record
 Preferences, notes, and the log
105 FOUNDATIONS OF ACL
CONCEPTS AND PRACTICES

Thank You

You might also like