Security Assessment and Testing
System Security Control Testing
Software Security Control Testing
Security Process Data Collection
Audits
Security Assessment
Test both administrative and technical controls.
Examine entire security posture:
Policies.
Organization's security culture.
Management attitude towards security.
Risks of conducting a security assessment:
Testers might focus only on technical controls, ignoring administrative controls, policy, and culture.
The testing process may disrupt normal operations.
The resulting data will not be properly interpreted.
Recommendations will be ignored, or improperly or insufficiently implemented.
Security Test Strategies
1. Create a security assessment policy.
2. Create a security assessment methodology.
3. Assign testing roles and responsibilities.
4. Determine which systems you will test.
5. Determine how you will approach the testing, addressing:
Logistical issues.
Legal regulations.
Policy considerations.
6. Carry out test, addressing any incidents that arise during/because of the test.
7. Maintain the CIA principles while handling the data through all phases:
Collection.
Storage.
Transmission.
Destruction.
8. Analyze data and create a report that will turn technical findings into risk
mitigation actions to improve the organization's security posture.
Administrative Assessment Test Output
Responses by management and users to security-related questions.
A list of existing or non-existing procedures or documentation.
Recorded observation of user/management activities.
Recorded observation of adherence to existing procedures/policies.
Security Questionnaire
Technical Assessment Test Output
Current firewall configuration of each system.
Antivirus patch level of each system.
List of known or potential vulnerabilities found on each system.
List of default configurations found on each system.
List of unused user accounts found on each system.
List of user privilege levels on each resource or system.
Vulnerability Assessments
Collect Store Organize Analyze Report
Perform when:
First deploy new/updated systems.
New vulnerabilities have been identified.
A security breach occurs.
Need to document security state of systems.
Vulnerability Scanning
Port scanner.
Protocol analyzer.
Packet analyzer.
Network enumerator.
Intelligence gathering.
Penetration Testing
Evaluate security by simulating an attack on a system.
Verify a threat exists.
Actively test and bypass security controls.
Exploit system vulnerabilities.
When compared to vulnerability assessment, it is:
Less common.
More intrusive.
An objective measurement.
A combination of multiple vulnerabilities to provide holistic understanding of vulnerability.
Follow real attacker’s methodology, including target preparation/research stages.
Difference between pen test and real attack is intent.
Need explicit permission of target organization.
Make sure organization knows test will not stop until attack is fully carried out.
Report should include:
Steps undertaken.
Weaknesses identified.
Recommendations.
Penetration Test Preparation
Who will commission the test?
Who will conduct the test?
How will the test be conducted?
What are the test’s limitations?
What tools will be used in the test?
The Penetration Test Process
Scanning Maintaining
Reconnaissance Exploitation Reporting
Access
Penetration Test Approaches
Black Box
Most effective at real-world evaluation.
Most time and effort.
Need to carefully consider who should know about the test.
White Box
More comprehensive evaluation because of broad perspective of organizational systems.
Might be too simulated – not able to account for attackers’ out of the box thinking.
Grey Box
Complex parameters needed to strike the perfect balance.
Full Amount of Reconnaissance None
Black Box Grey White
Test Box Test Box Test
Penetration Test Components
Component Description
Network scanning • Uses a port scanner to identify devices attached to target network and to enumerate
the applications hosted on the devices. This function is known as fingerprinting.
Social engineering • Attempts to get information from users to gain access to a system.
• Tests for adequate user training.
• Stay mindful of ethical implications of deceiving people.
• Don't want to undermine your employees' trust in you or their coworkers.
War dialing • Uses a modem and software to dial a range of phone numbers to locate computer
systems, PBX devices, and HVAC systems.
War driving • Locates/attempts to penetrate wireless systems from public property, like a sidewalk.
Vulnerability scanning • Exploits known weaknesses in operating systems and applications identified through
reconnaissance and enumeration.
Blind testing • Occurs when the target organization is not aware of penetration testing activities.
Targeted testing • Target organization is informed of the test.
• Less disruption to organization due to a more controlled climate.
Event Log Review
Event logs contain detailed information.
Often used to troubleshoot performance issues.
Should also review as part of security control test process.
Use an automated tool to help identify security events from mass of data.
May need to configure network devices to capture desired level of detail in a log.
Common logged activities include:
Authentication requests, both successful and
unsuccessful.
New user or group creation.
Group membership changes.
User privilege level changes.
Resource access, such as opening, changing,
and deleting files and folders.
Client requests for server services.
The number of transactions per hour of a
particular service.
Application or service shutdowns and restarts.
System shutdowns and restarts.
Service or system component errors and failures.
System policy changes.
Log Management Infrastructure
Centralize log collection from multiple devices.
Provides an intuitive dashboard:
Presents summary information.
Allows for easy further investigation.
Includes an automated tool that will:
Read/analyze logs.
Summarize findings.
Provide actionable next steps.
Back up/archive/purge older log entries to clear disk space.
Provide easy to read reports with trends analysis.
Log Management Policies
Prioritize requirements/goals based on:
Perceived risk reduction in resources needed
to perform log management functions.
Define log management roles and
responsibilities for key personnel.
Create and maintain in log management
infrastructure.
Specify resources and management
support for log management system.
Log Management Procedures
Monitor the status of all log
sources.
Monitor log rotation, backups, and
archiving processes.
Check for upgrades/patches for
logging software.
Test/deploy upgrades.
Maintain clock synchronization
between devices in the logging
console.
Regularly review logs after any
policy or technology changes.
Reconfigure logging as needed.
Document and report any anomalies
in log settings or processes.
Ensure logs are consolidated to a
central repository.
Synthetic Transaction
Used to investigate system response to specific activities or “what-if” scenarios:
A client request to a server.
A VoIP phone call.
A video conference call.
A security or performance "event."
An outside connection attempt.
The injection of "malicious" traffic into the network.
Guidelines for Implementing System Security
Control Testing
General Security Testing:
Test every one of your controls, whether administrative, logical, or physical.
Understand risks of conducting a security assessment, especially risk it may disrupt business
operations.
Adopt a security test strategy that can comprehensively support all of your organization's security needs.
Make sure your test output is robust and truly useful in evaluating effectiveness of security controls.
Vulnerability Assessment:
Follow a process of collecting, storing, organizing, analyzing, and reporting assessments.
Perform vulnerability assessments routinely and in response to security incidents when they arise.
Implement a vulnerability scan to detect weaknesses or misconfigurations in systems.
Choose scanning tools (e.g., port scanners/network enumerators) to help identify system issues.
Guidelines for Implementing System Security
Control Testing (Cont.)
Penetration Testing:
Remember purpose is to deliver a report on state of systems and how they can be improved.
In preparing for a pen test:
Outline who will conduct the test.
How they will do so.
Any limitations that need to be imposed on the test.
Follow a process of recon, scanning, exploitation, maintaining access, and reporting.
Understand advantages/disadvantages of white, black, and grey box testing approaches.
Use combination of components, like war driving and vulnerability scanning, in your pen test.
Log Reviews:
Review system event logs to identify security incidents.
Configure event logs to capture a feasible level of detail that won't overwhelm reviewer.
Track common security-related activities including failed login attempts or system policy changes.
Institute a log management infrastructure that can centralize and streamline logs from many
systems.
Draft log management policy to outline the how, what, and why of capturing logs.
Code Review
Systematic, manual review of code by someone other than the developer.
Attempt to find/fix problems developer overlooked.
Reviewer:
Peer developer.
Team lead.
Third party outside organization.
Common Software Vulnerabilities and Exploits
Error and exception handling.
Improper storage of sensitive data.
Buffer overflow.
Integer overflow.
Memory leaks.
SQL injection.
Session fixation.
Session prediction.
Cross-site scripting (XSS):
Stored attack.
Reflected attack.
DOM-based attack.
Cross-site request forgery (XSRF/CSRF).
Software Test Techniques
Knowledge of inner-workings
Black Box White Box
Who is running the test?
Manual Automated
Main( ) {
printf(“hello, world”); hello, world
Software execution state
}
Static Dynamic
Software Test Types
Test Type Description
Unit test • Simple "pass/no pass" test for a small piece of code.
• Done to ensure that code block:
• Performs exact action intended.
• Provides exact output expected.
• Ideally, there should be one unit test for every complete line or block of code.
• Most effective way to minimize software bugs.
Integration test • Individual components of a system are tested together to see if they interact as expected.
Interface test • Type of integration test that focuses on the interface between two systems/applications.
Functional test • Simulates a specific user interaction with system to see system's response and performance.
System test • Tests a complete, integrated system to verify that it satisfies the original specified requirements.
Acceptance test • End users try completed software to see if:
• They like it.
• They can easily use it.
• It satisfies business requirements.
Regression test • Runs same set of tests after some component of the application has been changed or updated.
• Done to make sure any updates or "fixes" don't "break" other functionality.
Misuse test • Identifies vulnerabilities and weaknesses in applications by validating input that app accepts, as
well as any other ways that an attacker could exploit app's behavior.
Test Cases
Set of conditions (scenario) in which a test will be performed.
Test checks to make sure the application works as expected.
Create at least one test case for every business requirement.
TEST CASE
Test case name: _____________
Description:
_____________________________
______________________________________
_
Business requirement: ____________________
Pre-conditions: __________________________
______________________________________
Step Step Test Expected Results
_ Description Data
1
Post-conditions: _________________________
______________________________________
Code-Based Testing
Identifies and makes up for gaps in positive testing process.
Test every reachable, functional element of code.
Within practical time/cost constraints.
Unit tests for every code statement.
Wherever code has not been tested, create additional test cases to cover gaps.
Challenges logic decisions made by application, looking for “dead” code.
Interface Testing
Confirms that information passed between software components is secure.
Crucial for connection-based software like web apps.
App needs secure way to recover from interrupted/terminated connection.
Need to test app functionality in all major browsers.
Additional scenarios that require interface testing:
How your app handles general errors and exceptions.
How your app handles copy and paste functionality.
How your app handles resuming paused downloads.
How your app handles third-party extensions, like browser plugins.
Whether or not your app supports encryption in all necessary contexts.
Whether or not your app supports cross-platform interfaces.
Misuse Testing
Determines if app can handle invalid input or “illegal” user activities.
No expected result.
Encourages developers to:
Consider what might happen to code in less-than-ideal world.
Embed security during development process.
Write negative test cases to see if you can crash/break into application.
Good way to enhance threat modeling process.
Examples:
Input validation techniques, such as:
Entering illegal values or characters in an input field.
Deliberately leaving required fields empty.
Attempting to exceed the allowed number of characters in an input field.
Entering data that exceeds the limits of the data type defined for that field.
Pressing unusual or unexpected key combinations while the software is executing.
Fuzzing, or inputting massive amounts of random data in an attempt to make the system crash.
Repeatedly opening and closing the soft keyboard in a mobile phone app.
Editing client side JavaScript in a web page and then submitting that page to the web server.
Attempting to open a web page without first performing a required login.
Can be applied to computer systems and hardware.
Should include end users in testing.
Test Coverage Analysis
Not practical to test every single line of code.
Test coverage = how much of code is actually tested.
Most software development environments include code testing suites.
Most organizations target a specific percentage.
Effective test coverage focuses on how much of logic/functionality of code has
been tested, not just percentage of overall code.
Guidelines for Conducting Software Security
Control Testing
Train developers on importance of incorporating security into development process.
Implement code reviews for all software organization writes or integrates with.
Become familiar with common app software vulnerabilities.
Consider advantages/disadvantages of various software testing techniques:
How each technique might be best used in your organization.
Consider different test types and how they might help evaluate software security.
Outline thorough test cases that address business requirements.
Use interface testing to ensure that information flows securely between components.
Use input validation and similar techniques to test for misuse cases.
Create a baseline for test coverage analysis:
Ensure testing processes cover most important elements of source code.
Ensure that any software acquired by a third party vendor is up-to-date.
Don't be overly verbose in error messages displayed to users:
Don’t want to give an attacker a vector to exploit your software.
Separate systems testing software from those running production software.
Information Security Continuous Monitoring
Ongoing process identifying controls, vulnerabilities, and threats as they relate
to risk management policies.
Comprehensive/hierarchical system of data gathering, reporting, and responses.
Purpose:
It maps risk to tolerance.
It adapts to ongoing needs.
It actively involves management.
Automated data collection tools:
IDS.
OS system event logs.
Inventory/asset control systems.
Firewalls.
Configuration/change management systems.
Manual data collection processes:
Training exercise results.
BCP/DR test results.
Management/staff analyses.
ISCM Data Collection
Physical asset location.
Logical asset location (IP addresses and subnets).
Numbers of identified MAC addresses.
Violations of network policy.
Number and severity of vulnerabilities discovered.
Number and severity of vulnerabilities that have been remediated.
Number of unauthorized access attempts.
Variances in configuration baselines.
DR/BCP plan testing dates and results.
Number of staff that have successfully completed security awareness training.
Risk tolerance thresholds that have been exceeded.
Risk scores for specific systems.
ISCM Tiers
s
Tier 1
ed
rs F e
Organization
ge ta
rig Da Risk Tolerance, Governance,
e T ion
Policy, Strategy
ns at
po rm
Tier 2
es nfo
Data Resources
tR I
er ed
Mission/Business Processes
Al elat
yR
Collection, Correlation,
Analysis, Reporting
rit
cu
Se
Data
Tier 3 Resources
Information Systems
Collection, Correlation,
Analysis, Reporting
Based on Figure 2-1 in NIST SP 800-137
ISCM Implementation
Goals
Have visibility into all your assets.
Detect anomalies and changes in your operating environment and information systems.
Be aware of your vulnerabilities.
Update:
Your knowledge of threats.
Effectiveness of your security controls.
Your overall security status.
Your overall compliance status.
Rollout must include:
Active involvement by upper management.
Configuration management and change control processes throughout your SLC/SDLC.
Security impact analyses on your information systems, monitoring for any changes.
Continuous assessment of your security controls.
Assess controls on the higher-impact and more volatile systems more frequently.
Accurate and up-to-date security status communications and alerts to management.
Management response.
NIST SP 800-137
ISCM strategy that includes:
Understanding organization's tolerance for risk.
Defining and implementing meaningful metrics for measuring security status at all organizational tiers.
Continuous assessment of effectiveness of your security controls.
Verification of compliance.
IT asset management.
Effective change control.
Continually updated awareness of threats and vulnerabilities.
Figure D-2 in NIST SP 800-137
Key Performance Indicators
Metric used to gauge performance.
Measure something management considers vital to company success.
KPIs often presented visually in a dashboard.
Helpful for:
Viewing trends.
Evaluating levels of risk.
Analyze data with criteria (time of day, location) as long as data contains those details.
Most software allows user to drill down into details.
Security KPIs
The number of vulnerabilities, by service level, that have been discovered and
remediated.
The number of failed logins or unauthorized access attempts.
The number of systems currently out of compliance with security requirements.
The number of security incidents reported within the last month.
The average response time for a security incident.
The average time required to resolve a help desk call.
The current number of outstanding or unresolved technical issues in a project or
system.
The number of employees who have completed security training.
Percentage of test coverage on applications being developed in-house.
Security Awareness and Training
Tracking staff training levels is critical KPI.
Helps discover latent risks.
Establish training programs with clear objectives to assist in data collection.
Awareness
Communication Education
Data Collection Process Components
Component Description
Account Management • Monitoring user account data including active users, login attempts, group memberships,
and privilege levels.
• Monitoring and managing which accounts have right to collect and report on user data.
DR/BCP • You should include DR response/resolution time as part of your normal KPI dashboard.
Backup Verification • Any issues related to backups and restores should be tracked as KPI.
Management Review • Security process data collection system needs to be reviewed regularly by management
for its effectiveness in shaping security policy and making wise security decisions.
Guidelines for Collecting Security Process Data
Recognize you must help organization regularly fine-tine its security processes.
Implement an ISCM process that takes on an ongoing stance in identifying:
Controls.
Vulnerabilities.
Threats.
Ensure that your ISCM program has involvement of upper management.
Use configuration management/change control processes through systems lifecycle.
Incorporate NIST SP 800-137 strategies for continuous monitoring.
Use KPIs to measure effectiveness of security processes.
Use KPIs that reflect user awareness/training to measure programs’ effectiveness.
Establish awareness programs with clear objectives to facilitate data collection.
Internal and External Audits
Audit Description
Internal • Audit department is an important ally in the detection of computer crimes.
• Internal audit departments review processes, logs, and transactions to ensure compliance with
generally accepted principles of operational and regulatory requirements.
External • Some enterprises also required to submit to an audit by an external auditing organization.
• Purpose is to provide additional oversight while validating/verifying compliance.
Audit Preparation
Define scope of audit and expected time period.
Identify any security controls in place.
Review organization's audit readiness.
Offer recommendations to address any issues found while reviewing audit readiness.
Discuss preferred approach in performing the audit and receiving results.
Foundation Documents
All audits include documents with federal/state requirements.
Often in checklist form.
Example – PCI DSS v3.
Audit Best Practices
Have an overall plan for the auditing process.
Collect data in advance of on-site auditing to save time.
Meet with key data handlers and owners involved in the auditing process.
Conduct tests on the necessary systems.
Analyze the on-site information off site.
Issue regular reports, usually weekly, about the status of the auditing process.
Have management review a draft of your report before finalizing it.
In report to management, include recommendations for fixing any outstanding issues.
BEST
PRACTICES
Service Organization Control (SOC)
SOC 1 • Service provider describes objectives/controls that impact financial information.
• Reports limited in scope to financial reporting.
• Reports extend beyond just financial interests.
SOC 2 • Focus on the CIA of information as well as general security and privacy
concerns.
• Short version of SOC 2.
SOC 3 • Omits detailed descriptions of test controls and related results.
SOC 2/3 Trust Services Principles
Principle Sample Criteria
Security • The organization documents security policy.
• The organization communicates its policy to relevant parties.
• The organization implements procedures that support policy.
Confidentiality • The organization documents policy for protecting the confidentiality of information.
• The organization communicates breaches of confidentiality to affected clients.
• There are procedures in place to ensure information is only disclosed to authorized parties.
Processing • The organization has a policy of identifying and responding to violations of integrity.
Integrity • The organization communicates the expected state of goods and services to its clients.
• The organization provides warranty information for clients.
Availability • The organization assesses risk to availability periodically.
• The organization provides business continuity services.
• Changes to availability are communicated to management.
Privacy • The organization has a management framework for its policies and procedures.
• The organization provides privacy policy notices to clients, including how personal information
will be used.
• The organization informs clients of their choices and obtains consent to handle their personal
information.
Guidelines for Conducing Audits
Use internal audits to verify compliance with:
Organization policy.
Established industry principles of security.
Understand you may be subject to external audits, especially when it comes to legal
and regulatory compliance.
Take steps to prepare for an audit of your systems.
In internal audits, use a foundation document as a checklist:
Compare a list of requirements against current state of organization.
Incorporate best practices into auditing process.
Consider using SOC 1 if your organization primarily deals with financial reporting.
Consider SOC 2 for more comprehensive/detailed accounting of organizational
security.
Consider SOC 3 for a more general audience that doesn't require a lot of detail.
Review the principles and criteria of SOC 2 and SOC 3.
Reflective Questions
1. Do you conduct penetration tests in your organization? If so, what type of tests
do you perform? If not, what other testing do you do on systems and security?
2. What sort of KPIs do you use in your organization to measure security
performance?