SP 800-115 gives the following explanation of what we are endeavoring to accomplish with the testing, evaluations and assessments. “An information security
assessment is the process of determining how effectively an entity being assessed (e.g., host, system, network, procedure, person—known as the assessment
object) meets specific security objectives. Three types of assessment methods can be used to accomplish this—testing, examination, and interviewing.
Testing is the process of exercising one or more assessment objects under specified conditions to compare actual and expected behaviors.
Examination is the process of checking, inspecting, reviewing, observing, studying, or analyzing one or more assessment objects to facilitate understanding, achieve clarification, or obtain evidence.
Interviewing is the process of conducting discussions with individuals or groups within an organization to facilitate understanding, achieve clarification, or identify the location of evidence. Assessment results are used to support the determination of security control effectiveness over time.”
3
For each of the methods below, I will first provide the NIST guidance, and then supplement the process flows with additional suggested techniques and procedures that I have used during assessments over the years.
Please note: The “bold” attribute used in each of these three method descriptions is directly from the NIST documents.
Interviews
Interviewing is the process of conducting discussions with individuals or groups within an organization to facilitate understanding, achieve clarification, or identify the location of evidence. This process for the assessment is often the initial step accomplished once the assessment team is on-site and ready to start the actual evaluation.
Method: Interview.
Assessment objects: Individuals or groups of individuals.
Definition: The process of conducting discussions with individuals or groups within an organization to facilitate understanding, achieve clarification, or lead to the location of evidence, the results of which are used to support the determination of security and privacy control existence, functionality, correctness, completeness, and potential for improvement over time.
Guidance: Typical assessor actions may include, for example, interviewing agency heads, chief information officers, senior agency information security officers, authorizing officials, information owners, information system and mission owners, information system security officers, information system security managers, personnel officers, human resource managers, facilities managers, training officers, information system operators, network and system administrators, site managers, physical security officers, and users.
Security Content Automation Protocol (SCAP)-validated tools that support the Open Checklist Interactive Language (OCIL) component specification may be used to automate the interview process for specific individuals or groups of individuals. The resulting information can then be examined by assessors during the security and privacy control assessments.
Attributes: Depth and coverage:
• The depth attribute addresses the rigor of and level of detail in the interview process. There are three possible values for the depth attribute:
• Basic:
- Basic interview: Interview that consists of broad-based, high-level discussions with individuals or groups of individuals. This type of interview is conducted using a set of generalized, high-level questions. Basic interviews provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors.
• Focused:
- Focused interview: Interview that consists of broad-based, high-level discussions and more in-depth discussions in specific areas with individuals or groups of individuals. This type of interview is conducted using a set of generalized, high-level questions and more in-depth questions in specific areas where responses indicate a need for more in-depth investigation. Focused interviews provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
• Comprehensive:
- Comprehensive interview: Interview that consists of broad-based, high-level discussions and more in-depth, probing discussions in specific areas with individuals or groups of individuals. This type of interview is conducted using a set of generalized, high-level questions and more in-depth, probing questions in specific areas where responses indicate a need for more in-depth investigation. Comprehensive interviews provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are further increased grounds for confidence that the controls are implemented correctly and operating as intended on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.
• The coverage attribute addresses the scope or breadth of the interview process and includes the types of individuals to be interviewed (by organizational role and associated responsibility), the number of individuals to be interviewed (by type), and specific individuals to be interviewed. (The organization, considering a variety of factors (e.g., available resources, importance of the assessment, the organization’s overall assessment goals and objectives), confers with assessors and provides direction on the type, number, and specific individuals to be interviewed for the particular attribute value described.) There are three possible values for the coverage attribute:
• Basic:
- Basic interview: Interview that uses a representative sample of individuals in key organizational roles to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors.
• Focused:
- Focused interview: Interview that uses a representative sample of individuals in key organizational roles and other specific individuals deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
• Comprehensive:
- Comprehensive interview: Interview that uses a
sufficiently large sample of individuals in key organizational roles and other specific individuals deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are
further increased grounds for confidence that the controls are implemented correctly and operating as intended
on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.
4
Examinations
Examinations primarily involve the review of documents such as policies, procedures, security plans, security requirements, standard operating procedures, architecture diagrams, engineering documentation, asset inventories, system configurations, rule-sets, and system logs. They are conducted to determine whether a system is properly documented, and to gain insight on aspects of security that are only available through documentation. This documentation identifies the intended design, installation, configuration, operation, and maintenance of the systems and network, and its review and cross-referencing ensures conformance and consistency. For example, an environment’s security requirements should drive documentation such as system security plans and standard operating procedures—so assessors should ensure that all plans, procedures, architectures, and configurations are compliant with stated security requirements and applicable policies. Another example is reviewing a firewall’s rule-set to ensure its compliance with the organization’s security policies regarding Internet usage, such as the use of instant messaging, peer-to-peer (P2P) file sharing, and other prohibited activities.
Examinations typically have no impact on the actual systems or networks in the target environment aside from accessing necessary documentation, logs, or rule-sets. One passive testing technique that can potentially impact networks is network sniffing, which involves connecting a sniffer to a hub, tap, or span port on the network. In some cases, the connection process requires reconfiguring a network device, which could disrupt operations. However, if system configuration files or logs are to be retrieved from a given system such as a router or firewall, only system administrators and similarly trained individuals should undertake this work to ensure that settings are not inadvertently modified or deleted.5
Method: Examine.
Assessment objects: Specifications (e.g., policies, plans, procedures, system requirements, designs), mechanisms (e.g., functionality implemented in hardware, software, firmware), and activities (e.g., system operations, administration, management, exercises).
Definition: The process of checking, inspecting, reviewing, observing, studying, or analyzing one or more assessment objects to facilitate understanding, achieve clarification, or obtain evidence, the results of which are used to support the determination of security and privacy control existence, functionality, correctness, completeness, and potential for improvement over time.
Supplemental guidance: Typical assessor actions may include, for example, reviewing information security policies, plans, and procedures; analyzing system design documentation and interface specifications; observing system backup operations; reviewing the results of contingency plan exercises; observing incident response activities; studying technical manuals and user/administrator guides; checking, studying, or observing the operation of an information technology mechanism in the information system hardware/software; or checking, studying, or observing physical security measures related to the operation of an information system.
SCAP-validated tools that support the OCIL component specification may be used to automate the collection of assessment objects from specific, responsible individuals within an organization. The resulting information can then be examined by assessors during the security and privacy control assessments.
Attributes: Depth and coverage:
• The depth attribute addresses the rigor of and level of detail in the examination process. There are three possible values for the depth attribute:
• Basic:
- Basic examination: Examination that consists of high-level reviews, checks, observations, or inspections of the assessment object. This type of examination is conducted using a limited body of evidence or documentation (e.g., functional-level descriptions for mechanisms; high-level process descriptions for activities; actual documents for specifications). Basic examinations provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors.
• Focused:
- Focused examination: Examination that consists of high-level reviews, checks, observations, or inspections and more in-depth studies/analyses of the assessment object. This type of examination is conducted using a substantial body of evidence or documentation (e.g., functional-level descriptions and, where appropriate and available, high-level design information for mechanisms; high-level process descriptions and implementation procedures for activities; the actual documents and related documents for specifications). Focused examinations provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
• Comprehensive:
- Comprehensive examination: Examination that consists of high-level reviews, checks, observations, or inspections and more in-depth, detailed, and thorough studies/analyses of the assessment object. This type of examination is conducted using an extensive body of evidence or documentation (e.g., functional-level descriptions and, where appropriate and available, high-level design information, low-level design information, and implementation information for mechanisms; high-level process descriptions and detailed implementation procedures for activities; the actual documents and related documents for specifications (while additional documentation is likely for mechanisms when moving from basic to focused to comprehensive examinations, the documentation associated with specifications and activities may be the same or similar for focused and comprehensive examinations, with the rigor of the examinations of these documents being increased at the comprehensive level)). Comprehensive examinations provide a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are further increased grounds for confidence that the controls are implemented correctly and operating as intended on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.
• The coverage attribute addresses the scope or breadth of the examination process and includes the types of assessment objects to be examined, the number of objects to be examined (by type), and specific objects to be examined. (The organization, considering a variety of factors (e.g., available resources, importance of the assessment, the organization’s overall assessment goals and objectives), confers with assessors and provides direction on the type, number, and specific objects to be examined for the particular attribute value described).
There are three possible values for the coverage attribute:
(i) Basic
- Basic examination: Examination that uses a representative sample of assessment objects (by type and number within type) to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors.
(ii) Focused
- Focused examination: Examination that uses a representative sample of assessment objects (by type and number within type) and other specific assessment objects deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
(iii) Comprehensive
- Comprehensive examination: Examination that uses a sufficiently large sample of assessment objects (by type and number within type) and other specific assessment objects deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are further increased grounds for confidence that the controls are implemented correctly and operating as intended on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.6
Observations
It is a fact that considerable data can be collected by just observing. Direct observation is an underused and valuable method for collecting evaluation information. “Seeing” and “listening” are key to observation. Observation provides the opportunity to document activities, behavior, and physical aspects without having to depend on peoples’ willingness and ability to respond to interview questions or isolate the exact document which is relevant to a particular control or risk. As described above from the examination phase of testing, observing practices and implementation of procedures is often a good evaluation method for review of control implementation, especially in the operational control families. Often we are observing system backup operations, observing incident response activities, observing the operation of an information technology mechanism in the information system hardware/software, or observing physical security measures related to the operation of an information system.
The two primary methods of observation which I employ are the use of system-level demonstrations and what I term “security walk-throughs.” System demonstrations are a great way to observe the actions of the system under review, especially for systems which are new or recently modified. The demonstration should include log-in actions, standard system processing, and system-level interactions and reporting. There are a lot of areas of security control implementations which can be observed during demonstrations. Access controls and identification and authorization actions to log into a system, password protection, and methods of system and communications protection are all areas of significance and focus during an observation of system demonstration of functionality and processing.
A “security walk-through inspection” consists of my observer walking throughout a facility and looking for the methods of physical security employed within the facility. Identification of physical access controls for sections of the facility, methods of intrusion detection, deployed fire detection and suppression components, and HVAC controls which are used in the facility are all “observed” and accounted for during the walk-through and recorded in the assessment report as examined through observation. I have employed this technique to conduct focused examinations.
Document Reviews
Documentation review determines if the technical aspects of policies and procedures are current and comprehensive. These documents provide the foundation for an organization’s security posture, but are often overlooked during technical assessments. Security groups within the organization should provide assessors with appropriate documentation to ensure a comprehensive review. Documents are reviewed for technical accuracy and completeness will include security policies, architectures, and requirements; standard operating procedures; system security plans and authorization agreements; memoranda of understanding and agreement for system interconnections; and incident response plans.
Documentation review can discover gaps and weaknesses that could lead to missing or improperly implemented security controls. Assessors typically verify that the organization’s documentation is compliant with standards and regulations such as FISMA, and look for policies that are deficient or outdated. Common documentation weaknesses include OS security procedures or protocols that are no longer used, and failure to include a new OS and its protocols. Documentation review does not ensure that security controls are implemented properly – only that the direction and guidance exist to support security infrastructure.
Results of documentation review can be used to fine-tune other testing and examination techniques. For example, if a password management policy has specific requirements for minimum password length and complexity, this information can be used to configure password-cracking tools for more efficient performance.
1. “Gap” analysis process: The initial document reviews which start off the assessment process often provide data to the assessor and the assessment team to use in providing focus and pinpoint areas of potential concern to review, examine, evaluate, and test the system or application under review. These document reviews allow the assessment team to identify recently repaired controls, areas of volatility in controls and protection, and potentially areas overlooked or reduced in strength of control protection. This leads to a “gap analysis” which can determine what requirements, operational criteria, security objectives, and compliance needs are and are not being met by the system under review. This “gap analysis” process often has been used to discover areas of weakness in policies, procedures, and reporting for systems and applications. The “gap analysis” process which I often used is described as follows:
(a) Review each authorization package using the 15-step methodology outlined below. Using this defined process for review promotes consistency and quality of package analysis.
- Review current documents for completeness and accuracy, based on the established security baseline.
- Review current documents for System Security Classification and level determination.
- Catalog current documents into security areas.
- Develop mapping for current documents to FISMA, DOD IA regulations (if applicable), NIST guidance, US governmental agency regulations (if applicable), and FIPS standards (DODI 8510.01, SP 800-37, SP 800-53, SP 800-53A, FIPS-199, etc.).
- Review current documents for mapping status.
- Identify preliminary documents, policies, plans, or procedures with questionable mapping status.
- Research any missing or unclear policies, procedures, plans, or guidelines in support documentation.
- Develop questions and issue report for customer remediation, answers, and identification.
- Identify agency standards and guidelines for document creation and development.
- Develop missing and required policies, plans, or procedures, as required such as:
(i) System of Record Notice (SORN) to register the system
(ii) Residual risk assessment
(iii) Plan of Action and Milestone (POAM)
(iv) Any additional Assessment and Authorization (A&A)-related artifacts as part of the submission package, such as:
• Security Concept of Operations (CONOPS)
• Security policies
• Security architecture drawings and documents
• Security User Security Manual and Standing Operating Procedures (USM/SOP)
• Continuity of Operations (COOP)
• Incident Response Plan
• Contingency Plan
• Configuration Management Plan
- Submit these developed documents for review, comment, revision, and approval.
- Once all documents and questions are answered, review vulnerability scans for actual technical controls implemented versus controls documented.
- Develop report on controls assessment.
- Complete required RMF certification and accreditation worksheets, documents, and forms.
- Develop SCA ATO Recommendation and Risk Assessment Reports, IAW the agency requirements.
- The completed review is then submitted to the quality assurance review for the internal consistency, completeness, and correctness (3C) review.
(i) The consistency, completeness, and correctness of the documentation are determined, and if quality standards are met, the documentation is then passed on to final submittal phase.
Testing
Testing involves hands-on work with systems and networks to identify security vulnerabilities, and can be executed across an entire enterprise or on selected systems. The use of scanning and penetration techniques can provide valuable information on potential vulnerabilities and predict the likelihood that an adversary or intruder will be able to exploit them. Testing also allows organizations to measure levels of compliance in areas such as patch management, password policy, and configuration management.
Although testing can provide a more accurate picture of an organization’s security posture than what is gained through examinations, it is more intrusive and can impact systems or networks in the target environment. The level of potential impact depends on the specific types of testing techniques used, which can interact with the target systems and networks in various ways—such as sending normal network packets to determine open and closed ports, or sending specially crafted packets to test for vulnerabilities. Any time that a test or tester directly interacts with a system or network, the potential exists for unexpected system halts and other denial of service conditions. Organizations should determine their acceptable levels of intrusiveness when deciding which techniques to use. Excluding tests known to create denial of service conditions and other disruptions can help reduce these negative impacts.
Testing does not provide a comprehensive evaluation of the security posture of an organization, and often has a narrow scope because of resource limitations—particularly in the area of time. Malicious attackers, on the other hand, can take whatever time they need to exploit and penetrate a system or network. Also, while organizations tend to avoid using testing techniques that impact systems or networks, attackers are not bound by this constraint and use whatever techniques they feel necessary. As a result, testing is less likely than examinations to identify weaknesses related to security policy and configuration. In many cases, combining testing and examination techniques can provide a more accurate view of security.7
Method: Test.
Assessment objects: Mechanisms (e.g., hardware, software, and firmware) and activities (e.g., system operations, administration, management; exercises).
Definition: The process of exercising one or more assessment objects under specified conditions to compare actual with expected behavior, the results of which are used to support the determination of security and privacy control existence, functionality, correctness, completeness, and potential for improvement over time. (Testing is typically used to determine if mechanisms or activities meet a set of predefined specifications. It can also be performed to determine characteristics of a security or privacy control that are not commonly associated with predefined specifications, with an example of such testing being penetration testing.)
Supplemental guidance: Typical assessor actions may include, for example, testing access control, identification and authentication, and audit mechanisms; testing security configuration settings; testing physical access control devices; conducting penetration testing of key information system components; testing information system backup operations; testing incident response capability; and exercising contingency planning capability.
SCAP-validated tools can be used to automate the collection of assessment objects and evaluate these objects against expected behavior. The use of SCAP is specifically relevant to the testing of mechanisms that involve assessment of actual machine state. The National Checklist Program catalogs a number of SCAP-enabled checklists that are suitable for assessing the configuration posture of specific operating systems and applications. SCAP-validated tools can use these checklists to determine the aggregate compliance of a system against all of the configuration settings in the checklist (e.g., CM-6) or specific configurations that are relevant to a security or privacy control that pertains to one or more configuration settings. SCAP-validated tools can also determine the absence of a patch or the presence of a vulnerable condition. The results produced by the SCAP tools can then be examined by assessors as part of the security and privacy control assessments.
Attributes: Depth and coverage:
• The depth attribute addresses the types of testing to be conducted. There are three possible values for the depth attribute:
• Basic testing:
- Basic testing: Test methodology (also known as black box testing) that assumes no knowledge of the internal structure and implementation detail of the assessment object. This type of testing is conducted using a functional specification for mechanisms and a high-level process description for activities. Basic testing provides a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors.
• Focused testing:
- Focused testing: Test methodology (also known as gray box testing) that assumes some knowledge of the internal structure and implementation detail of the assessment object. This type of testing is conducted using a functional specification and limited system architectural information (e.g., high-level design) for mechanisms and a high-level process description and high-level description of integration into the operational environment for activities. Focused testing provides a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
• Comprehensive testing:
- Comprehensive testing: Test methodology (also known as white box testing) that assumes explicit and substantial knowledge of the internal structure and implementation detail of the assessment object. This type of testing is conducted using a functional specification, extensive system architectural information (e.g., high-level design, low-level design) and implementation representation (e.g., source code, schematics) for mechanisms, and a high-level process description and detailed description of integration into the operational environment for activities. Comprehensive testing provides a level of understanding of the security and privacy controls necessary for determining whether the controls are implemented and free of obvious errors and whether there are further increased grounds for confidence that the controls are implemented correctly and operating as intended on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.
• The coverage attribute addresses the scope or breadth of the testing process and includes the types of assessment objects to be tested, the number of objects to be tested (by type), and specific objects to be tested. (The organization, considering a variety of factors (e.g., available resources, importance of the assessment, the organization’s overall assessment goals and objectives), confers with assessors and provides direction on the type, number, and specific objects to be tested for the particular attribute value described. For mechanism-related testing, the coverage attribute also addresses the extent of the testing conducted (e.g., for software, the number of test cases and modules tested; for hardware, the range of inputs, number of components tested, and range of environmental factors over which the testing is conducted.)
There are three possible values for the coverage attribute:
• Basic:
- Basic testing: Testing that uses a representative sample of assessment objects (by type and number within type) to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors.
• Focused:
- Focused testing: Testing that uses a representative sample of assessment objects (by type and number within type) and other specific assessment objects deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are increased grounds for confidence that the controls are implemented correctly and operating as intended.
• Comprehensive:
- Comprehensive testing: Testing that uses a
sufficiently large sample of assessment objects (by type and number within type) and other specific assessment objects deemed particularly important to achieving the assessment objective to provide a level of coverage necessary for determining whether the security and privacy controls are implemented and free of obvious errors and whether there are
further increased grounds for confidence that the controls are implemented correctly
and operating as intended
on an ongoing and consistent basis, and that there is support for continuous improvement in the effectiveness of the controls.
8
Automated
NIST SP 800-115 has a large discussion on automated testing techniques and tools available to run against systems to verify their compliance and programmatic actions against known baselines. These test mechanisms include vulnerability scanning, file and directory integrity checking, penetration testing, and others. We will discuss these tools and techniques in
Chapter 10.
Manual
Manual testing is often used by assessors to provide a method of processing review and output business rule confirmation by applying known test data inputs and manually either walking through the process a step at a time or allowing the system to run with the manually developed test “scripts” and reading the results. Once the results are produced, a comparison is performed to ensure there are no anomalies between the expected results and the actual results. These manual processes have a long history within the auditing community wherein the auditors would apply test data to a system, run Computer-Aided Audit Tools (CAAT) or manually generated code snippets known as “scripts” against the system with the test data loaded, and compare the output with the expected results to verify processing, business rules, and coded logic patterns within the system under test.