Glossary and Acronyms

Abuse: Malicious misuse, with the objective of intentional denial, alteration, or destruction. See misuse.

Acceptance testing: A type of testing used to determine whether or not software is acceptable to the actual users.

Access: A specific type of interaction between a subject and an object that results in the flow of information from one to the other.

Access control mechanisms: Hardware or software features, operating procedures, management procedures, and various combinations thereof that are designed to detect and prevent unauthorized access and to permit authorized access in an automated system.

Access control: The process of limiting access to system or software resources only to authorized users, programs, processes, or other systems (on a network). This term is synonymous with controlled access and limited access.

Access list: A list of users, programs, and/or processes and the specifications of access categories to which each is assigned; a list denoting which users, programs, and/or processes have what access privileges to a particular resource.

Access point (AP): A wireless LAN transceiver interface between a wireless network and a wired network. Access points forward frames between wireless devices and hosts on the LAN.

Access type: The nature of an access right to a particular device, program, or file (e.g., read, write, execute, append, modify, delete, or create).

Accountability: A property that allows auditing of IT system activities to be traced to persons or processes that may then be held responsible for their actions. Accountability includes authenticity and nonrepudiation.

Accreditation: A formal declaration by the designated approving authority (DAA) that the automated information system (AIS) is approved to operate in a particular security mode by using a prescribed set of safeguards. Accreditation is the official management authorization for operation of an AIS and is based on the certification process as well as other management considerations. The accreditation statement affixes security responsibility with the DAA and shows that due care has been taken for security.

Accreditation authority: Synonymous with designated approving authority.

Acquisition: The act of procurement or purchase of a product or service under license, subscription, or contract.

Advanced Encryption Standard (AES) (originally, Rijndael): A symmetric block cipher with a block size of 128 bits, in which the key can be 128, 192, or 256 bits. The Advanced Encryption Standard replaces the Data Encryption Standard (DES) and was announced on November 26, 2001 as Federal Information Processing Standard Publication 197 (FIPS PUB 197).

AIS: Automated information system.

API: Application programming interface.

Application layer: The top layer of the Open Systems Interconnection (OSI) model, which is concerned with application programs. It provides services such as file transfer and email to the network’s end users.

Application process: An entity, either human or software, that uses the services offered by the application layer of the OSI reference model.

Application programming interface (API): An interface to a library of software functions. An API is designed to allow software developers to call functions from the library and to make requests of an operating system or another software component.

Application software: Software that accomplishes functions such as database access, electronic mail, and menu prompts.

Architecture: When referring to a computer system, architecture describes the type of components, interfaces, and protocols the system uses and how they fit together. It is the configuration of any equipment or interconnected system or subsystems of equipment that is used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of data or information; includes computers, ancillary equipment, and services, including support services and related resources.

Asset: An object of value. In the context of this book, an asset could be a computer system, network, PLC, plant equipment, controllers, data acquisition systems, and so on.

Assurance: A measure of confidence that the security features and architecture of an AIS accurately mediate and enforce the security policy; the grounds for confidence that an IT product or system meets its security objectives.

Asymmetric (public) key encryption: Cryptographic system that employs two keys, a public key and a private key. The public key is made available to anyone wishing to send an encrypted message to an individual holding the corresponding private key of the public-private key pair. Any message encrypted with one of these keys can be decrypted with the other. The private key is always kept private. It should not be possible to derive the private key from the public key.

Asynchronous transmission: The transmission of data at a variable bit rate without the use of an external clock signal and the timing required to decode the data is contained within the communication stream.

Attack: The act of trying to bypass security controls on a system. An attack can be active, resulting (or potentially resulting) in data modification, or passive, resulting (or potentially resulting) in the release of data. Note: The fact that an attack is made does not necessarily mean that it will succeed. The degree of success depends on the vulnerability of the system or activity and the effectiveness of existing countermeasures.

Audit trail: A chronological record of system activities that is sufficient to enable the reconstruction, reviewing, and examination of the sequence of environments and activities surrounding or leading to an operation, a procedure, or an event in a transaction from its inception to its final result.

Authenticate: (1) To verify the identity of a user, device, or other entity in a computer system, often as a prerequisite to allowing access to system resources; (2) To verify the integrity of data that have been stored, transmitted, or otherwise exposed to possible unauthorized modification.

Authentication: The process of verifying “who” is at the other end of a transmission.

Authentication device: A device whose identity has been verified during the lifetime of the current link based on the authentication procedure.

Authenticity: A property that allows the ability to validate the claimed identity of a system entity.

Authorization: The granting of access rights to a user, program, or process.

Automated information system (AIS): An assembly of computer hardware, software, and/or firmware that is configured to collect, create, communicate, compute, disseminate, process, store, and/or control data or information.

Automated information system security: Measures and controls that protect an AIS against denial of service (DoS) and unauthorized (accidental or intentional) disclosure, modification, or destruction of an AIS and data. AIS security includes consideration of all hardware and/or software functions, characteristics, and/or features; operational procedures, accountability procedures, and access controls at the central computer facility, remote computers and terminal facilities; management constraints; physical structures and devices; and personnel and communication controls that are needed to provide an acceptable level of risk for the AIS and for the data and information contained in the AIS. It includes the totality of security safeguards needed to provide an acceptable protection level for an AIS and for data handled by an AIS.

Automated security monitoring: The use of automated procedures to ensure that security controls are not circumvented.

Availability: Refers to the ability of an authorized person or process to have prompt and reliable access to computational resources, when needed.

Availability of data: The condition in which data is in the place needed by the user, at the time the user needs it, and in the form needed by the user.

Backbone network: A network that interconnects other networks.

Back door: Synonymous with trap door.

Backup plan: Synonymous with contingency plan.

Bandwidth: Specification of the amount of the frequency spectrum that is usable for data transfer. In other words, bandwidth is the maximum data rate a signal can attain on the medium without encountering significant attenuation (loss of power). Also, the amount of information that may be sent through a connection.

Bell-LaPadula model: A formal state transition model of computer security policy that describes a set of access control rules. In this formal model, the entities in a computer system are divided into abstract sets of subjects and objects. The notion of a secure state is defined, and each state transition preserves security by moving from secure state to secure state, thereby inductively proving that the system is secure. A system state is defined to be secure only if the permitted access modes of subjects to objects are in accordance with a specific security policy. In order to determine whether a specific access mode is allowed, the clearance of a subject is compared to the classification of the object, and a determination is made as to whether the subject is authorized for the specific access mode. Also see * property (or star property) and simple security property.

Between-the-lines entry: Unauthorized access obtained by tapping the temporarily inactive terminal of a legitimate user. See piggyback.

Binary digit: See bit.

Biometrics: Access control method in which an individual’s physiological or behavioral characteristics are used to determine that individual’s access to a particular resource.

BIOS: Basic input/output system. The BIOS is the first program to run when the computer is turned on. BIOS initializes and tests the computer hardware, loads and runs the operating system, and manages setup for making changes in the computer.

Bit: Short for binary digit. A single digit number in binary (0 or 1).

Bit decay: The gradual loss of information stored in bits on storage media over time.

Bit splitting: Splitting bits into groups of bits and processing the groups such that after they are recombined the correct result is obtained. A countermeasure to a differential power analysis attack that does not expose the target bits internally to the processor, so the power trace is not affected.

Black box test: A test of system security in which an ethical hacking team has no knowledge of the target network.

Black hat hacker: A hacker who conducts unethical and illegal attacks against information systems to gain unauthorized access to sensitive information.

Block cipher: A symmetric key algorithm that operates on a fixed-length block of plaintext and transforms it into a fixed-length block of ciphertext. A block cipher is obtained by segregating plaintext into blocks of n characters or bits and applying the same encryption algorithm and key to each block.

Browsing: The act of searching through storage to locate or acquire information without necessarily knowing the existence or the format of the information being sought.

BSI ISO/IEC 17799:2000, BS 7799-I: 2000, “Information Technology—Code of Practice for Information Security Management.” British Standards Institution, London, UK: A standard intended to “provide a comprehensive set of controls comprising best practices in information security.” ISO refers to the International Organization for Standardization, and IEC is the International Electrotechnical Commission.

Buffer overflow: A condition in which more input is placed into a buffer or data holding area than the allowed or allocated capacity, overwriting other information. Such a condition is exploited by attackers to crash or gain control of a system.

Bug: An error, defect, mistake, vulnerability, failure, or fault in a computer system or in software.

Byte: A set of bits, usually eight, that represent a single character.

C&A: Certification and accreditation.

CA: Certificate authority/agent. See certificate authority.

Calls: The operations performed by an application to perform a task.

Call graph: A visual representation of a sequence of calls.

Capability: A protected identifier that identifies an object and specifies the access rights allowed to the accessor who possesses the capability. In a capability-based system, access to protected objects (such as files) is granted if the would-be accessor possesses a capability for the object.

CAPI: See cryptographic application programming interface.

Category: A restrictive label that has been applied to classified or unclassified data as a means of increasing the protection of the data and further restricting its access.

CBC: See cipher-block chaining.

CC: See Common Criteria.

CDDI: See copper data distributed interface.

Cell: See lines, units, cells.

Central processing unit (CPU): The microprocessor unit or units responsible for interpreting and executing instructions in a computer system.

Computer Emergency Response Team Coordination Center (CERT/CC): A unit of the Carnegie Mellon University Software Engineering Institute (SEI). SEI is a federally funded R&D center. CERT’s mission is to alert the Internet community to vulnerabilities and attacks and to conduct research and training in the areas of computer security, including incident response.

Certification: The comprehensive evaluation of the technical and nontechnical security features of an AIS and other safeguards, made in support of the accreditation process, that establishes the extent to which a particular design and implementation meet a specified set of security requirements.

Certificate authority (CA): The official responsible for performing the comprehensive evaluation of the technical and nontechnical security features of an IT system and other safeguards, made in support of the accreditation process, to establish the extent that a particular design and implementation meet a set of specified security requirements.

Certificate revocation list (CRL): A list of certificates that have been revoked or are no longer valid and should not be relied upon.

Channel: Specific communication link established within a communication conduit. See conduit.

Cipher-block chaining (CBC): An encryption mode of the Data Encryption Standard (DES) that operates on plaintext blocks 64 bits in length. Each block of plaintext is Exclusive ORed (XORed) with the previous ciphertext block before being encrypted. See Exclusive OR.

Cipher: A cryptographic transformation that operates on characters or bits.

Ciphertext or cryptogram: An unintelligible encrypted message.

Client: A computer that accesses a server’s resources.

Client/server architecture: A network system design in which a processor or computer designated as a file server or database server provides services to other client processors or computers. Applications are distributed between a host server and a remote client.

Cluster: A group of computers linked together over a fast local area network or other means. Clustered computers work closely together such that they act and appear like a single large computer. Clusters are typically created to improve availability, performance, or redundancy beyond that provided by a single computer.

Collision: An event in which simultaneous transmissions on a communications medium interfere with one another or “collide.”

Component Object Model (COM): A Microsoft technology that enables software components to communicate with each other.

Common Criteria (CC): A standard for specifying and evaluating the features of computer products and systems.

Common Object Request Broker Architecture (CORBA): A standard that uses the object request broker (ORB) to implement exchanges among objects in a heterogeneous, distributed environment.

Communications security (COMSEC): Measures and controls taken to deny unauthorized persons information derived from telecommunications and to ensure the authenticity of such telecommunications. Communications security includes cryptosecurity, transmission security, emission security, and physical security of COMSEC material and information.

Component: A constituent element of a larger system. In software, a component is a functional part of a larger program.

Compromise: A violation of a system’s security policy such that unauthorized disclosure of sensitive information might have occurred.

Compromising emanations: Unintentional data-related or intelligence-bearing signals that, when intercepted and analyzed, disclose the information transmission that is received, handled, or otherwise processed by any information processing equipment. See TEMPEST.

Computer abuse: The misuse, alteration, disruption, or destruction of data-processing resources. The key is that computer abuse is intentional and improper.

Computer cryptography: The use of a crypto-algorithm in a computer, microprocessor, or microcomputer to perform encryption or decryption in order to protect information or to authenticate users, sources, or information.

Computer facility: The physical structure housing data processing operations.

Computer forensics: Collection of information from and about computer systems that is admissible in a court of law.

Computer fraud: Computer-related crimes involving deliberate misrepresentation, alteration, or disclosure of data in order to obtain something of value (usually monetary gain). A computer system must have been involved in the perpetration or cover-up of the act or series of acts. A computer system might have been involved through improper manipulation of input data, output or results, application programs, data files, computer operations, communications, computer hardware, systems software, or firmware.

Computer security (COMPUSEC): Synonymous with automated information system security.

Computing environment: The total environment in which an automated information system, network, or component operates. The environment includes physical, administrative, and personnel procedures as well as communication and networking relationships with other information systems.

COMSEC: See communications security.

Concealment: Keeping a secret attribute of a program hidden to prevent it from being discovered by an attacker.

Conduit: The means by which something is transmitted or conveyed, such as information.

Confidentiality: Assurance that information is not disclosed to unauthorized persons, processes, or devices. The concept of holding sensitive data in confidence, limited to an appropriate set of individuals or organizations.

Configuration control: The process of controlling modifications to the system’s hardware, firmware, software, and documentation in a way that provides sufficient assurance that the system is protected against the introduction of improper modifications prior to, during, and after system implementation. Compare with configuration management.

Configuration management: The management of security features and assurances through control of changes made to a system’s hardware, software, firmware, documentation, test, test fixtures, and test documentation throughout the development and operational life of the system. Compare with configuration control.

Confinement: The prevention of the leaking of sensitive data from a program.

Confinement property: Synonymous with star property (* property).

Conformance testing: Planned activities to ensure that software processes and products conform to applicable requirements, standards, and procedures.

Connection-oriented service: Service that establishes a logical connection that provides flow control and error control between two stations that need to exchange data.

Connectivity: The condition of establishing and maintaining a connection between two or more points in a telecommunications system.

Containment: Preventing a successful attack on a software system from spreading to other parts of an organization’s computing resources.

Contamination: The intermixing of data at different sensitivity and need-to-know levels. The lower-level data is said to be contaminated by the higher-level data, with the result that the contaminating (higher-level) data might not receive the required level of protection.

Contingency management: Establishing actions to be taken before, during, and after a threatening incident.

Contingency plan: A plan for emergency response, backup operations, and post-disaster recovery maintained by an activity as a part of its security program; this plan ensures the availability of critical resources and facilitates the continuity of operations in an emergency situation. Synonymous with backup plan, disaster plan, emergency plan.

Control center: Central location used to operate a set of physical assets. Note: Infrastructure industries typically use one or more control centers to supervise or coordinate their operations. If there are multiple control centers (e.g., a backup center at a separate site), they are typically connected via a wide area network. The control center in such industries typically contains the SCADA host computers and associated operator display devices plus ancillary information systems such as a historian.

Control equipment: Class that includes distributed control systems, programmable logic controllers, SCADA systems, associated operator interface consoles, and field sensing and control devices used to manage and control the process. Note: The term also includes fieldbus networks where control logic and algorithms are executed on intelligent electronic devices that coordinate actions with each other, as well as systems used to monitor the process and the systems used to maintain the process.

Control network: Time-critical network that is typically connected to equipment that controls physical processes. Note: The control network can be subdivided into zones, and there can be multiple separate control networks within one company or site.

Controlled access: Synonymous with access control.

Correctness: If software performs all of its intended functions as specified, it is said to be correct, and exhibits the property of correctness.

Continuity of operations: Maintenance of essential functions and services under a broad range of circumstances.

Controlled sharing: The condition that exists when access control is applied to all users and components of a system. See access control.

Copper Data Distributed Interface (CDDI): A version of FDDI specifying the use of unshielded twisted pair wiring. See FDDI.

CORBA: See Common Object Request Broker Architecture.

Cost-risk analysis: The assessment of the cost of providing data protection for a system versus the cost of losing or compromising the data.

COTS: Commercial off-the-shelf. Refers to commercial items that are sold in substantial quantities in the commercial marketplace,

Countermeasure: Any action, device, procedure, technique, or other reactive measure that reduces the vulnerability of or threat to a system.

Covert channel: A communications channel that enables two cooperating processes to transfer information in a manner that violates the system’s security policy.

Covert storage channel: A covert channel that comprises the direct or indirect writing of a storage location by one process and the direct or indirect reading of the storage location by another process. Covert storage channels typically involve a finite resource (e.g., sectors on a disk) shared by two subjects at different security levels.

Covert timing channel: A covert channel in which one process signals information to another by modulating its own use of system resources (e.g., CPU time) in such a way that this manipulation affects the real response time observed by the second process.

CPU: See central processing unit.

Criteria: Standards by which something can be judged or decided. Also, see DoD Trusted Computer System Evaluation Criteria (TCSEC).

CRC: See cyclic redundancy check.

CRL: See certificate revocation list.

Cryptanalysis: The formal term for attacking cryptographic systems. In crypt-analysis, an attacker attempts to obtain the original, unencrypted message (plaintext) from the encrypted message (ciphertext).

Crypto-algorithm: A well-defined procedure, sequence of rules, or steps used to produce a keystream or ciphertext from plaintext, and vice versa. A step-by-step procedure that is used to encipher plaintext and decipher cipher-text. Also called a cryptographic algorithm.

Cryptographic application programming interface (CAPI): An interface to a library of software functions that provide security and cryptography services. CAPI is designed for software developers to call functions from the library, which makes it easier to implement security services.

Cryptography: The principles, means, and methods for rendering information unintelligible and for restoring encrypted information to intelligible form. The word cryptography comes from the Greek kryptos, meaning “hidden,” and graphein, “to write.”

Cryptosystem: A set of transformations from a message space to a cipher-text space. This system includes all cryptovariables (keys), plaintexts, and ciphertexts associated with the transformation algorithm (crypto-algorithm).

Cryptovariable: See key.

CSS: See DeCSS.

Cyclic redundancy check (CRC): A common error-detection process used in control systems. A mathematical operation is applied to the data when it is transmitted. The result is appended to the core packet. Upon receipt, the same mathematical operation is performed and the result is checked against the CRC. A mismatch indicates a very high probability that an error has occurred during transmission.

DAA: See designated approving authority.

Damage potential: The level of harm the attacker can cause to a system in conducting an attack.

DAC: See discretionary access control.

Data dictionary: A database that comprises tools to support the analysis, design, and development of software and to support good software engineering practices.

Data Encryption Standard (DES): A cryptographic algorithm for the protection of unclassified data, published in Federal Information Processing Standard (FIPS) 46. The DES, which was approved by the National Institute of Standards and Technology (NIST), is intended for public and government use.

Data flow control: See information flow control.

Data integrity: The attribute of data that is related to the preservation of its meaning and completeness, the consistency of its representation(s), and its correspondence to what it represents. Data is said to have integrity when it meets a prior expectation of quality.

Data link layer: The OSI level that performs the assembly and transmission of data packets, including error control.

Data security: The protection of data from unauthorized (whether accidental or intentional) modification, destruction, or disclosure.

Database: A persistent collection of data items that form relationships among each other.

Datagram service: A connectionless form of packet switching whereby the source does not need to establish a connection with the destination before sending data packets.

DCOM: See Distributed Component Object Model.

Decipher: To reverse the encipherment process in order to make an encrypted message human readable.

Declassification of AIS storage media: An administrative decision or procedure to remove or reduce the security classification of the subject media.

DeCSS: A program that bypasses the Content Scrambling System (CSS) software used to prevent the viewing of DVD movie disks on unlicensed platforms.

Default: A value or option that is automatically chosen when no other value or option is specified.

Degauss: To degauss a magnetic storage medium is to remove all the data stored on it by demagnetization. A degausser is a device used for this purpose.

Degausser products list (DPL): A list of commercially produced degaussers that meet National Security Agency specifications. This list is included in the NSA Information Systems Security Products and Services Catalogue and is available through the Government Printing Office.

Demilitarized zone (DMZ): A host or network segment inserted as a “neutral zone” between an organization’s private network and the Internet.

Denial of service (DoS): Any action (or series of actions) that prevents any part of a system from functioning in accordance with its intended purpose. This includes any action that causes unauthorized destruction or modification of data or delay of service.

Dependability: A desirable condition of software that can be attained with justifiable confidence when the software functions as intended.

DES: See data encryption standard.

Designated approving authority (DAA): The official who has the authority to decide whether to accept the security safeguards prescribed for an AIS, or the official who is responsible for issuing an accreditation statement that records the decision to accept those safeguards.

Developer: The organization or individual that develops the information system.

DHCP: See dynamic host configuration protocol.

Dial-up: A service whereby a computer terminal can use a telephone to initiate and effect communication with a computer.

Diffusion: A method of obscuring redundancy in plaintext by dissipating the statistical structure of plaintext over the bulk of ciphertext.

Digital Millennium Copyright Act (DMCA) of 1998: Addresses copyright licensing and ownership and prohibits trading, manufacturing, or selling in any way that is intended to bypass copyright protection mechanisms.

Direct-sequence spread spectrum (DSSS): A method of spread spectrum modulation for digital signal transmission over the airwaves. The transmission is split into 14 channels, each with a frequency range, by combining a data signal with a chipping sequence. The chipping sequence or chipping code, as it is sometimes known, divides the data according to a spreading ratio. The redundant chipping code helps the signal resist interference and also enables the original data to be recovered if data bits are damaged during transmission. Data rates of 1, 2, 5.5, and 11 Mbps are obtainable.

Disaster: A sudden, unexpected, calamitous event that produces great damage or loss; any event that creates an inability on an organization’s part to provide critical business functions for some undetermined period of time.

Disaster plan: Synonymous with contingency plan.

Disaster recovery plan: Procedure for emergency response, extended backup operations, and post-disaster recovery when an organization suffers a loss of computer resources and physical facilities.

Discretionary access control (DAC): A means of restricting access to objects based on the identity and need-to-know of the user, process, and/or groups to which they belong. The controls are discretionary in the sense that a subject that has certain access permissions is capable of passing that permission (perhaps indirectly) on to any other subject. Compare with mandatory access control.

Disk image backup: The process of conducting a bit-level copy of a disk, sector by sector, which provides the capability to examine slack space, undeleted clusters, and (possibly) deleted files.

Distributed control system: A type of control system in which the system elements are dispersed but are operated in a coupled manner. Note: Distributed control systems may have shorter coupling time constants than those typically found in SCADA systems. Note: Distributed control systems are commonly associated with continuous processes such as electric power generation, oil and gas refining, chemical, and paper manufacturing as well as discrete processes such as automobile and other goods manufacturing, packaging, and warehousing.

Distributed Component Object Model (DCOM): A distributed object model that is similar to the Common Object Request Broker Architecture (CORBA). DCOM is the distributed version of COM that supports remote objects as if the objects reside in the client’s address space. A COM client can access a COM object through the use of a pointer to one of the object’s interfaces and then invoke methods through that pointer.

DITSCAP: See DoD Information Technology Security Certification and Accreditation Process.

DMCA: See Digital Millennium Copyright Act of 1998.

DNS enumeration: Gathering information on domain name service (DNS) servers.

DoD: U.S. Department of Defense.

DoD Information Technology Security Certification and Accreditation Process (DITSCAP): Establishes for the defense entities a standard process, set of activities, general task descriptions, and management structure to certify and accredit IT systems that will maintain the required security posture.

DoD Trusted Computer System Evaluation Criteria (TCSEC): A document published by the National Computer Security Center containing a uniform set of basic requirements and evaluation classes for assessing degrees of assurance in the effectiveness of hardware and software security controls built into systems. These criteria are intended for use in the design and evaluation of systems that process and/or store sensitive or classified data. This document is Government Standard DoD 5200.28-STD and is frequently referred to as “the Criteria” (see criteria) or “the Orange Book.”

Domain: The unique context (e.g., access control parameters) in which a program is operating; in effect, the set of objects that a subject has the ability to access. See process and subject.

DoS: See denial of service.

DoS attack: Denial-of-service attack.

DPL: See degausser products list.

DSSS: See direct-sequence spread spectrum.

Due care: The care which an ordinary prudent person would have exercised under the same or similar circumstances. The terms due care and reasonable care are used interchangeably.

Dynamic host configuration protocol (DHCP): A protocol that issues Internet Protocol (IP) addresses automatically within a specified range to devices such as PCs when they are first powered on. The device retains the use of the IP address for a specific license period that the system administrator can define.

EAL: See evaluation assurance level.

Electronics Industry Association (EIA): A U.S. standards organization that represents a large number of electronics firms.

Emanations: See compromising emanations.

Embedded software: Software that is an element of a larger system and performs some of the requirements of that system, such as controlling, measuring, or monitoring the actions of the system’s physical components.

Embedded system: A system that performs or controls a function, either in whole or in part, as an integral element of a larger system or subsystem.

Emergency plan: Synonymous with contingency plan.

Emission(s) security (EMSEC): The protection resulting from all measures taken to deny unauthorized persons information of value derived from the intercept and analysis of compromising emanations from crypto-equipment or an IT system.

Encipher: To make a message unintelligible to all but the intended recipients.

End-to-end encryption: Encrypted information sent from the point of origin to the final destination. In symmetric key encryption, this process requires the sender and the receiver to have identical keys for the session.

Enumeration: Gathering detailed information about a target information system.

Environment: The aggregate of external procedures, conditions, and objects that affect the development, operation, and maintenance of a system.

Equipment under control: Equipment, machinery, apparatus or plant used for manufacturing, process, transportation, medical or other activities.

Erasure: A process by which a signal recorded on magnetic media is removed. Erasure is accomplished in two ways: (1) by alternating current erasure, by which the information is destroyed when an alternating high and low magnetic field is applied to the media; (2) by direct current erasure, in which the media is saturated by applying a unidirectional magnetic field. See degauss.

Error: An error occurs (1) if an individual interacts with a software system and creates an error, for example, a coding error or an operational failure; (2) when the value actually produced by the software is different than the correct value; (3) when one of the software’s states changes from correct to incorrect.

Ethernet switch: Ethernet switches establish a data link in which a circuit or a channel is connected to an Ethernet network. Switches are used to interconnect different LANs. A switch operates in the data link layer of the ISO/OSI Reference model.

Ethical hacker: Trusted individual who performs penetration tests without malicious intent.

Evaluation: Assessment of an IT or IACS against defined security functional and assurance criteria, performed by a combination of testing and analytic techniques.

Event: Something that occurs, such as a specific situation or an activity. Within a software system, an event handler is a subroutine that handles input received from the software.

Execution environment: The entities in software’s operational environment such as servers, middleware, and network devices that support, affect or influence its execution.

Exploit: To exploit means to take advantage of a security weakness in software in order to compromise the software to, for example, gain control of a system. An exploit also refers to the portion of code, data, or sequence of commands used to conduct an attack.

Evaluation assurance level (EAL): In the Common Criteria, the degree of examination of the product to be tested. EALs range from EA1 (functional testing) to EA7 (detailed testing and formal design verification). Each numbered package represents a point on the CC’s predefined assurance scale. An EAL can be considered a level of confidence in the security functions of an IT product or system.

Exclusive OR: A Boolean function where the output is a logical 0 when two input bits are identical (both 1’s or both 0’s) and the output is a logical 1 when the two input bits are not identical (one input is a 0 and the other input is a 1).

Executive state: One of several states in which a system can operate and the only one in which certain privileged instructions can be executed. Such instructions cannot be executed when the system is operating in other (e.g., user) states. Synonymous with supervisor state.

Exploitable channel: Any information channel that is usable or detectable by subjects that are external to the trusted computing base, whose purpose is to violate the security policy of the system. See covert channel.

Exposure: An instance of being exposed to losses from a threat.

External dependencies: The network and architectural components that a network interacts with but does not control.

Fail-over: A condition in which operations automatically switch over to a backup system when one system/application fails.

Fail safe: If a failure occurs, the system should fail in a secure manner, i.e., security controls and settings remain in effect and are enforced. It is usually better to lose functionality rather than security.

Fail secure: A system design principle that preserves a secure state during and after the occurrence of failures.

Failure: A condition that occurs when software or hardware are unable to perform their intended functions within the operational parameters specified for those functions.

False negative: A condition that happens when a security tool does not report a weakness where one is present.

False positive: A condition that happens when a security tool reports a weakness where no weakness is present.

False positive rate: The number of false positives divided by the sum of the number of false positives and the number of true positives.

Fault: A condition that causes a device or system component to fail to perform in the required manner.

FCC: Federal Communications Commission.

FDMA: Frequency division multiple access. A spectrum-sharing technique whereby the available spectrum is divided into a number of individual radio channels.

FDX: See full-duplex.

Federal Intelligence Surveillance Act (FISA) of 1978: An act that limited wiretapping for national security purposes as a result of the Nixon administration’s history of using illegal wiretaps.

Fetch protection: A system-provided restriction to prevent a program from accessing data in another user’s segment of storage.

Fiber-Distributed Data Interface (FDDI): A set of ANSI and ISO standards for data transmission on fiber optic lines in a local area network (LAN) that can extend in range up to 200 km (124 miles). The FDDI protocol is based on the token ring protocol. In addition to being large geographically, an FDDI local area network can support thousands of users. FDDI is frequently used on the backbone for a wide area network (WAN).

Fiestel cipher: An iterated block cipher that encrypts by breaking a plaintext block into two halves and, with a subkey, applying a “round” transformation to one of the halves. The output of this transformation is then XORed with the remaining half. The round is completed by swapping the two halves.

FIFO: See first in, first out.

File server: A computer that provides network stations with controlled access to sharable resources. The network operating system (NOS) is loaded on the file server, and most sharable devices, including disk subsystems and printers, are attached to it.

File protection: The aggregate of all processes and procedures in a system designed to inhibit unauthorized access, contamination, or elimination of a file.

File security: The means by which access to computer files is limited to authorized users only.

File Transfer Protocol (FTP): A TCP/IP protocol for file transfer.

FIPS: Federal Information Processing Standards. A set of standards that describe document processing, encryption algorithms and other information technology standards for use within nonmilitary government agencies and by government contractors and vendors who work with the agencies.

Firewall: A network device that shields a trusted network from unauthorized users in an untrusted network by blocking specific types of traffic. Many types of firewalls exist, including packet filtering and stateful inspection.

Firmware: Executable programs stored in nonvolatile memory.

First in, first out (FIFO): In computing, a method of processing elements in a queue on a first-come, first-served basis.

Flaw: A shortcoming in software’s requirements, architecture, or design specification that results in a weak design or errors in the implementation. A flaw may or may not be a vulnerability.

Flow-sensitive analysis: Analysis of a computer program that takes into account the flow of control.

Flow control: See information flow control.

Footprinting: Gathering information in both active and passive modes.

Formal method: A technique used to verify, through the use of mathematical proofs, that software is consistent with its specified requirements, architecture, design, or security policy.

Formal proof: A complete and convincing mathematical argument presenting the full logical justification for each proof step for the truth of a theorem or set of theorems.

Formal security policy model: A mathematically precise statement of a security policy. To be adequately precise, such a model must represent the initial state of a system, the way in which the system progresses from one state to another, and a definition of a secure state of the system. To be acceptable as a basis for a TCB (see TCB), the model must be supported by a formal proof that if the initial state of the system satisfies the definition of a secure state and if all assumptions required by the model hold, then all future states of the system will be secure. Some formal modeling techniques include state transition models, denotational semantics models, and algebraic specification models. See Bell-LaPadula model.

Formal verification: The process of using formal proofs to demonstrate the consistency between a formal specification of a system and a formal security policy model (design verification) or between the formal specification and its high-level program implementation (implementation verification).

Frequency division multiple access: See FDMA.

FTP: See File Transfer Protocol.

Full-duplex (FDX): Communications method where information can be transmitted in both directions simultaneously.

Functional testing: The segment of security testing in which the advertised security mechanisms of the system are tested, under operational conditions, for correct operation.

Gateway: A network component that provides interconnectivity at higher network layers.

Gigabyte (GB, Gbyte): A unit of measure for memory or disk storage capacity; usually 1,073,741,824 bytes.

Gigahertz (GHz): A measure of frequency; one billion hertz.

GOTS: Government off-the-shelf software.

Gramm-Leach-Bliley (GLB) Act of November 1999: An act that removed Depression-era restrictions on banks that limited certain business activities, mergers, and affiliations. It repealed the restrictions on banks affiliating with securities firms contained in sections 20 and 32 of the Glass-Steagall Act. GLB became effective on November 13, 2001. GLB also requires health plans and insurers to protect member and subscriber data in electronic and other formats. These health plans and insurers will fall under new state laws and regulations that are being passed to implement GLB because GLB explicitly assigns enforcement of the health plan and insurer regulations to state insurance authorities (15 U.S.C. §6805). Some of the privacy and security requirements of Gramm-Leach-Bliley are similar to those of HIPAA (Health Insurance Portability and Accountability Act).

Granularity: An expression of the relative size of a data object; for example, protection at the file level is considered coarse granularity, whereas protection at the field level is considered to be of a finer granularity.

Gray box test: A test in which the ethical hacking team has partial knowledge of the target information system.

Gray hat hacker: A hacker who normally performs ethical hacking but sometimes reverts to malicious, black hat hacking.

Guard: A processor that provides a filter between two disparate systems operating at different security levels or between a user terminal and a database in order to filter out data that the user is not authorized to access.

Hamming weight: The Hamming weight of a string is the number of symbols that are different from the zero-symbol of the alphabet used.

Handshaking procedure: A dialogue between two entities (e.g., a user and a computer, a computer and another computer, or a program and another program) for the purpose of identifying and authenticating the entities to one another.

Hertz (Hz): A unit of frequency measurement; one cycle of a periodic event per second. Used to measure frequency.

HIPAA (Health Insurance Portability and Accountability Act): See Kennedy-Kassebaum Act of 1996.

Host: A time-sharing computer accessed via terminals or terminal emulation; a computer to which an expansion device attaches.

HTTP: Hypertext Transfer Protocol. An application layer protocol designed within the framework of the Internet Protocol suite that is the fundamental protocol of the World Wide Web.

Hypertext markup language (HTML): A standard used on the Internet for defining hypertext links between documents.

I&A: Identification and authentication. Identification is the process of presenting the identity of a user, process, or device, usually as a prerequisite for granting access to resources in an IT system. Authentication is verifying the identity of a user, process, or device, often as a prerequisite to allowing access to resources in an information system.

IA: Information assurance. Measures that protect and defend information and information systems by ensuring their availability, integrity, authentication, confidentiality, and nonrepudiation. These measures include providing for restoration of information systems by incorporating protection, detection, and reaction capabilities.

IAC: Inquiry access code; used in inquiry procedures. The IAC can be one of two types: a dedicated IAC for specific devices or a generic IAC for all devices.

ICV: Integrity check value. In Wired Equivalent Privacy (WEP) encryption, the frame is run through an integrity algorithm, and the generated ICV is placed at the end of the encrypted data in the frame. Then the receiving station runs the data through its integrity algorithm and compares it to the ICV received in the frame. If it matches, the unencrypted frame is passed to the higher layers. If it does not match, the frame is discarded.

ID: Common abbreviation for “identifier” or “identity.”

Identification: The process that enables a system to recognize an entity, generally by the use of unique machine-readable user names.

Identity: The label applied to the type of service provided by a software component, for example, NETWORK_SERVICE.

IDS: Intrusion detection system. Hardware or software product that gathers and analyzes information from various areas within a computer or a network to identify possible security breaches, which include both intrusions (attacks from outside the organizations) and misuse (attacks from within the organizations).

IEEE: See Institute of Electrical and Electronic Engineers.

IETF: Internet Engineering Task Force. The Internet Engineering Task Force is a volunteer body that develops and promotes Internet standards. The mission of the IETF is to improve the Internet by producing high-quality, relevant technical documents that influence the way the Internet is designed, used, and managed.

IKE: Internet key exchange. The protocol used to set up a security association (SA) in the IPsec protocol suite in order to establish secure communications.

Impersonating: Synonymous with spoofing.

Implementation: The phase in the software life cycle when the software is coded from specifications and/or integrated from components.

Incomplete parameter checking: A method used to detect system design flaws that result when all parameters have not been fully examined for accuracy and consistency, thus making the system vulnerable to penetration.

Individual accountability: The ability to identify an individual and to hold that individual responsible for his or her actions.

Industrial automation and control system: Collection of personnel, hardware, and software that can affect or influence the safe, secure, and reliable operation of an industrial process. Note: These systems include, but are not limited to:

                   a.    industrial control systems, including distributed control systems (DCSs), programmable logic controllers (PLCs), remote terminal units (RTUs), intelligent electronic devices, supervisory control and data acquisition (SCADA), networked electronic sensing and control, and monitoring and diagnostic systems. (In this context, process control systems include basic process control system and safety instrumented system [SIS] functions, whether they are physically separate or integrated.)

                   b.    associated information systems such as advanced or multivariable control, online optimizers, dedicated equipment monitors, graphical interfaces, process historians, manufacturing execution systems, and plant information management systems.

                   c.    associated internal, human, network, or machine interfaces used to provide control, safety, and manufacturing operations functionality to continuous, batch, discrete, and other processes.

Industrial, scientific, and medicine (ISM) bands: Radio frequency bands authorized by the Federal Communications Commission (FCC) for wireless LANs used for industrial, scientific, and medicine-related purposes. The ISM bands are located at 902 MHz, 2.400 GHz, and 5.7 GHz. The transmitted power is commonly less than 600mW, and no FCC license is required.

Information flow control: A procedure undertaken to ensure that information transfers within a system are not made from a higher security level object to a lower security level object. See covert channel, simple security property, and star property (* property). Synonymous with data flow control and flow control.

Information security policy: The aggregate of public law, directives, regulations, and rules that regulate how an organization manages, protects, and distributes information. For example, the information security policy for financial data processed on DoD systems may be in U.S. Code (USC), Executive Order (EO), and DoD directives and in local regulations. The information security policy lists all the security requirements applicable to specific information.

Information system (IS): Any telecommunications or computer-related equipment or interconnected systems or subsystems of equipment used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of voice and/or data; includes software, firmware, and hardware.

Information system security engineering/engineer (ISSE): Process of capturing and refining information protection requirements to ensure their integration into information systems acquisition and information systems development through purposeful security design or configuration. Individual assigned responsibility for conducting information system security engineering activities.

Information system security officer (ISSO): (1) The person who is responsible to the designated approving authority (DAA) for ensuring that security is provided for and implemented throughout the life cycle of an AIS, from the beginning of the concept development plan through its design, development, operation, maintenance, and secure disposal; (2) In certification and accreditation (C&A), the person responsible to the DAA for ensuring that the security of an IT system is approved, operated, and maintained throughout its life cycle in accordance with the SSAA.

Information technology (IT): The hardware, firmware, and software used as part of the information system to perform information-related functions. This definition includes computers, telecommunications, automated information systems, and automatic data processing equipment. IT includes any assembly of computer hardware, software, and/or firmware configured to collect, create, communicate, compute, disseminate, process, store, and/or control information.

Information technology security (ITSEC): (1) Protection of information technology against unauthorized access to or modification of information, whether in storage, processing, or transit, and against the denial of service to authorized users, including those measures necessary to detect, document, and counter such threats; (2) Protection and maintenance of confidentiality, integrity, availability, and accountability.

Infrared (IR) light: Light waves that range in length from about 0.75 to 1,000 microns; IR light operates at a lower frequency than the spectral colors but at a higher frequency than radio waves.

Inheritance (in object-oriented programming): When all the methods of one class, called a superclass, are inherited by a subclass. Thus, all messages understood by the superclass are understood by the subclass.

Input validation: Checking the data that is input to a software application for acceptable parameters such as data type, length, and range.

Inquiry access code: See IAC.

Institute of Electrical and Electronic Engineers (IEEE): A U.S.-based standards organization participating in the development of standards for data transmission systems. The IEEE has made significant progress in the establishment of standards for LANs, namely the IEEE 802 series.

Integration testing: A testing process used to verify the interface among components such as hardware or software modules as the components are installed. The testing personnel should integrate components into the network one by one and perform integration testing when necessary to ensure proper gradual integration of components.

Integrator: An organization or individual that unites, combines, or otherwise incorporates information system components with another system(s).

Integrity: (1) The quality of being in sound, unimpaired, or perfect condition; (2) A quality of an IT system reflecting the logical correctness and reliability of the operating system, the logical completeness of the hardware and software implementing the protection mechanisms, and the consistency of the data structures and occurrence of the stored data. It is composed of data integrity and system integrity; (3) Integrity is concerned with unauthorized modification of data, whether by an authorized or an unauthorized individual or process. If a message is intercepted, its contents are changed, and it is then forwarded to a receiver, integrity has been violated.

Integrity check value: See ICV.

Interdiction: See denial of service.

Inter-file analysis: Analysis of code resulting in different files that have procedural, data, or other interdependencies.

Internal security controls: Hardware, firmware, and software features within a system that restrict access to resources (hardware, software, and data) to authorized subjects only (persons, programs, or devices).

International Standards Organization (ISO): A nontreaty standards organization active in the development of international standards, such as the Open System Interconnection (OSI) network architecture.

International Telecommunications Union (ITU): An intergovernmental agency of the United States, responsible for making recommendations and developing standards regarding telephone and data communications systems for public and private telecommunication organizations and for providing coordination for the development of international standards.

Internet: The largest network in the world. The successor to the Advanced Research Projects Agency Network (ARPANET), the Internet includes other large subnetworks. The Internet uses the TCP/IP protocol suite and connects universities, government agencies, business, industry, and individuals around the world.

Internet Protocol (IP): The Internet standard protocol that defines the Internet datagram as the information unit passed across the Internet. IP provides the basis of a best-effort packet delivery service. The Internet Protocol suite is often referred to as TCP/IP because IP is one of the two fundamental protocols, the other being the Transfer Control Protocol.

Internetwork packet exchange (IPX): Novell NetWare operating system protocol for the exchange of message packets on an internetwork. IPX passes application requests for network services to the network drives and then to other workstations, servers, or devices on the internetwork.

Intrusion detection system: See IDS.

Ionization: The act of converting an atom or molecule into an ion by adding or removing charged particles such as electrons or other ions.

Ionize: To convert an atom or molecule into an ion by adding or removing charged particles such as electrons or other ions.

IP: See Internet Protocol.

IPX: See internetwork packet exchange.

IS: See information system.

ISM bands: See industrial, scientific, and medicine bands.

Isolation: The containment of subjects and objects in a system in such a way that they are separated from one another as well as from the protection controls of the operating system.

IS: See information system.

ISO: See International Standards Organization.

ISP: Internet service provider. A commercial organization that provides user connections to the Internet for a fee.

ISSE: See information system security engineering/engineer.

ISSO: See information system security officer.

IT: See information technology.

ITSEC: See information technology security.

ITU: See International Telecommunications Union.

IV: An initialization vector (IV) is a random or pseudorandom character sequence that is used with a secret key for data encryption. The use of an IV makes it more difficult for an attacker to decipher the encrypted message.

Justifiable confidence: A high level of certainty, achieved with actions, arguments, and evidence.

Kennedy-Kassebaum Health Insurance Portability and Accountability Act (HIPAA) of 1996: A set of regulations that mandates the use of standards in health care record keeping and electronic transactions. The act requires that health care plans, providers, insurers, and clearinghouses do the following:

                Provide for restricted access by the patient to personal health care information

                Implement administrative simplification standards

                Enable the portability of health insurance

                Establish strong penalties for health care fraud

Key: Information or sequence that controls the enciphering and deciphering of messages. Also known as a cryptovariable. Used with a particular algorithm to encipher the plaintext message or decipher the encrypted message.

Keystream: A sequence of random or pseudorandom characters that are combined with a plaintext message to produce an encrypted message (the ciphertext).

Kilobyte (KB, Kbyte): A unit of measurement of memory or disk storage capacity; a data unit of 210 (1,024) bytes.

Kilohertz (kHz): A unit of frequency measurement; equivalent to 1,000 Hz.

Knowledge base: The rules and facts of the particular problem domain in an expert system. An expert system is a type of artificial intelligence element that makes decisions based on a knowledge base and rules derived from interviews with domain experts.

LAN: See local area network.

Least privilege: In the principle of least privilege, a user or process is given the minimum amount of privileges, authorization, and so on for the least amount of time that will permit him or her to accomplish assigned tasks and complete a mission. Least privilege reduces the prospect of an individual or process having unauthorized access to sensitive information.

Light-emitting diode (LED): Used in conjunction with optical fiber, an LED emits light when current is passed through it. Its advantages include low cost and long life, and it is capable of transmitting data in the million bits per second (Mbps) range.

Limited access: Synonymous with access control.

Lines, units, cells: Lower-level control system elements (entities) that perform manufacturing, field device control, or vehicle functions. Note: Entities at this level may be connected together by an area control network and may contain information systems related to the operations performed in that entity.

Local area network (LAN): A network that interconnects devices in the same office, floor, or building, or in adjacent or nearby buildings.

MAC: See mandatory access control if used in the context of a type of access control. MAC also refers to the media access control address assigned to a network interface card on an Ethernet network.

Magnetic remanence: A measure of the magnetic flux density that remains after removal of an applied magnetic force. Refers to any data remaining on magnetic storage media after removal of the power.

Mail gateway: A type of gateway that interconnects dissimilar email systems.

Maintainer: The organization or individual that maintains the information system.

Maintenance organization: The organization that keeps an IT or industrial automation and control system operating in accordance with prescribed requirements, laws, policies, procedures, and regulations.

Malicious code: Software or firmware that is intentionally included in a system for an unauthorized purpose (e.g., a Trojan horse).

Malicious logic: Synonymous with malicious code.

Malware: A merging of the words malicious and software. Malware is inserted into a system, usually covertly, with the intention of compromising the confidentiality, availability, or integrity of the system’s data, applications, and operating system. It may come to the attention of a user by inhibiting the operational abilities of the system. Often referred to as malicious code.

Mandatory access control (MAC): A means of restricting access to objects based on the sensitivity (as represented by a label) of the information contained in the objects and the formal authorization (in other words, clearance) of subjects to access information of such sensitivity. Compare with discretionary access control.

Manufacturing operations: Collection of production, maintenance, and quality assurance operations and their relationship to other activities of a production facility. Manufacturing operations include:

                   a.    manufacturing or processing facility activities that coordinate the personnel, equipment, and material involved in the conversion of raw materials or parts into products

                   b.    functions that may be performed by physical equipment, human effort, and information systems

                   c.    managing information about the schedules, use, capability, definition, history, and status of all resources (personnel, equipment, and material) within the manufacturing facility

Masquerading: Synonymous with spoofing.

Media access control (MAC): An IEEE 802 standards sublayer used to control access to a network medium, such as a wireless LAN. Also deals with collision detection. Each computer has its own unique MAC address.

Megabits per second (Mbps): One million bits per second, typically referring to a data transmission rate.

Megabyte (MB, Mbyte): A unit of measurement for memory or disk storage capacity, usually 1,048,576 bytes.

Megahertz (MHz): A measure of frequency equivalent to one million cycles per second.

Middleware: An intermediate software component located on a wired network between a wireless appliance and the application or data residing on the wired network. Middleware provides appropriate interfaces between the appliance and the host application or server database.

Mimicking: Synonymous with spoofing.

Mission: The assigned duties to be performed by a resource.

Misuse: Use of software that deviates from what is expected based on the software’s specifications. If the misuse is malicious in nature, it is typically referred to as abuse.

Mobile code: Software obtained from remote computer systems across a network, and downloaded and executed on a local computer system without explicit installation or execution by the recipient. Examples include JavaScript, Java applets, ActiveX controls, and Flash animations. Mobile code can be used for valid applications or can contain a virus that could do harm to a computer system.

Modes of operation: A description of the conditions under which an AIS functions, based on the sensitivity of the data processed and the clearance levels and authorizations of the users. Four modes of operation are authorized:

            1.    Dedicated mode: An AIS is operating in the dedicated mode when each user who has direct or indirect individual access to the AIS, its peripherals, remote terminals, or remote hosts has all of the following:

                   a.    A valid personnel clearance for all information on the system

                   b.    Formal access approval; furthermore, the user has signed nondisclosure agreements for all the information stored and/or processed (including all compartments, subcompartments, and/or special access programs)

                   c.    A valid need-to-know for all information contained within the system

            2.    System-high mode: An AIS is operating in the system-high mode when each user who has direct or indirect access to the AIS, its peripherals, remote terminals, or remote hosts has all of the following:

                   a.    A valid personnel clearance for all information on the AIS

                   b.    Formal access approval, and signed nondisclosure agreements, for all the information stored and/or processed (including all compartments, subcompartments, and/or special access programs)

                   c.    A valid need-to-know for some of the information contained within the AIS

            3.    Compartmented mode: An AIS is operating in the compartmented mode when each user who has direct or indirect access to the AIS, its peripherals, remote terminals, or remote hosts has all of the following:

                   a.    A valid personnel clearance for the most restricted information processed in the AIS

                   b.    Formal access approval, and signed nondisclosure agreements, for that information which he or she will be able to access

                   c.    A valid need-to-know for that information which he or she will be able to access

            4.    Multilevel mode: An AIS is operating in the multilevel mode when all of the following statements are satisfied concerning the users who have direct or indirect access to the AIS, its peripherals, remote terminals, or remote hosts:

                   a.    Some do not have a valid personnel clearance for all the information processed in the AIS.

                   b.    All have the proper clearance and the appropriate formal access approval for that information to which they are to have access.

                   c.    All have a valid need-to-know for that information to which they are to have access.

Multilevel device: A device that is used in a manner that permits it to simultaneously process data of two or more security levels without risk of compromise. To accomplish this, sensitivity labels are normally stored on the same physical medium and in the same form (e.g., machine-readable or human-readable) as the data being processed.

Multilevel secure: A class of system containing information with different sensitivities that simultaneously permits access by users with different security clearances and needs-to-know but that prevents users from obtaining access to information for which they lack authorization.

Multilevel security mode: See modes of operation.

Multiple inheritance: In object-oriented programming, a situation where a subclass inherits the behavior of multiple superclasses.

Multiuser mode of operation: A mode of operation designed for systems that process sensitive, unclassified information in which users might not have a need-to-know for all information processed in the system. This mode is also used for microcomputers processing sensitive unclassified information that cannot meet the requirements of the stand-alone mode of operation.

Mutually suspicious: A state that exists between interacting processes (subsystems or programs) in which neither process can expect the other process to function securely with respect to some property.

National Computer Security Center (NCSC): Originally named the DoD Computer Security Center, the NCSC is responsible for encouraging the widespread availability of trusted computer systems throughout the federal government. It is a branch of the National Security Agency (NSA) that also initiates research and develops and publishes standards and criteria for trusted information systems.

National Information Assurance Certification and Accreditation Process (NIACAP): Provides a standard set of activities, general tasks, and a management structure to certify and accredit systems that will maintain the information assurance and security posture of a system or site. The NIACAP is designed to certify that the information system meets documented accreditation requirements and continues to maintain the accredited security posture throughout the system life cycle. This document is issued and maintained by the National Security Telecommunications and Information Systems Security Committee (NSTISSC).

National Institute of Standards and Technology (NIST): NIST is a nonregulatory federal agency within the U.S. Department of Commerce. NIST's mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life.

National Security Agency (NSA): The National Security Agency (NSA) is a cryptologic intelligence agency of the United States Department of Defense responsible for the collection and analysis of foreign communications and foreign signals intelligence, as well as protecting U.S. government communications and information systems.

NCSC: See National Computer Security Center.

Need-to-know: The necessity for access to, knowledge of, or possession of specific information that is required to carry out assigned duties in industrial automation and control systems and systems critical to national security.

Network basic input/output system (NetBIOS): A standard interface between networks and PCs that enables applications on different computers to communicate within a LAN. NetBIOS was created by IBM for its early PC network, was adopted by Microsoft, and has since become a de facto industry standard. It is not routable across a wide area network (WAN).

Network interface card (NIC): A network adapter inserted into a computer that enables the computer to be connected to a network.

Network monitoring: A form of operational support enabling network management to view the network’s inner workings. Most network-monitoring equipment is nonobtrusive and can be used to determine the network’s utilization and to locate faults.

NIACAP: See National Information Assurance Certification and Accreditation Process.

NIAP: See National Information Assurance Partnership.

NIST: See National Institute of Standards and Technology.

Node: Any network-addressable device on a network, such as a router or network interface card. Any network station.

Nonrepudiation: Nonrepudiation in digital systems refers to ensuring that the sender of a message or contract cannot later deny sending the message or contract and that, conversely, a receiver to whom the message was sent cannot deny receiving the message. Digital signatures are used to implement nonrepudiation.

NSA: See National Security Agency.

Object: A passive entity that contains or receives information. Access to an object implies access to the information that it contains. Examples of objects include records, blocks, pages, segments, files, directories, directory trees, and programs, as well as bits, bytes, words, fields, processors, video displays, keyboards, clocks, printers, and network nodes.

Object request broker (ORB): The fundamental building block of the object request architecture (ORA), which manages the communications among the ORA entities. The purpose of the ORB is to support the interaction of objects in heterogeneous, distributed environments. The objects may be on different types of computing platforms.

Object reuse: The reassignment and reuse of a storage medium (e.g., page frame, disk sector, and magnetic tape) that once contained one or more objects. To be securely reused and assigned to a new subject, storage media must contain no residual data (data remanence) from the object(s) that were previously contained in the media.

Object services: Services that support the ORB in creating and tracking objects as well as performing access control functions.

OMB: Office of Management and Budget. As part of the executive branch of the U.S. government, the OMB serves the president of the United States in implementing his vision across the executive branch. OMB is the largest component of the executive office of the president. It reports directly to the President and helps a wide range of executive departments and agencies across the federal government to implement the commitments and priorities of the president.

OPC: A set of specifications for the exchange of information in a process control environment. Note: The abbreviation “OPC” originally came from “OLE for process control,” where “OLE” was short for “object linking and embedding.”

Open security environment: An environment that includes those systems in which at least one of the following conditions holds true: (1) application developers (including maintainers) do not have sufficient clearance or authorization to provide an acceptable presumption that they have not introduced malicious logic, and (2) configuration control does not provide sufficient assurance that applications are protected against the introduction of malicious logic prior to and during the operation of system applications.

Open source software (OSS): Software that is publicly available and is provided under a license that is less restrictive than a typical commercial license. A typical license may permit users to change the software and redistribute the software in modified or unmodified form.

Open System Interconnection (OSI): An ISO standard specifying an open system capable of enabling communications between diverse systems. OSI has the following seven layers of distinction: physical, data link, network, transport, session, presentation, and application. These layers provide the functions that enable standardized communications between two application processes.

Operating system (OS): The collection of software that directs a computer's operations, controlling and scheduling the execution of other programs, and managing storage, input/output, and communication resources.

Operations security: Controls over hardware, media, and operators who have access to computer systems. Operations security and controls safeguard information assets while the data is resident in the computer or otherwise directly associated with the computing environment. The controls address both software and hardware as well as such processes as change control and problem management.

Operator: An individual who supports system operations from the operator’s console, monitors execution of the system, and controls the flow of jobs.

Orange Book: Alternate name for the DoD Trusted Computer System Evaluation Criteria.

Original equipment manufacturer (OEM): A manufacturer of products for integration into other products or systems.

ORA: Refer to the definition for object request broker.

ORB: See object request broker.

OS: Commonly used abbreviation for operating system.

OSI: See open system interconnection.

OSS: See open source software.

Overt channel: A path within a computer system or network that is designed for the authorized transfer of data. Compare with covert channel.

Overwrite procedure: A software process that replaces data previously stored on storage media with a predetermined set of meaningless data or random patterns. See magnetic remanence.

Packet: A basic message unit for communication across a network. A packet usually includes routing information, data, and (sometimes) error-detection information.

Packet-switched: (1) A type of network that routes data packets based on an address contained in the data packet. Multiple data packets can share the same network resources. (2) A type of communications network that uses shared facilities to route data packets from and to different users. Unlike a circuit-switched network, a packet-switched network does not set up dedicated circuits for each session.

Passivation layer: A protective layer that covers an integrated circuit.

Password: A protected/private character string that is used to authenticate an identity.

Penetration: The act of successfully bypassing a system’s security mechanisms.

Penetration testing: The portion of security testing in which the evaluators attempt to circumvent the security features of a system. The evaluators might be assumed to use all system design and implementation documentation, which can include listings of system source code, manuals, and circuit diagrams. The evaluators work under the same constraints that are applied to ordinary users.

Performance modeling: The use of simulation software to predict network behavior, allowing developers to perform capacity planning. Simulation makes it possible to model the network and impose varying levels of utilization to observe the effects.

Permissions: A description of the type of authorized interactions that a subject can have with an object. Examples of permission types include read, write, execute, add, modify, and delete.

Personnel security: (1) Procedures established to ensure that all personnel who have access to sensitive information possess the required authority as well as appropriate clearances; (2) Procedures established to evaluate a person’s background; provides assurance of necessary trustworthiness.

PHI: See protected health information.

Physical security: The application of physical barriers and control procedures as preventive measures or countermeasures against threats to resources and sensitive information.

Piggyback: Gaining unauthorized access to a system via another user’s legitimate connection. See between-the-lines entry.

PKI: See public key infrastructure.

Plaintext: Message text in clear, human-readable form.

Platform for Privacy Preferences (P3P): Proposed standards developed by the World Wide Web Consortium (W3C) to implement privacy practices on Web sites.

Portability: Defines network connectivity that can be easily established, used, and then dismantled.

Port scanning: Connecting to Transmission Control Protocol (TCP) and User Datagram Protocol (TCP and UDP) ports in order to determine the services and applications running on the target host.

Presentation layer: The layer of the OSI model that negotiates data transfer syntax for the application layer and performs translations between different data types, if necessary.

Private key encryption: See symmetric (private) key encryption.

Privileged instructions: A set of instructions (e.g., interrupt handling or special computer instructions) used to control features such as storage protection features that are generally executable only when the automated system is operating in the executive state.

PRNG: Pseudorandom number generator. An algorithm that produces a sequence of bits that are uniquely determined from an initial value called a seed. The output of the PRNG “appears” to be random, i.e., the output is statistically indistinguishable from random values. A cryptographic PRNG has the additional property that the output is unpredictable, given that the seed is not known.

Procedural language: Comprises sequential execution of instructions based on the von Neumann architecture of a CPU, memory, and an input/output device. Variables are part of the sets of instructions used to solve a particular problem, and therefore the data is not separate from the instruction statements.

Process: (1) A program in execution. See domain and subject. (2) A series of operations performed in the making, treatment or transportation of a product or material.

Program: A plan of action aimed at accomplishing a clear technical, production, or business objective, with details on what work is to be done, by whom, when, and what means or resources will be used. In computing, a set of coded instructions that a computer executes to solve a problem or produce a desired result.

Program manager: The person ultimately responsible for the overall procurement, development, integration, modification, operation, and maintenance of an IT system.

Protected health information (PHI): Individually identifiable health information that is:

                Transmitted by electronic media

                Maintained in any medium described in the definition of electronic media (under HIPAA)

                Transmitted or maintained in any other form or medium

Protection profile (PP): In the Common Criteria, an implementation-independent specification of the security requirements and protections of a product that could be built.

Protection-critical portions of the TCB: Those portions of the TCB whose normal function is to deal with access control between subjects and objects. Their correct operation is essential to the protection of the data on the system.

Protocols: A set of rules and formats used to specify interactions between communicating entities.

Prototyping: A method of determining or verifying requirements and design specifications. The prototype normally consists of network hardware and software that support a proposed solution. The approach to prototyping is typically a trial-and-error experimental process.

Public key cryptography: See asymmetric key encryption.

Public key infrastructure (PKI): A PKI binds public keys to entities, enables other entities to verify public key bindings, and provides the services needed for ongoing management of keys in a distributed system. The goal of the PKI security architecture is to protect and distribute information that is needed in a widely distributed environment, where the users, resources, and stakeholders are in different places at different times. A PKI allows business to be conducted with the justifiable confidence that:

                The person or process sending the transaction is the actual originator.

                The person or process receiving the transaction is the intended recipient.

                Data integrity has not been compromised.

Purge: The removal of sensitive data from an automation and control system, IT system, storage device, or peripheral device with storage capacity at the end of a processing period. This action is performed in such a way that there is assurance proportional to the sensitivity of the data that the data cannot be reconstructed.

Quality: The degree to which software or hardware meets its specifications and satisfies its intended purpose.

RADIUS: See remote authentication dial-in user service.

RC4: Rivest, Shamir and Adleman (RSA) cipher algorithm 4.

Read: A fundamental operation that results only in the flow of information from an object to a subject.

Read access: Permission to read information.

Read-only memory (ROM): Computer memory on which data has been written and stored. Once data has been written onto a ROM chip, it cannot be removed and can only be read.

Recovery planning: The advance planning and preparations that are necessary to minimize loss and to ensure the availability of the critical information systems of an organization in the event of a system failure or outage/disruption.

Recovery procedures: The actions that are necessary to restore a system’s computational capability and data files after a system failure or outage/disruption.

Reference-monitor concept: An access-control concept that refers to the use of an abstract machine (mechanism) that mediates all access to objects by subjects.

Reference-validation mechanism: An implementation of the reference monitor concept. A security kernel is a type of reference-validation mechanism.

Regression testing: Testing process used to ensure that existing software functions of the product have not been accidentally damaged as an unintended by-product of adding additional software features.

Relevancies: Attributes of a software component that are applicable to its attack surface. A system’s attack surface is the set of ways in which an adversary can enter the system and potentially cause damage.

Reliability: The probability of a given system performing its mission adequately for a specified period of time under expected operating conditions.

Remote authentication dial-in user service (RADIUS): A server for remote user authentication and accounting. Its primary use is for Internet service providers, though it may as well be used on any network that needs a centralized authentication and/or accounting service for its workstations.

Requirement: A description of a functional or nonfunctional (security) behavior that software or hardware must meet.

Residual risk: The portion of risk that remains after security measures have been applied.

Resilience: The ability to quickly adapt and recover from any known or unknown changes to the environment through holistic implementation of risk management, contingency, and continuity planning. In software, a measure of how effectively software can recover after being compromised. If highly resilient software is compromised, damage to the software will be minimized and it will recover quickly to an acceptable level of service.

RFC: Request for comment. Documents that are the working notes of the Internet research and development community published by the Internet Engineering Task Force (IETF). An RFC document may be on essentially any topic related to computer communication, and can describe methods, protocols, behaviors, and so on pertaining to the Internet.

RFP: Request for proposal. A document issued at an early stage in a procurement process where an organization is invited to submit a proposal to provide goods or services.

RIP: See routing information protocol.

Risk: A measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of: (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence.

Risk analysis: The process of identifying security risks, determining their magnitude, and identifying areas needing safeguards. Risk analysis is a part of risk management.

Risk assessment: The process that systematically identifies potential vulnerabilities to valuable system resources and threats to those resources, quantifies loss exposures and consequences based on probability of occurrence, and (optionally) recommends how to allocate resources to countermeasures to minimize total exposure.

Risk management: Process of identifying and applying countermeasures commensurate with the value of the assets protected based on a risk assessment.

Robustness: The degree to which a software component or system can function correctly in the presence of invalid or unexpected inputs and unexpected or stressful environmental conditions, including input and conditions that are intentional and malicious. Robust systems are designed to operate under a variety of conditions within a certain range and will fail gracefully outside of that range. Resilient systems have the ability to recover and adapt to any known environmental changes.

Role: In role-based access control, permitted actions on resources are identified with roles rather than with individual subject identities.

ROM: See read-only memory.

Router: A network component that provides internetworking at the network layer of a network’s architecture by allowing individual networks to become part of a wide area network (WAN). A router works by using logical and physical addresses to connect two or more separate networks. It determines the best path by which to send a packet of information.

Routing information protocol (RIP): A common type of routing protocol. RIP bases its routing path on the distance (number of hops) to the destination. RIP maintains optimum routing paths by sending out routing update messages if the network topology changes.

RSA: Rivest, Shamir, and Adleman.

Safeguards: See security safeguards.

Safety: Software or hardware for industrial automation and control systems and IT systems that are life critical should exhibit the property of safety, that is, they should behave as necessary and required even if components of the system fail.

Safety instrumented system (SIS): System used to implement one or more safety instrumented functions. Note: A safety instrumented system is composed of any combination of sensors, logic solvers, and actuators.

Safety integrity level: Discrete level (one of four) for specifying the safety integrity requirements of the safety instrumented functions to be allocated to a safety instrumented system. Note: Safety integrity level 4 has the highest level of safety integrity; safety integrity level 1 has the lowest.

Safety network: Network that connects safety instrumented systems for the communication of safety-related information.

Sandbox: An access control based protection mechanism. It is commonly applied to restrict the access rights of mobile code that is downloaded from a Web site as an applet. The code is set up to run in a “sandbox” that blocks its access to the local workstation’s hard disk, thus preventing malicious activity by the code. The sandbox is usually interpreted by a virtual machine such as the Java Virtual Machine (JVM).

Scalar: A computer language variable or field that can hold only one value at a time.

Scanning: Sending packets or requests to another system to gain information to be used in a subsequent attack.

SDLC: See system development life cycle.

Secure coding: The use of software programming practices that reduce or eliminate software defects or programming errors so that software can be built with a higher level of security and quality assurance.

Secure configuration management: The set of procedures that are appropriate for controlling changes to a system’s hardware and software structure for the purpose of ensuring that changes will not lead to violations of the system’s security policy.

Secure sockets layer (SSL): A protocol used for protecting private information during transmission via the Internet.

Secure state: A condition in which no subject can access any object in an unauthorized manner.

Security: Measures and controls that ensure the confidentiality, integrity, availability, and accountability of the information processed, transmitted, and stored by industrial automation and control systems or IT systems.

Security evaluation: An evaluation that is performed to assess the degree of trust that can be placed in a system for the secure handling of sensitive information. One type, a product evaluation, is an evaluation performed on the hardware and software features and assurances of a computer product from a perspective that excludes the application environment. The other type, a system evaluation, is made for the purpose of assessing a system’s security safeguards with respect to a specific operational mission; it is a major step in the certification and accreditation process.

Security failure: An event that is a violation of a particular system’s explicit or implicit security policy.

Security fault analysis: A security analysis, usually performed on hardware at the gate level, to determine the security properties of a device when a hardware fault is encountered.

Security features: The security-relevant functions, mechanisms, and characteristics of system hardware and software. Security features are a subset of system security safeguards.

Security flaw: An error of commission or omission in a system that might enable protection mechanisms to be bypassed.

Security flow analysis: A security analysis performed on a formal system specification that locates the potential flows of information within the system.

Security functional requirements: Requirements, preferably from the Common Criteria, Part 2, that when taken together specify the security behavior of an industrial automation and control system or IT product or system.

Security kernel: The hardware, firmware, and software elements of a trusted computing base (TCB) that implement the reference monitor concept. The security kernel must mediate all accesses, must be protected from modification, and must be verifiable as correct.

Security level: The combination of a hierarchical classification and a set of nonhierarchical categories that represents the sensitivity of information.

Security measures: Elements of software, firmware, hardware, or procedures that are included in a system for the satisfaction of security specifications.

Security objective: A statement of intent to counter specified threats and/or satisfy specified organizational security policies and assumptions.

Security perimeter: The boundary where security controls are in effect to protect assets.

Security policy: The set of laws, rules, and practices that regulates how an organization manages, protects, and distributes sensitive information.

Security policy model: A formal presentation of the security policy enforced by the system. It must identify the set of rules and practices that regulate how a system manages, protects, and distributes sensitive information. See Bell-LaPadula model and formal security policy model.

Security process: The series of activities that monitor, evaluate, test, certify, accredit, and maintain the system accreditation throughout the system life cycle.

Security requirements: The types and levels of protection that are necessary for equipment, data, information, applications, and facilities to meet the security policy.

Security requirements baseline: A description of the minimum requirements necessary for a system to maintain an acceptable level of security.

Security safeguards: The proactive protective measures and controls that are prescribed to meet the security requirements specified for a system. Those safeguards can include (but are not necessarily limited to) hardware and software security features, operating procedures, accountability procedures, access and distribution controls, management constraints, personnel security, and physical structures, areas, and devices. Also called safeguards.

Security specifications: A detailed description of the safeguards required to protect a system.

Security target (ST): (1) In the Common Criteria, a listing of the security claims for a particular IT security product; (2) A set of security functional and assurance requirements and specifications to be used as the basis for evaluating an identified product or system.

Security test and evaluation (ST&E): Examination and analysis of the safeguards required to protect an IT system, as they have been applied in an operational environment, to determine the security posture of that system. Security posture is the security status of an industrial control and automation system or an enterprise’s networks, information, and systems based on information assurance resources (e.g., people, hardware, software, policies) and capabilities in place to manage the defense of the automation system or enterprise and to react as the situation changes.

Security testing: A process that is used to determine that the security features of a system are implemented as designed. This process includes hands-on functional testing, penetration testing, and verification.

Sensitive information: Information that if lost, misused, modified, or accessed by unauthorized individuals could affect the national interest, industrial automation and control system safety and operations, or IT systems.

Sensitivity label: A piece of information that represents the security level of an object. Sensitivity labels are used by the TCB as the basis for mandatory access control decisions.

Sensors and actuators: Measuring elements (sensors) and actuating elements (actuators) connected to process equipment and to the control system.

Service roles: The context in which a software component operates, consisting of type of service, authentication mechanism, and identities.

Session layer: One of the seven OSI model layers. Establishes, manages, and terminates sessions between applications.

Shared key authentication: A type of authentication that assumes each station has received a secret shared key through a secure channel, independent from an 802.11 network. Stations authenticate through shared knowledge of the secret key. Use of shared key authentication requires implementation of the 802.11 Wired Equivalent Privacy (WEP) algorithm.

Simple mail transfer protocol (SMTP): The Internet email protocol.

Simple Network Management Protocol (SNMP): Internet standard protocol for managing devices on IP computer networks.

Simple security condition: See simple security property.

Simple security property: A Bell-LaPadula security model rule enabling a subject read access to an object only if the security level of the subject dominates the security level of the object. Synonymous with simple security condition.

SLC: See software life cycle.

Social engineering: Attacks targeting an organization’s employees through the use of social skills to obtain sensitive information.

Software assurance: The level of confidence that software is free from vulnerabilities, either intentionally designed into the software or accidentally inserted at any time during its life cycle, and that the software functions in the intended manner.

System development life cycle (SDLC): The scope of activities associated with a system, encompassing the system’s initiation, development and acquisition, implementation, operation and maintenance, and ultimately its disposal that instigates another system initiation.

Software development life cycle process: The process using a model to translate user needs into a software product.

Software engineering: The science and art of specifying, designing, implementing, and evolving programs, documentation, and operating procedures whereby computers can be made useful to humans.

Software life cycle (SLC): Comprises software concept, requirements analysis, architectural design, detailed design, code construction and unit testing, system testing, release, and code distribution and maintenance.

Software process: A set of activities, methods, and practices that are used to develop and maintain software and associated products.

Software process capability: The range of expected results that can be achieved by following a software process.

Software process maturity: The extent to which a software process is defined, managed, measured, controlled, and effective.

Software process performance: The result achieved by following a software process.

Software security: General-purpose executive, utility, or software development tools and applications programs or routines used to protect data that are handled by a system.

Software system test and evaluation process: A process that plans, develops, and documents the quantitative demonstration of the fulfillment of all baseline functional performance and operational and interface requirements of a software system.

Software weakness: A portion of code that may lead to a vulnerability.

Software-intensive system: A system in which the majority of components and functionalities are implemented in software.

SONET: See synchronous optical networking.

Source code: A series of statements written in a human-readable computer programming language.

Spoofing: Attempting to gain access to a system by posing as an authorized user. Synonymous with impersonating, masquerading, or mimicking.

SQL: See structured query language.

SQL injection: The process of an attacker inserting SQL statements into a query by exploiting vulnerability for the purpose of sending commands to a Web server database.

SSL: See Secure Sockets Layer.

SSO: See system security officer.

ST: See security target.

ST&E: See security test and evaluation.

Standalone (shared system): A system that is physically and electrically isolated from all other systems and intended to be used by more than one person, either simultaneously (e.g., a system that has multiple terminals) or serially, with data belonging to one user remaining available to the system while another user uses the system (e.g., a personal computer that has nonremovable storage media, such as a hard disk).

Standalone (single-user system): A system that is physically and electrically isolated from all other systems and is intended to be used by one person at a time, with no data belonging to other users remaining in the system (e.g., a personal computer that has removable storage media, such as a flash drive).

Star property (* property): A Bell-LaPadula security model rule giving a subject write access to an object only if the security level of the object dominates the security level of the subject. Also called the confinement property.

State variable: A variable that represents either the state of the system or the state of some system resource.

Structured query language (SQL): An international standard for defining and accessing relational databases.

Subject: An active entity, generally in the form of a person, process, or device, that causes information to flow among objects or that changes the system state.

Subversion: An intentional violation of software’s integrity or its security controls in order to compromise the software or the using system.

Supervisor state: Synonymous with executive state.

Survivability: The capability of software or hardware to fulfill its objectives in the presence of attacks, failures, or accidents.

Symmetric (private) key encryption: Cryptographic system in which the sender and receiver both know a secret key that is used to encrypt and decrypt a message.

Synchronous optical networking (SONET): A fiber-optic transmission system for high-speed digital traffic. SONET is part of the Broadband Integrated Services Digital Network (BISDN) standard.

Synchronous transmission: A type of communications data synchronization whereby frames are sent within defined time periods. It uses a clock to control the timing of bits being sent in a frame. See asynchronous transmission.

System: Any organized assembly of resources and procedures united and regulated by interaction or interdependence to accomplish a set of specific functions.

System development methodologies: Methodologies developed through software engineering to manage the complexity of system development. Development methodologies include software engineering aids and high-level design analysis tools.

System entity: A system subject (user or process) or object.

System integrity: A characteristic of a system that performs its intended function in an unimpaired manner, free from deliberate or inadvertent unauthorized manipulation of the system. Also see integrity.

Security mode: The lowest security level supported by a system at a particular time or in a particular environment.

System security officer (SSO): Individual having the responsibility to ensure the security of an industrial automation and control system or IT system, excluding, for example, guards, physical security personnel, law enforcement officials, and disaster recovery officials.

System testing: A type of testing that verifies the installation of the entire network. Testers normally complete system testing in a simulated production environment, simulating actual users in order to ensure that the network meets all stated requirements.

Tainted input: Input data that has not been examined or sanitized prior to use by an application.

Tampering: Any unauthorized modification that alters the proper functioning of equipment or a system in a manner that degrades the security or functionality that it provides.

Target of evaluation (TOE): In the Common Criteria, TOE refers to the product to be tested.

TCB: See trusted computing base.

TCP: See Transmission Control Protocol.

Technical attack: An attack that can be perpetrated by circumventing or nullifying hardware and software protection mechanisms, rather than by subverting system personnel or other users through social engineering.

Technical vulnerability: A hardware, firmware, communication, or software flaw that leaves a computer processing system open for potential exploitation, either externally or internally, thereby resulting in a risk to the owner, user, or manager of the system.

TELNET (not an acronym): A virtual terminal protocol used in the Internet, enabling users to log in to a remote host. TELNET is defined as part of the TCP/IP protocol suite.

TEMPEST: Transient ElectroMagnetic Pulse Emanations Standard, the U.S. government standard for control of spurious compromising emanations emitted by electrical equipment; also used to refer to the investigation, study, and control of such emanations.

Test case: An executable test with a specific set of input values and a corresponding expected result.

Threat agent: A means of exploiting a vulnerability in a system, operation, or facility.

Threat analysis: The examination of all actions and events that might adversely affect a system or operation.

Threat: Any circumstance or event with the potential to cause harm to an industrial automation and control system or IT system in the form of destruction, disclosure, adverse modification of data, and/or denial of service.

Threat factor: The characterization of a threat based on a violation of confidentiality, integrity, or availability (CIA); the asset being targeted, the role of the user, and type of accessibility. It helps refine the application context to aid in analyzing the threat.

Threat monitoring: The analysis, assessment, and review of audit trails and other data that are collected for the purpose of searching for system events that might constitute violations or attempted violations of system security.

TLS: See transport layer security.

TOE: See target of evaluation.

Top-level specification: A nonprocedural description of system behavior at the most abstract level; typically, a functional specification that omits all implementation details.

Topology: A description of a network’s geographical layout of nodes and links.

Traceroute: Software utility used to determine the path to a target computer.

Transmission Control Protocol (TCP): A commonly used protocol for establishing and maintaining communications between applications on different computers. TCP provides full-duplex, acknowledged, and flow-controlled service to upper-layer protocols and applications.

Transmission Control Protocol / Internet Protocol (TCP/IP): A de facto industry-standard protocol for interconnecting disparate networks. TCP and IP are standard protocols that define both the reliable full-duplex transport level and the connectionless, best-effort unit of information passed across a network such as the Internet.

Transport layer: OSI model layer that provides mechanisms for the establishment, maintenance, and orderly termination of virtual circuits while shielding the higher layers from the network implementation details.

Transport layer security (TLS): An authentication and security protocol widely implemented in browsers and Web servers.

Trap door: A hidden software or hardware mechanism that can be triggered to permit system protection mechanisms to be circumvented. It is activated in a manner that appears innocent—for example, a special “random” key sequence at a terminal. Software developers often introduce trap doors in their code to enable them to re-enter the system and perform certain functions.

Trojan horse: A computer program that has an apparently or actually useful function but contains additional (hidden) functions that surreptitiously exploit the legitimate authorizations of the invoking process to the detriment of security or integrity.

True positive: When a security tool reports a weakness that is actually present. See false positive, false negative.

Trust: In a relationship between entities, the confidence that each entity will behave as expected, dependably, securely, and reliably within a specified context.

Trusted computer system: A system that employs sufficient hardware and software security assurance measures to enable its use for the simultaneous processing of a range of sensitive or classified information.

Trusted computing base (TCB): The totality of protection mechanisms within a computer system, including hardware, firmware, and software, the combination of which is responsible for enforcing a security policy. A TCB consists of one or more components that together enforce a unified security policy over a product or system. The ability of a TCB to correctly enforce a unified security policy depends solely on the mechanisms within the TCB and on the correct input of parameters by system administrative personnel (e.g., a user’s clearance level) related to the security policy.

Trusted distribution: A method for distributing the TCB hardware, software, and firmware components, both originals and updates, that provides means for protecting the TCB from modification during distribution and for the detection of any changes to the TCB that might occur.

Trusted path: A mechanism by which a person at a terminal can communicate directly with the TCB. This mechanism can be activated only by the person or by the TCB and cannot be imitated by untrusted software.

Trusted process: A process whose incorrect or malicious execution is capable of violating system security policy.

Trusted software: The software portion of the TCB.

Trustworthiness: The attribute of a person or organization that provides confidence to others of the qualifications, capabilities, and reliability of that entity to perform specific tasks and fulfill assigned responsibilities.

UDP: See User Datagram Protocol.

Unit: See lines, units, cells.

Untrusted process: A process that has not been evaluated or examined for adherence to the security policy. It might include incorrect or malicious code that attempts to circumvent the security mechanisms.

Use case: A technique for capturing potential functional requirements that employs the use of one or more scenarios that convey how the system should interact with the end user or another system to achieve a specific goal.

Note: Use cases typically treat the system as a black box, and the interactions with the system, including system responses, are as perceived from outside the system. Use cases are popular because they simplify the description of requirements and avoid the problem of making assumptions about how this functionality will be accomplished.

User: (1) A person or process that is accessing an industrial automation and control system or IT system either by direct connection (e.g., via terminals) or by indirect connection (in other words, preparing input data or receiving output that is not reviewed for content or classification by a responsible individual); (2) A person or process authorized to access an industrial automation and control system or IT system.

User Datagram Protocol (UDP): UDP uses the underlying Internet Protocol (IP) to transport a message. This is an unreliable, connectionless delivery scheme. It does not use acknowledgments to ensure that messages arrive and does not provide feedback to control the rate of information flow. UDP messages can be lost, duplicated, or arrive out of order. It is useful in sending high volumes of data such as streaming video.

User ID: A unique symbol or character string that is used by a system to identify a specific user.

Validation: A process for evaluating whether software will satisfy the specified requirements (functional and nonfunctional) for its intended purpose.

Verification: A process for evaluating whether software conforms to the specifications, regulations, or other conditions imposed on it while it was being developed.

Verification and validation (V&V): All of the verification and validation activities that are undertaken to ensure that software will function according to its specification.

Virus: A self-replicating or self-reproducing program that spreads by inserting copies of itself into other executable code or documents.

Vulnerability: A weakness in an information system, system security procedures, internal controls, or implementation that could be exploited or triggered by a threat source.

Vulnerability analysis: A method of determining vulnerability that includes the susceptibility of a particular system to a specific attack and the opportunities that are available to a threat agent to mount that attack.

Vulnerability assessment: Systematic examination of an industrial automation and control system or IT system or product to determine the adequacy of security measures, identify security deficiencies, provide data from which to predict the effectiveness of proposed security measures, and confirm the adequacy of such measures after implementation.

Weakness: A bug found in software that has the potential of being exploited as a vulnerability when the software is operational.

White box test: A test in which the ethical hacking team has full knowledge of the target information system.

White hat hacker: An individual who conducts ethical hacking to help secure and protect and organization’s information systems.

Wide area network (WAN): A network that interconnects users over a wide area, usually encompassing different metropolitan areas.

Wired equivalency privacy (WEP): The algorithm of the IEEE 802.11 wireless LAN standard that is used to protect transmitted information from disclosure. WEP is designed to prevent the violation of the confidentiality of data transmitted over a wireless LAN. WEP generates secret shared encryption keys that both source and destination stations use to alter frame bits to avoid disclosure to eavesdroppers. WEP is no longer considered a viable encryption mechanism due to known weaknesses.

Wired equivalency privacy (WEP) encryption: An encryption method specified in the IEEE 802.11standard for wireless security. See Wired Equivalency Privacy (WEP).

Wireless: A property of a computing device whereby it can access a network without a wired connection.

WLAN: Wireless local area network.

Work factor: An estimate of the effort or time needed by a potential intruder who has specified expertise and resources to overcome a protective measure.

Work function: The difficulty of recovering plaintext from ciphertext, as measured by cost and/or time. The security of the system is directly proportional to the value of the work function. The work function need only be large enough to suffice for the intended application. If the message to be protected loses its value after a short period of time, the work function need only be large enough to ensure that the decryption would be highly infeasible in that period of time.

Write: A fundamental operation that results only in the flow of information from a subject to an object.

Write access: Permission to write to an object.

Zeroizing: The act of erasing sensitive parameters from a cryptographic module.

Zeroization: The practice of erasing sensitive parameters (especially keys) from a cryptographic module to prevent their disclosure. The keys are typically zeroized (erased) in response to an attempt to tamper with the module.