Chattanooga Software Center
 

Glossary of Computer Software Development Terminology

 

The terms are defined, as much as possible, using available standards. The source of such definitions appears immediately following the term or phrase in parenthesis, e.g. (NIST).

 

The source documents are listed at the bottom of this page.

 

 
  ABCDEFGHIJKLMNOPQRSTUVWXYZ  
     
 

TB. terabyte.

 

TCP/IP. transmission control protocol/Internet protocol.

 

tape. Linear magnetic storage hardware, rolled onto a reel or cassette.

 

telecommunication system. The devices and functions relating to transmission of data between the central processing system and remotely located users.

 

terabyte. Approximately one trillion bytes; precisely 240 or 1,099,511,627,776 bytes. See: kilobyte, megabyte, gigabyte.

 

terminal. A device, usually equipped with a CRT display and keyboard, used to send and receive information to and from a computer via a communication channel.

 

test. (IEEE) An activity in which a system or component is executed under specified conditions, the results are observed or recorded and an evaluation is made of some aspect of the system or component.

 

testability. (IEEE) (1) The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met. (2) The degree to which a requirement is stated in terms that permit establishment of test criteria and performance of tests to determine whether those criteria have been met. See: measurable.

 

test case. (IEEE) Documentation specifying inputs, predicted results, and a set of execution conditions for a test item. Syn: test case specification. See: test procedure.

 

test case generator. (IEEE) A software tool that accepts as input source code, test criteria, specifications, or data structure definitions; uses these inputs to generate test input data; and, sometimes, determines expected results. Syn: test data generator, test generator.

 

test design. (IEEE) Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. See: testing functional; cause effect graphing; boundary value analysis; equivalence class partitioning; error guessing; testing, structural; branch analysis; path analysis; statement coverage; condition coverage; decision coverage; multiple-condition coverage.

 

test documentation. (IEEE) Documentation describing plans for, or results of, the testing of a system or component, Types include test case specification, test incident report, test log, test plan, test procedure, test report.

 

test driver. (IEEE) A software module used to invoke a module under test and, often, provide test inputs, control and monitor execution, and report test results. Syn: test harness.

 

test harness. See: test driver.

 

test incident report. (IEEE) A document reporting on any event that occurs during testing that requires further investigation. See: failure analysis.

 

test item. (IEEE) A software item which is the object of testing.

 

test log. (IEEE) A chronological record of all relevant details about the execution of a test.

 

test phase. (IEEE) The period of time in the software life cycle in which the components of a software product are evaluated and integrated, and the software product is evaluated to determine whether or not requirements have been satisfied.

 

test plan. (IEEE) Documentation specifying the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, responsibilities, required, resources, and any risks requiring contingency planning. See: test design, validation protocol.

 

test procedure. (NIST) A formal document developed from a test plan that presents detailed instructions for the setup, operation, and evaluation of the results for each defined test. See: test case.

 

test readiness review. (IEEE) (1) A review conducted to evaluate preliminary test results for one or more configuration items; to verify that the test procedures for each configuration item are complete, comply with test plans and descriptions, and satisfy test requirements; and to verify that a project is prepared to proceed to formal testing of the configuration items. (2) A review as in (1) for any hardware or software component. Contrast with code review, design review, formal qualification review, requirements review.

 

test report. (IEEE) A document describing the conduct and results of the testing carried out for a system or system component.

 

test result analyzer. A software tool used to test output data reduction, formatting, and printing.

 

testing. (IEEE) (1) The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component. (2) The process of analyzing a software item to detect the differences between existing and required conditions, i.e. bugs, and to evaluate the features of the software items. See: dynamic analysis, static analysis, software engineering.

 

testing, 100%. See: testing, exhaustive.

 

testing, acceptance. (IEEE) Testing conducted to determine whether or not a system satisfies its acceptance criteria and to enable the customer to determine whether or not to accept the system. Contrast with testing, development; testing, operational. See: testing, qualification.

 

testing, alpha []. (Pressman) Acceptance testing performed by the customer in a controlled environment at the developer's site. The software is used by the customer in a setting approximating the target environment with the developer observing and recording errors and usage problems.

 

testing, assertion. (NBS) A dynamic analysis technique which inserts assertions about the relationship between program variables into the program code. The truth of the assertions is determined as the program executes. See: assertion checking, instrumentation.

 

testing, beta []. (1) (Pressman) Acceptance testing performed by the customer in a live application of the software, at one or more end user sites, in an environment not controlled by the developer. (2) For medical device software such use may require an Investigational Device Exemption [IDE] or Institutional Review Board [IRB] approval.

 

testing, boundary value. A testing technique using input values at, just below, and just above, the defined limits of an input domain; and with input values causing outputs to be at, just below, and just above, the defined limits of an output domain. See: boundary value analysis; testing, stress.

 

testing, branch. (NBS) Testing technique to satisfy coverage criteria which require that for each decision point, each possible branch [outcome] be executed at least once. Contrast with testing, path; testing, statement. See: branch coverage.

 

testing, compatibility. The process of determining the ability of two or more systems to exchange information. In a situation where the developed software replaces an already working program, an investigation should be conducted to assess possible comparability problems between the new software and other programs or systems. See: different software system analysis; testing, integration; testing, interface.

 

testing, component. See: testing, unit.

 

testing, design based functional. (NBS) The application of test data derived through functional analysis extended to include design functions as well as requirement functions. See: testing, functional.

 

testing, development. (IEEE) Testing conducted during the development of a system or component, usually in the development environment by the developer. Contrast with testing, acceptance; testing, operational.

 

testing, exhaustive. (NBS) Executing the program with all possible combinations of values for program variables. Feasible only for small, simple programs.

 

testing, formal. (IEEE) Testing conducted in accordance with test plans and procedures that have been reviewed and approved by a customer, user, or designated level of management. Antonym: informal testing.

 

testing, functional. (IEEE) (1) Testing that ignores the internal mechanism or structure of a system or component and focuses on the outputs generated in response to selected inputs and execution conditions. (2) Testing conducted to evaluate the compliance of a system or component with specified functional requirements and corresponding predicted results. Syn: black-box testing, input/output driven testing. Contrast with testing, structural.

 

testing, integration. (IEEE) An orderly progression of testing in which software elements, hardware elements, or both are combined and tested, to evaluate their interactions, until the entire system has been integrated.

 

testing, interface. (IEEE) Testing conducted to evaluate whether systems or components pass data and control correctly to one another. Contrast with testing, unit; testing, system. See: testing, integration.

 

testing, interphase. See: testing, interface.

 

testing, invalid case. A testing technique using erroneous [invalid, abnormal, or unexpected] input values or conditions. See: equivalence class partitioning.

 

testing, mutation. (IEEE) A testing methodology in which two or more program mutations are executed using the same test cases to evaluate the ability of the test cases to detect differences in the mutations.

 

testing, operational. (IEEE) Testing conducted to evaluate a system or component in its operational environment. Contrast with testing, development; testing, acceptance; See: testing, system.

 

testing, parallel. (ISO) Testing a new or an altered data processing system with the same source data that is used in another system. The other system is considered as the standard of comparison. Syn: parallel run.

 

testing, path. (NBS) Testing to satisfy coverage criteria that each logical path through the program be tested. Often paths through the program are grouped into a finite set of classes. One path from each class is then tested. Syn: path coverage. Contrast with testing, branch; testing, statement; branch coverage; condition coverage; decision coverage; multiple condition coverage; statement coverage.

 

testing, performance. (IEEE) Functional testing conducted to evaluate the compliance of a system or component with specified performance requirements.

 

testing, qualification. (IEEE) Formal testing, usually conducted by the developer for the consumer, to demonstrate that the software meets its specified requirements. See: testing, acceptance; testing, system.

 

testing, regression. (NIST) Rerunning test cases which a program has previously executed correctly in order to detect errors spawned by changes or corrections made during software development and maintenance.

 

testing, special case. A testing technique using input values that seem likely to cause program errors; e.g., "0", "1", NULL, empty string. See: error guessing.

 

testing, statement. (NIST) Testing to satisfy the criterion that each statement in a program be executed at least once during program testing. Syn: statement coverage. Contrast with testing, branch; testing, path; branch coverage; condition coverage; decision coverage; multiple condition coverage; path coverage.

 

testing, storage. This is a determination of whether or not certain processing conditions use more storage [memory] than estimated.

 

testing, stress. (IEEE) Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements. Syn: testing, boundary value.

 

testing, structural. (1) (IEEE) Testing that takes into account the internal mechanism [structure] of a system or component. Types include branch testing, path testing, statement testing. (2) Testing to insure each program statement is made to execute during testing and that each program statement performs its intended function. Contrast with functional testing. Syn: white-box testing, glass-box testing, logic driven testing.

 

testing, system. (IEEE) The process of testing an integrated hardware and software system to verify that the system meets its specified requirements. Such testing may be conducted in both the development environment and the target environment.

 

testing, unit. (1) (NIST) Testing of a module for typographic, syntactic, and logical errors, for correct implementation of its design, and for satisfaction of its requirements. (2) (IEEE) Testing conducted to verify the implementation of the design for one software element; e.g., a unit or module; or a collection of software elements. Syn: component testing.

 

testing, usability. Tests designed to evaluate the machine/user interface. Are the communication device(s) designed in a manner such that the information is displayed in a understandable fashion enabling the operator to correctly interact with the system?

 

testing, valid case. A testing technique using valid [normal or expected] input values or conditions. See: equivalence class partitioning.

 

testing, volume. Testing designed to challenge a system's ability to manage the maximum amount of data over a period of time. This type of testing also evaluates a system's ability to handle overload situations in an orderly fashion.

 

testing, worst case. Testing which encompasses upper and lower limits, and circumstances which pose the greatest chance finding of errors. Syn: most appropriate challenge conditions. See: testing, boundary value; testing, invalid case; testing, special case; testing, stress; testing, volume.

 

time sharing. (IEEE) A mode of operation that permits two or more users to execute computer programs concurrently on the same computer system by interleaving the execution of their programs. May be implemented by time slicing, priority-based interrupts, or other scheduling methods.

 

timing. (IEEE) The process of estimating or measuring the amount of execution time required for a software system or component. Contrast with sizing.

 

timing analyzer. (IEEE) A software tool that estimates or measures the execution time of a computer program or portion of a computer program, either by summing the execution times of the instructions along specified paths or by inserting probes at specified points in the program and measuring the execution time between probes.

 

timing and sizing analysis. (IEEE) Analysis of the safety implications of safety-critical requirements that relate to execution time, clock time, and memory allocation.

 

top-down design. Pertaining to design methodology that starts with the highest level of abstraction and proceeds through progressively lower levels. See: structured design.

 

touch sensitive. (ANSI) Pertaining to a device that allows a user to interact with a computer system by touching an area on the surface of the device with a finger, pencil, or other object, e.g., a touch sensitive keypad or screen.

 

touch screen. A touch sensitive display screen that uses a clear panel over or on the screen surface. The panel is a matrix of cells, an input device, that transmits pressure information to the software.

 

trace. (IEEE) (1) A record of the execution of a computer program, showing the sequence of instructions executed, the names and values of variables, or both. Types include execution trace, retrospective trace, subroutine trace, symbolic trace, variable trace. (2) To produce a record as in (1). (3) To establish a relationship between two or more products of the development process; e.g., to establish the relationship between a given requirement and the design element that implements that requirement.

 

traceability. (IEEE) (1) The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; e.g., the degree to which the requirements and design of a given software component match. See: consistency. (2) The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies. See: traceability analysis, traceability matrix.

 

traceability analysis. (IEEE) The tracing of (1) Software Requirements Specifications requirements to system requirements in concept documentation, (2) software design descriptions to software requirements specifications and software requirements specifications to software design descriptions, (3) source code to corresponding design specifications and design specifications to source code. Analyze identified relationships for correctness, consistency, completeness, and accuracy. See: traceability, traceability matrix.

 

traceability matrix. (IEEE) A matrix that records the relationship between two or more products; e.g., a matrix that records the relationship between the requirements and the design of a given software component. See: traceability, traceability analysis.

 

transaction. (ANSI) (1) A command, message, or input record that explicitly or implicitly calls for a processing action, such as updating a file. (2) An exchange between and end user and an interactive system. (3) In a database management system, a unit of processing activity that accomplishes a specific purpose such as a retrieval, an update, a modification, or a deletion of one or more data elements of a storage structure.

 

transaction analysis. A structured software design technique, deriving the structure of a system from analyzing the transactions that the system is required to process.

 

transaction flowgraph. (Beizer) A model of the structure of the system's [program's] behavior, i.e., functionality.

 

transaction matrix. (IEEE) A matrix that identifies possible requests for database access and relates each request to information categories or elements in the database.

 

transform analysis. A structured software design technique in which system structure is derived from analyzing the flow of data through the system and the transformations that must be performed on the data.

 

translation. (NIST) Converting from one language form to another. See: assembling, compilation, interpret.

 

transmission control protocol/Internet protocol. A set of communications protocols developed for the Defense Advanced Research Projects Agency to internetwork dissimilar systems. It is used by many corporations, almost all American universities, and agencies of the federal government. The File Transfer Protocol and Simple Mail Transfer Protocol provide file transfer and electronic mail capability. The TELENET protocol provides a terminal emulation capability that allows a user to interact with any other type of computer in the network. The TCP protocol controls the transfer of the data, and the IP protocol provides the routing mechanism.

 

trojan horse. A method of attacking a computer system, typically by providing a useful program which contains code intended to compromise a computer system by secretly providing for unauthorized access, the unauthorized collection of privileged system or user data, the unauthorized reading or altering of files, the performance of unintended and unexpected functions, or the malicious destruction of software and hardware. See: bomb, virus, worm.

 

truth table. (1) (ISO) An operation table for a logic operation. (2) A table that describes a logic function by listing all possible combinations of input values, and indicating, for each combination, the output value.

 

tuning. (NIST) Determining what parts of a program are being executed the most. A tool that instruments a program to obtain execution frequencies of statements is a tool with this feature.

 

twisted pair. A pair of thin-diameter insulated wires commonly used in telephone wiring. The wires are twisted around each other to minimize interference from other twisted pairs in the cable. Twisted pairs have less bandwidth than coaxial cable or optical fiber. Abbreviated UTP for Unshielded Twisted Pair. Syn: twisted wire pair.

 
     
 

Source Documents

 

  1. The New IEEE Standard Dictionary of Electrical and Electronics Terms, IEEE Std. 100-1992.
  2. IEEE Standards Collection, Software Engineering, 1994 Edition, published by the Institute of Electrical and Electronic Engineers Inc.
  3. National Bureau of Standards [NBS] Special Publication 500-75 Validation, Verification, and Testing of Computer Software, 1981.
  4. Federal Information Processing Standards [FIPS] Publication 101, Guideline For Lifecycle Validation, Verification, and Testing of Computer Software, 1983.
  5. Federal Information Processing Standards [FIPS] Publication 105, Guideline for Software Documentation Management, 1984.
  6. American National Standard for Information Systems, Dictionary for Information Systems, American National Standards Institute, 1991.
  7. FDA Technical Report, Software Development Activities, July 1987.
  8. FDA Guide to Inspection of Computerized Systems in Drug Processing, 1983.
  9. FDA Guideline on General Principles of Process Validation, May 1987.
  10. Reviewer Guidance for Computer Controlled Medical Devices Undergoing 510(k) Review, Office of Device Evaluation, CDRH, FDA, August 1991.
  11. HHS Publication FDA 90-4236, Preproduction Quality Assurance Planning.
  12. MIL-STD-882C, Military Standard System Safety Program Requirements, 19JAN1993.
  13. International Electrotechnical Commission, International Standard 1025, Fault Tree Analysis.
  14. International Electrotechnical Commission, International Standard 812, Analysis Techniques for System Reliability - Procedure for Failure Mode and Effects Analysis [FMEA].
  15. FDA recommendations, Application of the Medical Device GMP to Computerized Devices and Manufacturing Processes, May 1992.
  16. Pressman, R., Software Engineering, A Practitioner's Approach, Third Edition, McGraw-Hill, Inc., 1992.
  17. Myers, G., The Art of Software Testing, Wiley Interscience, 1979.
  18. Beizer, B., Software Testing Techniques, Second Edition, Van Nostrand Reinhold, 1990.
  19. Additional general references used in developing some definitions are:
  20. Bohl, M., Information Processing, Fourth Edition, Science Research Associates, Inc., 1984.
  21. Freedman, A., The Computer Glossary, Sixth Edition, American Management Association, 1993.
  22. McGraw-Hill Electronics Dictionary, Fifth Edition, 1994, McGraw-Hill Inc.
  23. McGraw-Hill Dictionary of Scientific & Technical Terms, Fifth Edition, 1994, McGraw-Hill Inc..
  24. Webster's New Universal Unabridged Dictionary, Deluxe Second Edition, 1979.


The bulk of this information was obtained from FDA.gov.

 

BIOS. basic input/output system.

 

bps. bits per second.

 

band. Range of frequencies used for transmitting a signal. A band can be identified by the difference between its lower and upper limits, i.e. bandwidth, as well as by its actual lower and upper limits; e.g., a 10 MHz band in the 100 to 110 MHz range.

 

bandwidth. The transmission capacity of a computer channel, communications line or bus. It is expressed in cycles per second [Hz], and also is often stated in bits or bytes per second. See: band.

 

bar code. (ISO) A code representing characters by sets of parallel bars of varying thickness and separation that are read optically by transverse scanning.

 

baseline. (NIST) A specification or product that has been formally reviewed and agreed upon, that serves as the basis for further development, and that can be changed only through formal change control procedures.

 

BASIC. An acronym for Beginners All-purpose Symbolic Instruction Code, a high-level programming language intended to facilitate learning to program in an interactive environment.

 

basic input/output system. Firmware that activates peripheral devices in a PC. Includes routines for the keyboard, screen, disk, parallel port and serial port, and for internal services such as time and date. It accepts requests from the device drivers in the operating system as well from application programs. It also contains autostart functions that test the system on startup and prepare the computer for operation. It loads the operating system and passes control to it.

 

batch. (IEEE) Pertaining to a system or mode of operation in which inputs are collected and processed all at one time, rather than being processed as they arrive, and a job, once started, proceeds to completion without additional input or user interaction. Contrast with conversational, interactive, on-line, real time.

 

batch processing. Execution of programs serially with no interactive processing. Contrast with real time processing.

 

baud. The signalling rate of a line. It's the switching speed, or number of transitions [voltage or frequency change] made per second. At low speeds bauds are equal to bits per seconds; e.g., 300 baud is equal to 300 bps. However, one baud can be made to represent more than one bit per second.

 

benchmark. A standard against which measurements or comparisons can be made.

 

bias. A measure of how closely the mean value in a series of replicate measurements approaches the true value. See: accuracy, precision, calibration.

 

binary. The base two number system. Permissible digits are "0" and "1".

 

bit. A contraction of the term binary digit. The bit is the basic unit of digital data. It may be in one of two states, logic 1 or logic 0. It may be thought of as a switch which is either on or off. Bits are usually combined into computer words of various sizes, such as the byte.

 

bits per second. A measure of the speed of data transfer in a communications system.

 

black-box testing. See: testing, functional.

 

block. (ISO) (1) A string of records, words, or characters that for technical or logical purposes are treated as a unity. (2) A collection of contiguous records that are recorded as a unit, and the units are separated by interblock gaps. (3) A group of bits or digits that are transmitted as a unit and that may be encoded for error-control purposes. (4) In programming languages, a subdivision of a program that serves to group related statements, delimit routines, specify storage allocation, delineate the applicability of labels, or segment parts of the program for other purposes. In FORTRAN, a block may be a sequence of statements; in COBOL, it may be a physical record.

 

test
Pineapple Code
Chattanooga Software Center • 1-423-821-3463 • 3821 Saint Elmo Avenue, Chattanooga, TN 37409 USA

Copyright © 2024 All Rights Reserved. Privacy, Terms, Glossary and Staff