Testing

References:

  1. Bernd Bruegge and Allen H. Dutoit, "Object-Oriented Software Engineering", Third Edition, Chapter 11
  2. Mauro Pezze and Michal Young, "Software Testing and Analysis - Process, Principles, and Techniques".
  3. Ian Sommerville, "Software Engineering 8".

11.1 - Introduction: Testing the Space Shuttle

11.2 - An Overview of Testing

11.3 - Testing Concepts

11.3.1 - Faults, Erroneous States, and Failures

11.3.2 - Test Cases

11.3.3 - Test Stubs and Drivers

11.3.4 - Corrections

Correcting one fault can often introduce new faults. The following techniques can help reduce this problem:

11.4 - Testing Activities

11.4.1 - Component Inspection

11.4.2 - Usability Testing

11.4.3 - Unit Testing

Testing of individual units, such as a class or even an individual method.

Equivalence testing

Defining ranges of variable input values for which all tests are expected to yield equivalent results, i.e. negative numbers, zero, positive integers, numbers larger than 32768, etc.

Boundary testing

Values on the "edges" or boundaries of equivalence ranges are most likely to cause faults. For example, leap years have special rules for years divisible by 100 or 400. ( Empty strings are often taken as a special or boundary case. )

Path testing

Ensure that every path through the flow chart is followed by at least one test case.

State-based testing

Polymorphism testing

All possible bindings need to be tested for each method that can be sent. ( And sometimes all combinations of bindings. )

Sample Unit Testing

The followng are sample solutions to old exam questions on testing. They are somewhat more complete that what is really expected of students taking an exam, but more appropriate for inclusion in a report.

11.4.4 - Integration Testing

After units are tested individually, then combinations of units must be tested, ( focusing primarily on the interfaces between the units. )

Horizontal integration testing strategies

Vertical integration testing strategies

Focus on early testing and integration of all units necessary to provide partial functionality of the product.

11.4.5 - System Testing

Functional testing

Tests functional requirements, as outlined in use-cases.

Performance testing

Tests non-functional requirements, as outlined in the requirements documents:

  • Stress testing tests the system under heavy loads.
  • Volume testing tests the system with large volumes of data, such as large files or databases.
  • Security testing, often involves tiger teams.
  • Timing testing tests timing constraints.
  • Recovery testing tests the systems ability to recover from erroneous states.

Pilot testing

Testing done in the field by a select group of test subjects, e.g. alpha and beta tests.

Acceptance testing

Testing performed by the client, in the development environment.

Installation testing

Testing performed after the system has been installed at the client's site.

11.5 - Managing Testing

11.5.1 - Planning Testing

11.5.2 - Documenting Testing

Four types of testing documents:

11.5.3 - Assigning Responsibilities

11.5.4 - Regression Testing

11.5.5 - Automating Testing

11.5.6 - Model-Based Testing

Supplemental Material I - From Pezze & Young

The following material is excerpted from "Software Testing and Analysis - Process, Principles, and Techniques", by Pezze and Young. It is a required textbook when I teach CS 442, Software Engineering II.

Supplemental Material II - From Sommerville

The following material is excerpted from "Software Engineering 8", by Ian Sommerville. It is an alternate textbook that has been used in CS 440 in past semesters.

Sommerville's inspection process involves two meetings - One to get an overview understanding of the artifact, presented by the author, and another to review the findings of the inspection process:

Like Pezze, Sommerville breaks inspection checklists down into categories:

The format of Sommerville's test plan document is similar to our author's: