The terms system validation and verification refer to two basic concerns, "are we building the right product?" and "are we building the product right?" Satisfactory answers to both questions are a prerequisite to customer acceptance.
Established Approaches to Validation/Verification in Systems Engineering
In systems engineering, planning for validation and verification begins in the early stages of "requirements development."
Figure 1 shows the pathway from goals/scenarios through to the validation/verification of scenarios.
Because most systems are multifunctional, many scenarios may be needed to describe the system's intended behavior completely. Input from multiple stakeholders heads to a shared view of the system goals.
Figure 2 shows the high-level components in the iterative development and evaluation of system.
Figure 2. Iterative Procedure for Testing/Validation/Verification
Each iteration begins with a description of desired behavior, possibly expressed as a series of messages between objects in a sequence diagram, and ends with an equivalent description of the "as modeled" system. At the end of each iteraction, the "desired" and "as modeled" behaviors are compared.
How Traceability helps Verification/Validation
Traceability mechanisms support the capture and usage of trace data (i.e., to document, parse, organize, edit, interlink, change, and manage requirements and traceability links between them.
Users view traceability as a transformation of requirements documents to design. The main applications of traceability are: requirements decomposition, requirements allocation, compliance verification, change control. See Figure 3.
Compliance Verification Procedures (CVPs)
Procedures for requirements verification are an intimate part of traceability.
Compliance and verification procedures (CVPs) are developed to ensure that each requirement is satisfied.
If a requirement cannot be tested, then by definition, it is no longer a requirement.
Need for Early Concept Validation/Verification
The complexity of engineering systems is rapidly approaching where it will be impossible to verify correctness of the design without also introducing a verification-aware discipline in the design process.
Looking ahead, a challenge we face is partitioning the validation/verification process into simplier problems/procedures that can be applied as early as possible in the design process. For example:
Formal Methods
A formal method (FM) is a set of techniques and tools based on mathematical modeling and formal logic used to specify requirements and design for computer systems and software. The use of FM on a project can assume various forms, ranging from occasional mathematical notation embedded in English specifications, to fully formal specifications using specification languages with precise semantics. At its most rigorous, FM involves computer assisted proofs of key properties regarding the behavior of the system.
Formal methods can benefit system design in two ways:
There are two families of formal methods that have been developed over the last 20 years, both have their advantages and drawbacks:
Currently there are two major approaches to system validation and verification: (1) testing and (2) system inspection. Both appraches try to identify faults in a system.
The most common form of system verification is testing -- that is, given the specification of a system, test cases are created which embody the fine microscopic facets of the specification.
Each test case specifies a situation (i.e., a set of input data) at what is supposed to happen in that situation (i.e., a set of output data).
Test Design
Test design requires the solution of problems similar to those encountered in the development (analysis, design) of a system. The key steps in test design development are as follows (Binder, 2001):
Test Execution/Automation
Test execution is concerned with the application of the tests on the system. Typically, test execution will involve the following steps (Binder, 2001):
A test automation streamlines (the many steps in) the testing procedure.
Dealing with the Outcome of a Test?
When a test fails, then we can say with complete confidence that a failure has been detected.
But what can we say when a system passes a test? At the very minimum, we can say that the system passes the test for the specific specification (i.e., input data). The benefit of this observation can be small....
Types of Testing
For the development of complex systems, test procedures can be simplified through the use of separate test procedures for separate concerns. The concerns (and their corresponding test procedures) include:
System Test and System Test Patterns
Developed in October 2002 by Mark Austin
Copyright © 2002, Mark Austin, University of Maryland