Generation of the evidence that a device complies with the design controls' definition of "good" is a critical piece of the Intelligent Design Control methodology. We need not only to provide traceability to the test cases, but the test cases should trace to the appropriate level in the product "V", it should distinguish between verification and validation, and finally the test results.
Documents and Work Items
We have the ability to create test cases in more than one location. The can either be built in the overall repository or in the context of documents. Documents are essentially containers holding work item content and other contextual discussions. In general, medical device regulators expect the test protocols be approved prior to execution of the test. It is therefore convenient to use documents containing test cases to provide a scope of test runs which can be pre-approved in a Part 11 compliant change management system
Providing context to the test cases in the form of free text outside the requirement is a convenient feature to provide information that is lengthy and repeated for many of the test cases defined in the document. For this reason documents are a key re-usable artifact used to develop a testing library.
Providing context to the test cases in the form of free text outside the requirement is a convenient feature to provide information that is lengthy and repeated for many of the test cases defined in the document. For this reason documents are a key re-usable artifact used to develop a testing library.
Test Cases
Polarion includes a special work item defined as a test case. Test cases include automation required to determine a set of test steps, their description and an expected result.
Test Steps
The test steps are then used in the test run to walk the tester through the items in need of verification, and capture any necessary objective evidence of compliance.
![]() |
| Test Protocol Document |
As discussed above, the scope of the test cases can be established by the test protocol, the protocol ran through an approval workflow, and the test run applied to the document test cases. Now the creation of a test report is a trivial bit of automation provided by the reporting engine.
Test Runs
Once a protocol has been developed and approved, it is common then to execute a test run. The typical logic for this activity is shown below.
![]() |
| Testing Logic |
Test runs then become the data set which contains the evidence of verification and validation approval. The following is an example of a test run execution interface.
![]() |
| Test Run |
Test Reporting
Once the test has been executed, the test results are included in the test case traceability, and are now available for reporting. One example of such reporting is the standard test result dashboard shown below.
![]() |
| Test Results Document |
The system interface provides a test engineer with all results needed to determine the current level of product compliance with performance requirements. In addition, the data can now be automatically built into test report documents without further formatting.
![]() |
Test Reporting |
As you can see, integration of verification and validation testing with the product design controls provides a rapid, accurate method for assuring a complete test suite, and recovery of contextual evidence.
Next up is integration of risk.





No comments:
Post a Comment