Andrew Dallas

Andrew Dallas

May 14, 2010

8 Min Read
Caution: V&V May Be Hazardous to Software Quality

With the continued uncertainty regarding how much additional information FDA will ultimately require for the 510(k) and PMA application processes, some companies are responding by dramatically increasing the level of detail and the volume of their technical documentation. This trend is particularly evident in software engineering design documentation and in verification and validation (V&V) documentation.

Federal regulations for medical devices require that OEMs provide the following:

?    Design input and output requirements documentation.
?    Design verification performed and documented to confirm that design outputs meet design input requirements.
?    Design validation performed and documented to ensure that the medical device meets user needs for intended use.

The House That Software Built

Industry has recently seen a major rise in the level of detail in the software design and V&V documentation, at least in part due to uncertainty regarding the amount of information FDA requires. I have seen companies attempt to make V&V documentation detailed to the point that anyone can execute the test protocols. This may be motivated by a desire to employ low-cost testers instead of experienced software quality assurance (SQA) engineers. The problem with this approach is that by the time SQA engineers have documented a sufficiently detailed test protocol that anyone can execute, they have spent so much time doing so that they could have run the protocol themselves. There is absolutely no time or cost savings earned with this approach.
 

Creating excessively detailed design and test documentation is problematic. Even small feature changes are costly to implement, because the V&V documentation must then be changed. The resulting changes cascade so that all related documents must be modified. This is significant rework. In some cases, even desirable features are not added because the resulting document changes are prohibitive from a cost or scheduling perspective.
 

An alarming practice is having SQA engineers spend 90% of their time on a project developing and maintaining V&V documents. For detailed V&V documentation to be ready for execution by the end of the project, SQA engineers must begin writing the documents immediately after the functional and design specifications are complete. By the time the documentation is complete, the software is also complete, leaving actual software testing to occur at the very end of the software project.
 

This obsession with creating such detailed V&V documentation may lead some medical device software managers to forget a basic tenet of software development best practices. Specifically, software quality can most efficiently be achieved by building software incrementally, on a solid foundation of thoroughly tested software.
 

In software development, a milestone is a point during development at which the state of the software is formally tested. Each time, the software must be thoroughly tested and debugged before moving on to the next milestone. Often the milestone includes logically grouped sets of features that must be completed before some additional features can be started. Building on a solid foundation is an apt metaphor. Just as a well-built house must be constructed on a solid foundation, each milestone represents a portion of that foundation. Conducting software testing after the system is complete leaves little time to execute that testing. It leads to poor quality software that can only be mitigated by extending the project deadline to allow for intense testing and the subsequent rework.
 

Correcting software bugs at this stage of product development could require entire sections of the software to be completely rewritten and retested. Returning to the foundation metaphor, if the first layer of a foundation crumbles under the weight of the successive layers, the whole foundation must be torn down and rebuilt.
 

While the software development team is coding the first set of base features, the QA team should be planning test strategy, not writing protocols. Once the first round of coding is complete (the first milestone), QA should begin executing software tests, including some ad hoc testing. It is a common misconception that ad hoc testing is random testing. During ad hoc software testing, experienced SQA engineers quickly run tests of their own choosing based on their knowledge of the software, best practices in SQA, and their own creativity. There is no requirement that ad hoc testing be thoroughly documented, and no FDA guideline states that every test must be documented. A common question that is raised regarding ad hoc testing is “if it's not thoroughly documented, how do you know what you tested?” A better question to ask is “have our most experienced SQA engineers performed sufficient ad hoc testing?”
 

FDA Wants What It Wants: No More, No Less

FDA regulations require that OEMs demonstrate through documentation the activities that were performed to verify that requirements were met as well as the activities that were performed to confirm that the software conforms to user need and intended use (see the sidebar “FDA and Software”).
 

“A conclusion that software is validated is highly dependent upon comprehensive software testing, inspections, analyses, and other verification tasks performed at each stage of the software development life cycle.”
—General Principles of Software Validation; Final Guidance for Industry and FDA Staff

Documented tests describe a logical series of steps with known inputs and an expected result. One shortcoming with this approach is that users do not always follow a logical series of steps. Users can interact with software in unpredictable ways due to distractions, mistakes, or misunderstandings of system work flow. Additionally, there can be interactions between software modules that cannot be predicted during test case planning. So although documented testing is the correct approach for confirming that requirements are fulfilled, it is not suited for uncovering defects that users could encounter.
 

Therefore, an appropriate amount of ad hoc testing is consistent with FDA guidelines. The key is finding the balance between documentation and testing. Sufficient time and resources must be devoted to writing the V&V documentation as the software is being coded so that the tests are ready for execution when coding is complete. But the need to generate documentation should not prevent ad hoc software testing from occurring as milestones are reached.
 

Detailed V&V documentation is not a substitute for well-executed software testing and a proper SQA process. Conversely, extensive software testing does not alleviate the requirement to produce and execute sufficiently detailed V&V documentation. It is the SQA team's responsibility to calculate the time needed to develop V&V documentation. The appropriate level of detail depends on clinical and safety considerations, as well as on the project’s size and complexity. However, conducting intense software testing prior to developing the V&V documentation gives the SQA team valuable experience in using the software. Therefore, the time estimates for developing the V&V documentation and execution become much easier to calculate. Additionally, the time required to write the V&V documentation is reduced.
 

It is unreasonable to expect V&V documentation to trace every possible path through today’s software. Modern software is simply far too complex. This is why SQA engineers often employ sophisticated test methods, e.g., automated testing, to reach high levels of software quality. In addition, the software development team must also employ detailed code inspections, static analysis, and run-time analysis tools where appropriate as part of the formal development and test process.
 

Although the V&V documentation must be sufficiently detailed to satisfy FDA’s guidelines, these guidelines exist to ensure product safety and efficacy. FDA never intended the requirements to be so burdensome as to limit testing of the medical device software. V&V is not software testing. Verification testing ensures specified requirements have been fulfilled. Validation testing ensures that particular requirements for a specific intended use can be consistently fulfilled.
When considering software engineering documentation, it may be useful to address functional specifications, design documentation, and V&V documentation with the following question: “What is the purpose of this document, and is it sufficiently detailed to achieve that purpose?”
 

Although there is no formula for determining the optimum balance between testing and V&V, it may be safe to say that if the SQA team is spending 95% of its time on a project writing and updating documentation, then not enough software testing is being executed. Some projects require more documentation than others for reasons that may have to do with overall system complexity, clinical safety, or any number of other reasons. However, the same general rule applies.
 

Conclusion

Even if FDA were to change its guidelines on engineering documentation, its goal is, and always will be, to ensure the highest quality software possible relative to patient safety and efficacy of products. Documentation, even perfectly detailed documentation, is not a substitute for good SQA practices. If a quality management system is so documentation heavy that SQA engineers are spending the majority of their time documenting and doing very little testing, it is time to review the system.
 

Documentation must be sufficient to meet the clinical and safety requirements of a product and FDA guidelines. But software testing must also be sufficient to meet best practices in software development. Finding the correct balance is the key to achieving both goals.

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like