SOFTWARE : Implementation of Design Controls Offers Practical Benefits

July 1, 1996

15 Min Read
SOFTWARE : Implementation of Design Controls  Offers Practical Benefits

Medical Device & Diagnostic Industry Magazine | MDDI Article Index

Originally published July, 1996

Daniel P. Olivier

A major change in FDA's proposed revised good manufacturing practices (GMP) regulation is the addition of design controls.1 The design control elements of the revised GMPs closely resemble the design control elements defined in International Organization for Standardization (ISO) 9001 section 4.4 (see Table I).2

There has been concern on the part of some within industry that the imposition of design controls will result in escalating costs without commensurate benefit. Some manufacturers suggest that new regulatory requirements have historically increased costs, prolonged development and manufacturing schedules, and yet resulted in minimal payback. Design controls are perceived to be another such added cost that hampers productivity and delays time to market. To suggest that design controls can be implemented in a manner that adds to the bottom line in terms of increased quality, higher productivity, and reduced time to market requires some justification. This article will explain a practical implementation of design controls that can provide quantifiable benefits in increased quality and reduced development schedules.

WHAT ARE DESIGN CONTROLS?

It is important to clarify each of the elements of design controls. The GMP design control categories will be used for purposes of this discussion rather than the categories as defined in ISO 9001. Design controls are considered from a general engineering perspective and are not meant to be applied to a single discipline, such as software, but rather to all product development.

Design and Development Plan. The first element of design controls is to have an organized plan as the basis for establishing design and development activities.3 The plan should define the activities that are to be conducted in support of the design effort and the parties responsible for these activities. The plan should also define the relationship between departments, technical disciplines, and subcontractors that are contributing to the design. Because the responsibilities of individual parties may change throughout the design process, the plan is commonly updated as the design evolves. The plan should provide a general framework to guide the design and development process; if the plan is too specific, it cannot be followed and will quickly become out of date.

Defining a design and development plan helps establish an understanding among all parties of what deliverables are to be provided and their scheduled delivery dates. Review milestones, regulatory submission requirements, and potential risk conditions are identified at the start of the design process. The plan is not meant to constrain a manufacturer's ability to change the design, but to provide assurance that a formal process is applied that will ensure the quality objectives are continually stressed.

A template for a design and development plan is provided in Figure 1. This outline reflects a standard approach for project management plans as defined by the Institute of Electrical and Electronics Engineers (IEEE) in the IEEE Standard for Software Project Management Plans, 1058.1-1987.4 The outline shown has been tailored for medical device applications recognizing regulatory submission and safety risk assessment requirements.

Design Input. The design input stage results in the development of the preliminary requirements specification. This document defines the essential elements of the product, and the final version becomes the basis for evaluating when the product's design and development is complete. A key element of this phase is for all affected parties to agree when the specification is sufficiently complete and correct to begin formal development. Failure to gain buy-in and understanding of objectives, markets, and key requirements can later result in significant problems as new changes are introduced.5 Changes introduced late in the development process are much more expensive to implement and may delay the project when pressures to accelerate the schedule become more demanding.

Design input reviews are often criticized by management as introducing schedule delays up front when design implementation activities such as developing schematics and coding are perceived to be what is really needed to accelerate development. The fallacy of this complaint is that rushing into implementation without first meeting requirements can later result in considerable effort wasted in rework. Numerous changes are introduced late in the project when engineering realizes that "the needs of the customer" were never clarified. For an engineering manager, one of the best strategies for managing schedule risk is a signed requirements specification that forms the basis for minimizing future design changes.

Design Output. The design output includes the documents, design descriptions, analyses, test procedures, and test results produced during actual design activities. The final design output must demonstrate that the design satisfies the design input requirements.6 To satisfy this objective, the requirements must be quantifiable so test cases can be developed to show that the desired functions have been correctly implemented.

The actual presentation of the design output may be accomplished in many ways, based on the application, design approach, and tools used in support of the design process. For hardware development this may include functional block diagrams, schematics, drawings, and detailed component specifications. For software the design output may include design descriptions, module hierarchy charts, object-oriented diagrams, and other representations of the program.

Design Review. Reviews have been shown to be an effective method for improving product quality. The revised GMP and the ISO standard have recognized the benefit of reviews and have mandated that formal design reviews be conducted at "appropriate stages" of the design process. The wording "formal reviews" means that they must be documented. The review process is essential for ensuring the quality of the design input and design output specifications.7 Reviews are important factors in increasing product quality and improving the efficiency of the design and development process. Some of the benefits derived from reviews include the early identification of problems, open discussion among development groups, the opportunity to train new personnel, identification of unnecessarily complex design characteristics, and the potential for better solutions.

The most obvious benefit is the early identification of errors, which includes identifying design deficiencies or possible reliability errors. The earlier in the design process that errors are identified, the more easily they can be corrected. A requirements or coding error identified during final acceptance testing is very expensive to correct. Not only must the error be corrected, but changes may also be required in dependent code. Documentation must also be updated, and previously run acceptance test procedures must be reexecuted.

When reviews are conducted, they support a forum for open discussion among the various development groups, frequently clarifying misunderstandings and inconsistent interpretations of requirements. During the design process many unexpected problems are identified, and resolution of these problems results in slight variances in implementation. Although these variations are often perceived to have minimal impact by the individual designer, failure to communicate them can have significant implications for other designers responsible for interfacing components.

The review is an excellent opportunity to educate new personnel about the techniques used by more experienced designers to solve complex problems. The training of newly hired engineers rarely provides adequate insight into the nuances of product development. Instead, the new engineers are often left on their own to unravel the intricacies of equipment design that can be extremely complicated and the result of years of dedicated effort. Design reviews provide an excellent forum for training on new products, emerging technology, and sophisticated implementation techniques that have been learned by the more experienced design engineers.

Review meetings provide an excellent opportunity to make intricate and sometimes unnecessarily complex design characteristics visible. When a design must be presented to engineering peers, there is added incentive to ensure that the design is not only correct, but also structured and documented in accordance with company standards. In the absence of a formal design review, the documentation tasks are many times left to the last minute and either rushed or never addressed.

Because reviews offer the opportunity to discuss designs, better solutions may become apparent. It is a natural tendency of the design process for formalizing and documenting design architectures to become the first step in identifying improvements. Until the design can be documented, it may still be vague and not well understood, even by the designer. Formal documentation of the design architecture can many times lead to more-elegant solutions that can improve performance, reduce errors, and lead to reduced maintenance costs.

Design Verification and Validation. Verification and validation activities must demonstrate that the design satisfies the requirements as defined in the design input specification. Verification is the incremental checking performed during each phase of the design and development process to ensure that the design process is being followed and that the low-level derived requirements have been correctly implemented. Verification activities include unit level testing, analysis studies, integration level testing, and reviews.

Validation is assurance that the product conforms to the defined input requirements and needs of the customer. Validation must also encompass stress testing in an actual or simulated use environment. There are numerous instances of products being tested to the requirements defined by the manufacturer but failing in the field. This is because the user environment was much more demanding than or different from that defined by the requirements specification. It does a manufacturer little good to have a product that meets specifications but fails in the marketplace due to an incomplete or incorrect requirements definition.

Test procedures for both hardware and software must be developed to demonstrate the implementation of the design at multiple levels. Test procedures are characterized at the unit, integration, and system levels for most development efforts as specified in the FDA Reviewer Guidance and ISO 9000-3. 8,9 To support satisfaction of the requirements of the GMPs and ISO, the defined test procedures and the results of testing must be formally documented.

The revised GMP regulation includes the requirement that verification and validation also encompass a risk assessment that identifies potential sources of patient or operator injury and evaluates the probability and severity of occurrence. This risk assessment is a very effective technique for providing the manufacturer added confidence that, if failures are experienced, they will not lead to injury.

Design Transfer. Design transfer includes ensurance that the product produced by manufacturing is the same as that envisioned by the design and development engineers. Design transfer is an explicit element of the GMP regulation (820.30(g)) but is not defined as a separate element within the ISO design controls. It is FDA's experience that this element of a manufacturing process is often neglected, thereby leading to product recalls. This is exemplified by a design that is completed by engineering and is "thrown over the wall" to manufacturing. Manufacturing is left without a complete understanding of the requirements, insufficient information on critical tolerances, and no opportunity to respond to whether defined tolerances can be achieved. Design transfer should include not only design engineering and manufacturing coordination at the end of the design process, but also involvement in the early design and development stages as required for the design in-put phase. Early coordination can support manufacturing process validation and also prevent costly rework when it is learned that the capabilities demonstrated by prototype units are not consistent with capabilities that can be achieved with production systems. Design Changes. Changes to the approved design are accomplished with controls similar to those applied to the initial development process. The changed design must be subjected to the same documentation, validation, review, and approval procedures. This does not mean the entire validation must be reexecuted for a small design change, but that the applicable procedures must be followed. A rationale for a small change may be provided that justifies why only a subset of test procedures for affected functions may be needed to validate the changed design. In all cases, changes should be reviewed and approved by individuals with the same responsibilities as those for original review and approval.

Design History File. The design history file has been introduced in the revised GMP regulation as a repository for the documentation that describes the design history for a device. This includes documentation defined as a part of design controls and data on supplemental analyses, reviews, and studies. It is important for a manufacturer to characterize which of these document versions are to be maintained as part of the design history file.10 Some documents may be identified as informal and not officially retained for future audit purposes. This distinction may have significant implications for large projects that undergo numerous iterations during the development process.

BENEFITS OF DESIGN CONTROL

The preceding discussions have suggested that there are tangible benefits to the implementation of design controls as defined in both the revised GMP and ISO 9000. Yet in many cases, when the time for conducting formal reviews draws near, the pressures for accelerating time to market result in reviews being canceled in attempts to compress schedules. Though there are numerous instances that show cost savings can be realized through design reviews, the success stories are often dismissed as practical for other projects but not applicable to the current product.

The addition of reviews during the early project stages will require more time, with the perception that valuable code and prototype development time will be lost. However, history has demonstrated that the investment in reviews is repaid in reduced rework during testing and reduced errors in maintenance (see Figure 2).

The chart shows two critical characteristics of design control reviews. First is a savings in the effort invested in the development project. The added time required up front for reviews is represented by the shaded area on the left of the graph. The additional time spent in rework and refining the product during later stages if reviews are not conducted is shown by the shaded area on the right. It can be readily seen that the area on the left is smaller than the area on the right, representing less effort and, therefore, reduced cost if design reviews are conducted. The reduced effort is because the errors are found early in the development process when they are much easier to correct. Studies referenced elsewhere show AT&T Bell Laboratories found inspections (a form of reviews) to be 20 times as effective in finding errors as testing.11

A second benefit of design control reviews as shown in Figure 2 is increased quality. The quality improvement is demonstrated by the area under the curve shown for the maintenance phase, which is smaller for the design control reviews curve, representing less errors and less rework. With design control reviews, errors are identified and removed at each project phase. This means fewer residual errors are propagated to subsequent phases, resulting in a higher level of quality for the final product. Researchers have found that there is a tenfold reduction in the number of errors reaching each phase of testing for projects with a full system of reviews.12

The best way to overcome the tendency to delay and cancel reviews is to collect measures of the actual costs and benefits realized as a result of reviews. The amount of effort invested in reviews (cost) and the number and type of errors found (benefits) should be quantified. When the effort associated with the correction of errors during each phase of development has been measured, the cost return for reviews can be calculated. Realizing that the cost of error correction is significantly greater for errors found in system test and after release helps in understanding the significance of the savings that can be achieved through effective reviews. Real data can help to accurately evaluate the benefits of design control reviews.

CONCLUSION

It is important to understand that there is a learning curve that must be overcome when implementing design controls. As companies begin implementation, they sometimes become frustrated by the amount of time spent in review meetings and become discouraged. They see reviews becoming bogged down in discussions not directly related to the product quality. These experiences may lead to decisions to revert to previous practices, suggesting that they have demonstrated that review costs do not justify the benefits. The arguments presented here suggest that this is a shortsighted decision. The benefits achieved from design control reviews are real, but a dedicated commitment to realizing these benefits is essential. The learning curve problems can be addressed through training of personnel and monitoring of review effectiveness.

The real argument for conducting design control reviews should not be one of regulatory compliance only, but measurable business benefits including reduced development schedules and improved product quality.

REFERENCES

01."CFR 820 Working Draft of the CGMP Final Rule--Regulation," Federal Register, July 1995.

02.ISO 9001, Quality Systems--Model for Quality Assurance in Design, Development, Production, Installation and Servicing, Geneva, International Organization for Standardization (ISO), 1994.

03.Link D, "Design and Development Planning," in Med Dev Diag Indust Design Control Reprint Series, Santa Monica, CA, Canon Communications, pp 6­9, 1994.

04.Software Project Management Plans, IEEE Std 1058.1-1987, Piscataway, NJ, Institute of Electrical and Electronics Engineers, 1987.

05.Wurzel RD, "Design Input: Listening to the Marketplace," in Med Dev Diag Indust Design Control Reprint Series, Santa Monica, CA, Canon Communications, pp 10­14, 1994.

06.Sandberg J, "Design Output," in Med Dev Diag Indust Design Control Reprint Series, Santa Monica, CA, Canon Communications, pp 15­20, 1994.

07.Stevens JL, "Design Verification," in Med Dev Diag Indust Design Control Reprint Series, Santa Monica, CA, Canon Communications, pp 21­24, 1994.

08."Reviewer Guidance for Computer Controlled Medical Devices Undergoing 510(k) Review," Rockville, MD, FDA, August 1991.

09.Reilly SC, "Design History Files: The Why, What, and How," Med Dev Diag Indust, 17(5): 162­168, 1995.

10.ISO 9000-3, Quality Management and Quality Assurance Standards--Part 3: Guidelines for the Application of 9001 to the Development, Supply and Maintenance of Software, Geneva, ISO, June 1991.

11.Humphrey WS, Managing the Software Process, Menlo Park, CA, Addison-Wesley, pp 186­187, 1989.

12.Handbook of Walkthroughs, Inspections, and Technical Reviews, 3rd ed, Freedman DP, and Weinberg GM (eds), New York City, Dorset House, p 12, 1990.

Dan Olivier is president of Computer Applications Specialists, Inc. (San Diego), a software validation and development services company. *

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like