LEAK TESTING : Computerized Dry-Air Leak Testing for Process Control

Medical Device & Diagnostic Industry Magazine | MDDI Article Index Originally published January 1996 Jacques Hoffmann

January 1, 1996

9 Min Read
LEAK TESTING : Computerized Dry-Air Leak Testing for Process Control

Medical Device & Diagnostic Industry Magazine | MDDI Article Index

Originally published January 1996

Jacques Hoffmann

The growing need for more-stringent quality control, faster throughput, and lower manufacturing costs is increasing the importance and use of volumetric leak-rate testing of components by medical device manufacturers. This trend is also being driven by improvements in the technologies that underlie leak-rate-testing systems, particularly those systems based on mass flow.

Device components intended to contain liquid, gas, pressure, or a vacuum environment--which are common to analytical, diagnostic, and therapeutic systems--may require specific leak-rate readings for various characteristics, including material porosity, seals, assembly deficiencies, fit-and-function problems, and fastening and joining integrity. Volumetric leak-rate data for both accepted and rejected parts may also be required by OEMs as input for statistical process control (SPC) and other quality assurance programs. Such component-level data may be needed to ensure that final assemblies or products comply with stringent medical regulations and standards, particularly in light of the revision of FDA's good manufacturing practices regulation. In some cases, leak testing can also verify the ability of packaging to maintain interior sterility.


In the past, however, the acceptance of high-speed leak detection technology by the medical device industry has lagged behind other industries for various technical and economic reasons. One reason is that many medical products are manufactured in small quantities or in myriad models or variations. Often, the required test routines involve conducting a series of tests in sequence on the same device. Another reason is that some medical devices are manufactured from resilient materials that expand when pressurized. This results in long test times and runs the risk of producing inaccurate results. In addition, many test environments in the device industry require cleanrooms or settings with similarly hygienic conditions. In combination, these factors have often made it difficult for device manufacturers to justify purchasing automatic leak-test equipment.

Despite these obstacles, increasingly stringent leakage and flow specifications are prompting many device manufacturers to seriously consider high-speed automated leak testing. At the same time, advances in leak-test technology have provided the flexibility to test a variety of different devices requiring different test routines using a single, low-cost leak-test instrument. Such instruments can be furnished with low-cost sequencers that allow multiple tests to be performed in sequence on one device or on several similar ones.

Highly sensitive leak-sensing transducers linked to sophisticated software now permit rapid, accurate leak tests, even for nonrigid products. These computerized systems can recognize the distortion characteristics of resilient enclosures and subtract data that do not represent a true leak.


Historically, leak-testing of containment parts has involved some form of wet-bubble testing. The goal is simply to determine whether or not a part leaks. To do so, an operator pressurizes the part, submerges it in water, and then watches for a stream of escaping bubbles, which signals a leak.

Although it can detect and locate minute leaks, wet-bubble testing cannot measure the exact rate of leakage. The process is also slow and demands the constant attention of a skilled observer. Another drawback is that the part being tested usually must be dried before it can proceed through manufacturing or shipping.

Early dry-air leak-testing systems were also limited primarily to go/no-go indications, either because they could not produce a volumetric leak-rate reading or because uncontrollable variables made the accuracy of their readings suspect. Advances in sensor technology and software, however, have allowed dry-air leak testing to overcome these shortcomings. A fully or semiautomated dry-air leak-testing system speeds testing and lessens the need for specially trained operators. Multiple testing programs can be stored and then recalled quickly, allowing one testing station to readily accommodate a range of parts.

These computerized systems can provide graphic, real-time monitoring of test conditions and test-cycle dynamics and produce unambiguous readings keyed to product serial numbers. This information can be printed out, stored for later review, or downloaded to other computers in a form compatible with standard spreadsheet software. These systems can keep running counts of accept-versus-reject results, as well as the causes of rejects and the date and time of each test. They can perform calculations on stored data to prepare statistical records such as averages and standard deviations as part of SPC analysis. The result is typically improved efficiency, productivity, and quality.


Dry-air leak testing provides a way to satisfy today's more stringent testing requirements without creating a bottleneck in the production process. Two basic methods of such testing are suitable for production line applications. One measures the rate of pressure decay, while the other directly measures the leak rate based on mass flow.

In the pressure-decay method, the test part is pressurized and then isolated from the pressure source (see Figure 1). Calculations are then used to convert any change in the part's gage pressure over time into a measure of the rate of leakage. A faster, more accurate version of this approach, known as the differential-pressure method, pressurizes a reference volume along with the test part (see Figure 2). A transducer then reads the difference in pressure between the reference and the test item over time, and calculations are again needed to convert this difference into a measure of leakage. With both methods, ambient temperature changes, drafts, deformed test parts, seal creep, and other adverse conditions can cause problems.

The mass-flow method also involves pressurizing the test part (see Figure 3), but any leakage is naturally compensated for by air flowing into it. This flow can come from either a reference volume reservoir, pressurized along with the test part, or an air-supply line controlled by a regulator. In either case, the amount of air that enters the test part to replace the leakage flow is measured directly in standard cubic centimeters per minute.

In the past, measuring leak rates using the mass-flow method was considered slower and less reliable than the pressure-decay method. But improved sensors, coupled with the use of microprocessor-based electronics and control reservoirs, have dramatically improved the relative performance of mass-flow systems. By serving as an alternative source that is more stable than the conventional supply-line pressure regulators used previously, these new reservoirs allow the test to be isolated from the air-supply line. Automated testing stations employing the mass-flow technique can now provide accurate, reliable, and rapid leak detection in the most challenging production line environments.

Improved sensors have also helped to make the mass-flow technique more capable of process applications. Mass-flow sensing employs the principle of heat transfer, in which leakage flow is directed across a heated element, which transfers some of its heat to the flowing gas. Temperature-sensitive resistors measure the temperature of both the incoming and outgoing flow, and the temperature transducer bridge is balanced when both resistors are exposed to the same temperature. When the flow crossing one resistor is hotter than the flow crossing the other, the bridge becomes unbalanced. The resulting output voltage is proportional to mass flow, which provides the measurement of the leakage rate.


One reason mass-flow systems are faster than pressure-decay systems is that the former involve one measurement, whereas the latter require the pressure of the test part to be measured twice. In pressure-decay systems, a sufficient amount of time has to elapse between the two measurements, whereas the single-point measurement in mass-flow systems typically takes less than one second.

Measuring the pressure of the test part twice also doubles the opportunity for errors, which will produce equivalent or greater errors in the subsequent leak-rate calculations. Because of external variables, the probability of measurement error increases along with the length of the interval between measurements.

In addition to being faster and less prone to measurement error, mass-flow sensing can provide highly accurate leak readings over a much wider range of leak-to-volume ratios and testing conditions than differential-pressure systems, at roughly the same cost. Mass-flow systems are particularly well suited to the rapid measurement of small leaks and leaks in cavities with volumes of 20 L or more. With computerized control, they can also test different passages within the same part simultaneously.

In light of the improvements that have been made, the mass-flow leak sensor can now be considered a precision gage. In typical applications of computerized mass-flow leak-testing systems, repeatability and reproducibility studies consistently yield less than 10% variation.


For some applications, helium-based test methods offer advantages over both wet-bubble and dry-air techniques. These helium-based methods, which use mass spectrometers as detectors, offer several advantages, although their higher cost usually restricts them to specialized applications that require those advantages. For example, while dry-air leak measurement techniques can detect leak rates as low as 0.001 sccs under ideal circumstances, some critical devices (e.g., valves, tubing, fittings, and instrument housings developed for use with hazardous gases or liquids) must be tested for leak rates that are much smaller still. For these applications, automated testing systems using mass spectrometers can measure leakage as slow as 0.000001 sccs, while providing the same sort of data generation now available with other dry-air leak-testing techniques.

Helium testing can detect slower leak rates for two reasons. First, the helium mass spectrometer is much more sensitive than air-leak sensors. Second, the relatively small helium molecule can pass through pores that would block the larger molecules of most other air-component gases (e.g., oxygen and nitrogen). In situations where helium-based leak testing can detect pores small enough to block the passage of microbes, this technique can be used to help audit packaging processes or to production test package components such as formed plastic trays or covers. Materials that absorb helium do not lend themselves to this technique, however.

The most common helium-based test method involves pressurizing the test part with helium or a helium-air mixture within a test chamber (see Figure 4). The chamber is then evacuated, inviting the helium to pass through any leakage points in the test part into the surrounding vacuum. A mass spectrometer then samples the vacuum chamber and ionizes any helium present. Even trace amounts of the element are readily detected.

Helium techniques are also particularly useful for testing sealed enclosures for leakage, because the air inside the enclosure cannot be pressurized or measured directly by traditional means. In helium testing, the sealed enclosure can be placed in a pressurized, helium-filled environment (a process referred to as bombing the test item), removed, and then placed in a vacuum. If any helium penetrates the enclosure while it is being bombed, it will be drawn out under vacuum conditions and detected as leakage.


Recent advances in dry-air leak measurement technology now make it possible to use leak-rate readings as a reliable quantitative indicator of product quality and process control. While helium-based methods using mass spectrometry can detect smaller leak points and are less susceptible to variations in temperature, they are more expensive than other methods, and are therefore usually restricted to specialized device applications. Pressure-decay methods, including the differential-pressure technique, provide a feasible alternative when the economics of production demand lower-cost testing. Mass-flow systems offer a sort of middle ground between the two, capable of providing rapid, highly accurate readings over the widest range of testing conditions and leak-to-volume ratios. They represent the best choice when device manufacturers need true volumetric measurements of actual part leakage or shorter test cycles, or both.

Jacques Hoffmann is president of InterTech Development Co. (Skokie, IL).

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like