Solving the Systems Interconnectivity Problem

Quality and operations should focus on eliminating friction from their decision-making process by transitioning clerical tasks humans perform to computers.

June 1, 2023

8 Min Read
Computer connectivity
Tim Robberts / Stone via Getty Images

Jake Stowe

An essential feature of GxP industries is the productive tension between operations and quality. Operations (development in the case of medtech) pulls in the direction of action and constraints. Quality pulls in the direction of deliberation and ideals. In a high-functioning organization, the groups maintain a tenuous balance, but there is inevitable conflict by design.

I was first exposed to this tension as a major non-conformance investigator at a massive biopharmaceutical manufacturing plant. My team investigated, corrected, and reported on the thorniest and most subtle failures of a complex quality management system. The team was highly skilled, consisting of PhDs and multi-decade veterans of pharmaceutical manufacturing. While I had no experience in GxP when I joined the team, I had a keen eye for operations improvement, honed by a career in construction project management. 

The first step of improving any process is to thoroughly understand it. So, I spent months asking questions to be able to follow even simple team conversations. As I gained proficiency, I had two realizations. First, I developed a deep appreciation for the power and structure of GxP decision making, including the productive tension. It is the only way to make safety-critical products. Second, I saw that much of what is often described as productive tension is actually just waste and inefficiency. The source of this inefficiency is what I’d like to call the systems interconnectivity problem.

What keyed me into the interconnectivity problem was a chance conversation with one of my mentors on the team, J.  One morning when I approached his desk to ask him a question, I ended up watching him work.     

Staring into his laptop, J. was scrolling through a multi-thousand-page pdf document. He would locate a line of data, carefully center his cursor, then highlight and copy-paste the data into a spreadsheet. J. then proceeded to repeat this same process multiple times.

I couldn't help myself.

"J, what are you doing?"

He looked up at me, slightly annoyed. "I need to do a cross batch data analysis and I’m getting the data into a spreadsheet. This is one of the batch records." He turned back to the screen.

"How many batches are you analyzing?"

"Like a hundred."

I did some quick math and realized he was going to have to sift through hundreds of thousands of pages of data.

"How long will that take?"

"Maybe 40, 50 hours? But I won’t do it all at once. I’ve got a couple other investigations I’m working on. It usually takes me four to six weeks to finish."

"Usually? You do this often?  Is this critical to the investigation?"

He turned back to me and laughed. "I can't do anything else on it until this is done. Now will you leave me alone for a sec?"

It's hard to overestimate how deeply this conversation affected me. J. was a biology PhD who had worked at the plant for years. His job was incredibly important, sensitive, and urgent. And he had just casually told me he was going to spend a week copy-pasting from pdfs to a spreadsheet. This was a month-long critical path activity on one of his investigations. 

Try as I might, I could not move on from this incident. I lost sleep over it. I obsessed about fixing what seemed to be an insane process. After a few days of searching, I stumbled on an obscure way to query the batch record database programmatically. I asked J. for a list of the batches and the line items in the records he needed for his analysis. Within an hour I emailed him a spreadsheet with all of it. As I approached his desk, I saw him staring at the spreadsheet in amazement. Like it was magic.

One of the most frequent sources of conflict between operations and quality is time. The time it takes to make decisions. Organizations run on decisions. Each day lost in not making a decision cannot be regained. Each decision made incorrectly has consequences.  

You can broadly sort the work that leads to a decision into two categories: 

  • Deliberation: The analysis, debate, and reflection that leads to an understanding of the implications of a decision. Deliberation is valuable.

  • Friction: Data accumulation, alignment, transmission, transformation, presentation. Friction exists wherever a human is doing work that a computer can do exponentially faster and better. Friction is inherently waste.  

Now let’s return to my colleague, J.  Would a week of mindless manual work add value to the decision that was to be made? I don't think anyone could possibly answer yes to this question. Not only was the spreadsheet a waste of J's time, as an experienced professional balancing multiple priorities, the process was also a waste of the organization's time. Important decisions depended on that analysis and were delayed by a manual process that should have been completed in seconds rather than weeks. And, even though my hack at solving J.’s issue worked at the plant, it wouldn’t scale.

Given you accept the following premises, the argument is clear:

  1. Time to decision is a key — though by no means the only — area of conflict between quality and operations. 

  2. The time to make a decision can be broken down into deliberation and friction. One valuable, the other not.

  3. Friction makes up the vast majority of the time it takes to make decisions.

From there, the conclusion is obvious: Quality and operations should focus on eliminating friction from their decision-making process. But how? If we define friction as something that exists wherever a human is doing work that a computer can do exponentially faster and better, we can eliminate friction by transitioning clerical tasks humans perform to computers. 

But don’t we already use computers? Indeed, to be part of any company of any size in this age requires computer literacy. As you think about the software you use daily, you might find that the function of it is to capture data. When I worked as a manager in pharmaceutical manufacturing, my employees had to use several major software applications to do their job. These systems captured information about samples, batch record data, or equipment maintenance. My impression was that the hardest part of their job was navigating all of these different systems — hours spent that ultimately decreased their productivity.

Note that in my anecdote about J., his problem was not access to data, but extracting that data efficiently into a separate system. And that brings me back to my original thesis — only interconnecting disparate systems so data can be moved efficiently will eliminate this friction. Hence, the interconnectivity problem.

The following questions may help you gauge the relevance and urgency of the interconnectivity problem for your own organization:

  1. What is the total inventory of software systems that people on my team(s) must use for their daily work?

  2. How often do they use each of these systems?

  3. How often does data need to be combined from these systems? How is that combination accomplished today?

  4. How often do people on your team copy-paste information from various systems into other systems or documents?

  5. What types of decisions do these activities gate and how does that impact the time it takes to make them?

I think the answers will surprise you. 

Once you’ve assessed the interconnectivity issues within your organization, the natural next step is to attempt to resolve them. Here are three possible approaches:

Rely on programmatic queries: For the most urgent problems — where friction is limiting the organization in a critical process — see if there is a solution like the one I developed with J. Software that includes large databases often has standardized programmatic ways to filter and sort large datasets. This method does not entirely eliminate manual work, just makes it more efficient. Speak with your IT department or the system owner, to determine if there's a better way to accomplish the routine tasks your team is performing. Using new methods may change the way the data is verified prior to making a GxP decision, so consulting the quality team first is also a must.

Another, more robust method is to connect systems together through standard application programming interfaces (APIs): APIs allow systems to automatically pass data without a human in the loop. If your teams spend a lot of time copy-pasting data from one system to another, building a connection between those two systems could be extremely valuable. This is another place where IT and quality must be consulted, especially in light of the fact that a connection that supports GxP decisions will need to be validated.   

Finally, you could take a more holistic approach across your entire organization by considering a new class of product that overlays but does not replace all of your software systems. These connected lifecycle management solutions bridge disparate systems, allowing for real time data access and ensuring data alignment. These tools can be a safer approach to delivering medical software because they eliminate error-prone manual copying, creating data transparency across multiple teams and groups. 

In conclusion, any or some combination of these approaches can help bridge the inevitable tension between operations and quality. Similar to couples counseling, these strategies improve communications, building strong relationships between teams. Decrease in friction leads to faster decision-making and accelerated time to market which is good news for manufacturers and patients.

Jake Stowe is head of customer success with Ketryx. He is a former Amgen QA and manufacturing manager who spent time as a graduate research fellow at Amazon while studying systems engineering at MIT.    

Sign up for the QMED & MD+DI Daily newsletter.

You May Also Like