In biopharmaceutical manufacturing the interactions between cells, nutrients, and reagents in culture determine product quality. The big challenge for process developers is modeling these complex relationships while allowing for variability in materials and other inputs.
So says Wei Xie, PhD, assistant professor of mechanical and industrial engineering at Northeastern University, who suggests a technique called uncertainty analysis—which tries to account for unknowns in predictive models—can yield better processes.
“Uncertainty analysis, studying the output variability due to the variability of inputs, can accelerate the development of digital twins for integrated biopharmaceutical manufacturing and facilitate bioprocess innovations,” says Xie. “Since biomanufacturing involves a system of biological systems with hundreds of biological, physical, and chemical factors dynamically interacting with each other at molecular, cellular, and system levels and impacting production outcomes uncertainty analysis plays a fundamental role to support the FDA’s requirements on Quality-by-Design (QbD).”
Xie also notes that uncertainty analysis could provide a science-based understanding that guides reliable discovery and manufacturing process development and ensures product quality consistency, especially for personalized drugs.
Modeling uncertainty
Yet to date, industry use of the approach has been minimal. Part of the reason for this is that, until recently, none of the techniques available were developed with drug production in mind, continues Xie.
“Good uncertainty analysis methods in the biopharmaceutical industry are currently lacking. Uncertainty analyses are built on, and limited by, the selection of process models used to quantify input-output relationships,” Xie tells GEN. “Existing biomanufacturing process models are typically divided into black-box data-based models and mechanistic models. Purely data-based models do not characterize causal interactions and interdependencies of inputs and outputs such as CPPs/CQAs. Thus, they are less interpretable and require a large amount of data or experiments for process development.
“On the other hand, mechanistic models often ignore process inherent uncertainty. This limits their performance in terms of prediction reliability and mechanism learning.”
To address this, Xie and colleagues developed an uncertainty analysis method that combines bioprocess knowledge graph (KG) modeling with a Shapley value (SV)-based prediction risk analysis framework. The idea is to let drug makers understand how, when, and where components are interacting in culture.
“Basically, built in conjunction with the KG hybrid model, which characterizes spatial-temporal causal interdependencies within end-to-end biomanufacturing processes, the SV-based prediction risk analysis provides interpretable and reliable assessment of the uncertainty contribution of each input, or a selected set of inputs, by applying game theory approach,” explains Xie.
The system—known as the KG-SV framework—characterizes the spatial-temporal causal interdependencies of CPPs/CQAs and, according to Xie, can reduce the “design space” of experiments during process developments.
“Our KG hybrid modeling and interpretable uncertainty analysis can overcome the limits of current approaches and improve industry practices. It leads to benefits such as the identification of critical input factors—media compositions, enzymes, and pH—that affect yield and product quality, and support root-cause analysis.
“It also provides guidance on process monitoring and QC/QA testing to facilitate real-time release and guidance about which type of data collection will be the most informative, which in turn will accelerate the development of flexible, intensified, reliable, automated production processes.”
Xie and colleagues at Northwestern are developing a process analytical technology (PAT) software and online training platform designed to illustrate the benefits of the approach.