The ability to gather process data on the production line is reshaping biopharmaceutical manufacturing, allowing drug companies to make more consistent, higher-quality medicines more efficiently and at lower cost.

The challenge now is managing—and using—the masses of information generated during each production run according to the authors of new research, who say companies lacking effective data infrastructure risk falling behind.

Sophia Bongard, PhD, a bioprocess research scientist at Roche Diagnostics in Germany, told GEN: “The biopharmaceutical industry needs to rethink data acquisition and analytics due to the increasing complexity and volume of data generated in bioprocessing,” adding that advances like process intensification and continuous processing are driving the need for a rethink in data acquisition and analytics in bioprocessing.

“These approaches lead to an increase in the complexity and volume of data generated, as modern bioprocesses can produce millions of data points from online sensors, offline analytics, and calculated dimensions,” she continues. “The rise of automated processes and analytical capabilities, such as continuous process analytical technologies (PAT), necessitates real-time monitoring and automated control of bioreactors.”

Context

Current IT systems can struggle to store process data effectively, resulting in the creation of information silos. And this lack of interconnectivity can mean that valuable insights are missed, Bongard says.

“The need to contextualize data with metadata and ensure that data is findable, accessible, interoperable, and reusable [FAIR] is critical,” she points out. “Without comprehensive data strategies and easy data access, data scientists may spend most of their time on data wrangling and clean-up instead of analysis and insight generation.”

The lack of standardization is another challenge, according to Bongard, who says the use of different data formats makes integrating information into a cohesive process model harder than it needs to be.

“To address these challenges, the biopharmaceutical industry must invest in data engineering capabilities and develop a FAIR end-to-end data platform that can handle the complexity and volume of data generated,” Bongard said. “This involves integrating various data sources, rapidly prototyping visualizations, analysis, and machine learning solutions, and facilitating the integration of third-party tools and technologies.”

In addition to facilitating ease of data management and accessibility, the ideal IT infrastructure should also integrate elements of automation and artificial intelligence.

“AI algorithms and machine learning models are revolutionizing the process of identifying and developing new therapeutics, ultimately speeding up the journey from bench to patient,” explains Bongard. “In bioprocess development, machine learning can help in optimizing processes and predicting outcomes, leading to more efficient and effective production methods.

Automation, on the other hand, is crucial for managing the increased complexity and volume of data generated by modern bioprocesses.

“Automated processes and analytical capabilities, such as continuous PAT, are essential for real-time in-line monitoring of important process parameters and quality attributes,” said Bongard. “Automation paves the way for automated control and feeding of bioreactors, which is vital for maintaining the consistency and quality of biopharmaceutical products.

“Together, AI and automation enhance the ability to process and analyze large datasets, improve decision-making, and increase the overall efficiency and productivity of bioprocessing operations. They are integral to the digital transformation of biopharmaceutical manufacturing processes, optimizing efficiency and quality.”

Previous articleConsider Cell Therapy GMP Manufacturing Early in Development
Next articleConverging Roads in Bioprocessing