Bioprocessing technologies designed for volume products are not well suited to small batch personalized medicines, according to researchers who say firms making patient-specific medicines need better purification systems.
Today most purification methods are based on separating biopharmaceuticals from cellular debris, reagents, and nutrients present in the process stream. The aim is to pass large volumes of liquids through filtration systems as efficiently as possible.
However, for personalized medicines produced in smaller quantities the challenges are different, according to Dong-Pyo Kim, PhD, director of the Center for Intelligent Microprocess of Pharmaceutical Synthesis at Pohang University in South Korea.
“The personalized medicines industry needs new, super-efficient purification technologies because it handles complex biological molecules like nucleic acids, proteins, and cells that must be extremely pure for safety and effectiveness. A small impurity can result in significant immunological side effects.”
And there are other motivations for the development of more efficient downstream technologies, with cost reduction being the obvious example.
Kim tells GEN: “The purification process represents approximately 60–90% of the total cost involved in producing biotherapeutics. New approaches must be focused on boosting purity, cost reduction, flexibility in production scale, and fit smoothly with other manufacturing steps.
“Current technologies just aren’t flexible or precise enough for small, individualized batches, making it tough to keep consistent quality and meet regulations, which also pushes up production costs,” he says.
Microfluidics
Instead of using current tech—Kim and colleagues argue in a new study—personalized medicine developers should use systems designed to process smaller volumes—so called “microfluidics”—for downstream processing.
“Microfluidic technologies are a game-changer for purifying personalized medicines, specifically for those small population patients suffering from genetic and rare disorders because they offer high precision and scalability. They can handle small, customized batches efficiently, which is perfect for personalized treatments.
“These systems cut costs by using smaller quantities of reagents and minimizing waste. Plus, they provide high-resolution separation to distinguish closely related biomolecules and achieve great recovery rates, making the most of valuable therapeutic agents,” Kim says.
He also advocates combining microfluidics with automated artificial intelligence (AI) based monitoring, analytics, and modeling technologies.
“Combining microfluidic technologies with AI and automation systems brings a lot to the table, specifically for producing biotherapeutics and personalized medicines. AI can analyze real-time data from microfluidic processes to fine-tune things like flow rates and reaction conditions, making everything run smoother and more efficiently. It also uses past data to predict outcomes, helping us make better decisions, and keeping experiments on track.
“With automation in the mix, tasks like preparing samples and purifying substances become super precise and repeatable. This setup speeds up the whole development process by cutting down on manual work and analyzing data faster. It also keeps a close eye on quality, ensuring each batch meets high standards while managing costs better by optimizing how we use resources.”
Kim adds, “Overall, it’s a powerhouse combo that’s pushing the boundaries of biopharmaceutical processing.”