Bioprocessing is a major growth industry within the life sciences. But why is this cell growth industry emerging so quickly?
The rise in large molecule therapeutics, vaccines, and biosimilars has been rapid, but the FDA has fed the sector with its Quality by Design (QbD) requirements backed up by some recent high-profile fines for major pharma batch failures. The momentum is now unstoppable.
But the “big data” problem is brewing in fermenters around the world. Bioprocess development, technology transfer, and QA & QC all produce mountains of process, regulatory, and scientific data. Many organizations still exist with outdated information systems, siloed from the rest of their entities.
As the sector grows and faces challenges of diversity, scale, and price competition, it needs to better manage its data and bind more effectively into its broader companies and customers. Better data management plus more efficient process and data sharing are key components of the successful expansion of this sector. This was a major focus of the recent “BIO International Convention”, as hundreds of contract companies, institutes, and suppliers took the booth space traditionally housed by biotech.
Integration Is Essential
When developing a process, and undertaking experimental design, each step needs to be captured, compared to historical data, and integrated with other data to secure IP and provide process insight. Iterative recipe improvements, machine and statistical process control data, procedural sign-offs, and analysis of yields against historical information all require—and produce—data. Yet many of these processes are still paper driven and laborious, even with legacy LIMS systems in place.
The impact for a CMO seeking to attract business or a pharmaceutical company looking to keep the FDA happy is clear: go digital and get more efficient along the way.
According to Atrium Research, some vendors have developed simple ELN-like systems to digitize the paper process (so-called ‘paper-to-glass’ approach), but this fails to exploit the benefit of computing and data management. While deciding on how to update their information architecture, most manufacturing organizations in the market are looking for a number of key elements beyond LIMS to:
- Capture and compute data and share workflows across multiple procedures through the process
- Start with a flexible process then gradually lock down and validate
- Compare and analyze real-time with historian data
- Develop, store, and share real-time insight into processes
- Enable B2B collaboration under granular security and scalability
These requirements talk to the efficiency of the process development activity itself. Yet, the ability to bridge research and manufacturing siloes is critical for the organization—and to first-class technology transfer.
Huge organizational efficiency can be gained by consuming data generated in the research phase and then passing the data on to other systems, such as Manufacturing Execution Systems via single multidisciplinary data platforms. This integrated approach is new and has already attracted the attention of a majority of players in the sector. Their interest goes well beyond tactical access and dashboarding of process data. It’s about the understanding and management of a complex data asset and its effective use across the organization.
So far the very document-driven method of bioprocess development has been hugely inefficient. Collecting data and putting it into a vault may tick a box on a worksheet but does not fundamentally improve the organization.
With secure validated access to up-to-date experimental and process data, stage-gated, versioned reports can be generated quickly by a query rather than pulling data from multiple documents, which a majority of biomanufacturers still have to do today. This is where data management overtakes legacy document/paper management.
Rather than living in a document-heavy environment, fast-moving bioprocess organizations can now take advantage of a flexible, less expensive data-driven world with documents generated on demand. Into this data-centric environment, modular enterprise ELNs (such as IDBS’ E-WorkBook) provide a data management platform combining the flexibility needed for research with the ability to lock down and validate procedural execution workflows.
Integrated with the design of experiments systems, such as those from Umetrics, and internal historian systems, these platforms are able to capture, compute, compare, and secure process data. They then integrate upstream to enterprise requirements planning systems, such as SAP.
Companies such as Lonza have considered process technology transfer during a selection of enterprise data-management systems. As a result they have already seen benefits in improved quality, faster design transfer, and an increase in operational effectiveness. In the bioprocess area, and particularly for CMOs, where costs are well understood, these savings can reach the bottom line quickly.