August 1, 2010 (Vol. 30, No. 14)

Advanced Tools Provide Better Solutions for Critical Aspect of Biologics Development

Scaling up biologics production from the bench to clinical and commercial scale is a central consideration of process design and development. Issues, challenges, and new technologies in scale-up were all discussed at  “BioProcess International Europe” held in Vienna recently. The meeting began with a case study of scale-up in action, when Russell Thirsk, Ph.D., vp, global head of influenza cell culture at Novartis Vaccines and Diagnostics, described large-scale manufacture of H1N1 influenza vaccine.

Global capacity for vaccine manufacture allows for just 20–30% of the population to be vaccinated for seasonal influenza and would not be sufficient for dealing with a pandemic of H1N1. The reality is that it would not be cost effective to have such a large infrastructure merely waiting for an event that might not occur.

One solution, Dr. Thirsk noted, is to use an adjuvant. “This allows expansion of the amount of vaccine available by reducing the dose needed.” Adjuvant is also useful because it allows some cross-protection of a vaccine against strains of flu it was not designed for, which is important in a pandemic scenario because of the likely antigenic drift of the virus.

Novartis has made three vaccines against H1N1—two of them manufactured in eggs and one in mammalian cell culture. One of the egg-manufactured vaccines includes an adjuvant and the other does not, while the cell-manufactured vaccine contains an adjuvant. It took Novartis four months, from registration to supply of the product.

“This is an incredibly short time frame, but it is what is required for response to a pandemic.” The cell-based vaccine was faster to produce than the egg-based vaccines. So far, Novartis has made 120 million doses of vaccine overall.

Dr. Thirsk noted that one egg can produce just one to two doses of vaccine, so the logistics of this route do not make sense in terms of scale-up. Furthermore, many strains of influenza do not grow in eggs, and the involvement of hens in producing eggs for manufacturing vaccines against avian flu strains may invite contamination.

In the H1N1 project, Novartis used Madin-Darby canine kidney cell culture in a closed bioreactor with no animal components. The resulting vaccine, Celtura®, is approved in Germany and Switzerland, and Novartis is seeking approvals elsewhere. “Close cooperation with universities and other institutions were key success factors in the speed of the project,” Dr. Thirsk said. “We think this is the way the influenza vaccine business is going to move in the future.”

Novartis has a big stake in this future, with its new facility at Holly Springs, NC, which produces flu vaccines with cell-based rather than egg-based culture. The company received nearly $500 million from the Department of Health and Human Services to help build the $1 billion, 130,000 sq. ft. plant, which Novartis believes is capable of producing 150 million doses of vaccine within six months of a pandemic declaration.

Process Design

Bo Kara, Ph.D., director of science and technology at MSD Biologics (formerly Avecia Biologics), which was recently acquired by Merck & Co., talked about how to build scalability into biologics process development through good design. He pointed out that this is worthwhile as a robust manufacturing process because it can decrease the time from launch to peak sales by two years, which may represent revenue of up to $600 million.

One element in the company’s success in this respect is its investment in process platforms, where scale-up issues can be better understood and managed. To this end, MSD Biologics has a range of optimized platforms that are well understood and scalable. These are all off the shelf but include some custom optimization.

MSD Biologics is using advanced tools for process design including predictive models such as neural nets, statistical experimental design, and other chemometric and multivariate techniques to aid in acquiring process understanding. Also important is actual manufacturing experience. Carrying out a process on a large scale builds up manufacturing understanding that, in combination with good small-scale data derived using statistically designed experiments, allows an effective control strategy to be developed. The result is good process robustness and consistency. This ultimately supports the establishment of a design space.

Dr. Kara listed some useful tools for process characterization such as failure mode and effects analysis (FMEA), design of experiment (DoE), and lab models, as well as experimental studies at lab-, pilot-, and full-scale. Risk-assessment tools such as FMEA are an important element in process characterization to focus efforts where required. They can and should be used during the whole life cycle of the product including preclinical, clinical, and commercial manufacture.

Dr. Kara noted that during the product life cycle, risk assessments may be done for different reasons but with a common emphasis on reflecting current understanding and process robustness. He concluded that good design happens early on and looks at scalability, using well-understood platforms.


MSD Biologics is building scalability into biologics process development through good design.

Managing Complex Datasets

Matt Osborne, Ph.D., cell culture lead at Eli Lilly, said the Kinsale, Ireland, plant is now expanding into biologics with the completion of a €300 million (about $369 million) investment in a facility using cell culture technology for the manufacture of antibody-based protein therapeutics.

He described work on managing complex datasets to derive an integrated analytical and process control strategy. His presentation outlined the practice of applying QbD principles and managing the complex datasets produced when applying these methodologies.

An important aspect of this is CQA (critical quality attribute—those qualities of a product that can affect the patient) risk assessment. Tools that have been found useful for this at Lilly include the Cause and Effect Matrix, Ishikawa Fishbone diagrams, and FMEA—all of which can map out what will have an impact on CQAs.

“There is a huge amount of data from such studies and it is a challenge to integrate it all into something usable,” Dr. Osborne observed. To this end, the company has built a data warehouse and developed a strategy of building a meta model with all the data, allowing for factors such as type of bioreactor, scale of manufacture, and the site where manufacturing takes place. These models are subsequently used to inform design space verification experiments.

One example of this approach involved a protein A unit operation where a strategy model was built using DoE data. This showed clearly which process parameters are related, and the acceptable ranges (design space) for such parameters. For instance, protein A elution buffer concentration and elution buffer pH have a relationship in design space. Ultimately, everything that could have an impact on CQA is included and mapped onto a “decision tree”, which reveals how the parameters are classified and how these can be used to inform the process control strategy. Dr. Osborne concluded that QbD leads to large and often disparate datasets, and judicious use of statistics is required to integrate and analyze these. “This is one of the biggest challenges we face.”

New technologies that can give information on large scale at a small scale are of great interest. Joey Studts, Ph.D., associate director of purification development at Boehringer Ingelheim, said that it is important to think commercially as soon as possible in process development. Miniaturization and automation both play an especially important role in Boehringer’s strategy for developing specific, robust processes for the manufacture of monoclonal antibodies.

Dr. Studts described how the company’s RAPPTor®, a fully automated purification screening platform, contributes to process development. Developed with the help of Tecan, RAPPTor (rapid protein purification technology) allows 96 variables to be measured on 500 samples a day, allowing for early identification of conditions for full-scale downstream processing. The platform is applied as early as possible. “The first chance we get with any material, we put it into RAPPTor,” Dr. Studts said. “It makes a real impact on downstream-processing decisions.”

Understanding the scalability issue is key to early process learning using automation. “We’ve tried to optimize protocols so they best reflect scale-up.” There are also various DoE strategies that can be used in this context, namely 2-D, multilevel DoE, and full factorial DoE, with this last approach giving the most information.

In one Boehringer case study, the researchers wanted to find the optimal conditions for a cation-exchange polishing step for a therapeutic antibody. They looked at three resin types, a pH range between 5–6, and both binding and elution conductivity. The readout was yield, host cell protein content, and monomer yield. A clear picture of the impact of each variable was obtained using RAPPTor.

The sweet spot aimed for was a monomer yield of over 99.2% and a product yield of 85–100%. Conditions for achieving the sweet spot predicted by RAPPTor included a pH of 5.75, and this gave good results on a real column, therefore, the DoE and column results were comparable. “This gave us a lot of process learning from the very beginning,” Dr. Studts said.

The second case study using RAPPTor also involved a therapeutic antibody. Here, conditions for scale-up were chosen depending upon the most critical output parameter. This could easily be optimized for scale-up 13-fold.

Getting early knowledge of what the design space looked like for each process step was possible using a QbD approach throughout development and scale-up, noted Dr. Studts. “Efforts should be focused on design space and process understanding in scale-up,” he said. In summary, the aim is to eliminate waiting time by carrying out process optimization and process learning much earlier and at a small scale using RAPPTor, and it has now been shown that data from this screening is indeed scalable.

Bioreactor System

A new tool that aids scale-up studies is ambr™, the microscale bioreactor system from The Automation Partnership (TAP). ambr mimics the performance and characteristics of a classic larger scale (5–10 L) bioreactor at the microscale (10–15 mL), according to the company. ambr workstations are now available, offering parallel processing and evaluation of up to 48 microbioreactor experiments in an automated benchtop system.

“Unlike other systems, ambr has features of a large-scale bioreactor, with each ambr microbioreactor having an impeller and a sparge tube as well as pH and DO monitoring, and the system generates enough material to predict how each clone will perform,” Ian Ransome, TAP’s director of sales, explained. “Additionally, it is one of the only systems designed specifically for culturing mammalian cells.”

Researchers tested ambr extensively and showed that it provides similar data to five liter bioreactors and is more representative than using shake flasks to predict how cells will perform in bioreactors, Ransome said.


TAP says that its ambr microscale bioreactor system has been shown to mimic the performance and characteristics of a larger-scale bioreactor, making it a useful tool in scale-up studies.

Susan Aldridge, Ph.D. ([email protected]), is a freelance science and medical writer specializing in biotechnology, pharmaceuticals, chemistry, medicine, and health.

Previous articleFormatech to Donate Fill/Finish Services to Femta for One Lot of Investigational mAb
Next articlePROTEIN IMPORTANT IN DIABETES FOUND TO PLAY ROLE IN OTHER DISEASES