September 1, 2007 (Vol. 27, No. 15)

Angelo DePalma Ph.D. Writer GEN

FDA’s PAT Has Not Brought About the Changes in Manufacturing That Were Envisioned

Over the last five years process monitoring has taken on a new impetus. Inspired by the FDA’s Process Analytic Technology (PAT) initiative, the objective of process monitoring today is nothing short of real-time, in-line analytics.

According to the FDA’s vision, real-time process data should help companies design better pharmaceutical manufacturing processes, improve existing processes, and lead to greater operational flexibility once a process is approved.

“If you understand the design space of the process, it should be possible to file an application based on that design space,” says Eugene Johnston, who heads the North Carolina office of The Biologics Consulting Group (www.bcg-usa.com). Design space includes those parameters, demonstrated through experiment, under which the process operates acceptably and within which critical quality attributes are unaffected. According to this idea bioprocessors would not need to re-file regulatory documents provided processes do not stray outside the design space.

It all sounds great: process understanding, greater flexibility, reduced regulatory filings. In practice, however, biotech companies have been reluctant to file this way, preferring to stick with tried-and-true process limits. That is one reason why the adoption of real-time PAT has been slow industry-wide. And biotechnology companies are even more bogged down than their small-molecule counterparts. For once, nobody has been blaming regulators.

In part that is because bioprocessors are comfortable with defined limits. If process development were to specify a protein titer, rather than a specified time, as the endpoint for a fermentation, translating that process from development to manufacturing could be problematic due to scheduling problems with subsequent process steps, particularly purification. “It is hard to define a cell culture process as lasting for between one and two weeks and expect purification to be ready when you are done,” Johnston states.

In addition, there appears to be some concern, as Johnston explains, about the disconnect between quality assurance and development groups regarding which analytical methods require validation within a PAT environment, and how much. This, he says, is the basis for QA being uncomfortable with fuzzy limits on process parameters.

“Also, it brings up situations where manufacturing personnel have to make decisions on the floor that they did not need to make in the past.” According to Johnston, companies need to get over such internal obstacles to moving forward with process monitoring, especially PAT.

Another roadblock relates to dealing with living organisms and their complex products. Not every protein is equally “PAT-able.” For example applying PAT to relatively homogeneous nonglycosylated alpha interferon will be much easier than for a complex protein with a lot of heterogeneity. “Companies thinking of applying PAT for the first time will probably do well to look at a simple protein first.”

See No Evil

Originally promulgated in 2002 as part of the FDA’s GMPs for the 21st Century initiative, PAT promises greater control over pharmaceutical and biotech manufacturing, and ultimately, higher quality. According to the FDA’s 2005 PAT update, “The goal of PAT is to understand and control the manufacturing process, which is consistent with our current drug-quality system. Quality cannot be tested into products; it should be built-in or should be by design.”

PAT makes sense, which is why industries ranging from chemicals, materials, semiconductors, and foods have adopted some form of in-line in-process analytics. But despite what the pundits and editors write, regardless of how hard the FDA pushes, and no matter what the apparent benefits may be, new technologies all too often meet their Waterloo at the doorsteps of drug manufacturing plants. For every good reason for adopting PAT, there seem to be two for not doing so, or for implementing it in a watered-down form.

So despite an initial frenzy of activity, PAT seems no closer to transforming the manufacture of high-value drug products than when the FDA promulgated its landmark guidance five years ago. “PAT has not brought about the revolution that pushes pharmaceutical manufacturing into the 21st century,” comments Gregory Page, Ph.D., a life sciences practice leader at Deloitte & Touche.

Legacy processes, and the regulatory burden of revising and revalidating them, represent a major impediment of time and cost. The complexity inherent in redesign and resubmissions, which most PAT deployments would entail, are apparently higher than any potential cost savings from lost batches. The industry therefore remains divided on whether PAT makes sense for existing processes.

Another hurdle falls into the see-no-evil category. The sheer volume of data acquired during real-time monitoring and control present bioprocessors with numerous difficult questions. The first is determining which parameters, alone or in combination, are significant; the second is how to utilize the data in some intelligent manner.

“When you have data, there is an idea that you have to do something with it,” says Dr. Page. “Biomanufacturers already have more data than they know what to do with.” Defining acceptable process parameters is critical because of the need to investigate when a measurement goes out of specification.

“How do you handle little glitches that drift out of spec?” Although such aberrations usually are quality-neutral, biomanufacturers evidently feel their existence must be justified. This is why testing-in quality retains such appeal in biotech, and probably will for some time.

Biotech Is Different

Data overload is just one reason why PAT is getting more play for new processes than for existing ones, for which quality attributes are already set. The transition from static data points to continuous monitoring has not been well thought out, Dr. Page notes. For existing processes, manufacturers who are used to dealing with a limited number of data points must now address and act on a continuum of data.

“PAT is a great concept, but I don’t think the benefits have been demonstrated in real time and for real-life processes.”

It has been said many times that all PAT seeks to do is provide the drug industry with the same analytics enjoyed by potato chip makers. That is somewhat of a simplification.

Most process industries tend to manufacture commodity products: chemicals, polymers, fertilizer, and foods. Others like semiconductors are limited by the extremely high cost of goods and capital investments as well as micron-scale manufacturing.

The manufacture of commodity-type goods, being capacity-limited, absolutely requires in-line analytics to squeeze every bit of value from the process. One could argue that pharm/biotech should also be thinking along these lines, but that is a different subject. According to current thinking, commodity process industries can afford to take risks with product because batches can be fixed post-production by reblending, which is impossible in biotech.

Microchip production, on the other hand, is extremely value-limited by the extremely tight tolerances of etching and wiring microscopic circuits. Unable to see the intricacies of their processes, chip-makers are forced to rely on process analytics.

Finally, recent drug-safety alarms may have created an atmosphere where manufacturers are more averse to accepting risk-based protocols through which process parameter discrepancies become acceptable, observes Dr. Page. “In this environment there is not much incentive to admit you can live with a higher level of process variation. It just doesn’t sound right.”

Another View

With PAT, much of the focus has switched to the broader concept of quality by design (QBD). “It is difficult to do PAT without QBD,” says Duncan Low, Ph.D., scientific executive director at Amgen (www.amgen.com). This suggests that biotech companies might continue to put off deploying PAT if they can meet their QBD objectives through more traditional means, for example through cell engineering.

Dr. Low has been working with the ASTM E55 Committee on Manufacture of Pharmaceutical Products, which is developing standards for PAT. He expects the standards will provide greater flexibility for, and help streamline systems validation and verification. In particular, the standard will allow manufacturers to use more of their vendors’ data and information for systems verification—including for analytical instrumentation—rather than repeating verification/validation work that has already been documented.

“This amounts to a more risk-based approach to systems verification,” Dr. Low says. The FDA and EMEA are involved in developing this consensus standard, but it is not a guidance or directive. Nor does it absolve biomanufacturers from validating the scientific validity of their tests and measurements, for example if an assay correctly reflects protein titer. “You still must qualify that as part of your assay.”

Professor Carl-Fredrik Mandenius, who heads the biotechnology division at Linköping University, dismisses the notion that bioprocessors have too much data, or that real-time data acquisition will overwhelm.

“All relevant analytes are normally well-known and -characterized,” he says. Methods exist for measuring them, and many have already been correlated with product quality, either during manufacturing or at release. It is a simple matter, he says, to rank process attributes by relevance to product quality, and apply some algorithm for assessing them during processing.

The problem is that adapting proven analytical methods to real-time monitoring has been difficult. “Existing methods, for example LC, are tedious and laborious when applied to real-time monitoring.

“It will take substantial effort to modify and improve analytics to make them sensitive, rapid, or automated for a PAT setting. A goal of PAT would be not only to render existing methods PAT-able, but to improve on them.”

Dr. Mandenius’s group works on optical sensors, particularly near-infrared for smaller molecules (below 1,000 Da), and surface plasmon resonance for measuring protein interactions. Both techniques have the advantage of being near-instantaneous (for example, no waiting for a protein to elute from a gel or column). Their drawbacks include a much smaller user base and less familiarity among analytical scientists than, say, chromatography.

So is process monitoring, in the FDA-inspired PAT sense, doomed to remain in the 20th century? It may be best to think of PAT as a goal—something to shoot for but not to obsess over. “Clearly, PAT is not going to save us,” says Dr. Page. “Pharmaceutical plants will likely never have the continuous tweaking we see in chemical plants. For new processes, PAT is becoming more common, and under the right conditions and with the right base information it can help modernize bioprocesses by making them faster and less expensive to run. But it will take a lot of time to produce real-time data that either a manufacturing or quality person can use.”

Previous articleEntelos Buys Iconix to Combine Efficacy and Toxicology Expertise
Next articleLFB Biotechnologies Creates New Entity under MAbgène Brand