September 15, 2006 (Vol. 26, No. 16)

Gail Dutton

Solutions for Dealing with Variability and Bottlenecks are Among the Benefits

Bioprocessing simulations are gaining traction and sophistication in the biotech industry. Long a standard in other industries, “simulations are becoming requirements for all major design, construction, and de-bottlenecking projects,” according to Demetri Petrides, Ph.D., CEO, Intelligen(www.intelligen.com).

Bioprocess simulations were generally run by chemical engineers. Biologists and microbiologists, until recently, haven’t been exposed to what bioprocess simulations can do, points out Ian Gosling, Ph.D., CEO, ChemSim (www.chemsin.com). For companies that run batch processes once every two weeks, there has been no over-riding reason to run simulations. That is changing, though. “The FDA’s emphasis on process analytical technology is pushing people to look at statistical approaches in an effort to better understand their processes,” Dr. Gosling notes.

“The ‘modelization’ of large molecules is an emerging field,” according to Jean-Michel Pin, aspenONE product manager, Aspen Technology (www.aspentech.com). “The key to their success is flexibility and agility of tools to manage best practices regardless of the type of operations.”

AspenTech, for example, started with the Aspen BatchPlus to model small molecule behavior, Pin says. The new aspenONE Golden Batch Profiler builds upon that history to provide better tools for manufacturers to understand large molecular protein behavior in their reactors.

The decades of data accumulated in other industries, however, either don’t exist for biotech or aren’t well-tested. Hence, the big challenge for users is accumulating the appropriate data, including processing times and quantities, states Dr. Gosling. “Not all the data is in the same place. Therefore, effort goes into the initial development without much payback until you’ve collected all the data and put it into the system.

“ChemSim runs a whole range of simulation applications, from the simplest to dynamic models,” Dr. Gosling says. He’s seeing a wave of interest in large-scale processes, including biorefineries, which look at capital equipment on a much larger scale. Other projects are geared to fuel cells and biofuels, as well as biotech.

Simulation in Scheduling

Biotech companies are becoming more involved in manufacturing and are learning that designing a process with manufacturing in mind from the onset saves time during production scale up, facilitates regulatory compliance, and improves product quality. This is where bioprocess simulations step in with a chance at better asset management, manufacturing agility, and the ability to drive best practices, Pin notes.

The main application for bioprocess simulation in biotech involves scheduling batch processes. While it may seem mundane, it makes a difference in avoiding equipment conflicts and preventing bottlenecks, Dr. Gosling says.

“For scheduling, the general algorithms used in manufacturing apply to biotech,” he explains. To make software more comprehensive, developers are incorporating planning and logistics with engineering calculations.

Intelligen offers SchedulePro, an integrated component for its SuperPro process engineering tool. “SuperPro models and analyzes a single bioprocess in detail, while SchedulePro is more of an industrial engineering tool that can model multiproduct facilities,” explains Dr. Petrides. It deals with variability caused by downtime, holidays, maintenance, and product changeovers, as well as scheduling on an on-going basis manufacturing and R&D facilities.

SuperPro v7.0, which releases this autumn, contains additional enhancements, but Schedule Pro is most attuned to solving new types of problems for biotech users. “We have added models for secondary processing, like fill/finish,” says Charlie Siletti, Ph.D., director of planning and scheduling applications at Intelligen, and chromatography and membrane filtration have been improved. An improved and easier method for handling vapor/liquid equilibrium is in development.

At Aspen, batch statistical process control (Batch SPC) was recently added to Aspen Q to help optimize production efficiency and quality in batch processes. “That is key to Six Sigma initiatives,” Pin says.

Designing the Process

During process development, a common use for bioprocess simulation is for process mapping and cost analysis, Dr. Petrides says. “This facilitates communication among the various team members. Cost analysis identifies the expensive process steps and other cost items that have a major impact on the bottom line. Results of such analyses facilitate planning future R&D work.

“When a process is ready to move from R&D to manufacturing, such tools facilitate tech transfer and process fitting, like sizing, adding new equipment, and adjusting batch size,” adds Dr. Petrides. A good model to aid in this process is an effective description and representation of this process, Dr. Petrides continues.

Variability and Bottlenecks

Additionally, biotech companies are reducing the cycle time of existing processes to meet increased market demand for successful products. “That often leads to conflicts or bottlenecks with buffer preparation and holding tanks, delivery lines, transfer panels, cleaning-in-place skids, and water for injection supply systems, for example,” according to Dr. Petrides. “Using simulation tools, engineering and operating companies can identify potential bottlenecks ahead of time and take corrective action.”

If you run a batch once ever two weeks, those issues are not very important. “But, if you push, running batches every two and a half to three days, a little variability in the process time and validation time can add up and create conflicts,” Dr. Gosling says.

Dealing with such variability at Intelligen means allowing the software to work with external programs so users may enhance their simulations with advanced modeling techniques like Monte Carol simulations or optimization.

Automating the Process

Automation, which allows parametric studies to be run, is an important aspect of some of the latest simulations. Hence, building upon Aspen Technology’s history of advanced process controllers, Pin says the company’s next focus is on providing capabilities to model processes automatically. “We are looking at putting models online, so users can measure process variables as they move forward in time.”

Enhancing the Complexity

Historically, many simulations were designed as spreadsheets to answer a few questions about a specific process, limiting their predictive power. “People are asking more sophisticated questions nowadays,” according to Dr. Siletti. “We are seeing a need for models that include multiple processes, utilities, and buffer preparation.”

Product integration at Aspen takes two forms—integration between process engineering and production engineering to facilitate knowledge transfer between these two distinct operational areas, and then inside the plant to collect the variables and then automate them to drive process toward production with predictable behaviors.

Software developers are responding by adding capabilities that extend to modeling multiproduct facilities under real-world time and scheduling constraints, as well as modeling uncertainty, and providing variability analysis.

Previous articleKeith March, M.D., Ph.D.
Next articleAs Breast Cancer Month Nears, NASCAR Fans Will See Pink; VIVA(R) Towels Debuts a Pink Race Car in NASCAR Busch Series as Part of Partnership with the Breast Cancer Research Foundation