The aim of any formulation scientist is to create a product that is safe, effective, stable, and distinct from competitor drugs. It also helps if the product is as patient friendly as possible to enhance compliance. These dynamics are fundamental to formulation development regardless of whether the drug that is being produced is a small molecule, a chemically synthesized pill, a monoclonal antibody–based cancer medicine, or a cell therapy.
But there are other factors fueling innovation in the formulation space. According to experts in the field, these factors include the need for formulations that are straightforward and cost effective.
Contributing to efficiency
Turning a promising compound into a medicine is an expensive and time-consuming process. According to the Congressional Budget Office, estimates of the average R&D cost per new drug range from less than $1 billion to more than $2 billion per drug. The agency also indicates that the development process often takes a decade or more.
Efforts to streamline drug development include the Cures Act, or the 21st Century Cures Act, which was signed into law in 2016. The legislation includes provisions that give the NIH additional resources. “The Cures Act,” the NIH states on its website, “implements measures to … alleviate administrative burdens that can prolong the start of clinical trials.”
Regulatory requirements have become more flexible in some respects, but the streamlining of development may depend on other aspects of development becoming more stringent, not less. For example, formulation work is being compressed.
Traditionally, formulation was an afterthought, a task that was deferred until late-phase studies. But industry attitudes have changed, and now many developers begin final formulation development in parallel to the development of the simplified formats that pertain when a drug is being administered in early-phase trials.
Applying AI-based screening
In part, this is thanks to new screening methods, says Anastasia Holovchenko, PhD, a physicist at Janssen Pharmaceuticals. She says that large-capacity analysis technologies have accelerated formulation development.
“High-throughput analysis (HTA) enables analytical techniques for characterization of thousands of samples per day in a given laboratory or piece of equipment,” she elaborates. “Hence, it accelerates the process and allows developers to determine the best formulation that can keep the drug product stable for longer time in stress conditions.”
Holovchenko is helping to take the process further. In partnership with colleagues in the Netherlands, she has developed an AI-based HTA approach for formulation screening. A key benefit of the approach, she emphasizes, is the way it minimizes the human element.
“Our program allows us to minimize human error in preanalytical screening by using computer vision instead of the human eye,” Holovchenko asserts. “It classifies images automatically within five seconds, and it gives a detailed output with an assessment of whether the sample is clean and suitable for HTA.
“This software allows us to prevent unnecessary measurements on contaminated samples by high-precision AI prescreening at the beginning of the HTA. In traditional screening methods, the human factor plays an important role in preanalytical formulation assessment. Foreign particles like dust or bubbles can be accidentally trapped in the sample and become a major source of sample contamination.
“Those particles can lead to inaccurate data generated during analytical testing. To overcome this issue, we developed an automated imaging program which enables AI computer vision for formulation screening purposes.”
Detecting contaminants
All medicines—whether they are small-molecule oral solid dosage forms or liquid preparations of protein-based therapeutics—can become contaminated by particles generated during the production process. Detecting these contaminates is vital to the production of safe and effective formulations.
“Many drugs contain particles that affect product stability and represent a critical risk for patient safety,” says Tobias Werk, CEO of Bionter, an analytical testing technology developer. “Indeed, those particles may cause blood vessel occlusions and induce immune responses to the administered drug or autoimmune responses. So, particle identification, quantification, and characterization are essential control strategies for ensuring quality and safety in biotherapeutic development.”
The tricky part is understanding at which point in the production process these particles are formed and predicting how corrective changes could impact the finished formulation. “Multiple factors can cause particle formation during manufacturing, storage, or transportation of liquid or reconstituted drugs,” Werk notes. “In complex protein-based therapeutics (such as monoclonal antibodies, recombinant proteins, fusion proteins, or antibody-drug conjugates), these particles can be of extrinsic or intrinsic origin.”
Traditional methods are iterative, with the focus being to determine if the contaminant is extrinsic (materials like cellulose, glass, rubber, plastic, or metal generated by interactions between a drug and its packaging) or intrinsic (aggregates derived from the product itself).
Because many large-molecule drugs are polar, they tend to aggregate in suboptimal formulations. “The aggregates,” Werk explains, “may form due to prolonged storage; container properties; mechanical agitation during shipping and handling; changes in air, light, and temperature; or changes in state from solid to liquid.”
“Sometimes, in innovative therapeutic formulations, particles can also represent the active pharmaceutical ingredient or the drug delivery vehicle,” he points out. “Here, aggregate formation can hamper the effectiveness of the treatment, enhance immunogenicity, or cause immunotoxicity.”
Shadowing particles
A technique called light obscuration is widely used for the detection of particulates. The method is based upon the amount of light a particle blocks when passing through the detection window area of the particle counter.
Light obscuration is the preferred method for quantifying particles in drug formulations; however, it has some limitations. “It can show a lower sensitivity in detecting some types of translucent, proteinaceous particles, resulting in lower counts or erroneous sizing of the subvisible particles,” Werk details. “Moreover, when the difference between the refractive index of the particles and that of the medium is small, the accuracy of the analytical results can diminish. Also, light obscuration cannot detect particles smaller than 1 µm.
“Finally, current light obscuration methods require many manual steps and consume the samples during testing, so that scientist cannot reuse the same sample for further particle characterization beyond compendial application. This latter factor is a significant disadvantage when scientists have limited sample material available and need to run several tests.”
To address the problem of sample loss during testing, Werk and colleagues at Bionter developed a subvisible particle analysis technique. “The reason for the destructive nature of the current technology lies in the mixing of the cleaning medium with the sample,” Werk relates. “Our workflow includes a drying step after ensuring the fluid path is clean. Therefore, the sample is not contaminated or diluted by the cleaning medium and can be used for further testing, minimizing the total sample volume required in drug product testing.
“The nondestructive nature of our particle counter also offers the possibility to monitor particle population evolutions over time on a single sample. Avoiding sample loss results in huge savings, as the cost of a single test run of a 25-mL sample typically ranges between $300 to more than $1,000.”
Ensuring stability
For Shwetha Iyer, a principal scientist at the Novartis Institutes for BioMedical Research, stability measures represent one of the most innovative aspects of formulation development, with next-generation antibody medicines being a major driver.
“Most of these molecules either require a highly concentrated drug product or come with a technically challenging framework, which could potentially lead to unfavorable biophysical attributes,” she explains. “If a product has these attributes, developing a stable liquid formulation can be quite challenging. Formulation scientists are constantly trying to understand the correlation of biophysical properties of a molecule and its impact on a liquid formulation.”
The current gold-standard approach to gaining this understanding is forced degradation. As the name suggests, forced degradation is designed to help formulators understand the degradation of a drug substance or drug product
under severe conditions.
The idea is to better understand the degradation pathways of a molecule and to define a strategy that can mitigate or control them. In effect, a forced degradation study is an early assessment of molecule that guides the formulation scientist to evaluate the pathways through which the molecule can potentially degrade and cause instabilities.
When to carry out these studies is a topic of debate according to Iyer, who says, “Forced degradation studies are normally performed late in development. Sometimes, the degradation pathways identified are so severe that the only way to mitigate them is to develop a formulation that might not be commercially viable. Hence, we at Novartis are trying to understand a molecule’s degradation pathways much earlier than usual, particularly in my group, so that we only send the most developable candidate to our clinical teams.”