August 1, 2006 (Vol. 26, No. 14)

Kathy Liszewski

Line Begins to Blur with Lead Discovery

Lead optimization aims at enhancing the most promising compounds to improve effectiveness, diminish toxicity, or increase absorption. Many of the technologies for lead discovery overlap with lead optimization as researchers attempt to incorporate the best drug characteristics early in the process. While the approaches taken may vary, the central theme is the same: make it better, faster, and more efficient, according to speakers at Cambridge Healthtech’s recent “World Pharmaceutical Congress” in Philadelphia.

Optimizing Early Phases

How companies direct their early phases of lead optimization may spell the difference between success or failure, explained Jefferson W. Tilley, Ph.D., senior research director at Roche (www.rocheusa.com).

“At the very beginning, one needs to evaluate risks before committing significant resources to the project. First, of course, one must verify that the target is viable. Companies are increasingly recognizing the need to accurately determine if they have a truly druggable target. The design of the screening library also is critical. One needs high-quality libraries that also eliminate such things as reactive functional groups. This helps reduce false positives. Another early aspect that is critical is the assay itself. Assays must have high signal-to-noise ratios. If you take the time to refine your assays, you may be able to tune up the sensitivity by an order of magnitude. This could pay off in huge dividends later,” Dr. Tilley noted.

“At Roche, we have the capability for so-called cherry picking. This means that we will re-assay our positives with fresh samples once we identify a hit. This can be efficiently done with robotics. It’s important to be sure each hit is real. You can also extend with dose-response curves and determine IC50 or EC50. The key is to use a fresh sample to independently verify the compound.”

In Silico Tools

Predix Pharmaceuticals (www.predixpharm.com) is optimizing its leads with a team approach that combines computational as well as medicinal chemists. “Lead optimization typically requires the synthesis of hundreds of compounds over several years until the desired pharmacological profile is obtained, such as affinity, safety, pharmacokinetics, and ADME (absorption, distribution, metabolism, elimination, toxicology),” explained Yael Marantz, Ph.D., senior director, computational drug discovery. The in silico 3-D models and the virtual ADME profiles are especially important since they help disclose what to synthesize and, just as important, what not to synthesize.”

Dr. Marantz pointed to the company’s recent success for developing a lead candidate for treatment of anxiety and depression. “This is one of the first examples of a candidate that was discovered and optimized, from beginning to end, using an in silico model-based approach.

“This candidate reached clinical trials less than two years from initiation. We spent less than six months in lead optimization, with only 31 compounds synthesized. We started by modeling the 3-D structure of the 5-HT (serotonin) receptor using our PREDICT modeling methodology. We virtually complexed the model with serotonin and easily identified a binding pocket in the extracellular domain. In silico screening identified 78 virtual hits. From that we generated a lead compound.

“Using these strategies, we have now brought four programs from initiation to clinical trials within the last four years. The most important strategy, though, is to develop a team of both computational and medicinal chemists who can work and think together on the project. It’s the team effort that really counts.”

High-content Cellular Imaging

Researchers at Boehringer-Ingelheim Pharmaceuticals (www.boeringer-ingelheim.com) are employing high-content cellular imaging to support compound prioritization and timely decision-making during the hit-to-lead and lead-optimization phases of their drug discovery projects.

“High-content imaging allows us to see if we have hit the intended biological target in the cell. It provides an important link between molecular screening and functional cellular assays,” reported Lore Gruenbaum, Ph.D., principal scientist. “A major advantage is that it allows you to obtain multiple read outs in the same cells. You can evaluate target-directed as well as off-target effects of compounds in the same experiment.”

High-content cellular imaging also provides tools to quantitatively analyze cellular events in order to evaluate compound potency and verify structure activity relationships. “For example, we analyze phosphorylation and subcellular localization of endogenous target proteins,” Dr. Gruenbaum says.

Another benefit of high-content imaging is the ability to support critical early assessments of potential problems and liabilities found using the cellular models. “Such an approach allows one to look at a complex cellular response to the drug. The read outs can assess such parameters as cytoxicity, apoptosis, and effects on cell cycle.”

Toxicogenomics-based Assays

The decision of which compounds to advance through the drug discovery pipeline depends largely on knowing how it will behave once it gets to the clinic. Characterizing leads for toxicity liabilities early in lead development helps insure later success. In vitro ADME profiles help select viable candidates from the group of hits. However, they are insufficient to address all issues of toxicity during lead optimization.

Gene Logic (www.genelogic.com) is developing a primary rat hepatocyte system as a new assay platform for the prediction of human liver toxicity by compounds in the pipeline.

“In this system, we test compounds using rat models. We use a robust 96-well high-throughput cell culture and microarray assay platform so that we can screen panels of compounds,” said Larry Mertz, Ph.D., vp of R&D and product management. “Everyone is focusing on more high throughput. But this must also be cost-effective. With this new in vitro screen, you can increase volume and decrease costs.

“Toxicogenomics is a new enabling technology that allows for early assessment of compound liabilities due to potential toxicity. Our toxicogenomics approach is different because of its unique economy of scale and use of standardized operating procedures for assay consistency.”

“Toxicogenomics allows you to analyze biomarkers early on in the process,” Loralyn Mears, Ph.D., vp of marketing and partner alliances, genomics division, said. Currently, a majority of effort is at the time of injury. So, we are looking for tests with good prognostic value and are leveraging genomics with early discovery and high throughput.”

Genetic Susceptibility Factors

It is becoming increasingly clear that genetic factors play key roles in susceptibility to psychiatric illnesses. C. Anthony Altar, Ph.D., president and CSO of Psychiatric Genomics (PGI;www.psygenomics.com) commented that, “Understanding the relevant genes and their biochemical pathways that underlie psychiatric diseases and psychiatric drug actions provides a rational approach for developing novel therapeutics.”

“At PGI, we start by evaluating post-mortem samples of the human brain to discover the changes in gene expression in afflicted individuals compared with normal controls (the disease signature),” Dr. Altar said. “We also identify gene responses of cultured human neurons exposed to the most effective psychiatric drugs (the drug signature). Those genes whose changes in expression overlap between disease and drug treatments are the optimal targets for identifying novel antibipolar and antischizophrenia drugs.”

PGI selects 16 representative genes from the disease and drug signatures, and evaluates their change in cultured human neural cells exposed to compounds from their library. Drug candidate hits are identified by their effect on a 16-gene array.

“We perform a multiparameter high-throughput screen (MPHTS) using the 16-gene array based on quantitative nuclease protection, explained Dr. Altar. “The MPHTS assay allows us to quantitatively measure gene expression without the need for gene amplification. This is a powerful tool that helps minimize errors while providing disease-relevant information about candidate drugs.”

Dr. Altar noted that this approach has identified mood stabilizer drug candidates that are behaviorally active at lower doses than existing drugs. “Potency is an important milestone. In bipolar disorder, dose is particularly problematic because patients may need to take gram quantities of existing drugs every day. This amplifies undesirable side effects. The PGI compounds we are developing have higher potency in the drug screen and in the behavioral models we have used.”

Crystallographic Approaches

SGX Pharmaceuticals (www.sgxpharm.com) is utilizing a crystallography-driven fragment and structure-based approach, FAST (Fragments of Active Structures), for lead discovery and optimization. According to Siegfried H. Reich, Ph.D., vp of drug discovery, “Fragment screening is conceptually different but complementary to high-throughput screening. We start with a small number of low molecular weight compounds, or fragments, that are amenable to crystallographic screening. Our library of 1,000 structurally diverse compounds are selected for their low molecular weight, shape diversity, and drug-like properties. Although hits identified typically bind with low affinity, between 10-1,000 uM, we can see how they bind to their target using x-ray crystallography. Then, we can optimize with parallel synthesis and structure-based design approaches. That includes redesigning the core, if desired, to create higher affinity compounds.

“From a chemist’s point of view, I much prefer to start out with small, low molecular weight compounds as opposed to higher molecular weight compounds, which is often the case with HTS-based approaches, then design and optimize from there,” Dr. Reich commented. It’s an alternative and complementary approach in a sense because although you begin with a low affinity compound, you can generate and optimize leads with greater confidence, and importantly verify the optimization process along the way with crystallographic feedback.”

Another advantage to this approach is the quick turn around of data afforded by SGX’s state-of-the-art data-collection technology, said Dr. Reich. “One’s data is only as good as the turn-around time. You must be able to keep things moving. Because we started out as a crystallographic platform company, we made a deep investment in structural biology from the beginning. Our company was originally built around the rapid generation of crystal structures. Now we are employing that expertise as a competitive edge in our drug discovery organization.”

Intelligent Design

Combining intelligent design and combinatorial synthesis of libraries can provide a much more efficient route to drug discovery and lead optimization, according to Zhengming Chen, Ph.D., director of chemistry at Dov Pharmaceuticals(www.dovpharm.com).

“The best practice in library synthesis today is determining which compounds should be synthesized rather than settling for what can be made,” Dr. Chen suggested. “At the core of this trend toward high-quality smart libraries is the link between library design and the specific biological target.”

The drive for intelligent design involves surveying all available knowledge about a target, such as reviewing genomic and proteomics databases about the specific biological target protein, known patents on the target, known ligands for the target, and scientific publications related to the target and ligands.

“It is better and wiser to use intelligent design at lead-generation and lead-optimization stages. Our studies taught us several lessons. First, data and numbers alone are not sufficient for efficiently discovering new drugs. There is a huge amount of information being generated from genomics, proteomics, high-throughput screening, and rapid combinatorial chemistry.

“Second, intelligent design is emerging as a leading technique. The library size per scaffold peaked 5-6 years ago and has steadily declined since then. Now, instead of libraries containing several hundred compounds, we see libraries consisting of 20-100 compounds. These can be made more quickly and it provides us with shorter cycle times in drug discovery process.

“Third, the balance in drug discovery today is shifting from the concept of industrializing the drug discovery process into intellectualizing it. High-throughput synthesis and traditional medicinal chemistry are joining forces to create libraries that are both smaller and smarter in order to generate high quality in vitro and in vivo data. This is the future of drug discovery.”

Previous articleFrank Slack, Ph.D.
Next articleEnzon and Santaris to Collaborate on Novel Cancer Drugs