November 15, 2008 (Vol. 28, No. 20)

Gail Dutton

Integrating Genomics Technology and Information into Toxicology Studies Benefits Development

Whether toxicogenomics—gene-expression profiling—can be said to have lived up to its potential depends upon how it is used. The tool is applied mainly to determine mechanisms of action and, to a lesser extent, as a predictive tool. The key is to place the data into a contextual framework, according to many leading researchers.

“Toxicogenomics reflects an integration of genomics technologies and information into toxicology studies,” explained Cindy Afshari, Ph.D., scientific director, Amgen. Dr. Afshari hosted the “ILSI Health and Environmental Sciences Institute Genomics Applications in Safety Studies—Case Study Workshop” held last month.

Salah-Dine Chibout, Ph.D., global head of investigative toxicology at Novartis, said that toxicology identifies whether a drug is or is not toxic. Toxicogenomics, on the other hand, “takes you to another level of complexity, which could provide mechanisms of toxicity.”

“Proper execution and interpretation requires specialized expertise and training that not all toxicologists have,” Dr. Afshari emphasized. Consequently, many toxicogenomic studies occur in investigative or discovery toxicology groups. The move a decade ago to develop special toxicogenomics departments has largely fallen by the wayside as the technologies used in toxicogenomics such as microarrays have become mainstream.

Bruce Carlson, analyst for Kalorama Information, noted “there are no good market numbers on toxicogenomics.” The market for gene-expression profiles, however, a cornerstone of the field, was $1.1 billion globally in 2007. Over time, Dr. Afshari predicted, it is likely that toxicogenomics and standard toxicology assessments will become integrated.


Application of toxicogenomics has provided insights into mechanisms driving particular target organ toxicities and has provided viable hypotheses for further testing.

Predictive Toxicogenomics

Lilly Research Laboratories is at the forefront of predictive toxicogenomics work. This area was the focus of intense interest when the field emerged a decade ago. Since then, it has been slow to fulfill its promise.

Craig E. Thomas, Ph.D., senior research advisor, investigative toxicology, suspects the barriers are as much cultural as technological. “As safety assessment in the pharma industry has traditionally been initiated in the latter stages of the discovery process, overcoming the natural fear of failure associated with anything new or unproven is even more daunting.”

“Our feeling is that one of the primary barriers to using transcriptomics predicatively is an inability to put the data into a context that allows decision-making,” said Dr. Thomas.

The challenge for researchers is that with more than 30,000 data points for each microarray, “scientists will undoubtedly see expression changes relative to the controls. What was lacking for many years, however, was an ability to attach toxicologic significance to those changes. For example, is the pattern of gene-expression changes representative or associated with an adverse event?”

Addressing those changes, explained Dr. Thomas, requires developing a large chemogenomics database. During the past three years, Lilly has leveraged DrugMatrix®, a contextual database from Entelos.

“A key feature of having a large database of expression changes integrated with traditional toxicity endpoints is mathematically derived gene signatures that are predictive or coincident with toxicologic endpoints,” he continued. Because the gene signatures are often composed of genes that are not readily associated biologically to outcomes, “you have to believe in the numbers.”

Dr. Thomas said that Lilly’s experience with toxicogenomics has been positive largely because of the contextual database that helped Lilly scientists move beyond purely retrospective studies to study the mechanism of toxicity.

He added that Lilly is unique in focusing much of its toxicogenomics work on in vitro studies, in which gene signatures are used to predict outcomes in animal studies. By addressing toxicology at the hit-to-lead stage, scientists can look across the structure/activity relationship to consider multiple chemical scaffolds.

“Researchers, historically, haven’t considered drug safety at this early stage because the tools were lacking,” he emphasized. The result was often a drug candidate optimized around one scaffold without any toxicology assessment.

“It’s still the early days,” he cautioned, and so the outcomes in long-term toxicology studies of molecules prioritized using genomics in the early preclinical studies remain to be seen. That said, that approach has contributed to a growing pipeline that currently features an all-time high of 50 distinct compounds in clinical development.

The biopharma industry as a whole, however, has experienced less success with predictive toxicogenomics. Only a few validated markers have emerged from this research.

Mark Fielden, Ph.D., senior scientist, Roche, pointed out that the problem reported by the broad industry is that “the biomarkers that have been identified haven’t been robust enough,” for this to be used predictively. But the issue may be with the models used. Roche has experienced success here, said Dr. Fielden, and this remains a ripe area of research.

Other issues outlined by Russell S. Thomas, Ph.D., director of the Center for Genomic Biology and senior investigator at the Hamner Institutes for Health Sciences, included questions of reproducibility, how to interpret the data, and how to build a profile to allow it to become clinically useful.


A Roche scientist evaluates an Affymetrix GeneChip array image for quality assessment related to a toxicogenomic Cyp-Induction project.

Mechanistic Actions

“Predictive toxicogenomics is taking longer to materialize than mechanistic applications but has promise,” according to Dr. Thomas. “It may take five to seven years to be broadly applied and accepted by the FDA, which has a consortium looking at this.”

More companies have focused upon toxicogenomics as a way to better understand mechanism of action than to predict toxicity. “Application of toxicogenomics has provided insights into mechanisms driving particular target organ toxicities and has provided viable hypotheses for further testing,” Dr. Afshari explained. “We have also found that genomic analyses provide useful information around discrimination compounds to support decisions for ranking molecules” to advance through lead optimization.

Novartis is using known toxic compounds to develop the techniques to understand the mechanistic actions of toxicity. Such screening led to the recent validation of biomarkers for kidney toxicity. “There is a lot of demand,” Dr. Chibout said, particularly for kidney, liver, heart, and vascular systems.

Novartis has developed an internal database to identify toxicity that is used routinely. Such information allows a more accurate risk/benefit analysis, helping companies determine the relative value of advancing compounds through the developmental pipeline.

Its strength, Dr. Fielden said, is as a hypothesis generator. When used to screen particular drugs, the technology generates thousands of expression changes at a time, allowing many hypotheses to be tested. Cell-based assays, in contrast, generate perhaps a half-dozen endpoints at most when multiplexed, and thus can test a smaller number of hypotheses. Roche therefore uses toxicogenomics to generate hypotheses when trying to understand mechanisms of toxicity, which then are tested more thoroughly using other assays.

Challenges

“One of the challenges for scientists in toxicology is that sometimes genomics that are not fully sequenced and/or annotated are used,” said Dr. Afshari, which “presents a challenge to network and pathway analyses.” For example, a study undertaken by the Health and Environmental Sciences Institute found that although pathway analyses were consistent among labs and platforms, gene-to-gene comparisons were more problematic.

Consortiums currently are addressing that issue by cross-validating many of the biomarkers that have been linked to various diseases. “There’s a decade of literature,” Dr. Fielden noted, “but a majority of the information is not well-understood.”

The C-Path Predictive Safety Testing Consortium (PSTC) is trying to leverage that work by identifying promising biomarkers from the literature and therefore allow testing on smaller arrays or on individual genes that provide results comparable to or better than those of larger arrays, he said.

The Microarray Quality Consortium, on the other hand, is working to standardize microarrays to “give researchers confidence that the assay they are running is both accurate and precise,” said Dr. Fielden. That has the benefit of allowing researchers to better compare results in the literature and among labs, and thereby access a larger, more relevant body of data. That group also is establishing a set of best practices for the derivation and validation of multiple gene biomarkers or signatures.

Work also is under way to develop a database of carcinogenic chemicals and their specific effects in rats. Dr. Fielden and the PSTC are examining published toxicogenomics signatures and trying to validate those signatures across laboratories.

Results, he noted are “fairly robust and show promise” for predictive or rodent carcinogenicity. Other work is under way to rederive the carcinogenicity signature on a real-time PCR platform containing 20–30 genes in order to standardize the measurement platform.

Interest in toxicogenomics is predicated upon the ability to more accurately and more quickly identify toxic compounds to select the best drug candidate. “By doing so, you save a lot of money, as well as time,” said Dr. Chibout.

Future

“The field is still maturing, so lessons from use and applications are still emerging,” explained Dr. Afshari. Toxicogenomics has the potential to revolutionize the way drugs are developed if it is integrated with other disciplines, including pathology, molecular biology, physiology, bioinformatics, and clinical development. “If you do that, it is successful,” Dr. Chibout asserted. If not, it’s just more data.

It’s important to augment toxicogenomic data with other information for decision-making such as histopathology researchers agree. “Sometimes the inference from the information can be used as a guide to understand drug effect,” noted Dr. Afshari. Any inference or key hypothesis directly resulting from toxicogenomics, however, should be followed up with other analyses to help bridge the gap between preclinical models and humans.

Right now, the information isn’t fully integrated with other disciplines. “For example,” added Dr. Chibout, “currently, we look at the gene. In the future, we are likely to look at proteins, and metabolites and epigenetics.” That broadening will come sooner rather than later, he predicted, as data is emerging from those fields to make them more amenable to toxicogenomics.

The key to the successful deployment of toxicogenomics is to understand when to use it, and when to use something else. “Gene-expression profiling is just another tool in the toolbox,” emphasized Dr. Fielden. “Pick the right tool for the right problem.”

Previous articlemiRNA Implicated in Inflammation Integral to Alzheimer’s Disease
Next articleUnited Therapeutics Pays $150M for Rights to Lily’s Tadalafil