High-throughput approaches that have made an enormous difference in compound screening are beginning to make inroads into the broader area of drug discovery and development. In this arena, though, the term high throughput is relative.
“For in vitro screening, hundreds of thousands of compounds screened with a 1,536-well plate is high throughput, but for iterative SAR-based drug discovery, 60 compounds per week might qualify,” notes Jeffrey Dage, Ph.D., senior research advisor for Eli Lilly (www.lilly.com). “For a pharmacologist, even screening 14 compounds every two weeks, rather than three in three months, could be high throughput.”
Dr. Dage is looking at strategies to make the process of progressing from a target to a drug that’s ready for market as fast as possible. His solution is to incorporate analytical chemistry early in the process and have continual involvement with the project team.
The issue, he says, is that adequate assays do not exist for many targets or for helping researchers understand project-specific issues. “You can’t always use FLIPR technology or fluorescent labeling; we need a new, label-free method to build an assay. Once you have an in vitro assay, the next barrier is moving it to an in vivo situation,” so, researchers have to devise an assay showing in vivo target inhibition. “The next problem is efficacy,” which leads to biomarker assay development and thus on to the next barrier and the next. The focus, inevitably, is on surmounting the next hurdle.
“Looking at the big picture can allow the foundational work to happen even before the hurdle is directly impeding the team’s progress.”
Dr. Dage also advocates repurposing existing tools. Mass spectrometry, for example, is common in isolated functional laboratories but, if considered more broadly as a platform technology, could be beneficial during all stages of drug development. Applying it early on allows some experiments to be conducted on a single platform for in vitro and in vivo screening as well as in vivo pharmacokinetic work, thus minimizing variability and assay development time.
NMR is another option. “Is this a platform we can use across the value chain?” Dr. Dage asks. “It is similar to MRI, so how can we leverage it into discovery?” Nanostream technology, parallel HPCL, is another example of technology that can be leveraged. “We’ll continue to have new technological developments,” he points out, so it is imperative that companies learn to maximize their capabilities and determine whether they can become platform technologies. The goal is to develop efficiencies that keep things moving toward the market or to design the right experiments to fail drugs early.
There are a lot of different ways to do that, as Dr. Dage and other researchers will outline at the end of this month at “Lab Automation” in Palm Springs, CA. New tools are being added to the drug developers’ armamentarium that, each in its own way, enhances the process and speeds the outcome.
Waters (www.waters.com) developed a new, label-free approach to analyzing protein mixtures that provides qualitative and quantitative results. Based upon technology developed within the past few years, this system allows researchers to confidently identify and quantify proteins and peptides in a single system, according to James Langridge, Ph.D., senior manager for proteomics. Called the IdentityE High Definition Proteomics System, it includes the NanoACUITY UltraPerformance Liquid Chromatography and Synapt-HDMS systems. “A key element of the system is a powerful suite of software for the rigorous identification of proteins,” explains Dr. Langridge.
The key benefit of the system is reproducibility, according to Dr. Langridge. One of the challenges for the scientific community is instrument-to-instrument, day-to-day, and lab-to-lab variability of results. The IdentityE alleviates that concern, yielding a coefficient of variation of 10–15% for quantitative analysis, he reports. In terms of qualitative analyses, more than 90% of the data is reproduced, says Dr. Langridge. “We do two runs per sample to ensure reproducibility.”
Customers are using the system for qualitative protein identification and, uniquely, to simultaneously detect the absolute amounts of multiple peptides and proteins at femtomole levels (versus calculating relative ratios). It’s also a method to reinterrogate previously collected data and identify new peptides and proteins as sequences are unraveled and protein databanks updated.
The method has been applied to the study of lysosomal storage disorders. Its ability to detect changes in biomarker levels led to the re-examination of an existing biochemical test so that is it more widely applicable and specific, allowing therapeutic doses to be adjusted, according to Dr. Langridge.
“We’d like to stress how straightforward this is,” Dr. Langridge says. “You use the same methodology, independent of the sample.” This lets users focus on the biological results and not the analytical system, running the same method day after day regardless of the types of sample. That’s possible, he explains, because of the data-independent nature of the approach and the implementation of standard operating procedures.
At Indiana University, Yehia Mechref, Ph.D., and his team are focusing on glycomics and glycoproteomics in a search for biomarkers to identify cancers earlier and track their progress.
Prostate cancer is a particular concern because, as Dr. Mechref points out, the standard assay is, at best, 50% specific for the presence of PSA. “It’s not conclusive,” he admits. Dr. Merchef’s work, based upon glycoproteins, seems to offer greater specificity and identifies cancers earlier than standard methods. Rather than looking for the presence of PSA, “we’re looking at biomarkers from a different angle, focusing on post-translational modifications in glycans.” Some nonfunctional proteins become functional when exposed to cancerous conditions. “Therefore, we look for changes in glycans, which may be used as biomarkers.”
Currently, Dr. Mechref is analyzing samples to define potential biomarkers. Automating the process with standard lab automation tools increased the analysis from 18 samples per day one year ago to 192 samples per day now. The assays he is developing, and will detail at “Lab Automation,” effectively measure glycoprotein changes in samples at micro- to nanomolar concentrations. His goal is eventually to push that to picomolar levels.
Thermo Fisher Scientific (www.thermofisher.com) is marrying its automation platforms with its robust reagent libraries and discovery infrastructure to develop an integrated platform for genome-wide RNAi screening with siRNAs and miRNA reagents.
Traditionally, researchers would screen RNAi reagents against specific pathways of 15–20 genes whose inhibition may be important. Thermo Fisher now lets them screen against the approximately 22,000 genes in the human genome. The benefits, according to Dave Evans, Ph.D., senior director of RNAi discovery and therapeutic services, include gaining a more rapid understanding of the role specific genes play in disease mechanisms.
Application of high-content screening lets researchers look at subcellular effects induced by silencing individual genes or use end-point analysis to provide insight into the role these genes play in global parameter such as cell viability, Dr. Evans explains.
“One of the biggest problems in the industry is that drugs may not specifically target the appropriate mechanisms to induce therapeutic benefit across a large patient population, so many drugs fail in the clinic,” Dr. Evans says. Genome-wide RNAi screening, in contrast, helps researchers identify nodes in specific pathways where therapies against these targets may have a more profound effect.
“By combining the use of RNAi with coexposure to a drug, new targets can be discovered that, when silenced, synergize with and improve the potency and/or efficacy of the existing drug. Such adjunct targets have been identified for cancer therapeutics such as Doxorubicin or the taxanes. By identifying targets that synergize with an existing drug, the dose can be reduced, which also may reduce unwanted side effects.”
Understanding and characterizing the mechanisms that lead to improved therapeutic benefits aid in repurposing failing drugs. “This approach is becoming part of the standard tool set in drug discovery,” Dr. Evans says, “but we’ve just scratched the surface with the technology.”
miRNAs, either mimics or inhibitors, can be assayed with this technology too. They are becoming important in the differentiation and maturation of a wide variety of cell types, particularly stem cells, and they are generating interest in cancer and schizophrenia as well as other diseases.
Affymetrix (www.affymetrix.com) will be discussing its GeneChip® Array Station at the conference. The combination of the GeneChip Array Station and the GeneChip HT Array Plate Scanner increases workflow throughput and standardization, eliminating the variability inherent with human technicians, according to Shantanu Kaushikkar, product marketing manager for systems and software.
This combination lets researchers replace a single column approach with a 24- or 96-well plate, he says, explaining that although the technology has been used in research for a while, “the adoption rate is just starting to ramp up.”
The Affymetrix GeneChip Array Station is a dual-purpose instrument that automates target prep and array processing for high-throughput array plates, allowing a gradual scale-up without increasing staff size. It prepares a hybridization cocktail that can be added easily to the existing workflow, Kaushikkar says, letting researchers analyze 96 individual samples simultaneously.
The GeneChip HT Array Plate Scanner scans 24- and 96-well array plates for human, mouse, and rat samples. It focuses and scans each array in less than three minutes, according to Kaushikkar. It also boasts a dual-core processor in its computer workstation; an internal, automatic barcode reader; and software for data analysis and visualization.
These combined systems offer significant value in terms of standardization, Kaushikkar notes. Automation reduces variability by ensuring the same methods are applied to each set of samples regardless of where they are run or who runs them. The result is more consistent, higher-quality data that can be automatically tracked and added to the lab’s information management system.
“We have an open platform, which distinguishes us from others,” Kaushikkar adds. As an example, he notes that the target prep can be performed on hardware from Beckman Coulter (www.beckman.com), and data from the acquired images can be analyzed with software applications from a variety of vendors.