Leading the Way in Life Science Technologies

GEN Exclusives

More »

Feature Articles

More »
May 15, 2017 (Vol. 37, No. 10)

Biomarker Validation for Genomic Assays

Assay Developers Put Their Tests to the Test and Demonstrate the Clinical Utility of Genomic Biomarkers

  • Click Image To Enlarge +
    A liquid biopsy technology is being developed by UCLA scientists to detect tumor-causing mutations in saliva and other bodily fluids. The technology, called electric field-induced release and measurement (EFIRM), leverages highly specific molecular probes and an enzymatic amplification system to achieve high-sensitivity detection of circulating tumor DNA and proteins associated with cancer. The EFIRM detection process takes place on a high-throughput electrode array, allowing for the testing of multiple patients and multiple cancer markers simultaneously.

    Pity the genomic scientist who specializes in biomarker development, a discipline that is still being built around next-generation sequencing (NGS). This new discipline, which could be called next-generation molecular diagnostic assay validation, is currently grappling with a range of challenges: biospecimens that are rare and of uncertain quality, analytes that are present in scant quantities, and sample preparation guidelines and data handling conventions that are still in flux.

    In hopes of laying a firm foundation on these shifting sands, genomic scientists gathered at the fourth annual Genomic Sample Prep, Biomarker Assay Development and Validation, a conference that recently took place in San Francisco. This event, which was organized by Cambridge Healthtech Institute (CHI), provided an opportunity for biospecimen experts and assay developers to discuss the “major challenges and latest advances in sample preparation and the validation of NGS and other advanced diagnostics assays.”

    Outstanding presentations from the CHI conference are summarized in this article. Like the conference, the main body of this article will start with observations offered by  Lin Wu, Ph.D., vice president of development, Roche Sequencing Solutions. Dr. Wu chose to emphasize best practices and approaches that could bring highly sensitive and robust assays to market. In particular, Dr. Wu cited guidelines that appeared in The Journal of Molecular Diagnostics.

    On March 16, a workgroup convened by the Centers for Disease Control and Prevention published a set of guidelines on molecular diagnostics (“Principles and Recommendations for Standardizing the Use of the Next-Generation Sequencing Variant File in Clinical Settings”). Less than a week later, on March 21, the Association for Molecular Pathology and the College of American Pathologists issued an oncology-oriented joint consensus recommendation (“Guidelines for Validation of Next-Generation Sequencing-Based Oncology Panels”).

    Such efforts, Dr. Wu stated, “provide recommendations for validating genomic assays analytically and clinically.” Moreover, they recognize that precious patient samples are critical for advancing genomic medicine.

    The importance of patient samples was also taken up by Benoit Bouche, Pharm.D., Ph.D., managing director, Trans-Hit Biomarkers. “Hundreds of R&D projects are stuck in labs because pharma companies behind projects cannot find the biospecimens needed for analytical and clinical validation of assays,” he lamented.

    The problem of limited tissue, commented Ping Qiu, Ph.D., principal scientist, Merck & Co., is especially troublesome in oncology applications, which also contend with tumor heterogeneity and sequencing artifacts. “Oncology applications of sequencing technologies are still in their infancy, especially in the immuno-oncology space,” noted Dr. Qiu, who also complained of a lack of consistency in laboratory practices. “There is currently no industry standard,” he said, “in terms of reference materials, DNA/RNA extraction, sample input amounts, or bioinformatics pipelines, etc.”

    The strategies pursued by different laboratories depend not only on the markers—whether DNA, RNA, or protein—but also on the types of questions that are being asked, and the technologies that are being used. Suppose a biomarker assay identifies a clinically actionable somatic mutation in a cancer. Is the mutation in 100 cells or 100% of the cells, or 10% or 5%?

    “You must pinpoint and validate,” asserted Madhuri Hegde, Ph.D., adjunct professor, Emory University, and vice president and chief scientific officer, global laboratory services, diagnostics, PerkinElmer. “This knowledge will drive the drug treatment for that patient.”

    Although the challenges addressed at the CHI conference remain daunting, several presenters highlighted advances in biospecimen science, including promising results with unconventional specimens. For example, David T.W. Wong, endowed professor, associate dean of research, UCLA School of Dentistry, discussed how saliva and salivaomics could be used in the detection of oncogenic mutations in human cancers. “Biomarker validation,” insisted Wong, “is the key element in biomarker research, particularly in patient research.”

  • “Real” Specimens in Potential Assays

    Dr. Wu verifies analytical performance of next-generation biomarker assays. “We are developing targeted biomarker panels to measure tumor-specific mutations in liquid biopsies,” she said, adding that Roche is also working on a panel to measure how many tumor mutations are in the patient’s blood overall.

    Analytical validation technologies are considered promising if they excel in terms of sensitivity, specificity, and reproducibility. Although performance attributes such as these are important, they do not, by themselves, guarantee success. Dr. Wu emphasized that analytical verification doesn’t prove assays have clinical utility; the biomarkers therein may not be 100% proven for clinical use yet. Analytical studies must be done prior to moving assays to clinical validation; otherwise, resources and time could be wasted on precious patient samples and trials.

    “It is very important for us to complete the analytical biomarker studies before we commercialize the biomarker assays,” confirmed Dr. Wu.

    Roche uses “real” clinical specimens and follows guideline recommendations such as the ones recently published in The Journal of Molecular Diagnostics. Admittedly, collection of hard-to-find real clinical specimens can be challenging and costly. Nevertheless, it’s important, and some of the analytical verification studies for a biomarker assay should be done with the “real” clinical specimen that the assay is intended to be used with, asserted Dr. Wu.

    For example, genomic DNA isolated from cell lines versus tumor tissues versus a patient’s blood will all be different. “Analytical validation can’t just be done with cell line DNA,” cautioned Dr. Wu. “It’s too clean, and it doesn’t give you the variability found in sample specimens.”

    It’s challenging to determine the ground truth when using clinical specimens. “You might say your assay device can measure certain copy numbers of DNA, but you need to verify your claim with an independent method,” advised Dr. Wu. “Typically, we develop more than one method to measure the same biomarker.” The biomarker is measured not only in the assay intended for the customer, but also in independent tests on the same clinical specimens. “I don’t see that being done with too many others in the industry as a standard practice,” she concluded.

  • Biobank Withdrawals in Support of Liquid Biopsies

     “Liquid biopsy, the evaluation of circulating tumor cells (CTCs) or circulating DNA in blood samples taken from cancer patients, is currently the hottest segment within the biomarker field,” exclaimed Dr. Bouche. “Street analysts predict sales could soon exceed $10 billion from opportunities opened up by liquid biopsy technologies.”

    Pharmaceutical and in vitro diagnostic (IVD) companies that are working on the same targets are also “competing” for the same biological sample types. Samples collected from patients treated by drugs targeting the PD-1/PD-L1 pathways, now just becoming available, are particularly difficult to find. “Lack of access to biospecimens is the main bottleneck for researchers,” observed Dr. Bouche.

    Trans-Hit Biomarkers asserts that it offers unparalleled and fully transparent access to biospecimens, and that it has a proprietary network of over 100 clinical sites and academic biobanks. Biospecimens are collected and curated from more than 400 hospitals all over the world, including North America, Europe, and Asia.

    In contrast, biospecimen brokerage firms whose core business is to provide off-the-shelf commoditized samples likely lack the depth of expertise and academic connections to meet liquid biopsy sample requirements, cautioned Dr. Bouche.

    Unprecedented demand for matched plasma and formalin-fixed, paraffin-embedded (FFPE) blocks spurred Trans-Hit BioMarkers and its U.S. partner, MT Group, to initiate and sponsor a consortium in lung cancer. The consortium aims to collect 5,000 high-quality unique patient series of specimens (matched tissues and biofluids at diagnosis and relapse) within one year. The samples will be ethically collected from different geographies, annotated with detailed clinical information, and screened for biomarkers of current interest (EGFR, KRAS, BRAF, ALK, MET, ROS, RET, and PD-L1).

    “Our initiative will accelerate the development of new biomarkers that are of strategic interest for public health but are stuck in laboratories due to the scarcity of high-quality samples,” concluded Dr. Bouche.

  • Raw and Amplifiable DNA

    Mutational load (that is, number of mutations) is a predictive biomarker of response to checkpoint inhibitors such as Merck’s pembrolizumab. The more mutations a tumor has, the more likely a response. Whole exome sequencing (WES) can help define the mutational landscape and correlate mutational load with the likelihood of responding to immunotherapy.

    “There’s a lot of preanalytical work that needs to be streamlined and standardized before the mutational load assay can be validated,” cautioned Dr. Qiu. For example, several parameters that are relevant to NGS-based tests—including the amount of DNA input for WES—are in need of standardization.

    The lack of standardization is complicating tests that extract nucleic acids from FFPE tissue. DNA/RNA extracted from FFPE tissue is typically highly degraded, fragmented, and cross-linked.

    Dr. Qiu has evaluated many different DNA/RNA extraction kits and quantification methods. This work led him to conclude that “10 different vendors might provide 10 different DNA input recommendations, ranging from 50 ng to 2 μg.” Standardization is lacking, and quantification is inaccurate, because amounts include poor quality and unamplifiable DNA.

    “We investigated several more-accurate measurement methods called ‘amplifiability of the DNA’ to qualify the amplifiable copy of DNAs instead of the ‘raw’ DNA from FFPE extractions,” explained Dr. Qiu. Input guidelines based on copies of amplifiable or usable DNA can be more standardized and controllable than raw DNA input amounts.

  • Bioinformatics Pipelines

    The bioinformatics pipeline (a set of ordered scripts that take raw data to data product to analysis results) is used to align reads from NGS assays, do variant calling, and filter mutations. “We download the reference sequence from the National Center for Biotechnology Information (NCBI) website,” informed Dr. Hegde. “And we design our own bioinformatics pipeline by writing scripts to pull regions we want to interrogate clinically.”

    The major step in pipeline validation is to check your assay sequence data, such as data from 5,000 known disease-causing genes in WES assays, against the reference sequence pulled. “There should be a one-to-one sequence match if your assay was designed properly,” asserted Dr. Hegde.

    When different pipelines are used for biomarker validation, results vary. “If a researcher uses five different pipelines for the same raw data, the overlapping mutation rate is only about 50%,” Dr. Qiu pointed out. “If you’re getting results from different laboratories, you cannot compare the data unless you do a unified analysis by the same pipeline.”

Related content