|SEND TO PRINTER|
Next-Generation Sequencing vs. Microarrays
Is it time to switch?
With recent advancements and a radical decline in sequencing costs, the popularity of next generation sequencing (NGS) has skyrocketed. As costs become less prohibitive and methods become simpler and more widespread, researchers are choosing NGS over microarrays for more of their genomic applications.
Rising maturity in NGS systems and ancillary protocols such as library preparation and data analysis tools have certainly contributed to the increasing popularity among the research community. Whether it’s a need for more accurate data, better resolution, pressure from granting agencies, or just plain fear of being left behind the technology forefront, it's clear that the demand for revolutionary sequencing technologies that deliver fast, inexpensive, and accurate genomic information has never been greater.
As outlined in a previous article (GEN Sep 1, 2012; Vol. 32, No. 15), NGS technologies have made great strides both economically and technically, and are gaining in popularity since first appearing on the scene less than a decade ago. With the cost of sequencing a human genome soon to drop to just over $1K and a market trend towards cheaper instrumentation, NGS is more affordable than ever for projects with even the tightest of budgets.
The immense number of journal articles citing NGS technologies is sending a clear message to array users that NGS is no longer just for the early adopters. Once thought of as cost prohibitive and technically out of reach, NGS has become a mainstream option for many laboratories, allowing researchers to generate more complete and scientifically accurate data than previously possible with microarrays.
Microarrays’ Proven Track Record
With all the advancements that NGS has made, why is anyone still using microarrays? The answer is, "lots of reasons!" Microarray platforms have a proven track record spanning nearly two decades in the lab. And with practice comes mastery—researchers have grown comfortable both with operating the technology and analyzing the results. Microarrays are generally considered easier to use with less complicated and less labor-intensive sample preparation than NGS. The same goes for data analysis. While there are still many tools for data analysts to choose from, a general consensus has emerged on the major methods for processing the data. And, despite the rapid drop in the cost associated with NGS, arrays are still more economical and yield higher throughput, providing significant advantages when working with a large number of samples.
Time to Make The Switch?
So when is it time to make the switch from microarrays to NGS? What factors should be considered when deciding between these two technologies? While researchers facing such choices may feel overwhelmed, it boils down to just a few key areas, such as research goals (e.g., discovery vs. profiling), access to technology, maturity of applications, cost per sample, and desired throughput. These key aspects for the primary genomic applications are addressed separately below. For some applications, such as chromatin immunoprecipitation, the transition to NGS is nearly complete, while for others, like cytogenetics, the transition has barely begun.
While dropping prices and maturing technology are causing NGS to make headway in becoming the technology of choice for a wide range of applications, the transition away from microarrays is a long and varied one. Different applications have different requirements, so researchers need to carefully weigh their options when making the choice to switch to a new technology or platform. Regardless of which technology they choose, genomic researchers have never had more options.
Shawn C. Baker, Ph.D, is the CSO of BlueSEQ, an independent guide for researchers outsourcing their next-generation sequencing projects.
© 2012 Genetic Engineering & Biotechnology News, All Rights Reserved