When members of the Metabolomics Society congregated in Boston recently for their fourth annual meeting, one topic that generated great interest was the problem of sample preparation for mass-spec analysis. While mass spec is playing an ever-expanding role in research, medicine, forensics, and industrial quality control, a major challenge is the processing of samples in a way that allows for their analysis without contaminating material, which can drastically lessen the information content of a sample.
A. Daniel Jones, Ph.D., professor in the department of biochemistry and molecular biology at Michigan State University, discussed multiple LC/MS/MS platforms for large-scale screening for target metabolites without derivatization. The conventional approach is tedious, employing gas chromatography for separation of samples before mass-spec analysis. Most metabolites require chemical derivatization, along with homogenization, extraction, and partitioning as well as other preparative steps that can raise the cost of processing each sample to more than $5, clearly an unacceptably high figure. Reproducibility of derivatization is suspect, and not all metabolites can be detected.
“For all these reasons, we adopted LC/MS processing for large-scale metabolite profiling,” Dr. Jones explained. “Using microplates, we can process a large number of samples for less than a dollar a sample.”
Dr. Jones and his colleagues have conducted extensive studies on oxylipins and phytohormones in Arabidopsis and other plants using the LC/MS/MS platform, which has allowed them to obtain extensive data on the role of Jasmonic acid, a member of the jasmonate class of plant hormones. It is biosynthesized from linolenic acid by the octadecanoid pathway and plays an important role in tuber formation in potatoes, yams, and onions. It may be converted to derivatives such as methyl jasmonate and may also be conjugated to amino acids.
The team postulated that following wounding in Arabidopsis plants there would be a spike in metabolites of Jasmonic acid, as the plant tissue sought to respond to the environmental insult by mobilizing a powerful metabolic response. Indeed, their analysis using LC/MS/MS revealed a wide range of Jasmonic acid-amino acid conjugates induced by the wounding process.
In a related study of glucosinolate mutants in Arabidopsis seed extracts, the group faced a potential pitfall of poor chromatographic separation of the minimally retained metabolites. They resolved this problem by modifying the solvent before injection.
“Discovery metabolite profiling is achieved using fast-liquid chromatography separations and TOF-MS, which enables large-scale analysis of metabolic phenotypes,” according to Dr. Jones. “These will serve as a guide to functional genomics. Rapid metabolite profiling will be essential if we are to address spatial resolution and temporal dynamics of the metabolome.”
Quality Control Issues
“The quality of the data is only as good as the sample handling and preparation,” stated John W. Newman, Ph.D., who with Theresa L. Pedersen presented their thoughts on many of the pitfalls that workers can encounter in the course of preparing samples for mass-spec analysis. Dr. Newman and Pedersen are research scientists at the USDA Western Human Nutrition Research Center.
Dr. Newman is adamant about quality control issues when incorporating commercial reagents into a protocol. “These metabolites can be expensive, but this does not mean purchased materials can be trusted,” he emphasized. For example, his team examined the targeted profiles of approximately 100 oxylipids using dilute solutions from a handful of sources as primary calibrants. Strikingly, major discrepancies between their 2003 and 2007 analyte lots were observed, ranging from 8% to 233% of theoretically predicted values. Dr. Newman’s moral to the story is, always check the quality of new standards against old, regardless of manufacturer’s quality control assessments.
“For minimum quality control requirements you need a set of robust and reproducible, performance-based methods,” Dr. Newman added. “To control for systematic bias in sample handling, randomizing is recommended.
“Reducing sample size is often employed to save reagents and scale down mass-spec analysis. When reducing sample size, you should evaluate how small you can go, minimize the transfer steps, and compare replicate precision with larger samples.”
Dr. Newman also discussed sample-handling considerations, stressing that metabolite decomposition post-sampling can distort results. Inadvertently introduced biological and chemical factors can destroy analytes during all sample-handling steps. The result will be inaccurate or unreliable data.
Specific examples constitute a rogue’s gallery of unwanted changes to the samples. These include inadvertent thawing of samples during processing, which may be counteracted by maintaining samples on dry ice during processing; enzymatic deterioration of samples by lipases, proteases, and cyclooxygenases, which can be blocked with specific inhibitors; and auto-oxidation, for which antioxidants like butylated hydroxy toluene are commonly employed.
Dr. Newman has other recommendations for the handling of serum and urine during processing for mass-spec analysis. These include careful monitoring of hemolysis, control of serum coagulation time, and ultimately the strict regulation of processing time and temperature during sample work up.
Metabolome Coverage in Erythrocytes
Theodore Sana, Ph.D., a researcher at Agilent Technologies, discussed his application of a liquid chromatographic method with multimodal LC/MS detection. As he put it, “the metabolome consists of a wealth of diverse chemical compounds, with a large variety of structures, each displaying unique physico-chemical properties.”
Because of this byzantine complexity, it has so far not been possible to delineate the complete metabolome using a single technique. Indeed, for a broad, untargeted analysis of the metabolome most extraction procedures are not comprehensive enough and do not provide a unifying picture of the range of compounds comprising the metabolome.
So Dr. Sana and his colleagues are approaching the problem on a number of fronts including sample-preparation improvements: lowering cost, time saving, expanded use of internal standards (for determining extraction recovery), increasing the extraction coverage, and applying new chromatographic separation technologies applicable to the LC/MS platform.
In the Agilent metabolomics workflow for untargeted analysis, increased coverage is obtained by varying the pH of the sample extracts to maximize solubility of compounds in the aqueous phase. Dr. Sana discussed the application of this platform to workflow processing of red blood cell samples in order to maximize metabolite coverage. His team found that no single pH condition recovers all analytes, and no single ionization mode detects all analytes. They observed 2,370 total compounds in their extensive screening program.
A major component of his program is the development of an Accurate Mass Retention Time Library (AMRT) for LC/MS analyses that currently includes over 1,000 standards. This library uses compound identification from sample extracts based on matching compounds from samples to the standards, which are in turn based on accurate neutral mass and retention time.
Using this information, the Agilent group has built a METLIN Personal Database for metabolomics research, covering over 22,000 standards that can be installed on a PC. It is customizable and works with other Agilent software. In addition, the database can include a molecular formula generator (MFG) that calculates a molecular formula for each compound generated from their isotopic patterns, such as abundance and spacing information. Taken together, the mass, retention time, and MFG score can increase the confidence of making the correct identification without having to resort to MS/MS analysis.
Dr. Sana provided examples of the performance of the METLIN database using mass, retention time, and MFG score as identifying clues for metabolites. In the case that he profiled, several compounds have the same molecular formula and mass (i.e., they were isobaric) but only one had the correct retention time. “We are confident that the compound matched a particular standard in the database based on its mass, retention time, and the quality of the MFG score,” Dr. Sana stated.
Dr. Sana concluded by stating that the Agilent extraction platform provided a significant increase in the coverage of recovered metabolites over neutral pH alone. Subsequently, these metabolites were analyzed using a chromatographic method compatible with multiple ionization modes and polarities. The database is specific to the separation method, and the retention times change if the chromatographic separation is changed. “The feasibility of identifying metabolites using mass, retention time, and isotope pattern is still a work in progress.”
Oliver Fiehn, Ph.D., a faculty member of the University of California, Davis Genome Center, discussed his group’s efforts in developing quality control in GC/MS-based metabolite profiling. Dr. Fiehn described how in gas chromatography metabolites may need to be modified in order to increase their volatility. Thus prepared, liquid samples are injected into a capillary column where they are separated before they are ionized and then introduced into the time-of-flight mass spectrometer. Their passage through the flight tube is measured, with small ions reaching the detector before large ones.
According to Dr. Fiehn, the major problem in mass spec-based metabolomics is that the instrument gets into physical contact with the sample that is contaminated and highly heterogeneous. “The sample is ‘dirty’ because metabolomics gears up to detect everything with little or no clean-up steps, and this ‘dirt’ or ‘matrix’ causes major quality problems.”
Dr. Fiehn believes that the hot injection system is the most problematic part in the mechanics of gas chromatography-coupled mass spec. As the samples are introduced into the system through a hot injection, a phenomenon referred to as the Leidenfrost effect occurs—a liquid, in near contact with a mass significantly hotter than its boiling point, produces an insulating vapor layer which keeps that liquid from boiling rapidly.
“The droplet became flat as a small disk and hovers a fraction of a millimeter above the plate. It may have moved nervously, jumping around the hot griddle. Nebulization does not occur with fast injection auto-samplers,” stated Konrad Grob, chief of the GC Department at the Kantonales Laboratory in Zürich, Switzerland.
The Fiehn team dealt with these issues through a number of strategies. Among these were cleaning or exchanging the syringe needles, replacing the injector, and changing the gold plate on which the samples are sprayed.
“We believe that with appropriate quality control, GC/MS is the method of choice for primary metabolism; indeed, even amino acids can be quantified using daily QC calibrations,” Dr. Fiehn concluded.
Ron Bonner, Ph.D., and his analytical team at Applied Biosystems/MDS Analytical Technologies have been grappling with artifacts introduced into LC/MS-based analysis. He introduced his presentation by stating, “There is one source of expected (and useful) variance, but many sources of unexpected, confounding variance. While they may appear to be real, they may be marauders that mask the authentic distinguishing features of the data set.”
Variation may be introduced into samples by the collection process, by improper handling and storage, or by the limited long-term stability of the experimental material. Another source of variation is that introduced by instrumentation failure. Sensitivity drift, carryover, contamination (build up or clearout), and retention time changes may all be among the problems.
When the time for data analysis arrives, peak determination and failure to align the data properly may contribute to unwanted variability in the outcome. Finally, biological variation including presence of xenobiotics along with unaccounted differences in gender, age, diurnality, and individual variability all may introduce confounding variability into the assessment process.
To deal with this myriad of challenges, Dr. Bonner and his associates have developed a software program, Marker View™, designed to handle many of these problems. The general features include ability to go from data to PCA quickly and in one program, feature extraction and selection, retention time, alignment, normalization, and scaling. The software also provides a variety of other features for crunching and managing the data.
Dr. Bonner also presented another software tool for analyzing and eliminating fraudulent variance from mass spec data sets. The Principal Component Variable Grouping is a tool for data interpretation and visualization. It uses samples to find correlated variables, and identifies peaks from the same compound and with similar expression profiles. Correlated variables are assigned to a group, and the same symbols are used for group members in the loadings and profile plots.
Dr. Bonner provided examples of how the application of his tools can clean up messy data sets and transform them into manageable, meaningful collections of information from which solid scientific conclusions may be drawn. These included mass-spec analysis of individual samples of 14 different types of fruit and a study of the metabolic distribution of the drug Vinpocetin in rats.
The fact that so much of the symposium was given over to a discussion of the many ways in which investigators can go astray suggests apprehension within the field over the quality standards of mass spec as applied to metabolomics. If these admonitions concerning quality control and rigorous adherence to analytical standards are addresed, this could go a long way toward accounting for the failure of metabolomics programs to come up with viable drug candidates and workable therapies.