Posts Tagged ‘Diagnosis’

Blood test uses DNA strands of dying cells

Curators:  Larry H. Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN



Hadassah-Developed Blood Test Detects Multiple Sclerosis, Cancer & Brain Damage

A new blood test that uses the DNA strands of dying cells to detect diabetes, cancer, traumatic brain injury, and neurodegenerative disease has been developed by researchers at Hadassah Medical Organization (HMO) and The Hebrew University.

In a study involving 320 patients, the researchers were able to infer cell death in specific tissues by looking at the unique chemical modifications (called methylation patterns) of circulating DNA that these dying cells release. Previously, it had not been possible to measure cell death in specific human tissues non-invasively.

The findings are reported in the March 14, 2016 online edition of Proceedings of National Academy of Sciences USA, in an article entitled “Identification of tissue specific cell death using methylation patterns of circulating DNA.”  Prof. Benjamin Glaser, head of Endocrinology at Hadassah, and Dr. Ruth Shemer and Prof. Yuval Dor from The Hebrew University of Jerusalem led an international team in performing the groundbreaking research.

Cell death is a central feature in health and disease. It can signify the early stages of pathology (e.g. a developing tumor or the beginning of an autoimmune or neurodegenerative disease); it can illuminate whether a disease has progressed and whether a particular treatment, such as chemotherapy, is working; and it can alert physicians to unintended toxic effects of treatment or the early rejection of a transplant.

As the researchers relate: “The approach can be adapted to identify cfDNA (cell-free circulating DNA) derived from any cell type in the body, offering a minimally invasive window for diagnosing and monitoring a broad spectrum of human pathologies as well as providing a better understanding of normal tissue dynamics.”

“In the long run,” notes Prof. Glaser, “we envision a new type of blood test aimed at the sensitive detection of tissue damage, even without a-priori suspicion of disease in a specific organ. We believe that such a tool will have broad utility in diagnostic medicine and in the study of human biology.”

The research was performed by Hebrew University students Roni Lehmann-Werman, Daniel Neiman, Hai Zemmour, Joshua Moss and Judith Magenheim, aided by clinicians and scientists from Hadassah Medical Center, Sheba Medical Center, and from institutions in Germany, Sweden, the USA and Canada, who provided precious blood samples from patients.

Scientists have known for decades that dying cells release fragmented DNA into the blood; however, since the DNA sequence of all cells in the body is identical, it had not been possible to determine the tissue of origin of the circulating DNA.  Knowing that the DNA of each cell type carries a unique methylation and that methylation patterns of DNA account for the identity of cells, the researchers were able to use patterns of methylated DNA sequences as biomarkers to detect the origin of the DNA and to identify a specific pathology. For example, they were able to detect evidence of pancreatic beta-cell death in the blood of patients with new-onset type 1 diabetes, oligodendrocyte cell death in patients with relapsing multiple sclerosis, brain cell death in patients after traumatic or ischemic brain damage, and exocrine pancreatic tissue cell death in patients with pancreatic cancer or pancreatitis.

Support for the research came from the Juvenile Diabetes Research Foundation, the Human Islet Research Network of the National Institutes of Health, the Sir Zalman Cowen Universities Fund, the DFG (a Trilateral German-Israel-Palestine program), and the Soyka pancreatic cancer fund.

 Identification of tissue-specific cell death using methylation patterns of circulating DNA.
Minimally invasive detection of cell death could prove an invaluable resource in many physiologic and pathologic situations. Cell-free circulating DNA (cfDNA) released from dying cells is emerging as a diagnostic tool for monitoring cancer dynamics and graft failure. However, existing methods rely on differences in DNA sequences in source tissues, so that cell death cannot be identified in tissues with a normal genome. We developed a method of detecting tissue-specific cell death in humans based on tissue-specific methylation patterns in cfDNA. We interrogated tissue-specific methylome databases to identify cell type-specific DNA methylation signatures and developed a method to detect these signatures in mixed DNA samples. We isolated cfDNA from plasma or serum of donors, treated the cfDNA with bisulfite, PCR-amplified the cfDNA, and sequenced it to quantify cfDNA carrying the methylation markers of the cell type of interest. Pancreatic β-cell DNA was identified in the circulation of patients with recently diagnosed type-1 diabetes and islet-graft recipients; oligodendrocyte DNA was identified in patients with relapsing multiple sclerosis; neuronal/glial DNA was identified in patients after traumatic brain injury or cardiac arrest; and exocrine pancreas DNA was identified in patients with pancreatic cancer or pancreatitis. This proof-of-concept study demonstrates that the tissue origins of cfDNA and thus the rate of death of specific cell types can be determined in humans. The approach can be adapted to identify cfDNA derived from any cell type in the body, offering a minimally invasive window for diagnosing and monitoring a broad spectrum of human pathologies as well as providing a better understanding of normal tissue dynamics.

While impressively organ specific, they did not specifically prove that the DNA was from an actual dying cell. For example, you would need to see if Troponin levels were elevated when assuming the DNA is from injured myocardium. Also, for brain, though impractical , you’d want to see a brain biopsy or imaging for the brain related cases. The experiment of spiking with DNA was clever though. Also, what is the turnaround time for this test in practical use?

Larry HB

Very good comment. I was reluctant to put this up, but it was of interest and published in PNAS.  Perhaps I can find more information.  Troponin levels would be good for 48 hours, longer than CK and comparable to LD.  What about Nat peptides?

Glutamine and cancer: cell biology, physiology, and clinical opportunities

Christopher T. Hensley,1 Ajla T. Wasti,1,2 

J Clin Invest 2013

Glutamine is an abundant and versatile nutrient that participates in energy formation, redox homeostasis, macromolecular synthesis, and signaling in cancer cells. These characteristics make glutamine metabolism an appealing target for new clinical strategies to detect, monitor, and treat cancer. Here we review the metabolic functions of glutamine as a super nutrient and the surprising roles of glutamine in supporting the biological hallmarks of malignancy. We also review recent efforts in imaging and therapeutics to exploit tumor cell glutamine dependence, discuss some of the challenges in this arena, and suggest a disease-focused paradigm to deploy these emerging approaches.

It has been nearly a century since the discovery that tumors display metabolic activities that distinguish them from differentiated, non-proliferating tissues and presumably contribute to their supraphysiological survival and growth (1). Interest in cancer metabolism was boosted by discoveries that oncogenes and tumor suppressors could regulate nutrient metabolism, and that mutations in some metabolic enzymes participate in the development of malignancy (2, 3). The persistent appeal of cancer metabolism as a line of investigation lies both in its ability to uncover fundamental aspects of malignancy and in the translational potential of exploiting cancer metabolism to improve the way we diagnose, monitor, and treat cancer. Furthermore, an improved understanding of how altered metabolism contributes to cancer has a high potential for synergy with translational efforts. For example, the demonstration that asparagine is a conditionally essential nutrient in rapidly growing cancer cells paved the way for L-asparaginase therapy in leukemia. Additionally, the avidity of some tumors for glucose uptake led to the development of 18fluoro-2-deoxyglucose imaging by PET; this in turn stimulated hundreds of studies on the biological underpinnings of tumor glucose metabolism.

There continue to be large gaps in understanding which metabolic pathways are altered in cancer, whether these alterations benefit the tumor in a substantive way, and how this information could be used in clinical oncology. In this Review, we consider glutamine, a highly versatile nutrient whose metabolism has implications for tumor cell biology, metabolic imaging, and perhaps novel therapeutics.

Glutamine in intermediary metabolism

Glutamine metabolism has been reviewed extensively and is briefly outlined here (4, 5). The importance of glutamine as a nutrient in cancer derives from its abilities to donate its nitrogen and carbon into an array of growth-promoting pathways (Figure 1). At concentrations of 0.6–0.9 mmol/l, glutamine is the most abundant amino acid in plasma (6). Although most tissues can synthesize glutamine, during periods of rapid growth or other stresses, demand outpaces supply, and glutamine becomes conditionally essential (7). This requirement for glutamine is particularly true in cancer cells, many of which display oncogene-dependent addictions to glutamine in culture (8). Glutamine catabolism begins with its conversion to glutamate in reactions that either donate the amide nitrogen to biosynthetic pathways or release it as ammonia. The latter reactions are catalyzed by the glutaminases (GLSs), of which several isozymes are encoded by human genes GLS and GLS2 (9). Classical studies revealed that GLS isozymes, particularly those encoded by GLS, are expressed in experimental tumors in rats and mice, where their enzyme activity correlates with growth rate and malignancy. Silencing GLS expression or inhibiting GLS activity is sufficient to delay tumor growth in a number of models (1013). The role of GLS2 in cancer appears to be context specific and regulated by factors that are still incompletely characterized. In some tissues, GLS2 is a p53 target gene and seems to function in tumor suppression (14). On the other hand, GLS2 expression is enhanced in some neuroblastomas, where it contributes to cell survival (15). These observations, coupled with the demonstration that c-Myc stimulates GLS expression (12, 16), position at least some of the GLS isozymes as pro-oncogenic.

Glutamine metabolism as a target for diagnostic imaging and therapy in cancFigure 1Glutamine metabolism as a target for diagnostic imaging and therapy in cancer. Glutamine is imported via SLC1A5 and other transporters, then enters a complex metabolic network by which its carbon and nitrogen are supplied to pathways that promote cell survival and growth. Enzymes discussed in the text are shown in green, and inhibitors that target various aspects of glutamine metabolism are shown in red. Green arrows denote reductive carboxylation. 18F-labeled analogs of glutamine are also under development as PET probes for localization of tumor tissue. AcCoA, acetyl-CoA; DON, 6-diazo-5-oxo-L-norleucine; GSH, glutathione; NEAA, nonessential amino acids; ME, malic enzyme; OAA, oxaloacetate; TA, transaminase; 968, compound 968; α-KG, α-ketoglutarate.

Glutamate, the product of the GLS reaction, is a precursor of glutathione, the major cellular antioxidant. It is also the source of amino groups for nonessential amino acids like alanine, aspartate, serine, and glycine, all of which are required for macromolecular synthesis. In glutamine-consuming cells, glutamate is also the major source of α-ketoglutarate, a TCA cycle intermediate and substrate for dioxygenases that modify proteins and DNA. These dioxygenases include prolyl hydroxylases, histone demethylases, and 5-methylcytosine hydroxylases. Their requirement for α-ketoglutarate, although likely accounting for only a small fraction of total α-ketoglutarate utilization, makes this metabolite an essential component of cell signaling and epigenetic networks.

Conversion of glutamate to α-ketoglutarate occurs either through oxidative deamination by glutamate dehydrogenase (GDH) in the mitochondrion or by transamination to produce nonessential amino acids in either the cytosol or the mitochondrion. During avid glucose metabolism, the transamination pathway predominates (17). When glucose is scarce, GDH becomes a major pathway to supply glutamine carbon to the TCA cycle, and is required for cell survival (17, 18). Metabolism of glutamine-derived α-ketoglutarate in the TCA cycle serves several purposes: it generates reducing equivalents for the electron transport chain (ETC) and oxidative phosphorylation, becoming a major source of energy (19); and it is an important anaplerotic nutrient, feeding net production of oxaloacetate to offset export of intermediates from the cycle to supply anabolism (20). Glutamine oxidation also supports redox homeostasis by supplying carbon to malic enzyme, some isoforms of which produce NADPH (Figure 1). In KRAS-driven pancreatic adenocarcinoma cells, a pathway involving glutamine-dependent NADPH production is essential for redox balance and growth (21). In these cells, glutamine is used to produce aspartate in the mitochondria. This aspartate is then trafficked to the cytosol, where it is deaminated to produce oxaloacetate and then malate, the substrate for malic enzyme.

Recent work has uncovered an unexpected role for glutamine in cells with reduced mitochondrial function. Despite glutamine’s conventional role as a respiratory substrate, several studies demonstrated a persistence of glutamine dependence in cells with permanent mitochondrial dysfunction from mutations in the ETC or TCA cycle, or transient impairment secondary to hypoxia (2225). Under these conditions, glutamine-derived α-ketoglutarate is reductively carboxylated by NADPH-dependent isoforms of isocitrate dehydrogenase to produce isocitrate, citrate, and other TCA cycle intermediates (Figure 1). These conditions broaden glutamine’s utility as a carbon source because it becomes not only a major source of oxaloacetate, but also generates acetyl-CoA in what amounts to a striking rewiring of TCA cycle metabolism.

Glutamine promotes hallmarks of malignancy

Deregulated energetics. One hallmark of cancer cells is aberrant bioenergetics (26). Glutamine’s involvement in the pathways outlined above contributes to a phenotype conducive to energy formation, survival, and growth. In addition to its role in mitochondrial metabolism, glutamine also suppresses expression of thioredoxin-interacting protein, a negative regulator of glucose uptake (27). Thus, glutamine contributes to both of the energy-forming pathways in cancer cells: oxidative phosphorylation and glycolysis. Glutamine also modulates hallmarks not traditionally thought to be metabolic, as outlined below. These interactions highlight the complex interplay between glutamine metabolism and many aspects of cell biology.

Sustaining proliferative signaling. Pathological cancer cell growth relies on maintenance of proliferative signaling pathways with increased autonomy relative to non-malignant cells. Several lines of evidence argue that glutamine reinforces activity of these pathways. In some cancer cells, excess glutamine is exported in exchange for leucine and other essential amino acids. This exchange facilitates activation of the serine/threonine kinase mTOR, a major positive regulator of cell growth (28). In addition, glutamine-derived nitrogen is a component of amino sugars, known as hexosamines, that are used to glycosylate growth factor receptors and promote their localization to the cell surface. Disruption of hexosamine synthesis reduces the ability to initiate signaling pathways downstream of growth factors (29).

Enabling replicative immortality. Some aspects of glutamine metabolism oppose senescence and promote replicative immortality in cultured cells. In IMR90 lung fibroblasts, silencing either of two NADPH-generating isoforms of malic enzyme (ME1, ME2) rapidly induced senescence, while malic enzyme overexpression suppressed senescence (30). Both malic enzyme isoforms are repressed at the transcriptional level by p53 and contribute to enhanced levels of glutamine consumption and NADPH production in p53-deficient cells. The ability of p53-replete cells to resist senescence required the expression of ME1 and ME2, and silencing either enzyme reduced the growth of TP53+/+ and, to a lesser degree, TP53–/– tumors (30). These observations position malic enzymes as potential therapeutic targets.

Resisting cell death. Although many cancer cells require glutamine for survival, cells with enhanced expression of Myc oncoproteins are particularly sensitive to glutamine deprivation (8, 12, 16). In these cells, glutamine deprivation induces depletion of TCA cycle intermediates, depression of ATP levels, delayed growth, diminished glutathione pools, and apoptosis. Myc drives glutamine uptake and catabolism by activating the expression of genes involved in glutamine metabolism, including GLSand SLC1A5, which encodes the Na+-dependent amino acid transporter ASCT2 (12, 16). SilencingGLS mimicked some of the effects of glutamine deprivation, including growth suppression in Myc-expressing cells and tumors (10, 12). MYCN amplification occurs in 20%–25% of neuroblastomas and is correlated with poor outcome (31). In cells with high N-Myc levels, glutamine deprivation triggered an ATF4-dependent induction of apoptosis that could be prevented by restoring downstream metabolites oxaloacetate and α-ketoglutarate (15). In this model, pharmacological activation of ATF4, inhibition of glutamine metabolic enzymes, or combinations of these treatments mimicked the effects of glutamine deprivation in cells and suppressed growth of MYCN-amplified subcutaneous and transgenic tumors in mice.

The PKC isoform PKC-ζ also regulates glutamine metabolism. Loss of PKC-ζ enhances glutamine utilization and enables cells to survive glucose deprivation (32). This effect requires flux of carbon and nitrogen from glutamine into serine. PKC-ζ reduces the expression of phosphoglycerate dehydrogenase, an enzyme required for glutamine-dependent serine biosynthesis, and also phosphorylates and inactivates this enzyme. Thus, PKC-ζ loss, which promotes intestinal tumorigenesis in mice, enables cells to alter glutamine metabolism in response to nutrient stress.

Invasion and metastasis. Loss of the epithelial cell-cell adhesion molecule E-cadherin is a component of the epithelial-mesenchymal transition, and is sufficient to induce migration, invasion, and tumor progression (33, 34). Addiction to glutamine may oppose this process because glutamine favors stabilization of tight junctions in some cells (35). Furthermore, the selection of breast cancer cells with the ability to grow without glutamine yielded highly adaptable subpopulations with enhanced mesenchymal marker expression and improved capacity for anchorage-independent growth, therapeutic resistance, and metastasis in vivo (36). It is unknown whether this result reflects a primary role for glutamine in suppressing these markers of aggressiveness in breast cancer, or whether prolonged glutamine deprivation selects for cells with enhanced fitness across a number of phenotypes.

Organ-specific glutamine metabolism in health and disease

As a major player in carbon and nitrogen transport, glutamine metabolism displays complex inter-organ dynamics, with some organs functioning as net producers and others as consumers (Figure 2). Organ-specific glutamine metabolism has frequently been studied in humans and animal models by measuring the arteriovenous difference in plasma glutamine abundance. In healthy subjects, the plasma glutamine pool is largely the result of release from skeletal muscle (3739). In rats, the lungs are comparable to muscle in terms of glutamine production (40, 41), and human lungs also have the capacity for marked glutamine release, although such release is most prominent in times of stress (42, 43). Stress-induced release from the lung is regulated by an induction of glutamine synthase expression as a consequence of glucocorticoid signaling and other mechanisms (44, 45). Although this results in a small arteriovenous difference, the overall release of glutamine is significant because of the large pulmonary perfusion. In rats and humans, adipose tissue is a minor but potentially important source of glutamine (46, 47). The liver has the capacity to synthesize or catabolize glutamine, with these activities subject both to regional heterogeneity among hepatocytes and regulatory effects of systemic acidosis and hyperammonemia. However, the liver does not appear to be a major contributor to the plasma glutamine pool in healthy rats and humans (39, 48, 49).

Model for inter-organ glutamine metabolism in health and cancer.Figure 2Model for inter-organ glutamine metabolism in health and cancer. Organs that release glutamine into the bloodstream are shown in green, and those that consume glutamine are in red; the shade denotes magnitude of consumption/release. For some organs (liver, kidneys), evidence from model systems and/or human studies suggests that there is a change in net glutamine flux during tumorigenesis.

Glutamine consumption occurs largely in the gut and kidney. The organs of the gastrointestinal tract drained by the portal vein, particularly the small intestine, are major consumers of plasma glutamine in both rats and humans (37, 38, 49, 50). Enterocytes oxidize more than half of glutamine carbon to CO2, accounting for a third of the respiration of these cells in fasting animals (51). The kidney consumes net quantities of glutamine to maintain acid-base balance (37, 38, 52, 53). During acidosis, the kidneys substantially increase their uptake of glutamine, cleaving it by GLS to produce ammonia, which is excreted along with organic acids to maintain physiologic pH (52, 54). Glutamine is also a major metabolic substrate in lymphocytes and macrophages, at least during mitogenic stimulation of primary cells in culture (5557).

Importantly, cancer seems to cause major changes in inter-organ glutamine trafficking (Figure 2). Currently, much work in this area is derived from studies in methylcholanthrene-induced fibrosarcoma in the rat, a model of an aggressively growing, glutamine-consuming tumor. In this model, fibrosarcoma induces skeletal muscle expression of glutamine synthetase and greatly increases the release of glutamine into the circulation. As the tumor increases in size, intramuscular glutamine pools are depleted in association with loss of lean muscle mass, mimicking the cachectic phenotype of humans in advanced stages of cancer (52). Simultaneously, both the liver and the kidneys become net glutamine exporters, although the hepatic effect may be diminished as the tumor size becomes very large (48, 49, 52). Glutamine utilization by organs supplied by the portal vein is diminished in cancer (48). In addition to its function as a nutrient for the tumor itself, and possibly for cancer-associated immune cells, glutamine provides additional, indirect metabolic benefits to both the tumor and the host. For example, glutamine was used as a gluconeogenic substrate in cachectic mice with large orthotopic gliomas, providing a significant source of carbon in the plasma glucose pool (58). This glucose was taken up and metabolized by the tumor to produce lactate and to supply the TCA cycle.

It will be valuable to extend work in human inter-organ glutamine trafficking, both in healthy subjects and in cancer patients. Such studies will likely produce a better understanding of the pathophysiology of cancer cachexia, a major source of morbidity and mortality. Research in this area should also aid in the anticipation of organ-specific toxicities of drugs designed to interfere with glutamine metabolism. Alterations of glutamine handling in cancer may induce a different spectrum of toxicities compared with healthy subjects.

Tumors differ according to their need for glutamine

One important consideration is that not all cancer cells need an exogenous supply of glutamine. A panel of lung cancer cell lines displayed significant variability in their response to glutamine deprivation, with some cells possessing almost complete independence (59). Breast cancer cells also demonstrate systematic differences in glutamine dependence, with basal-type cells tending to be glutamine dependent and luminal-type cells tending to be glutamine independent (60). Resistance to glutamine deprivation is associated with the ability to synthesize glutamine de novo and/or to engage alternative pathways of anaplerosis (10, 60).

Tumors also display variable levels of glutamine metabolism in vivo. A study of orthotopic gliomas revealed that genetically diverse, human-derived tumors took up glutamine in the mouse brain but did not catabolize it (58). Rather, the tumors synthesized glutamine de novo and used pyruvate carboxylation for anaplerosis. Cells derived from these tumors did not require glutamine to survive or proliferate when cultured ex vivo. Glutamine synthesis from glucose was also a prominent feature of primary gliomas in human subjects infused with 13C-glucose at the time of surgical resection (61). Furthermore, an analysis of glutamine metabolism in lung and liver tumors revealed that both the tissue of origin and the oncogene influence whether the tumor produces or consumes glutamine (62). MET-induced hepatic tumors produced glutamine, whereas Myc-induced liver tumors catabolized it. In the lung, however, Myc expression was associated with glutamine accumulation.

This variability makes it imperative to develop ways to predict which tumors have the highest likelihood of responding to inhibitors of glutamine metabolism. Methods to image or otherwise quantify glutamine metabolism in vivo would be useful in this regard (63). Infusions of pre-surgical subjects with isotopically labeled glutamine, followed by extraction of metabolites from the tumor and analysis of 13C enrichment, can be used to detect both glutamine uptake and catabolism (58, 62). However, this approach requires a specimen of the tumor to be obtained. Approaches for glutamine-based imaging, which avoid this problem, include a number of glutamine analogs compatible with PET. Although glutamine could in principle be imaged using the radioisotopes 11C, 13N, or 18F, the relatively long half-life of the latter increases its appeal. In mice, 18F-(2S, 4R)4-fluoroglutamine is avidly taken up by tumors derived from highly glutaminolytic cells, and by glutamine-consuming organs including the intestine, kidney, liver, and pancreas (64). Labeled analogs of glutamate are also taken up by some tumors (65, 66). One of these, (4S)-4-(3-[18F] fluoropropyl)-L-glutamate (18F-FSPG, also called BAY 94-9392), was evaluated in small clinical trials involving patients with several types of cancer (65, 67). This analog enters the cell through the cystine/glutamate exchange transporter (xCtransport system), which is linked to glutathione biosynthesis (68). The analog was well tolerated, with high tumor detection rates and good tumor-to-background ratios in hepatocellular carcinoma and lung cancer.

PET approaches detect analog uptake and retention but cannot provide information about downstream metabolism. Analysis of hyperpolarized nuclei can provide a real-time view of enzyme-catalyzed reactions. This technique involves redistribution of the populations of energy levels of a nucleus (e.g., 13C, 15N), resulting in a gain in magnetic resonance signal that can temporarily exceed 10,000-fold (69). This gain in signal enables rapid detection of both the labeled molecule and its downstream metabolites. Glutamine has been hyperpolarized on 15N and 13C (70, 71). In the latter case, the conversion of hyperpolarized glutamine to glutamate could be detected in intact hepatoma cells (70). If these analogs are translated to clinical studies, they might provide a dynamic view of the proximal reactions of glutaminolysis in vivo.

Pharmacological strategies to inhibit glutamine metabolism in cancer

Efforts to inhibit glutamine metabolism using amino acid analogs have an extensive history, including evaluation in clinical trials. Acivicin, 6-diazo-5-oxo-L-norleucine, and azaserine, three of the most widely studied analogs (Figure 1), all demonstrated variable degrees of gastrointestinal toxicity, myelosuppression, and neurotoxicity (72). Because these agents non-selectively target glutamine-consuming processes, recent interest has focused on developing methods directed at specific nodes of glutamine metabolism. First, ASCT2, the Na+-dependent neutral amino acid transporter encoded by SLC1A5, is broadly expressed in lung cancer cell lines and accounts for a majority of glutamine transport in those cells (Figure 1). It has been shown that γ-L-glutamyl-p-nitroanilide (GPNA) inhibits this transporter and limits lung cancer cell growth (73). Additional interest in GPNA lies in its ability to enhance the uptake of drugs imported via the monocarboxylate transporter MCT1. Suppressing glutamine uptake with GPNA enhances MCT1 stability and stimulates uptake of the glycolytic inhibitor 3-bromopyruvate (3-BrPyr) (74, 75). Because enforced MCT1 overexpression is sufficient to sensitize tumor xenografts to 3-BrPyr (76), GPNA may have a place in 3-BrPyr–based therapeutic regimens.

Two inhibitors of GLS isoforms have been characterized in recent years (Figure 1). Compound 968, an inhibitor of the GLS-encoded splice isoform GAC, inhibits the transformation of fibroblasts by oncogenic RhoGTPases and delays the growth of GLS-expressing lymphoma xenografts (13). Bis-2-(5-phenylacetamido-1,2,4-thiadiazol-2-yl)ethyl sulfide (BPTES) also potently inhibits GLS isoforms encoded by GLS (77). BPTES impairs ATP levels and growth rates of P493 lymphoma cells under both normoxic and hypoxic conditions and suppresses the growth of P493-derived xenografts (78).

Evidence also supports a role for targeting the flux from glutamate to α-ketoglutarate, although no potent, specific inhibitors yet exist to inhibit these enzymes in intact cells. Aminooxyacetate (AOA) inhibits aminotransferases non-specifically, but milliomolar doses are typically used to achieve this effect in cultured cells (Figure 1). Nevertheless, AOA has demonstrated efficacy in both breast adenocarcinoma xenografts and autochthonous neuroblastomas in mice (15, 79). Epigallocatechin gallate (EGCG), a green tea polyphenol, has numerous pharmacological effects, one of which is to inhibit GDH (80). The effects of EGCG on GDH have been used to kill glutamine-addicted cancer cells during glucose deprivation or glycolytic inhibition (17, 18) and to suppress growth of neuroblastoma xenografts (15).

A paradigm to exploit glutamine metabolism in cancer

Recent advances in glutamine-based imaging, coupled with the successful application of glutamine metabolic inhibitors in mouse models of cancer, make it possible to conceive of treatment plans that feature consideration of tumor glutamine utilization. A key challenge will be predicting which tumors are most likely to respond to inhibitors of glutamine metabolism. Neuroblastoma is used here as an example of a tumor in which evidence supports the utility of strategies that would involve both glutamine-based imaging and therapy (Figure 3). Neuroblastoma is the second most common extracranial solid malignancy of childhood. High-risk neuroblastoma is defined by age, stage, and biological features of the tumor, including MYCN amplification, which occurs in some 20%–25% of cases (31). Because MYCN-amplified tumor cells require glutamine catabolism for survival and growth (15), glutamine-based PET at the time of standard diagnostic imaging could help predict which tumors would be likely to respond to inhibitors of glutamine metabolism. Infusion of 13C-glutamine coordinated with the diagnostic biopsy could then enable inspection of 13C enrichment in glutamine-derived metabolites from the tumor, confirming the activity of glutamine catabolic pathways. Following on evidence from mouse models of neuroblastoma, treatment could then include agents directed against glutamine catabolism (15). Of note, some tumors were sensitive to the ATF4 agonist fenretinide (FRT), alone or in combination with EGCG. Importantly, FRT has already been the focus of a Phase I clinical trial in children with solid tumors, including neuroblastoma, and was fairly well tolerated (81).

A strategy to integrate glutamine metabolism into the diagnosis, classificaFigure 3A strategy to integrate glutamine metabolism into the diagnosis, classification, treatment, and monitoring of neuroblastoma. Neuroblastoma commonly presents in children as an abdominal mass. A standard evaluation of a child with suspected neuroblastoma includes measurement of urine catecholamines, a bone scan, and full-body imaging with meta-iodobenzylguanidine (MIBG), all of which contribute to diagnosis and disease staging. In animal models, a subset of these tumors requires glutamine metabolism. This finding implies that approaches to image, quantify, or block glutamine metabolism (highlighted in red) in human neuroblastoma could be incorporated into the diagnosis and management of this disease. In particular, glutamine metabolic studies may help predict which tumors would respond to therapies targeting glutamine metabolism. HVA, homovanillic acid; VMA, vanillylmandelic acid.


Glutamine is a versatile nutrient required for the survival and growth of a potentially large subset of tumors. Work over the next several years should produce a more accurate picture of the molecular determinants of glutamine addiction and the identification of death pathways that execute cells when glutamine catabolism is impaired. Advancement of glutamine-based imaging into clinical practice should soon make it possible to differentiate tumors that take up glutamine from those that do not. Finally, the development of safe, high-potency inhibitors of key metabolic nodes should facilitate therapeutic regimens featuring inhibition of glutamine metabolism.

Therapeutic strategies impacting cancer cell glutamine metabolism

The metabolic adaptations that support oncogenic growth can also render cancer cells dependent on certain nutrients. Along with the Warburg effect, increased utilization of glutamine is one of the metabolic hallmarks of the transformed state. Glutamine catabolism is positively regulated by multiple oncogenic signals, including those transmitted by the Rho family of GTPases and by c-Myc. The recent identification of mechanistically distinct inhibitors of glutaminase, which can selectively block cellular transformation, has revived interest in the possibility of targeting glutamine metabolism in cancer therapy. Here, we outline the regulation and roles of glutamine metabolism within cancer cells and discuss possible strategies for, and the consequences of, impacting these processes therapeutically.

Cancer cell metabolism & glutamine addiction

Interest in the metabolic changes characteristic of malignant transformation has undergone a renaissance of sorts in the cancer biology and pharmaceutical communities. However, the recognition that an important connection exists between cellular metabolism and cancer began nearly a century ago with the work of Otto Warburg [13]. Warburg found that rapidly proliferating tumor cells exhibit elevated glucose uptake and glycolytic flux, and furthermore that much of the pyruvate generated by glycolysis is reduced to lactate rather than undergoing mitochondrial oxidation via the tricarboxylic acid (TCA) cycle (Figure 1). This phenomenon persists even under aerobic conditions (‘aerobic glycolysis’), and is known as the Warburg effect [4]. Warburg proposed that aerobic glycolysis was caused by defective mitochondria in cancer cells, but it is now known that mitochondrial dysfunction is relatively rare and that most tumors have an unimpaired capacity for oxidative phosphorylation [5]. In fact, the most important selective advantages provided by the Warburg effect are still debated. Although aerobic glycolysis is an inefficient way to produce ATP (2 ATP/glucose vs ~36 ATP/glucose by complete oxidation), a high glycolytic flux can generate ATP rapidly and furthermore can provide a biosynthetic advantage by supplying precursors and reducing equivalents for the synthesis of macromolecules [4]. The mechanisms underlying the Warburg effect are also not yet fully resolved, although it is increasingly clear that a number of oncogenes and tumor suppressors contribute to the phenomenon. The PI3K/Akt/mTORC1 signaling axis, for example, is a key regulator of aerobic glycolysis and biosynthesis, driving the surface expression of nutrient transporters and the upregulation of glycolytic enzymes [6]. The HIF transcription factor also upregulates expression of glucose transporters and glycolytic enzymes in response to hypoxia and growth factors (or loss of the von Hippel–Landau [VHL] tumor suppressor), and the oncogenic transcription factor c-Myc similarly induces expression of proteins important for glycolysis [6].

An external file that holds a picture, illustration, etc. Object name is nihms610340f1.jpg

Cell proliferation requires metabolic reprogramming

A second major change in the metabolic program of many cancer cells, and the primary focus of this review, is the alteration of glutamine metabolism. Glutamine is the major carrier of nitrogen between organs, and the most abundant amino acid in plasma [7]. It is also a key nutrient for numerous intracellular processes including oxidative metabolism and ATP generation, biosynthesis of proteins, lipids and nucleic acids, and also redox homeostasis and the regulation of signal transduction pathways [810]. Although most mammalian cells are capable of synthesizing glutamine, the demand for this amino acid can become so great during rapid proliferation that an additional extracellular supply is required; hence glutamine is considered conditionally essential [11]. Indeed, many cancer cells are ‘glutamine addicted’, and cannot survive in the absence of an exogenous glutamine supply [12,13].

An important step in the elevation of glutamine catabolism is the activation of the mitochondrial enzyme glutaminase, which catalyzes the hydrolysis of glutamine to generate glutamate and ammonium. The subsequent deamination of glutamate releases a second ammonium to yield the TCA cycle intermediate α-ketoglutarate (α-KG), a reaction catalyzed by glutamate dehydrogenase (GLUD1). This series of reactions is particularly important in rapidly proliferating cells, in which a considerable proportion of the TCA cycle metabolite citrate is exported from mitochondria in order to generate cytosolic acetyl-CoA for lipid biosynthesis [14]. Replenishment of TCA cycle intermediates (anaplerosis) is therefore required, and glutamine often serves as the key anaplerotic substrate through its conversion via glutamate to α-KG (Figure 1).

Mammals express two genes for glutaminase enzymes [1517]. The GLS gene encodes a protein initially characterized in kidney and thus called kidney-type glutaminase (KGA), although this enzyme and its shorter splice variant glutaminase C (GAC), collectively referred to as GLS, are now known to be widely distributed [1820]. The KGA and GAC isoforms share identical N-terminal and catalytic domains, encoded by exons 1–14 of the GLS gene, but have distinct C-termini derived from exon 15 in the case of GAC and exons 16–19 in the case of KGA [21]. Upregulation of GLS, in particular the GAC iso-form, is common in cancer cells and the degree of GLS overexpression correlates with both the degree of malignancy and the tumor grade in human breast cancer samples [22,23]. The GLS2 gene encodes a protein originally discovered and characterized in liver, which has thus been referred to as liver-type glutaminase and, more recently, as glutaminase 2 (GLS2) [15].

Both KGA and GAC can be activated by inorganic phosphate (Pi), and this activation correlates closely with a dimer-to-tetramer transition for each enzyme [7, 22]. As the concentration of Pi is raised the apparent catalytic constant, kcatapp, increases and simultaneously the apparent Michaelis constant, Kmapp, decreases; consequently the catalytic efficiency rises dramatically, especially in the case of GAC [22]. x-ray crystal structures of GAC and KGA in different states indicate that the positioning of a key loop within each monomer (Glu312 to Pro329), located between the active site and the dimer–dimer interface, is critical for mediating tetramerization-induced activation [22,24]. Given the ability of Pi to promote tetramerization and activation of GAC and KGA, it has been proposed that the elevated mitochondrial Pi levels found under hypoxic conditions, which are commonly encountered in the tumor microenvironment, could be one trigger for GLS activation [22].

Oncogenic alterations affecting glutamine metabolism

At least two classes of cellular signals regulate glutamine metabolism, influencing both the expression level and the enzymatic activity of GLS. The transcription factor c-Myc can suppress the expression of microRNAs miR-23a and miR-23b and, in doing so, upregulates GLS (specifically GAC) expression [13,25]. Independent of changes in GAC expression, oncogenic diffuse B-cell lymphoma protein (Dbl), a GEF for Rho GTPases and oncogenic variants of downstream Rho GTPases are able to signal to activate GAC in a manner that is dependent on NF-κB [23]. Mitochondria isolated from Dbl- or Rho GTPase-transformed NIH-3T3 fibroblasts demonstrate significantly higher basal glutaminase activity than mitochondria isolated from non-transformed cells [23]. Furthermore, the enzymatic activity of GAC immunoprecipitated from Dbl-transformed cells is elevated relative to GAC from non-transformed cells, indicating the presence of activating post-translational modification(s) [23]. Indeed, when GAC isolated from Dbl-transformed cells is treated with alkaline phosphatase, basal enzymatic activity is dramatically reduced [23]. Collectively, these findings point to phosphorylation events underlying the activation of GAC in transformed cells. Similarly, phosphorylation-dependent regulation of KGA activity downstream of the Raf-Mek-Erk signaling axis occurs in response to EGF stimulation [24].

It is becoming clear that, in addition to c-Myc and Dbl, many other oncogenic signals and environmental conditions can impact cellular glutamine metabolism. Loss of the retinoblastoma tumor suppressor, for example, leads to a marked increase in glutamine uptake and catabolism, and renders mouse embryonic fibroblasts dependent on exogenous glutamine [26]. Cells transformed by KRAS also illustrate increased expression of genes associated with glutamine metabolism and a corresponding increased utilization of glutamine for anabolic synthesis [27]. In fact, KRAS signaling appears to induce glutamine dependence, since the deleterious effects of glutamine withdrawal in KRAS-driven cells can be rescued by expression of a dominant-negative GEF for Ras [28]. Downstream of Ras, the Raf-MEK-ERK signaling pathway has been implicated in the upregulation of glutamine uptake and metabolism [24,29]. A recent study using human pancreatic ductal adenocarcinoma cells identified a novel KRAS-regulated metabolic pathway, through which glutamine supports cell growth [30]. Proliferation of KRAS-mutant pancreatic ductal adenocarcinoma cells depends on GLS-catalyzed production of glutamate, but not on downstream deamination of glutamate to α-KG; instead, transaminase-mediated glutamate metabolism is essential for growth. Glutamine-derived aspartate is subsequently transported into the cytoplasm where it is converted by aspartate transaminase into oxaloacetate, which can be used to generate malate and pyruvate. The series of reactions maintains NADPH levels and thus the cellular redox state [30].

Other recent studies have revealed that another pathway for glutamine metabolism can be essential under hypoxic conditions, and also in cancer cells with mitochondrial defects or loss of the VHL tumor suppressor [3135]. In these situations, glutamine-derived α-KG undergoes reductive carboxylation by IDH1 or IDH2 to generate citrate, which can be exported from mitochondria to support lipogenesis (Figure 1). Activation of HIF is both necessary and sufficient for driving the reductive carboxylation phenotype in renal cell carcinoma, and suppression of HIF activity can induce a switch from glutamine-mediated lipogenesis back to glucose-mediated lipogenesis [32,35]. Furthermore, loss of VHL and consequent downstream activation of HIF renders renal cell carcinoma cells sensitive to inhibitors of GLS [35]. Evidently, the metabolic routes through which glutamine supports cancer cell proliferation vary with genetic background and with microenvironmental conditions. Nevertheless, it is increasingly clear that diverse oncogenic signals promote glutamine utilization and furthermore that hypoxia, a common condition within poorly vascularized tumors, increases glutamine dependence.


Consistent with the critical role of TCA cycle anaplerosis in cancer cell proliferation, a range of glutamine-dependent cancer cell lines are sensitive to silencing or inhibition of GLS [23,93]. Although loss of GLS suppresses proliferation, in some cases the induction of a compensatory anaplerotic mechanism mediated by pyruvate carboxylase (PC) allows the use of glucose- rather than glutamine-derived carbon for anaplerosis [93]. Low glutamine conditions render glioblastoma cells completely dependent on PC for proliferation; reciprocally, glucose deprivation causes them to become dependent on GLUD1, presumably as a mediator of glutamine-dependent anaplerosis [94]. These studies provide insight into the possibility of inhibiting glutamine-dependent TCA cycle anaplerosis (e.g., with 968 or BPTES) and indicate that high expression of PC could represent a means of resistance to GLS inhibitors.

In c-Myc-induced human Burkitt lymphoma P493 cells, entry of glucose-derived carbon into the TCA cycle is attenuated under hypoxia, whereas glutamine oxidation via the TCA cycle persists [95]. Upon complete withdrawal of glucose, the TCA cycle continues to function and is driven by glutamine. The proportions of viable and proliferating cell populations are almost identical in glucose-replete and -deplete conditions so long as glutamine is present. Inhibition of GLS by BPTES causes a decrease in ATP and glutathione levels, with a simultaneous increase in reactive oxygen species production. Strikingly, whereas BPTES treatment under aerobic conditions suppresses proliferation, under hypoxic conditions it results in cell death, an effect ascribed to glutamine’s critical roles in alleviating oxidative stress in addition to supporting bioenergetics.

In addition to deamidation, glutamine-derived carbon can also reach the TCA cycle through transamination [96], and recent studies indicate that inhibition of this process could be a promising strategy for cancer treatment [30,97,98]. The transaminase inhibitor amino-oxyacetate selectively suppresses proliferation of the aggressive breast cancer cell line MDA-MB-231 relative to normal human mammary epithelial cells, and similar effects were observed with siRNA knockdown of aspartate transaminase [97]. Treatment with amino-oxyacetate killed glutamine-dependent glioblastoma cells, in a manner that could be rescued by α-KG and was dependent on c-Myc expression [13]. Transaminase inhibitors have also been found to suppress both anchorage-dependent and anchorage-independent growth of lung carcinoma cells [98].

Reductive carboxylation

The central metabolic precursor for fatty acid biosynthesis is acetyl-CoA, which can be generated from pyruvate in the mitochondria by pyruvate dehydrogenase. Since acetyl-CoA cannot cross the inner mitochondrial membrane, it is exported to the cytosol via the citrate shuttle following its condensation with oxaloacetate in the TCA cycle (Figure 3). In the cytosol, citrate is converted back to acetyl-CoA and oxaloacetate in a reaction catalyzed by ATP citrate lyase. In addition to its synthesis from glycolytic pyruvate, citrate can also be generated by reductive carboxylation of α-KG [99]. Across a range of cancer cell lines, 10–25% of lipogenic acetyl-CoA is generated from glutamine via this reductive pathway; indeed, reductive metabolism is the primary route for incorporation of glutamine, glutamate and α-KG carbon into lipids [32]. Some of the reductive carboxylation of α-KG is catalyzed by cytosolic IDH1, as well as by mitochondrial IDH2 and/or IDH3.

In A549 lung carcinoma cells, glutamine dependence and reductive carboxylation flux increases under hypoxic conditions [32,34], such that glutamine-derived α-KG accounts for approximately 80% of the carbon used for de novo lipogenesis. Similarly, in melanoma cells, the major source of carbon for acetyl-CoA, citrate and fatty acids switches from glucose under normoxia to glutamine (via reductive carboxylation) under hypoxia [31]. The hypoxic switch to reductive glutamine metabolism is dependent on HIF, and constitutive activation of HIF is sufficient to induce the preferential reductive metabolism of α-KG even under normoxic conditions [32]. Tumor cells with mitochondrial defects, such as electron-transport chain mutations/inhibition, also use glutamine-dependent reductive carboxylation as the major pathway for citrate generation, and loss of electron-transport chain activity is sufficient to induce a switch from glucose to glutamine as the primary source of lipogenic carbon [33].

Together these studies indicate that mitochondrial defects/inhibition, and/or hypoxia, might sensitize cancer cells to inhibition of GLS. The fact that P493 cells are more sensitive to BPTES under hypoxic conditions could in part be explained by an increased reliance on glutamine-dependent reductive carboxylation for lipogenesis [95]. Intriguingly, cancer cells harboring neoenzymatic mutations in IDH1, which results in production of the oncometabolite 2-hydroxyglutarate, are also sensitized to GLS inhibition [100]. 2-hydroxyglutarate is generated primarily from glutamine-derived α-KG [100,101], and therefore tumors expressing mutant IDH might be especially susceptible to alterations in α-KG levels.


As with all therapies, the potential side effects of strategies impacting glutamine metabolism must be seriously considered. The widespread use of l-asparaginase to lower plasma asparagine and glutamine concentrations in ALL patients demonstrates the potential for glutamine metabolism to be safely targeted, and also sheds light on potential toxicological consequences. For example, glutamine is known to be essential for the proliferation of lymphocytes, macrophages and neutrophils, and immunosuppression is a known side effect of l-asparaginase treatment, requiring close monitoring [11,105]. Evidence from early trials using glutamine-mimetic anti-metabolites, such as l-DON, indicates that these unselective molecules can cause excessive gastrointestinal toxicity and neurotoxicity. Within the brain, GLS converts glutamine into the neurotransmitter glutamate in neurons; astrocytes then take up synaptically released glutamate and convert it back to glutamine, which is subsequently transported back to neurons [106,107].


It has become clear during the past decade that altered metabolism plays a critical, in some cases even causal, role in the development and maintenance of cancers. It is now accepted that virtually all oncogenes and tumor suppressors impact metabolic pathways [5]. Furthermore, mutations in certain metabolic enzymes (e.g., isocitrate dehydrogenase, succinate dehydrogenase and fumarate hydratase) are associated with both familial and sporadic human cancers [113]. With this realization has come a renewed interest in the possibility of selectively targeting the metabolism of cancer cells as a therapeutic strategy. The use of l-asparaginase to treat ALL by depleting plasma asparagine and glutamine levels and the promising outcome of the first use of dichloroacetate (which acts, at least in part, through its inhibition of the metabolic enzyme pyruvate dehydrogenase kinase) in glioblastoma patients [114,115], support the notion that cancer metabolism can be safely and effectively targeted in the clinic. The metabolic adaptations of cancer cells must balance the requirements for modestly increased ATP synthesis, dramatically upregulated macromolecular biosynthesis and maintenance of redox balance. By serving as a carbon source for energy generation, a carbon and nitrogen source for biosynthesis and a precursor of the cellular antioxidant glutathione, glutamine is able to contribute to each of these requirements.

The countless combinations of genetic alterations that are found in human neo-plasias mean that there is not a single rigid metabolic program that is characteristic of all transformed cells. This perhaps explains why some current anti-metabolite chemotherapies (e.g., those targeting nucleotide synthesis) are effective only for certain malignancies. A deeper understanding of the metabolic alterations within specific genetic contexts will allow for better-targeted therapeutic interventions. Furthermore, it seems highly likely that combination therapies based on drug synergisms will be especially important for exploiting therapeutic windows within which cancer cells, but not normal cells, are impacted [37]. Glucose and glutamine metabolic pathways, for example, might be able to compensate for one another under some circumstances. When glucose metabolism is impaired in glioblastoma cells, glutamine catabolism becomes essential for survival [94]; reciprocally, suppression of GLS expression causes cells to become fully dependent on glucose-driven TCA cycle anaplerosis via PC [93]. The implication is that PC inhibition could synergize with GLS inhibition.

A topic warranting further investigation is the role that GLS2 plays in cellular metabolism. GLS, in particular the GAC isoform, is upregulated downstream of oncogenes and downregulated by tumor suppressors, and is essential for growth of many cancer cells. In contrast, GLS2 is activated by the ‘universal’ tumor suppressor p53, and furthermore is significantly downregulated in liver tumors and can block transformed characteristics of some cancer cells when overexpressed [116118]. Emphasizing the importance of genetic context, it was recently reported that GLS2 is significantly upregulated in neuroblastomas overexpressing N-Myc [119]. There are various possible explanations for the apparently different roles of two enzymes that catalyze the same reaction. Because the regulation of GLS and GLS2 is distinct, they will be called up under different conditions. The two enzymes have different kinetic characteristics, and therefore might influence energy metabolism and antioxidant defense in different manners [20]. There is also evidence that GLS2 may act, directly or indirectly, as a transcription factor [118]. Finally, it is possible that the different interactions of GLS and GLS2 with other proteins are responsible for their apparently different roles.


Mitochondria as biosynthetic factories for cancer proliferation

Christopher S Ahn and Christian M Metallo

Cancer & Metabolism (2015) 3:1

Unchecked growth and proliferation is a hallmark of cancer, and numerous oncogenic mutations reprogram cellular metabolism to fuel these processes. As a central metabolic organelle, mitochondria execute critical biochemical functions for the synthesis of fundamental cellular components, including fatty acids, amino acids, and nucleotides. Despite the extensive interest in the glycolytic phenotype of many cancer cells, tumors contain fully functional mitochondria that support proliferation and survival. Furthermore, tumor cells commonly increase flux through one or more mitochondrial pathways, and pharmacological inhibition of mitochondrial metabolism is emerging as a potential therapeutic strategy in some cancers. Here, we review the biosynthetic roles of mitochondrial metabolism in tumors and highlight specific cancers where these processes are activated.


Recent characterizations of metabolic enzymes as tumor suppressors and oncogene-driven metabolic reprogramming have reinvigorated interest in cancer metabolism. Although therapies targeting metabolic processes have long been a staple in cancer treatment (e.g. inhibition of folate metabolism via methotrexate), the focused therapeutic potential surrounding these findings have generated a renewed appreciation for Otto Warburg’s work almost a century ago. Warburg observed that tumor cells ferment much of the glucose taken up during growth to lactate, thus using glycolysis as a major means of adenosine triphosphate (ATP) regeneration [1]. However, the observation of decreased respiration in cancer cells and idea that “the respiration of all cancer cells is damaged” belies the critical role of mitochondria in biosynthesis and cell survival [1]. On the contrary, functional mitochondria are present in all proliferative cells within our body (including all tumors), as they are responsible for converting the diverse nutrients available to cells into the fundamental building blocks required for cell growth. These organelles execute numerous functions in cancer cells to promote tumor growth and survival in response to stress. Here, we outline the critical biosynthetic functions served by mitochondria within tumors (Figure 1). Although many of these functions are similarly important in normal, proliferating cells, we have attempted to highlight potential points where mitochondrial metabolism may be therapeutically targeted to slow cancer growth. This review is organized by specific metabolic pathways or processes (i.e., glucose metabolism and lipogenesis, amino acid metabolism, and nucleotide biosynthesis). Tumors or cancer cell types where enzymes in each pathway have been specifically observed to by dysregulated are described within the text and summarized in Table 1.

Figure 1

Biosynthetic nodes within mitochondria. Metabolic pathways within mitochondria that contribute to biosynthesis in cancer and other proliferating cells. TCA metabolism and FOCM enable cells to convert carbohydrates and amino acids to lipids, non-essential amino acids, nucleotides (including purines used for cofactor synthesis), glutathione, heme, and other cellular components. Critical biosynthetic routes are indicated by yellow arrows. Enzymatic reactions that are dependent on redox-sensitive cofactors are depicted in red.

Table 1

Overview of mitochondrial biosynthetic enzymes important in cancer

TCA cycle, anaplerosis, and AcCoA metabolism

Cancers in which three or more mitochondrial enzymes have been studied and found to be differentially regulated (or mutated, as indicated) in cancers vs. control groups are included. Dysregulation of each enzyme was demonstrated in clinical tumors samples, animal models, or cell lines at the levels of genes, mRNA, protein, metabolites, and/or flux.

Figure 2

Coordination of carbon and nitrogen metabolism across amino acids. Glutamate and aKG are key substrates in numerous transamination reactions and can also serve as precursors for glutamine, proline, and the TCA cycle. Mitochondrial enzymes catalyzing these reactions are highlighted in blue, and TCA cycle intermediates are highlighted in orange (pyruvate enters the TCA cycle as acetyl-CoA or oxaloacetate).

Figure 3

Biosynthetic sources for purine and pyrimidine synthesis. Sources and fates of nitrogen, carbon, and oxygen atoms are colored as indicated. Italicized metabolites can be sourced from the mitochondria or cytosol. The double bond formed by the action of DHODH/ubiquinone is also indicated.

Mitochondria operate as both engine and factory in eukaryotes, coordinating cellular energy production and the availability of fundamental building blocks that are required for cell proliferation. Cancer cells must therefore balance their relative bioenergetic and biosynthetic needs to grow, proliferate, and survive within the physical constraints of energy and mass conservation. In contrast to quiescent cells, which predominantly use oxidative mitochondrial metabolism to produce ATP and uptake glucose at much lower rates than proliferating cells, tumor cells exhibit increased glycolytic rates to provide an elevated flux of substrate for biosynthetic pathways, including those executed within mitochondria. Given these higher rates of nutrient utilization, metabolic flux through mitochondrial pathways and the associated ROS production can often be higher in cancer cells. Not surprisingly, activation of cellular antioxidant response pathways is commonly observed in cancer or subpopulations of cells within tumors [46,78]. Cellular compartmentalization affords a degree of protection from such damaging side products of metabolism, and methods which are able to deconvolute the relative contributions of each cellular compartment (e.g. mitochondria, cytosol, peroxisome, etc.) to cancer metabolism will be crucial to more completely understand the metabolism of cancer cells in the future [74,79]. Ultimately, while mitochondrial dysregulation is widely considered to be a hallmark of cancer, numerous mitochondrial functions remain critical for tumor growth and are emerging as clinical targets.

Following this point, it comes as no surprise that mitochondrial metabolism is highly active in virtually all tumors (i.e., cancer cells, stroma, or both), and investigators have begun targeting these pathways to explore potential efficacy. Indeed, some evidence suggests that biguanides such as metformin or phenformin may limit tumor incidence and burden in humans and animals [80,81]. These effects are presumably due, at least in part, to complex I inhibition of the ETC, which significantly perturbs mitochondrial function [82,83]. However, more insights are needed into the mechanisms of these compounds in patients to determine the therapeutic potential of targeting this and other components of mitochondria. In developing new therapies that target cancer metabolism, researchers will face challenges similar to those that are relevant for many established chemotherapies since deleterious effects on normal proliferating cells that also depend on mitochondrial metabolism (and aerobic glycolysis) are likely to arise.

As we acquire a more detailed picture of how specific genetic modifications in a patient’s tumor correlate with its metabolic profile, opportunities for designing targeted or combinatorial therapies will become increasingly apparent. Cancer therapies that address tumor-specific mitochondrial dysregulation and dysfunction may be particularly effective. For example, some cancer cells harbor mutations in TCA enzymes (e.g., FH, SDH, IDH2) or regulatory proteins that control mitophagy (i.e., LKB1) [84]. Such tumors may be compromised with respect to some aspects of mitochondrial biosynthesis and dependent on alternate pathways for growth and/or survival such that synthetically lethal targets emerge. Ultimately, such strategies will require clinicians and researchers to coordinate metabolic, biochemical, and genetic information in the design of therapeutic strategies.


David Terrano, M.D., Ph.D. commented on your update
“Not well versed in Nat peptides so I could not say. I also hesitate with any PNAS paper because those in their academy tend to have a fast track to publication. It has been that way since at least early 2000’s wh n I began research. I don’t doubt their goal and approach (this same group leads the way in methylation-based diagnosis of CNS neoplasms, which is apparently highly accurate). But when I see “dying cells” I know what that means biochemically and look for those hallmarks. Organ specific oligonucleosomes would be a nice cell death surrogate. “

Read Full Post »

The importance of spatially-localized and quantified image interpretation in cancer management

Writer & reporter: Dror Nir, PhD

I became involved in the development of quantified imaging-based tissue characterization more than a decade ago. From the start, it was clear to me that what clinicians needs will not be answered by just identifying whether a certain organ harbors cancer. If imaging devices are to play a significant role in future medicine, as a complementary source of information to bio-markers and gene sequencing the minimum value expected of them is accurate directing of biopsy needles and treatment tools to the malignant locations in the organ.  Therefore, the design goal of the first Prostate-HistoScanning (“PHS”) version I went into the trouble of characterizing localized volume of tissue at the level of approximately 0.1cc (1x1x1 mm). Thanks to that, the imaging-interpretation overlay of PHS localizes the suspicious lesions with accuracy of 5mm within the prostate gland; Detection, localisation and characterisation of prostate cancer by prostate HistoScanning(™).

I then started a more ambitious research aiming to explore the feasibility of identifying sub-structures within the cancer lesion itself. The preliminary results of this exploration were so promising that it surprised not only the clinicians I was working with but also myself. It seems, that using quality ultrasound, one can find Imaging-Biomarkers that allows differentiation of inside structures of a cancerous lesions. Unfortunately, for everyone involved in this work, including me, this scientific effort was interrupted by financial constrains before reaching maturity.

My short introduction was made to explain why I find the publication below important enough to post and bring to your attention.

I hope for your agreement on the matter.

Quantitative Imaging in Cancer Evolution and Ecology

Robert A. Gatenby, MD, Olya Grove, PhD and Robert J. Gillies, PhD

From the Departments of Radiology and Cancer Imaging and Metabolism, Moffitt Cancer Center, 12902 Magnolia Dr, Tampa, FL 33612. Address correspondence to  R.A.G. (e-mail:


Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral Darwinian dynamics before and during therapy. Advances in image analysis will place clinical imaging in an increasingly central role in the development of evolution-based patient-specific cancer therapy.

© RSNA, 2013



Cancers are heterogeneous across a wide range of temporal and spatial scales. Morphologic heterogeneity between and within cancers is readily apparent in clinical imaging, and subjective descriptors of these differences, such as necrotic, spiculated, and enhancing, are common in the radiology lexicon. In the past several years, radiology research has increasingly focused on quantifying these imaging variations in an effort to understand their clinical and biologic implications (1,2). In parallel, technical advances now permit extensive molecular characterization of tumor cells in individual patients. This has led to increasing emphasis on personalized cancer therapy, in which treatment is based on the presence of specific molecular targets (3). However, recent studies (4,5) have shown that multiple genetic subpopulations coexist within cancers, reflecting extensive intratumoral somatic evolution. This heterogeneity is a clear barrier to therapy based on molecular targets, since the identified targets do not always represent the entire population of tumor cells in a patient (6,7). It is ironic that cancer, a disease extensively and primarily analyzed genetically, is also the most genetically flexible of all diseases and, therefore, least amenable to such an approach.

Genetic variations in tumors are typically ascribed to a mutator phenotype that generates new clones, some of which expand into large populations (8). However, although identification of genotypes is of substantial interest, it is insufficient for complete characterization of tumor dynamics because evolution is governed by the interactions of environmental selection forces with the phenotypic, not genotypic, properties of populations as shown, for example, by evolutionary convergence to identical phenotypes among cave fish even when they are from different species (911). This connection between tissue selection forces and cellular properties has the potential to provide a strong bridge between medical imaging and the cellular and molecular properties of cancers.

We postulate that differences within tumors at different spatial scales (ie, at the radiologic, cellular, and molecular [genetic] levels) are related. Tumor characteristics observable at clinical imaging reflect molecular-, cellular-, and tissue-level dynamics; thus, they may be useful in understanding the underlying evolving biology in individual patients. A challenge is that such mapping across spatial and temporal scales requires not only objective reproducible metrics for imaging features but also a theoretical construct that bridges those scales (Fig 1).


Figure 1a: Computed tomographic (CT) scan of right upper lobe lung cancer in a 50-year-old woman.


Figure 1b: Isoattenuation map shows regional heterogeneity at the tissue scale (measured in centimeters).


Figure 1c & 1d: (c, d)Whole-slide digital images (original magnification, ×3) of a histologic slice of the same tumor at the mesoscopic scale (measured in millimeters) (c) coupled with a masked image of regional morphologic differences showing spatial heterogeneity (d). 


Figure 1e: Subsegment of the whole slide image shows the microscopic scale (measured in micrometers) (original magnification, ×50).


Figure 1f: Pattern recognition masked image shows regional heterogeneity. In a, the CT image of non–small cell lung cancer can be analyzed to display gradients of attenuation, which reveals heterogeneous and spatially distinct environments (b). Histologic images in the same patient (c, e) reveal heterogeneities in tissue structure and density on the same scale as seen in the CT images. These images can be analyzed at much higher definition to identify differences in morphologies of individual cells (3), and these analyses reveal clusters of cells with similar morphologic features (d, f). An important goal of radiomics is to bridge radiologic data with cellular and molecular characteristics observed microscopically.

To promote the development and implementation of quantitative imaging methods, protocols, and software tools, the National Cancer Institute has established the Quantitative Imaging Network. One goal of this program is to identify reproducible quantifiable imaging features of tumors that will permit data mining and explicit examination of links between the imaging findings and the underlying molecular and cellular characteristics of the tumors. In the quest for more personalized cancer treatments, these quantitative radiologic features potentially represent nondestructive temporally and spatially variable predictive and prognostic biomarkers that readily can be obtained in each patient before, during, and after therapy.

Quantitative imaging requires computational technologies that can be used to reliably extract mineable data from radiographic images. This feature information can then be correlated with molecular and cellular properties by using bioinformatics methods. Most existing methods are agnostic and focus on statistical descriptions of existing data, without presupposing the existence of specific relationships. Although this is a valid approach, a more profound understanding of quantitative imaging information may be obtained with a theoretical hypothesis-driven framework. Such models use links between observable tumor characteristics and microenvironmental selection factors to make testable predictions about emergent phenotypes. One such theoretical framework is the developing paradigm of cancer as an ecologic and evolutionary process.

For decades, landscape ecologists have studied the effects of heterogeneity in physical features on interactions between populations of organisms and their environments, often by using observation and quantification of images at various scales (1214). We propose that analytic models of this type can easily be applied to radiologic studies of cancer to uncover underlying molecular, cellular, and microenvironmental drivers of tumor behavior and specifically, tumor adaptations and responses to therapy (15).

In this article, we review recent developments in quantitative imaging metrics and discuss how they correlate with underlying genetic data and clinical outcomes. We then introduce the concept of using ecology and evolutionary models for spatially explicit image analysis as an exciting potential avenue of investigation.


Quantitative Imaging and Radiomics

In patients with cancer, quantitative measurements are commonly limited to measurement of tumor size with one-dimensional (Response Evaluation Criteria in Solid Tumors [or RECIST]) or two-dimensional (World Health Organization) long-axis measurements (16). These measures do not reflect the complexity of tumor morphology or behavior, and in many cases, changes in these measures are not predictive of therapeutic benefit (17). In contrast, radiomics (18) is a high-throughput process in which a large number of shape, edge, and texture imaging features are extracted, quantified, and stored in databases in an objective, reproducible, and mineable form (Figs 12). Once transformed into a quantitative form, radiologic tumor properties can be linked to underlying genetic alterations (the field is called radiogenomics) (1921) and to medical outcomes (2227). Researchers are currently working to develop both a standardized lexicon to describe tumor features (28,29) and a standard method to convert these descriptors into quantitative mineable data (30,31) (Fig 3).


Figure 2: Contrast-enhanced CT scans show non–small cell lung cancer (left) and corresponding cluster map (right). Subregions within the tumor are identified by clustering pixels based on the attenuation of pixels and their cumulative standard deviation across the region. While the entire region of interest of the tumor, lacking the spatial information, yields a weighted mean attenuation of 859.5 HU with a large and skewed standard deviation of 243.64 HU, the identified subregions have vastly different statistics. Mean attenuation was 438.9 HU ± 45 in the blue subregion, 210.91 HU ± 79 in the yellow subregion, and 1077.6 HU ± 18 in the red subregion.



Figure 3: Chart shows the five processes in radiomics.

Several recent articles underscore the potential power of feature analysis. After manually extracting more than 100 CT image features, Segal and colleagues found that a subset of 14 features predicted 80% of the gene expression pattern in patients with hepatocellular carcinoma (21). A similar extraction of features from contrast agent–enhanced magnetic resonance (MR) images of glioblastoma was used to predict immunohistochemically identified protein expression patterns (22). Other radiomic features, such as texture, can be used to predict response to therapy in patients with renal cancer (32) and prognosis in those with metastatic colon cancer (33).

These pioneering studies were relatively small because the image analysis was performed manually, and the studies were consequently underpowered. Thus, recent work in radiomics has focused on technical developments that permit automated extraction of image features with the potential for high throughput. Such methods, which rely heavily on novel machine learning algorithms, can more completely cover the range of quantitative features that can describe tumor heterogeneity, such as texture, shape, or margin gradients or, importantly, different environments, or niches, within the tumors.

Generally speaking, texture in a biomedical image is quantified by identifying repeating patterns. Texture analyses fall into two broad categories based on the concepts of first- and second-order spatial statistics. First-order statistics are computed by using individual pixel values, and no relationships between neighboring pixels are assumed or evaluated. Texture analysis methods based on first-order statistics usually involve calculating cumulative statistics of pixel values and their histograms across the region of interest. Second-order statistics, on the other hand, are used to evaluate the likelihood of observing spatially correlated pixels (34). Hence, second-order texture analyses focus on the detection and quantification of nonrandom distributions of pixels throughout the region of interest.

The technical developments that permit second-order texture analysis in tumors by using regional enhancement patterns on dynamic contrast-enhanced MR images were reviewed recently (35). One such technique that is used to measure heterogeneity of contrast enhancement uses the Factor Analysis of Medical Image Sequences (or FAMIS) algorithm, which divides tumors into regions based on their patterns of enhancement (36). Factor Analysis of Medical Image Sequences–based analyses yielded better prognostic information when compared with region of interest–based methods in numerous cancer types (1921,3739), and they were a precursor to the Food and Drug Administration–approved three-time-point method (40). A number of additional promising methods have been developed. Rose and colleagues showed that a structured fractal-based approach to texture analysis improved differentiation between low- and high-grade brain cancers by orders of magnitude (41). Ahmed and colleagues used gray level co-occurrence matrix analyses of dynamic contrast-enhanced images to distinguish benign from malignant breast masses with high diagnostic accuracy (area under the receiver operating characteristic curve, 0.92) (26). Others have shown that Minkowski functional structured methods that convolve images with differently kernelled masks can be used to distinguish subtle differences in contrast enhancement patterns and can enable significant differentiation between treatment groups (42).

It is not surprising that analyses of heterogeneity in enhancement patterns can improve diagnosis and prognosis, as this heterogeneity is fundamentally based on perfusion deficits, which generate significant microenvironmental selection pressures. However, texture analysis is not limited to enhancement patterns. For example, measures of heterogeneity in diffusion-weighted MR images can reveal differences in cellular density in tumors, which can be matched to histologic findings (43). Measures of heterogeneity in T1- and T2-weighted images can be used to distinguish benign from malignant soft-tissue masses (23). CT-based texture features have been shown to be highly significant independent predictors of survival in patients with non–small cell lung cancer (24).

Texture analyses can also be applied to positron emission tomographic (PET) data, where they can provide information about metabolic heterogeneity (25,26). In a recent study, Nair and colleagues identified 14 quantitative PET imaging features that correlated with gene expression (19). This led to an association of metagene clusters to imaging features and yielded prognostic models with hazard ratios near 6. In a study of esophageal cancer, in which 38 quantitative features describing fluorodeoxyglucose uptake were extracted, measures of metabolic heterogeneity at baseline enabled prediction of response with significantly higher sensitivity than any whole region of interest standardized uptake value measurement (22). It is also notable that these extensive texture-based features are generally more reproducible than simple measures of the standardized uptake value (27), which can be highly variable in a clinical setting (44).


Spatially Explicit Analysis of Tumor Heterogeneity

Although radiomic analyses have shown high prognostic power, they are not inherently spatially explicit. Quantitative border, shape, and texture features are typically generated over a region of interest that comprises the entire tumor (45). This approach implicitly assumes that tumors are heterogeneous but well mixed. However, spatially explicit subregions of cancers are readily apparent on contrast-enhanced MR or CT images, as perfusion can vary markedly within the tumor, even over short distances, with changes in tumor cell density and necrosis.

An example is shown in Figure 2, which shows a contrast-enhanced CT scan of non–small cell lung cancer. Note that there are many subregions within this tumor that can be identified with attenuation gradient (attenuation per centimeter) edge detection algorithms. Each subregion has a characteristic quantitative attenuation, with a narrow standard deviation, whereas the mean attenuation over the entire region of interest is a weighted average of the values across all subregions, with a correspondingly large and skewed distribution. We contend that these subregions represent distinct habitats within the tumor, each with a distinct set of environmental selection forces.

These observations, along with the recent identification of regional variations in the genetic properties of tumor cells, indicate the need to abandon the conceptual model of cancers as bounded organlike structures. Rather than a single self-organized system, cancers represent a patchwork of habitats, each with a unique set of environmental selection forces and cellular evolution strategies. For example, regions of the tumor that are poorly perfused can be populated by only those cells that are well adapted to low-oxygen, low-glucose, and high-acid environmental conditions. Such adaptive responses to regional heterogeneity result in microenvironmental selection and hence, emergence of genetic variations within tumors. The concept of adaptive response is an important departure from the traditional view that genetic heterogeneity is the product of increased random mutations, which implies that molecular heterogeneity is fundamentally unpredictable and, therefore, chaotic. The Darwinian model proposes that genetic heterogeneity is the result of a predictable and reproducible selection of successful adaptive strategies to local microenvironmental conditions.

Current cross-sectional imaging modalities can be used to identify regional variations in selection forces by using contrast-enhanced, cell density–based, or metabolic features. Clinical imaging can also be used to identify evidence of cellular adaptation. For example, if a region of low perfusion on a contrast-enhanced study is necrotic, then an adaptive population is absent or minimal. However, if the poorly perfused area is cellular, then there is presumptive evidence of an adapted proliferating population. While the specific genetic properties of this population cannot be determined, the phenotype of the adaptive strategy is predictable since the environmental conditions are more or less known. Thus, standard medical images can be used to infer specific emergent phenotypes and, with ongoing research, these phenotypes can be associated with underlying genetic changes.

This area of investigation will likely be challenging. As noted earlier, the most obvious spatially heterogeneous imaging feature in tumors is perfusion heterogeneity on contrast-enhanced CT or MR images. It generally has been assumed that the links between contrast enhancement, blood flow, perfusion, and tumor cell characteristics are straightforward. That is, tumor regions with decreased blood flow will exhibit low perfusion, low cell density, and high necrosis. In reality, however, the dynamics are actually much more complex. As shown in Figure 4, when using multiple superimposed sequences from MR imaging of malignant gliomas, regions of tumor that are poorly perfused on contrast-enhanced T1-weighted images may exhibit areas of low or high water content on T2-weighted images and low or high diffusion on diffusion-weighted images. Thus, high or low cell densities can coexist in poorly perfused volumes, creating perfusion-diffusion mismatches. Regions with poor perfusion with high cell density are of particular clinical interest because they represent a cell population that is apparently adapted to microenvironmental conditions associated with poor perfusion. The associated hypoxia, acidosis, and nutrient deprivation select for cells that are resistant to apoptosis and thus are likely to be resistant to therapy (46,47).


Figure 4: Left: Contrast-enhanced T1 image from subject TCGA-02-0034 in The Cancer Genome Atlas–Glioblastoma Multiforme repository of MR volumes of glioblastoma multiforme cases. Right: Spatial distribution of MR imaging–defined habitats within the tumor. The blue region (low T1 postgadolinium, low fluid-attenuated inversion recovery) is particularly notable because it presumably represents a habitat with low blood flow but high cell density, indicating a population presumably adapted to hypoxic acidic conditions.

Furthermore, other selection forces not related to perfusion are likely to be present within tumors. For example, evolutionary models suggest that cancer cells, even in stable microenvironments, tend to speciate into “engineers” that maximize tumor cell growth by promoting angiogenesis and “pioneers” that proliferate by invading normal issue and co-opting the blood supply. These invasive tumor phenotypes can exist only at the tumor edge, where movement into a normal tissue microenvironment can be rewarded by increased proliferation. This evolutionary dynamic may contribute to distinct differences between the tumor edges and the tumor cores, which frequently can be seen at analysis of cross-sectional images (Fig 5).


Figure 5a: CT images obtained with conventional entropy filtering in two patients with non–small cell lung cancer with no apparent textural differences show similar entropy values across all sections. 


Figure 5b: Contour plots obtained after the CT scans were convolved with the entropy filter. Further subdividing each section in the tumor stack into tumor edge and core regions (dotted black contour) reveals varying textural behavior across sections. Two distinct patterns have emerged, and preliminary analysis shows that the change of mean entropy value between core and edge regions correlates negatively with survival.

Interpretation of the subsegmentation of tumors will require computational models to understand and predict the complex nonlinear dynamics that lead to heterogeneous combinations of radiographic features. We have exploited ecologic methods and models to investigate regional variations in cancer environmental and cellular properties that lead to specific imaging characteristics. Conceptually, this approach assumes that regional variations in tumors can be viewed as a coalition of distinct ecologic communities or habitats of cells in which the environment is governed, at least to first order, by variations in vascular density and blood flow. The environmental conditions that result from alterations in blood flow, such as hypoxia, acidosis, immune response, growth factors, and glucose, represent evolutionary selection forces that give rise to local-regional phenotypic adaptations. Phenotypic alterations can result from epigenetic, genetic, or chromosomal rearrangements, and these in turn will affect prognosis and response to therapy. Changes in habitats or the relative abundance of specific ecologic communities over time and in response to therapy may be a valuable metric with which to measure treatment efficacy and emergence of resistant populations.


Emerging Strategies for Tumor Habitat Characterization

A method for converting images to spatially explicit tumor habitats is shown in Figure 4. Here, three-dimensional MR imaging data sets from a glioblastoma are segmented. Each voxel in the tumor is defined by a scale that includes its image intensity in different sequences. In this case, the imaging sets are from (a) a contrast-enhanced T1 sequence, (b) a fast spin-echo T2 sequence, and (c) a fluid-attenuated inversion-recovery (or FLAIR) sequence. Voxels in each sequence can be defined as high or low based on their value compared with the mean signal value. By using just two sequences, a contrast-enhanced T1 sequence and a fluid-attenuated inversion-recovery sequence, we can define four habitats: high or low postgadolinium T1 divided into high or low fluid-attenuated inversion recovery. When these voxel habitats are projected into the tumor volume, we find they cluster into spatially distinct regions. These habitats can be evaluated both in terms of their relative contributions to the total tumor volume and in terms of their interactions with each other, based on the imaging characteristics at the interfaces between regions. Similar spatially explicit analysis can be performed with CT scans (Fig 5).

Analysis of spatial patterns in cross-sectional images will ultimately require methods that bridge spatial scales from microns to millimeters. One possible method is a general class of numeric tools that is already widely used in terrestrial and marine ecology research to link species occurrence or abundance with environmental parameters. Species distribution models (4851) are used to gain ecologic and evolutionary insights and to predict distributions of species or morphs across landscapes, sometimes extrapolating in space and time. They can easily be used to link the environmental selection forces in MR imaging-defined habitats to the evolutionary dynamics of cancer cells.


Imaging can have an enormous role in the development and implementation of patient-specific therapies in cancer. The achievement of this goal will require new methods that expand and ultimately replace the current subjective qualitative assessments of tumor characteristics. The need for quantitative imaging has been clearly recognized by the National Cancer Institute and has resulted in formation of the Quantitative Imaging Network. A critical objective of this imaging consortium is to use objective, reproducible, and quantitative feature metrics extracted from clinical images to develop patient-specific imaging-based prognostic models and personalized cancer therapies.

It is increasingly clear that tumors are not homogeneous organlike systems. Rather, they contain regional coalitions of ecologic communities that consist of evolving cancer, stroma, and immune cell populations. The clinical consequence of such niche variations is that spatial and temporal variations of tumor phenotypes will inevitably evolve and present substantial challenges to targeted therapies. Hence, future research in cancer imaging will likely focus on spatially explicit analysis of tumor regions.

Clinical imaging can readily characterize regional variations in blood flow, cell density, and necrosis. When viewed in a Darwinian evolutionary context, these features reflect regional variations in environmental selection forces and can, at least in principle, be used to predict the likely adaptive strategies of the local cancer population. Hence, analyses of radiologic data can be used to inform evolutionary models and then can be mapped to regional population dynamics. Ecologic and evolutionary principles may provide a theoretical framework to link imaging to the cellular and molecular features of cancer cells and ultimately lead to a more comprehensive understanding of specific cancer biology in individual patients.



  • • Marked heterogeneity in genetic properties of different cells in the same tumor is typical and reflects ongoing intratumoral evolution.
  • • Evolution within tumors is governed by Darwinian dynamics, with identifiable environmental selection forces that interact with phenotypic (not genotypic) properties of tumor cells in a predictable and reproducible manner; clinical imaging is uniquely suited to measure temporal and spatial heterogeneity within tumors that is both a cause and a consequence of this evolution.
  • • Subjective radiologic descriptors of cancers are inadequate to capture this heterogeneity and must be replaced by quantitative metrics that enable statistical comparisons between features describing intratumoral heterogeneity and clinical outcomes and molecular properties.
  • • Spatially explicit mapping of tumor regions, for example by superimposing multiple imaging sequences, may permit patient-specific characterization of intratumoral evolution and ecology, leading to patient- and tumor-specific therapies.
  • • We summarize current information on quantitative analysis of radiologic images and propose future quantitative imaging must become spatially explicit to identify intratumoral habitats before and during therapy.

Disclosures of Conflicts of Interest: R.A.G. No relevant conflicts of interest to disclose. O.G. No relevant conflicts of interest to disclose.R.J.G. No relevant conflicts of interest to disclose.



The authors thank Mark Lloyd, MS; Joel Brown, PhD; Dmitry Goldgoff, PhD; and Larry Hall, PhD, for their input to image analysis and for their lively and informative discussions.


  • Received December 18, 2012; revision requested February 5, 2013; revision received March 11; accepted April 9; final version accepted April 29.
  • Funding: This research was supported by the National Institutes of Health (grants U54CA143970-01, U01CA143062; R01CA077575, andR01CA170595).


    1. Kurland BF,
    2. Gerstner ER,
    3. Mountz JM,
    4. et al

    . Promise and pitfalls of quantitative imaging in oncology clinical trials. Magn Reson Imaging2012;30(9):1301–1312.

    1. Levy MA,
    2. Freymann JB,
    3. Kirby JS,
    4. et al

    . Informatics methods to enable sharing of quantitative imaging research data. Magn Reson Imaging2012;30(9):1249–1256.

    1. Mirnezami R,
    2. Nicholson J,
    3. Darzi A

    . Preparing for precision medicine. N Engl J Med 2012;366(6):489–491.

    1. Yachida S,
    2. Jones S,
    3. Bozic I,
    4. et al

    . Distant metastasis occurs late during the genetic evolution of pancreatic cancer. Nature 2010;467(7319):1114–1117.

    1. Gerlinger M,
    2. Rowan AJ,
    3. Horswell S,
    4. et al

    . Intratumor heterogeneity and branched evolution revealed by multiregion sequencing. N Engl J Med2012;366(10):883–892.

    1. Gerlinger M,
    2. Swanton C

    . How Darwinian models inform therapeutic failure initiated by clonal heterogeneity in cancer medicine. Br J Cancer2010;103(8):1139–1143.

    1. Kern SE

    . Why your new cancer biomarker may never work: recurrent patterns and remarkable diversity in biomarker failures. Cancer Res2012;72(23):6097–6101.

    1. Nowell PC

    . The clonal evolution of tumor cell populations. Science1976;194(4260):23–28.

    1. Greaves M,
    2. Maley CC

    . Clonal evolution in cancer. Nature2012;481(7381):306–313.

    1. Vincent TL,
    2. Brown JS

    . Evolutionary game theory, natural selection and Darwinian dynamics. Cambridge, England: Cambridge University Press, 2005.

    1. Gatenby RA,
    2. Gillies RJ

    . A microenvironmental model of carcinogenesis. Nat Rev Cancer 2008;8(1):56–61.

    1. Bowers MA,
    2. Matter SF

    . Landscape ecology of mammals: relationships between density and patch size. J Mammal 1997;78(4):999–1013.

    1. Dorner BK,
    2. Lertzman KP,
    3. Fall J

    . Landscape pattern in topographically complex landscapes: issues and techniques for analysis. Landscape Ecol2002;17(8):729–743.

    1. González-García I,
    2. Solé RV,
    3. Costa J

    . Metapopulation dynamics and spatial heterogeneity in cancer. Proc Natl Acad Sci U S A2002;99(20):13085–13089.

    1. Patel LR,
    2. Nykter M,
    3. Chen K,
    4. Zhang W

    . Cancer genome sequencing: understanding malignancy as a disease of the genome, its conformation, and its evolution. Cancer Lett 2012 Oct 27. [Epub ahead of print]

    1. Jaffe CC

    . Measures of response: RECIST, WHO, and new alternatives. J Clin Oncol 2006;24(20):3245–3251.

    1. Burton A

    . RECIST: right time to renovate? Lancet Oncol2007;8(6):464–465.

    1. Lambin P,
    2. Rios-Velazquez E,
    3. Leijenaar R,
    4. et al

    . Radiomics: extracting more information from medical images using advanced feature analysis. Eur J Cancer 2012;48(4):441–446.

    1. Nair VS,
    2. Gevaert O,
    3. Davidzon G,
    4. et al

    . Prognostic PET 18F-FDG uptake imaging features are associated with major oncogenomic alterations in patients with resected non-small cell lung cancer. Cancer Res2012;72(15):3725–3734.

    1. Diehn M,
    2. Nardini C,
    3. Wang DS,
    4. et al

    . Identification of noninvasive imaging surrogates for brain tumor gene-expression modules. Proc Natl Acad Sci U S A 2008;105(13):5213–5218.

    1. Segal E,
    2. Sirlin CB,
    3. Ooi C,
    4. et al

    . Decoding global gene expression programs in liver cancer by noninvasive imaging. Nat Biotechnol 2007;25(6):675–680.

    1. Tixier F,
    2. Le Rest CC,
    3. Hatt M,
    4. et al

    . Intratumor heterogeneity characterized by textural features on baseline 18F-FDG PET images predicts response to concomitant radiochemotherapy in esophageal cancer. J Nucl Med2011;52(3):369–378.

    1. Pang KK,
    2. Hughes T

    . MR imaging of the musculoskeletal soft tissue mass: is heterogeneity a sign of malignancy? J Chin Med Assoc2003;66(11):655–661.

    1. Ganeshan B,
    2. Panayiotou E,
    3. Burnand K,
    4. Dizdarevic S,
    5. Miles K

    . Tumour heterogeneity in non-small cell lung carcinoma assessed by CT texture analysis: a potential marker of survival. Eur Radiol 2012;22(4):796–802.

    1. Asselin MC,
    2. O’Connor JP,
    3. Boellaard R,
    4. Thacker NA,
    5. Jackson A

    . Quantifying heterogeneity in human tumours using MRI and PET. Eur J Cancer2012;48(4):447–455.

    1. Ahmed A,
    2. Gibbs P,
    3. Pickles M,
    4. Turnbull L

    . Texture analysis in assessment and prediction of chemotherapy response in breast cancer. J Magn Reson Imaging doi:10.1002/jmri.23971 2012. Published online December 13, 2012.

    1. Kawata Y,
    2. Niki N,
    3. Ohmatsu H,
    4. et al

    . Quantitative classification based on CT histogram analysis of non-small cell lung cancer: correlation with histopathological characteristics and recurrence-free survival. Med Phys2012;39(2):988–1000.

    1. Rubin DL

    . Creating and curating a terminology for radiology: ontology modeling and analysis. J Digit Imaging 2008;21(4):355–362.

    1. Opulencia P,
    2. Channin DS,
    3. Raicu DS,
    4. Furst JD

    . Mapping LIDC, RadLex™, and lung nodule image features. J Digit Imaging 2011;24(2):256–270.

    1. Channin DS,
    2. Mongkolwat P,
    3. Kleper V,
    4. Rubin DL

    . The Annotation and Image Mark-up project. Radiology 2009;253(3):590–592.

    1. Rubin DL,
    2. Mongkolwat P,
    3. Kleper V,
    4. Supekar K,
    5. Channin DS

    . Medical imaging on the semantic web: annotation and image markup. Presented at the AAAI Spring Symposium Series, Semantic Scientific Knowledge Integration, Palo Alto, Calif, March 26–28, 2008.

    1. Goh V,
    2. Ganeshan B,
    3. Nathan P,
    4. Juttla JK,
    5. Vinayan A,
    6. Miles KA

    . Assessment of response to tyrosine kinase inhibitors in metastatic renal cell cancer: CT texture as a predictive biomarker. Radiology 2011;261(1):165–171.

    1. Miles KA,
    2. Ganeshan B,
    3. Griffiths MR,
    4. Young RC,
    5. Chatwin CR

    . Colorectal cancer: texture analysis of portal phase hepatic CT images as a potential marker of survival. Radiology 2009;250(2):444–452.

    1. Haralick RM,
    2. Shanmugam K,
    3. Dinstein I

    . Textural features for image classification. IEEE Trans Syst Man Cybern 1973;3(6):610–621.

    1. Yang X,
    2. Knopp MV

    . Quantifying tumor vascular heterogeneity with dynamic contrast-enhanced magnetic resonance imaging: a review. J Biomed Biotechnol 2011;2011:732848.

    1. Frouin F,
    2. Bazin JP,
    3. Di Paola M,
    4. Jolivet O,
    5. Di Paola R

    . FAMIS: a software package for functional feature extraction from biomedical multidimensional images. Comput Med Imaging Graph 1992;16(2):81–91.

    1. Frouge C,
    2. Guinebretière JM,
    3. Contesso G,
    4. Di Paola R,
    5. Bléry M

    . Correlation between contrast enhancement in dynamic magnetic resonance imaging of the breast and tumor angiogenesis. Invest Radiol 1994;29(12):1043–1049.

    1. Zagdanski AM,
    2. Sigal R,
    3. Bosq J,
    4. Bazin JP,
    5. Vanel D,
    6. Di Paola R

    . Factor analysis of medical image sequences in MR of head and neck tumors. AJNR Am J Neuroradiol 1994;15(7):1359–1368.

    1. Bonnerot V,
    2. Charpentier A,
    3. Frouin F,
    4. Kalifa C,
    5. Vanel D,
    6. Di Paola R

    . Factor analysis of dynamic magnetic resonance imaging in predicting the response of osteosarcoma to chemotherapy. Invest Radiol 1992;27(10):847–855.

    1. Furman-Haran E,
    2. Grobgeld D,
    3. Kelcz F,
    4. Degani H

    . Critical role of spatial resolution in dynamic contrast-enhanced breast MRI. J Magn Reson Imaging2001;13(6):862–867.

    1. Rose CJ,
    2. Mills SJ,
    3. O’Connor JPB,
    4. et al

    . Quantifying spatial heterogeneity in dynamic contrast-enhanced MRI parameter maps. Magn Reson Med2009;62(2):488–499.

    1. Canuto HC,
    2. McLachlan C,
    3. Kettunen MI,
    4. et al

    . Characterization of image heterogeneity using 2D Minkowski functionals increases the sensitivity of detection of a targeted MRI contrast agent. Magn Reson Med2009;61(5):1218–1224.

    1. Lloyd MC,
    2. Allam-Nandyala P,
    3. Purohit CN,
    4. Burke N,
    5. Coppola D,
    6. Bui MM

    . Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: how reliable is it? J Pathol Inform2010;1:29–36.

    1. Kumar V,
    2. Nath K,
    3. Berman CG,
    4. et al

    . Variance of SUVs for FDG-PET/CT is greater in clinical practice than under ideal study settings. Clin Nucl Med2013;38(3):175–182.

    1. Walker-Samuel S,
    2. Orton M,
    3. Boult JK,
    4. Robinson SP

    . Improving apparent diffusion coefficient estimates and elucidating tumor heterogeneity using Bayesian adaptive smoothing. Magn Reson Med 2011;65(2):438–447.

    1. Thews O,
    2. Nowak M,
    3. Sauvant C,
    4. Gekle M

    . Hypoxia-induced extracellular acidosis increases p-glycoprotein activity and chemoresistance in tumors in vivo via p38 signaling pathway. Adv Exp Med Biol 2011;701:115–122.

    1. Thews O,
    2. Dillenburg W,
    3. Rösch F,
    4. Fellner M

    . PET imaging of the impact of extracellular pH and MAP kinases on the p-glycoprotein (Pgp) activity. Adv Exp Med Biol 2013;765:279–286.

    1. Araújo MB,
    2. Peterson AT

    . Uses and misuses of bioclimatic envelope modeling. Ecology 2012;93(7):1527–1539.

    1. Larsen PE,
    2. Gibbons SM,
    3. Gilbert JA

    . Modeling microbial community structure and functional diversity across time and space. FEMS Microbiol Lett2012;332(2):91–98.

    1. Shenton W,
    2. Bond NR,
    3. Yen JD,
    4. Mac Nally R

    . Putting the “ecology” into environmental flows: ecological dynamics and demographic modelling. Environ Manage 2012;50(1):1–10.

    1. Clark MC,
    2. Hall LO,
    3. Goldgof DB,
    4. Velthuizen R,
    5. Murtagh FR,
    6. Silbiger MS

    .Automatic tumor segmentation using knowledge-based techniques. IEEE Trans Med Imaging 1998;17(2):187–201.

Read Full Post »

Diagnostics and Biomarkers: Novel Genomics Industry Trends vs Present Market Conditions and Historical Scientific Leaders Memoirs

Larry H Bernstein, MD, FCAP, Author and Curator

This article has two parts:

  • Part 1: Novel Genomics Industry Trends in Diagnostics and Biomarkers vs Present Market Transient Conditions


  • Part 2: Historical Scientific Leaders Memoirs


Part 1: Novel Genomics Industry Trends in Diagnostics and Biomarkers vs Present Market Transient Conditions


Based on “Forging a path from companion diagnostics to holistic decision support”, L.E.K.

Executive Insights, 2013;14(12).

Companion diagnostics and their companion therapies is defined here as a method enabling

  • LIKELY responders to therapies that are specific for patients with ma specific molecular profile.

The result of this statement is that the diagnostics permitted to specific patient types gives access to

  • novel therapies that may otherwise not be approve or reimbursed in other, perhaps “similar” patients
  • who lack a matching identification of the key identifier(s) needed to permit that therapy,
  • thus, entailing a poor expected response.

The concept is new because:

(1) The diagnoses may be closely related by classical criteria, but at the same time they are
not alike with respect to efficacy of treatment with a standard therapy.
(2) The companion diagnostics is restricted to dealing with a targeted drug-specific question
without regard to other clinical issues.
(3) The efficacy issue it clarifies is reliant on a deep molecular/metabolic insight that is not available, except through
emergent genomic/proteomic analysis that has become available and which has rapidly declining cost to obtain.

The limitation example given is HER2 testing for use of Herceptin in therapy for non-candidates (HER2 negative patients).
The problem is that the current format is a “one test/one drug” match, but decision support  may require a combination of

  • validated biomakers obtained on a small biopsy sample (technically manageable) with confusing results.

While HER2 negative patients are more likely to be pre-menopausal with a more aggressive tumor than postmenopausal,

  • the HER2 negative designation does not preclude treatment with Herceptin.

So the Herceptin would be given in combination, but with what other drug in a non-candidate?

The point that L.E.K. makes is that providing highly validated biomarkers linked to approved therapies, it is necessary to pursue more holistic decision support tests that interrogate multiple biomarkers (panels of companion diagnostic markers) and discovery of signatures for treatments that are also used with a broad range of information, such as,

  • traditional tests,
  • imaging,
  • clinical trials,
  • outcomes data,
  • EMR data,
  • reimbursement and coverage data.

A comprehensive solution of this nature appears to be a distance from realization.  However, is this the direction that will lead to tomorrows treatment decision support approaches?

 Surveying the Decision Support Testing Landscape

As a starting point, L.E.K. characterized the landscape of available tests in the U.S. that inform treatment decisions compiled from ~50 leading diagnostics companies operating in the U.S. between 2004-2011. L.E.K. identified more than 200 decision support tests that were classified by test purpose, and more specifically,  whether tests inform treatment decisions for a single drug/class (e.g., companion diagnostics) vs. more holistic treatment decisions across multiple drugs/classes (i.e., multiagent response tests).

 Treatment Decision Support Tests

Companion Diagnostics
Single drug/class
Predict response/safety or guide dosing of a single drug or class

HercepTest   Dako
Determines HER2 protein overexpression for Herceptin treatment selection

Multiple drugs/classes

Vysis ALK Break
Apart FISH
Abbott Labs Predicts the NSCLC patient response to Xalkori

Other Decision Support
Provide prognostic and predictive information on the benefit of treatment

Oncotype Dx    Genomic Health, Inc.
Predicts both recurrence of breast cancer and potential patient benefit to chemotherapy regimens

PML-RARα     Clarient, Inc.
Predicts response to all-trans retinoic acid (ATRA) and other chemotherapy agents

TRUGENE    Siemens
Measures resistence to multiple  HIV-1 anti-retroviral agents

Multi-agent Response

Inform targeted therapy class selection by interrogating a panel of biomarkers
Target Now  Caris Life Sciences
Examines tumor’s molecular profile to tailor treatment options

ResponseDX: Lung    Response Genetics, Inc.
Examines multiple biomarkers to guide therapeutic treatment decisions for NSCLC patients

Source: L.E.K. Analysis

Includes IVD and LDT tests from

  1. top-15 IVD test suppliers,
  2. top-four large reference labs,
  3. top-five AP labs, and
  4. top-20 specialty reference labs.

For descriptive purposes only, may not map to exact regulatory labeling

Most tests are companion diagnostics and other decision support tests that provide guidance on

  • single drug/class therapy decisions.

However, holistic decision support tests (e.g., multi-agent response) are growing the fastest at 56% CAGR.
The emergence of multi-agent response tests suggests diagnostics companies are already seeing the need to aggregate individual tests (e.g., companion diagnostics) into panels of appropriate markers addressing a given clinical decision need. L.E.K. believes this trend is likely to continue as

  • increasing numbers of  biomarkers become validated for diseases and multiplexing tools
  • enabling the aggregation of multiple biomarker interrogations into a single test

to become deployed in the clinic.

Personalized Medicine Partnerships

L.E.K. also completed an assessment of publicly available personalized medicine partnership activity from 2009-2011 for ~150 leading organizations operating in the U.S. to look at broader decision support trends and emergence of more holistic solutions beyond diagnostic tests.

Survey of partnerships deals was conducted for

  • top-10 academic medical centers research institutions,
  • top-25 biopharma,
  • top-four healthcare IT companies,
  • top-three healthcare imaging companies,
  • top-20 IVD manufacturers,
  • top-20 laboratories,
  • top-10 payers/PBMs,
  • top-15 personalized healthcare companies,
  • top-10 regulatory/guideline entities, and
  • top-20 tools vendors for the period of 01/01/2009 – 12/31/2011.
    Source: Company websites, GenomeWeb, L.E.K. analysis

Across the sample we identified 189 publicly announced partnerships of which ~65% focused on more traditional areas (biomarker discovery, companion diagnostics and targeted therapies). However, a significant portion (~30%) included elements geared towards creating more holistic decision support models.

Partnerships categorized as holistic decision support by L.E.K. were focused on

  • mining large patient datasets (e.g., from payers or providers),
  • molecular profiling (e.g., deploying next-generation sequencing),
  • creating information technology (IT) infrastructure needed to enable holistic decision support models and
  • integrating various datasets to create richer decision support solutions.

Interestingly, holistic decision support partnerships often included stakeholders outside of biopharma and diagnostics such as

  • research tools,
  • payers/PBMs,
  • healthcare IT companies as well as
  • emerging personalized healthcare (PHC) companies (e.g., Knome, Foundation Medicine and 23andMe).

This finding suggests that these new stakeholders will be increasingly important in influencing care decisions going forward.

Holistic Treatment Decision Support

Holistic Decision   Support Focus

Technology Provider Partners
Stakeholder Deploying the Solution

Holistic Decision
Support Activities
Molecular Profiling

Life Technologies


Sequencing of triple-negative breast  cancer patients to identify potential treatment strategies

Foundation Medicine


Deployment of cancer genomics analysis platform to support Novartis clinical research efforts
Predictive genomics

Clarient, Inc.
(GE Healthcare)


Biomarker profiling of patients within Acorn’s network of providers to support clinical research efforts


Beth Israel Deaconess
Medical Center

Whole genome analysis and to guide patient management
Outcomes Data Mining



Evaluate comparative effectiveness of selected marketed therapies



Leverage information linking drug response and CYP2C9/CYP2C19 variation



Leverage patient genotype, phenotype and outcome for treatment decisions and target therapeutics
Healthcare IT Infrastructure



Deploy IBM’s Watson-based solution to evidence-based healthcare decision-making support


Moffitt Cancer Center

Deploy Oracle’s informatics platform to store and manage patient medical information
Data Integration

Siemens Diagnostics

Susquehanna Health

Integration of imaging and laboratory diagnostics



Integration of advanced tissue diagnostics, digital pathology, annotated biorepository and EMR
to create solutions
next-generation treatment decision support solutions


GE Healthcare

Integration of genomics with imaging data in CVD


L.E.K. believes the likely debate won’t center on which models and companies will prevail. It appears that the industry is now moving along the continuum to a truly holistic capability.
The mainstay of personalized medicine today will become integrated and enhanced by other data.

The companies that succeed will be able to capture vast amounts of information

  • and synthesize it for personalized care.

Holistic models will be powered by increasingly larger datasets and sophisticated decision-making algorithms.
This will require the participation of an increasingly broad range of participants to provide the

  • science, technologies, infrastructure and tools necessary for deployment.

There are a number of questions posed by this study, but only some are of interest to this discussion:

Group A.    Pharmaceuticals and Devices

  •  How will holistic decision support impact the landscape ?
    (e.g., treatment /testing algorithms, decision making, clinical trials)

Group B.     Diagnostics and   Decision Support

  •   What components will be required to build out holistic solutions?

– Testing technologies

– Information (e.g., associations, outcomes, trial databases, records)

– IT infrastructure for data integration and management, simulation and reporting

  •  How can various components be brought together to build seamless holistic  decision support solutions?

Group C.      Providers and Payers

  •  In which areas should models be deployed over time?
  • Where are clinical and economic arguments  most compelling?

Part 2: Historical Scientific Leaders Memoirs – Realtime Clinical Expert Support

Gil David and Larry Bernstein have developed, in consultation with Prof. Ronald Coifman,
in the Yale University Applied Mathematics Program,

A software system that is the equivalent of an intelligent Electronic Health Records Dashboard that

  • provides empirical medical reference and
  • suggests quantitative diagnostics options.

The current design of the Electronic Medical Record (EMR) is a linear presentation of portions of the record

  • by services
  • by diagnostic method, and
  • by date, to cite examples.

This allows perusal through a graphical user interface (GUI) that partitions the information or necessary reports

  • in a workstation entered by keying to icons.

This requires that the medical practitioner finds the

  • history,
  • medications,
  • laboratory reports,
  • cardiac imaging and
  • EKGs, and
  • radiology in different workspaces.

The introduction of a DASHBOARD has allowed a presentation of

  • drug reactions
  • allergies
  • primary and secondary diagnoses, and
  • critical information

about any patient the care giver needing access to the record.

The advantage of this innovation is obvious.  The startup problem is what information is presented and

  • how it is displayed, which is a source of variability and a key to its success.

We are proposing an innovation that supercedes the main design elements of a DASHBOARD and utilizes

  • the conjoined syndromic features of the disparate data elements.

So the important determinant of the success of this endeavor is that

  • it facilitates both the workflow and the decision-making process with a reduction of medical error.

Continuing work is in progress in extending the capabilities with model datasets, and sufficient data because

  • the extraction of data from disparate sources will, in the long run, further improve this process.

For instance, the finding of  both ST depression on EKG coincident with an elevated cardiac biomarker (troponin), particularly in the absence of substantially reduced renal function. The conversion of hematology based data into useful clinical information requires the establishment of problem-solving constructs based on the measured data.

The most commonly ordered test used for managing patients worldwide is the hemogram that often incorporates

  • the review of a peripheral smear.

While the hemogram has undergone progressive modification of the measured features over time the subsequent expansion of the panel of tests has provided a window into the cellular changes in the

  • production
  • release
  • or suppression

of the formed elements from the blood-forming organ into the circulation. In the hemogram one can view

  • data reflecting the characteristics of a broad spectrum of medical conditions.

Progressive modification of the measured features of the hemogram has delineated characteristics expressed as measurements of

  • size
  • density, and
  • concentration,

resulting in many characteristic features of classification. In the diagnosis of hematological disorders

  • proliferation of marrow precursors, the
  • domination of a cell line, and features of
  • suppression of hematopoiesis

provide a two dimensional model.  Other dimensions are created by considering

  • the maturity of the circulating cells.

The application of rules-based, automated problem solving should provide a valid approach to

  • the classification and interpretation of the data used to determine a knowledge-based clinical opinion.

The exponential growth of knowledge since the mapping of the human genome enabled by parallel advances in applied mathematics that have not been a part of traditional clinical problem solving.

As the complexity of statistical models has increased

  • the dependencies have become less clear to the individual.

Contemporary statistical modeling has a primary goal of finding an underlying structure in studied data sets.
The development of an evidence-based inference engine that can substantially interpret the data at hand and

  • convert it in real time to a “knowledge-based opinion”

could improve clinical decision-making by incorporating

  • multiple complex clinical features as well as duration of onset into the model.

An example of a difficult area for clinical problem solving is found in the diagnosis of SIRS and associated sepsis. SIRS (and associated sepsis) is a costly diagnosis in hospitalized patients.   Failure to diagnose sepsis in a timely manner creates a potential financial and safety hazard.  The early diagnosis of SIRS/sepsis is made by the application of defined criteria by the clinician.

  • temperature
  • heart rate
  • respiratory rate and
  • WBC count

The application of those clinical criteria, however, defines the condition after it has developed and

  • has not provided a reliable method for the early diagnosis of SIRS.

The early diagnosis of SIRS may possibly be enhanced by the measurement of proteomic biomarkers, including

  • transthyretin
  • C-reactive protein
  • procalcitonin
  • mean arterial pressure

Immature granulocyte (IG) measurement has been proposed as a

  • readily available indicator of the presence of granulocyte precursors (left shift).

The use of such markers, obtained by automated systems

  • in conjunction with innovative statistical modeling, provides
  • a promising approach to enhance workflow and decision making.

Such a system utilizes the conjoined syndromic features of

  • disparate data elements with an anticipated reduction of medical error.

How we frame our expectations is so important that it determines

  • the data we collect to examine the process.

In the absence of data to support an assumed benefit, there is no proof of validity at whatever cost.
This has meaning for

  • hospital operations,
  • for nonhospital laboratory operations,
  • for companies in the diagnostic business, and
  • for planning of health systems.

The problem stated by LL  WEED in “Idols of the Mind” (Dec 13, 2006): “ a root cause of a major defect in the health care system is that, while we falsely admire and extol the intellectual powers of highly educated physicians, we do not search for the external aids their minds require”.  HIT use has been

  • focused on information retrieval, leaving
  • the unaided mind burdened with information processing.

We deal with problems in the interpretation of data presented to the physician, and how through better

  • design of the software that presents this data the situation could be improved.

The computer architecture that the physician uses to view the results is more often than not presented

  • as the designer would prefer, and not as the end-user would like.

In order to optimize the interface for physician, the system would have a “front-to-back” design, with
the call up for any patient ideally consisting of a dashboard design that presents the crucial information

  • that the physician would likely act on in an easily accessible manner.

The key point is that each item used has to be closely related to a corresponding criterion needed for a decision.

Feature Extraction.

This further breakdown in the modern era is determined by genetically characteristic gene sequences
that are transcribed into what we measure.  Eugene Rypka contributed greatly to clarifying the extraction
of features in a series of articles, which

  • set the groundwork for the methods used today in clinical microbiology.

The method he describes is termed S-clustering, and

  • will have a significant bearing on how we can view laboratory data.

He describes S-clustering as extracting features from endogenous data that

  • amplify or maximize structural information to create distinctive classes.

The method classifies by taking the number of features

  • with sufficient variety to map into a theoretic standard.

The mapping is done by

  • a truth table, and each variable is scaled to assign values for each: message choice.

The number of messages and the number of choices forms an N-by N table.  He points out that the message

  • choice in an antibody titer would be converted from 0 + ++ +++ to 0 1 2 3.

Even though there may be a large number of measured values, the variety is reduced

  • by this compression, even though there is risk of loss of information.

Yet the real issue is how a combination of variables falls into a table with meaningful information. We are concerned with accurate assignment into uniquely variable groups by information in test relationships. One determines the effectiveness of each variable by

  • its contribution to information gain in the system.

The reference or null set is the class having no information.  Uncertainty in assigning to a classification is

  • only relieved by providing sufficient information.

The possibility for realizing a good model for approximating the effects of factors supported by data used

  • for inference owes much to the discovery of Kullback-Liebler distance or “information”, and Akaike
  • found a simple relationship between K-L information and Fisher’s maximized log-likelihood function.

In the last 60 years the application of entropy comparable to

  • the entropy of physics, information, noise, and signal processing,
  • has been fully developed by Shannon, Kullback, and others, and has been integrated with modern statistics,
  • as a result of the seminal work of Akaike, Leo Goodman, Magidson and Vermunt, and work by Coifman.

Gil David et al. introduced an AUTOMATED processing of the data available to the ordering physician and

  • can anticipate an enormous impact in diagnosis and treatment of perhaps half of the top 20 most common
  • causes of hospital admission that carry a high cost and morbidity.

For example: anemias (iron deficiency, vitamin B12 and folate deficiency, and hemolytic anemia or myelodysplastic syndrome); pneumonia; systemic inflammatory response syndrome (SIRS) with or without bacteremia; multiple organ failure and hemodynamic shock; electrolyte/acid base balance disorders; acute and chronic liver disease; acute and chronic renal disease; diabetes mellitus; protein-energy malnutrition; acute respiratory distress of the newborn; acute coronary syndrome; congestive heart failure; disordered bone mineral metabolism; hemostatic disorders; leukemia and lymphoma; malabsorption syndromes; and cancer(s)[breast, prostate, colorectal, pancreas, stomach, liver, esophagus, thyroid, and parathyroid].

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH (Chairman). Prealbumin in Nutritional Care Consensus Group.

Measurement of visceral protein status in assessing protein and energy malnutrition: standard of care. Nutrition 1995; 11:169-171.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction: integration of serum markers and clinical descriptors using information theory. Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.; Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB) Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method (GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor Wickerhauser. Adapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising. Optical Engineering, 33(7):2170–2174, July 1994.

R. Coifman and N. Saito. Constructions of local orthonormal bases for classification and regression. C. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

Realtime Clinical Expert Support and validation System

We have developed a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that provides empirical medical reference and suggests quantitative diagnostics options.

The primary purpose is to

  1. gather medical information,
  2. generate metrics,
  3. analyze them in realtime and
  4. provide a differential diagnosis,
  5. meeting the highest standard of accuracy.

The system builds its unique characterization and provides a list of other patients that share this unique profile, therefore utilizing the vast aggregated knowledge (diagnosis, analysis, treatment, etc.) of the medical community. The

  • main mathematical breakthroughs are provided by accurate patient profiling and inference methodologies
  • in which anomalous subprofiles are extracted and compared to potentially relevant cases.

As the model grows and its knowledge database is extended, the diagnostic and the prognostic become more accurate and precise. We anticipate that the effect of implementing this diagnostic amplifier would result in

  • higher physician productivity at a time of great human resource limitations,
  • safer prescribing practices,
  • rapid identification of unusual patients,
  • better assignment of patients to observation, inpatient beds,
    intensive care, or referral to clinic,
  • shortened length of patients ICU and bed days.

The main benefit is a real time assessment as well as diagnostic options based on

  • comparable cases,
  • flags for risk and potential problems

as illustrated in the following case acquired on 04/21/10. The patient was diagnosed by our system with severe SIRS at a grade of 0.61 .

Graphical presentation of patient status

The patient was treated for SIRS and the blood tests were repeated during the following week. The full combined record of our system’s assessment of the patient, as derived from the further hematology tests, is illustrated below. The yellow line shows the diagnosis that corresponds to the first blood test (as also shown in the image above). The red line shows the next diagnosis that was performed a week later.

Progression changes in patient ICU stay with SIRS

Chemistry of Herceptin [Trastuzumab] is explained with images in



The Cost Burden of Disease: U.S. and Michigan CHRT Brief. January 2010.

The National Hospital Bill: The Most Expensive Conditions by Payer, 2006. HCUP Brief #59.

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction: integration of serum markers and clinical descriptors using information theory. Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.; Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB) Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method (GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor Wickerhauser. Adapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising. Optical Engineering 1994; 33(7):2170–2174.

  1. Coifman and N. Saito. Constructions of local orthonormal bases for classification and regression. C. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

W Ruts, S De Deyne, E Ameel, W Vanpaemel,T Verbeemen, And G Storms. Dutch norm data for 13 semantic categories and 338 exemplars. Behavior Research Methods, Instruments, & Computers 2004; 36 (3): 506–515.

De Deyne, S Verheyen, E Ameel, W Vanpaemel, MJ Dry, WVoorspoels, and G Storms.  Exemplar by feature applicability matrices and other Dutch normative data for semantic concepts.  Behavior Research Methods 2008; 40 (4): 1030-1048

Landauer, T. K., Ross, B. H., & Didner, R. S. (1979). Processing visually presented single words: A reaction time analysis [Technical memorandum].  Murray Hill, NJ: Bell Laboratories. Lewandowsky, S. (1991).

Weed L. Automation of the problem oriented medical record. NCHSR Research Digest Series DHEW. 1977;(HRA)77-3177.

Naegele TA. Letter to the Editor. Amer J Crit Care 1993:2(5):433.

Retinal prosthetic strategy with the capacity to restore normal vision, Sheila Nirenberg and Chethan Pandarinath


Other related articles published in include the following:


  • The Automated Second Opinion Generator

Larry H Bernstein, MD, FCAP


  • The electronic health record: How far we have travelled and where is journeys end

Larry H Bernstein, MD, FCAP


  • The potential contribution of informatics to healthcare is more than currently estimated.

Larry H Bernstein, MD, FCAP


  • Clinical Decision Support Systems for Management Decision Making of Cardiovascular Diseases

Justin Pearlman, MD, PhD, FACC and Aviva Lev-Ari, PhD, RN


  • Demonstration of a diagnostic clinical laboratory neural network applied to three laboratory data conditioning problems

Larry H Bernstein, MD, FCAP


  • CRACKING THE CODE OF HUMAN LIFE: The Birth of BioInformatics & Computational Genomics

Larry H Bernstein, MD, FCAP


  • Genetics of conduction disease atrioventricular AV conduction disease block gene mutations transcription excitability and energy homeostasis

Aviva Lev-Ari, PhD, RN


  • Identification of biomarkers that are related to the actin cytoskeleton

Larry H Bernstein, MD, FCAP


  • Regression: A richly textured method for comparison of predictor variables

Larry H Bernstein, MD, FCAP


  • Diagnostic evaluation of SIRS by immature granulocytes

Larry H Bernstein, MD, FCAP


  • Big data in genomic medicine

Larry H Bernstein, MD, FCAP


  • Automated inferential diagnosis of SIRS, sepsis, septic shock

Larry H Bernstein, MD, FCAP


  • A Software Agent for Diagnosis of ACUTE MYOCARDIAL INFARCTION

Isaac E. Mayzlin, Ph.D., David Mayzlin and Larry H. Bernstein, MD, FCAP


  • Artificial Vision: Cornell and Stanford Researchers crack Retinal Code

Reporter: Aviva Lev-Ari, PhD, RN


  • Vinod Khosla: 20 doctor included speculations, musings of a technology optimist or technology will replace 80 percent of what doctors do

Aviva Lev-Ari, PhD, RN


  • Biomaterials Technology: Models of Tissue Engineering for Reperfusion and Implantable Devices for Revascularization

Larry H Bernstein, MD, FACP and Aviva Lev-Ari, PhD, RN


  • The Heart: Vasculature Protection – A Concept-based Pharmacological Therapy including THYMOSIN

Aviva Lev-Ari, PhD, RN 2/28/2013


  • FDA Pending 510(k) for The Latest Cardiovascular Imaging Technology

Aviva Lev-Ari, PhD, RN 1/28/2013


  • PCI Outcomes, Increased Ischemic Risk associated with Elevated Plasma Fibrinogen not Platelet Reactivity

Aviva Lev-Ari, PhD, RN 1/10/2013


  • The ACUITY-PCI score: Will it Replace Four Established Risk Scores — TIMI, GRACE, SYNTAX, and Clinical SYNTAX

Aviva Lev-Ari, PhD, RN 1/3/2013


  • Coronary artery disease in symptomatic patients referred for coronary angiography: Predicted by Serum Protein Profiles

Aviva Lev-Ari, PhD, RN 12/29/2012


  • New Definition of MI Unveiled, Fractional Flow Reserve (FFR)CT for Tagging Ischemia

Aviva Lev-Ari, PhD, RN 8/27/2012

Additional Related articles

  • Hospital EHRs Inadequate for Big Data; Need for Specialized -Omics Systems(
  • Apple Inc. (AAPL), QUALCOMM, Inc. (QCOM): Disruptions Needed(
  • Netsmart Names Dr. Ian Chuang Senior Vice President, Healthcare Informatics and Chief Medical Officer(
  • Strategic partnership signals new age of stratified medicine(
  • Personalized breast cancer therapeutic with companion diagnostic poised for clinical trials in H2(

Read Full Post »

Reproductive Genetic Testing

Reporter and Curator: Sudipta Saha, Ph.D.

Reproductive genetics, a field of medical genetics integrated with reproductive medicine, assisted reproduction, and developmental genetics, involves a wide array of genetic tests that are conducted with the intent of informing individuals about the possible outcomes of current or future pregnancies. The tests themselves can include the analysis of chromosomes, DNA, RNA, genes, and/or gene products to determine whether an alteration is present that is causing or is likely to cause a specific disease or condition.

Types of Tests

In general, reproductive genetic testing involves the following categories of tests:

Carrier testing is performed to determine whether an individual carries one copy of an altered gene for a particular recessive disease. The term recessive refers to diseases that will occur only if both copies of a gene that an individual receives have a disease-associated mutation; thus, each child born to two carriers of a mutation in the same gene has a 25 percent risk of being affected with the disorder. Examples of carrier tests include those for

Couples are likely to have carrier tests if they are at higher risk of having a child with a specific disorder because of their racial or ethnic heritage or family history. Carrier testing is often done in the context of family planning and reproductive health.

Preimplantation diagnosis is used following in vitro fertilization to diagnose a genetic disease or condition in a preimplantation embryo. Preimplantation genetic diagnosis is essentially an alternative to prenatal diagnosis, as it allows prenatal testing to occur months earlier than conventional tests such as amniocentesis on week 18th of pregnancy, even before a pregnancy begins. Doctors can test a single cell from an eight-cell embryo that is just days old to determine, among other things, whether it is a male or female. This can provide crucial information for genetic diseases that afflict just one sex. Preimplantation genetic diagnosis has been applied to patients carrying chromosomal rearrangements, such as translocations, in which it has been proven to decrease the number of spontaneous abortions and prevent the birth of children affected with chromosome imbalances. Preimplantation genetic diagnosis techniques have also been applied to

  • increase implantation rates,
  • reduce the incidence of spontaneous abortion, and
  • prevent trisomic offspring in women of advanced maternal age undergoing fertility treatment.

A third group of patients receiving preimplantation genetic diagnosis are those at risk of transmitting a single gene disorder to their offspring. The number of monogenic disorders that have been diagnosed in preimplantation embryos has increased each year. So far, at least 700 healthy babies have been born worldwide after undergoing the procedure, and the number is growing rapidly.

Prenatal diagnosis is used to diagnose a genetic disease or condition in a developing fetus.

The techniques currently in use or under investigation for prenatal diagnosis include

  • (1) fetal tissue sampling through amniocentesis, chorionic villi sampling (CVS), percutaneous umbilical blood sampling, percutaneous skin biopsy, and other organ biopsies, including muscle and liver biopsy;
  • (2) fetal visualization through ultrasound, fetal echocardiography, embryoscopy, fetoscopy, magnetic resonance imaging, and radiography;
  • (3) screening for neural tube defects by measuring maternal serum alpha-fetoprotein (MSAFP);
  • (4) screening for fetal Down Syndrome by measuring MSAFP, unconjugated estriol, and human chorionic gonadotropin;
  • (5) separation of fetal cells from the mother’s blood; and
  • (6) preimplantation biopsy of blastocysts obtained by in vitro fertilization.

The more common techniques are amniocentesis, performed at the 14th to 20th week of gestation, and CVS, performed between the 9th and 13th week of gestation. If the fetus is found to be affected with a disorder, the couple can plan for the birth of an affected child or opt for elective abortion.

Newborn screening is performed in newborns on a public health basis by the states to detect certain genetic diseases for which early diagnosis and treatment are available. Newborn screening is one of the largest public health activities in the United States. It is aimed at the early identification of infants who are affected by certain genetic, metabolic or infectious conditions, reaching approximately 4 million children born each year. According to the Centers for Disease Control and Prevention (CDC), approximately 3,000 babies each year in the United States are found to have severe disorders detected through screening. States test blood spots collected from newborns for 2 to over 30 metabolic and genetic diseases, such as

  • phenylketonuria,
  • hypothyroidism,
  • galactosemia,
  • sickle cell disease, and
  • medium chain acyl CoA dehyrogenase deficiency.

The goal of this screening is to identify affected newborns quickly in order to provide treatment that can prevent mental retardation, severe illness or death.

It is possible that somatic cell nuclear transfer (cloning) techniques could eventually be employed for the purposes of reproductive genetic testing. In addition, germline gene transfer is a technique that could be used to test and then alter the genetic makeup of the embryo. To date, however, these techniques have not been used in human studies.

Ethical Issues

Any procedure that provides information that could lead to a decision to terminate a pregnancy is not without controversy. Although prenatal diagnosis has been routine for nearly 20 years, some ethicists remain concerned that the ability to eliminate potential offspring with genetic defects contributes to making society overall less tolerant of disability. Others have argued that prenatal diagnosis is sometimes driven by economic concerns because as a society we have chosen not to provide affordable and accessible health care to everyone. Thus, prenatal diagnosis can save money by preventing the birth of defective and costly children. For reproductive genetic procedures that involve greater risk to the fetus, e.g., preimplantation diagnosis, concerns remain about whether the diseases being averted warrant the risks involved in the procedures themselves. These concerns are likely to escalate should

  • cloning or
  • germline gene transfer

be undertaken as a way to genetically test and select healthy offspring.


Read Full Post »


Larry H Bernstein, MD
Leaders in Pharmaceutical Intelligence


I call attention to an interesting article that just came out.   The estimate of improved costsavings in healthcare and diagnostic accuracy is extimated to be substantial.   I have written about the unused potential that we have not yet seen.  In short, there is justification in substantial investment in resources to this, as has been proposed as a critical goal.  Does this mean a reduction in staffing?  I wouldn’t look at it that way.  The two huge benefits that would accrue are:


  1. workflow efficiency, reducing stress and facilitating decision-making.
  2. scientifically, primary knowledge-based  decision-support by well developed algotithms that have been at the heart of computational-genomics.




Can computers save health care? IU research shows lower costs, better outcomes

Cost per unit of outcome was $189, versus $497 for treatment as usual

 Last modified: Monday, February 11, 2013


BLOOMINGTON, Ind. — New research from Indiana University has found that machine learning — the same computer science discipline that helped create voice recognition systems, self-driving cars and credit card fraud detection systems — can drastically improve both the cost and quality of health care in the United States.



 Physicians using an artificial intelligence framework that predicts future outcomes would have better patient outcomes while significantly lowering health care costs.



Using an artificial intelligence framework combining Markov Decision Processes and Dynamic Decision Networks, IU School of Informatics and Computing researchers Casey Bennett and Kris Hauser show how simulation modeling that understands and predicts the outcomes of treatment could


  • reduce health care costs by over 50 percent while also
  • improving patient outcomes by nearly 50 percent.


The work by Hauser, an assistant professor of computer science, and Ph.D. student Bennett improves upon their earlier work that


  • showed how machine learning could determine the best treatment at a single point in time for an individual patient.


By using a new framework that employs sequential decision-making, the previous single-decision research


  • can be expanded into models that simulate numerous alternative treatment paths out into the future;
  • maintain beliefs about patient health status over time even when measurements are unavailable or uncertain; and
  • continually plan/re-plan as new information becomes available.

In other words, it can “think like a doctor.”  (Perhaps better because of the limitation in the amount of information a bright, competent physician can handle without error!)


“The Markov Decision Processes and Dynamic Decision Networks enable the system to deliberate about the future, considering all the different possible sequences of actions and effects in advance, even in cases where we are unsure of the effects,” Bennett said.  Moreover, the approach is non-disease-specific — it could work for any diagnosis or disorder, simply by plugging in the relevant information.  (This actually raises the question of what the information input is, and the cost of inputting.)


The new work addresses three vexing issues related to health care in the U.S.:


  1. rising costs expected to reach 30 percent of the gross domestic product by 2050;
  2. a quality of care where patients receive correct diagnosis and treatment less than half the time on a first visit;
  3. and a lag time of 13 to 17 years between research and practice in clinical care.

  Framework for Simulating Clinical Decision-Making


“We’re using modern computational approaches to learn from clinical data and develop complex plans through the simulation of numerous, alternative sequential decision paths,” Bennett said. “The framework here easily out-performs the current treatment-as-usual, case-rate/fee-for-service models of health care.”  (see the above)


Bennett is also a data architect and research fellow with Centerstone Research Institute, the research arm of Centerstone, the nation’s largest not-for-profit provider of community-based behavioral health care. The two researchers had access to clinical data, demographics and other information on over 6,700 patients who had major clinical depression diagnoses, of which about 65 to 70 percent had co-occurring chronic physical disorders like diabetes, hypertension and cardiovascular disease.  Using 500 randomly selected patients from that group for simulations, the two


  • compared actual doctor performance and patient outcomes against
  • sequential decision-making models

using real patient data.

They found great disparity in the cost per unit of outcome change when the artificial intelligence model’s


  1. cost of $189 was compared to the treatment-as-usual cost of $497.
  2. the AI approach obtained a 30 to 35 percent increase in patient outcomes
Bennett said that “tweaking certain model parameters could enhance the outcome advantage to about 50 percent more improvement at about half the cost.”


While most medical decisions are based on case-by-case, experience-based approaches, there is a growing body of evidence that complex treatment decisions might be effectively improved by AI modeling.  Hauser said “Modeling lets us see more possibilities out to a further point –  because they just don’t have all of that information available to them.”  (Even then, the other issue is the processing of the information presented.)



Using the growing availability of electronic health records, health information exchanges, large public biomedical databases and machine learning algorithms, the researchers believe the approach could serve as the basis for personalized treatment through integration of diverse, large-scale data passed along to clinicians at the time of decision-making for each patient. Centerstone alone, Bennett noted, has access to health information on over 1 million patients each year. “Even with the development of new AI techniques that can approximate or even surpass human decision-making performance, we believe that the most effective long-term path could be combining artificial intelligence with human clinicians,” Bennett said. “Let humans do what they do well, and let machines do what they do well. In the end, we may maximize the potential of both.”



Artificial Intelligence Framework for Simulating Clinical Decision-Making: A Markov Decision Process Approach” was published recently in Artificial Intelligence in Medicine. The research was funded by the Ayers Foundation, the Joe C. Davis Foundation and Indiana University.


For more information or to speak with Hauser or Bennett, please contact Steve Chaplin, IU Communications, at 812-856-1896 or



IBM Watson Finally Graduates Medical School


It’s been more than a year since IBM’s Watson computer appeared on Jeopardy and defeated several of the game show’s top champions. Since then the supercomputer has been furiously “studying” the healthcare literature in the hope that it can beat a far more hideous enemy: the 400-plus biomolecular puzzles we collectively refer to as cancer.




Anomaly Based Interpretation of Clinical and Laboratory Syndromic Classes

Larry H Bernstein, MD, Gil David, PhD, Ronald R Coifman, PhD.  Program in Applied Mathematics, Yale University, Triplex Medical Science.


 Statement of Inferential  Second Opinion

 Realtime Clinical Expert Support and Validation System

Gil David and Larry Bernstein have developed, in consultation with Prof. Ronald Coifman, in the Yale University Applied Mathematics Program, a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that provides
  • empirical medical reference and suggests quantitative diagnostics options.


The current design of the Electronic Medical Record (EMR) is a linear presentation of portions of the record by
  • services, by
  • diagnostic method, and by
  • date, to cite examples.

This allows perusal through a graphical user interface (GUI) that partitions the information or necessary reports in a workstation entered by keying to icons.  This requires that the medical practitioner finds

  • the history,
  • medications,
  • laboratory reports,
  • cardiac imaging and EKGs, and
  • radiology
in different workspaces.  The introduction of a DASHBOARD has allowed a presentation of
  • drug reactions,
  • allergies,
  • primary and secondary diagnoses, and
  • critical information about any patient the care giver needing access to the record.
 The advantage of this innovation is obvious.  The startup problem is what information is presented and how it is displayed, which is a source of variability and a key to its success.


We are proposing an innovation that supercedes the main design elements of a DASHBOARD and
  • utilizes the conjoined syndromic features of the disparate data elements.
So the important determinant of the success of this endeavor is that it facilitates both
  1. the workflow and
  2. the decision-making process
  • with a reduction of medical error.
 This has become extremely important and urgent in the 10 years since the publication “To Err is Human”, and the newly published finding that reduction of error is as elusive as reduction in cost.  Whether they are counterproductive when approached in the wrong way may be subject to debate.
We initially confine our approach to laboratory data because it is collected on all patients, ambulatory and acutely ill, because the data is objective and quality controlled, and because
  • laboratory combinatorial patterns emerge with the development and course of disease.  Continuing work is in progress in extending the capabilities with model data-sets, and sufficient data.
It is true that the extraction of data from disparate sources will, in the long run, further improve this process.  For instance, the finding of both ST depression on EKG coincident with an increase of a cardiac biomarker (troponin) above a level determined by a receiver operator curve (ROC) analysis, particularly in the absence of substantially reduced renal function.
The conversion of hematology based data into useful clinical information requires the establishment of problem-solving constructs based on the measured data.  Traditionally this has been accomplished by an intuitive interpretation of the data by the individual clinician.  Through the application of geometric clustering analysis the data may interpreted in a more sophisticated fashion in order to create a more reliable and valid knowledge-based opinion.
The most commonly ordered test used for managing patients worldwide is the hemogram that often incorporates the review of a peripheral smear.  While the hemogram has undergone progressive modification of the measured features over time the subsequent expansion of the panel of tests has provided a window into the cellular changes in the production, release or suppression of the formed elements from the blood-forming organ to the circulation.  In the hemogram one can view data reflecting the characteristics of a broad spectrum of medical conditions.
Progressive modification of the measured features of the hemogram has delineated characteristics expressed as measurements of
  • size,
  • density, and
  • concentration,
resulting in more than a dozen composite variables, including the
  1. mean corpuscular volume (MCV),
  2. mean corpuscular hemoglobin concentration (MCHC),
  3. mean corpuscular hemoglobin (MCH),
  4. total white cell count (WBC),
  5. total lymphocyte count,
  6. neutrophil count (mature granulocyte count and bands),
  7. monocytes,
  8. eosinophils,
  9. basophils,
  10. platelet count, and
  11. mean platelet volume (MPV),
  12. blasts,
  13. reticulocytes and
  14. platelet clumps,
  15. perhaps the percent immature neutrophils (not bands)
  16. as well as other features of classification.
The use of such variables combined with additional clinical information including serum chemistry analysis (such as the Comprehensive Metabolic Profile (CMP)) in conjunction with the clinical history and examination complete the traditional problem-solving construct. The intuitive approach applied by the individual clinician is limited, however,
  1. by experience,
  2. memory and
  3. cognition.
The application of rules-based, automated problem solving may provide a more reliable and valid approach to the classification and interpretation of the data used to determine a knowledge-based clinical opinion.
The classification of the available hematologic data in order to formulate a predictive model may be accomplished through mathematical models that offer a more reliable and valid approach than the intuitive knowledge-based opinion of the individual clinician.  The exponential growth of knowledge since the mapping of the human genome has been enabled by parallel advances in applied mathematics that have not been a part of traditional clinical problem solving.  In a univariate universe the individual has significant control in visualizing data because unlike data may be identified by methods that rely on distributional assumptions.  As the complexity of statistical models has increased, involving the use of several predictors for different clinical classifications, the dependencies have become less clear to the individual.  The powerful statistical tools now available are not dependent on distributional assumptions, and allow classification and prediction in a way that cannot be achieved by the individual clinician intuitively. Contemporary statistical modeling has a primary goal of finding an underlying structure in studied data sets.
In the diagnosis of anemia the variables MCV,MCHC and MCH classify the disease process  into microcytic, normocytic and macrocytic categories.  Further consideration of
proliferation of marrow precursors,
  • the domination of a cell line, and
  • features of suppression of hematopoiesis

provide a two dimensional model.  Several other possible dimensions are created by consideration of

  • the maturity of the circulating cells.
The development of an evidence-based inference engine that can substantially interpret the data at hand and convert it in real time to a “knowledge-based opinion” may improve clinical problem solving by incorporating multiple complex clinical features as well as duration of onset into the model.
An example of a difficult area for clinical problem solving is found in the diagnosis of SIRS and associated sepsis.  SIRS (and associated sepsis) is a costly diagnosis in hospitalized patients.   Failure to diagnose sepsis in a timely manner creates a potential financial and safety hazard.  The early diagnosis of SIRS/sepsis is made by the application of defined criteria (temperature, heart rate, respiratory rate and WBC count) by the clinician.   The application of those clinical criteria, however, defines the condition after it has developed and has not provided a reliable method for the early diagnosis of SIRS.  The early diagnosis of SIRS may possibly be enhanced by the measurement of proteomic biomarkers, including transthyretin, C-reactive protein and procalcitonin.  Immature granulocyte (IG) measurement has been proposed as a more readily available indicator of the presence of
  • granulocyte precursors (left shift).
The use of such markers, obtained by automated systems in conjunction with innovative statistical modeling, may provide a mechanism to enhance workflow and decision making.
An accurate classification based on the multiplicity of available data can be provided by an innovative system that utilizes  the conjoined syndromic features of disparate data elements.  Such a system has the potential to facilitate both the workflow and the decision-making process with an anticipated reduction of medical error.
This study is only an extension of our approach to repairing a longstanding problem in the construction of the many-sided electronic medical record (EMR).  On the one hand, past history combined with the development of Diagnosis Related Groups (DRGs) in the 1980s have driven the technology development in the direction of “billing capture”, which has been a focus of epidemiological studies in health services research using data mining.

In a classic study carried out at Bell Laboratories, Didner found that information technologies reflect the view of the creators, not the users, and Front-to-Back Design (R Didner) is needed.  He expresses the view:

“Pre-printed forms are much more amenable to computer-based storage and processing, and would improve the efficiency with which the insurance carriers process this information.  However, pre-printed forms can have a rather severe downside. By providing pre-printed forms that a physician completes
to record the diagnostic questions asked,
  • as well as tests, and results,
  • the sequence of tests and questions,
might be altered from that which a physician would ordinarily follow.  This sequence change could improve outcomes in rare cases, but it is more likely to worsen outcomes. “

Decision Making in the Clinical Setting.   Robert S. Didner

 A well-documented problem in the medical profession is the level of effort dedicated to administration and paperwork necessitated by health insurers, HMOs and other parties (ref).  This effort is currently estimated at 50% of a typical physician’s practice activity.  Obviously this contributes to the high cost of medical care.  A key element in the cost/effort composition is the retranscription of clinical data after the point at which it is collected.  Costs would be reduced, and accuracy improved, if the clinical data could be captured directly at the point it is generated, in a form suitable for transmission to insurers, or machine transformable into other formats.  Such data capture, could also be used to improve the form and structure of how this information is viewed by physicians, and form a basis of a more comprehensive database linking clinical protocols to outcomes, that could improve the knowledge of this relationship, hence clinical outcomes.
 How we frame our expectations is so important that
  • it determines the data we collect to examine the process.
In the absence of data to support an assumed benefit, there is no proof of validity at whatever cost.   This has meaning for
  • hospital operations, for
  • nonhospital laboratory operations, for
  • companies in the diagnostic business, and
  • for planning of health systems.
In 1983, a vision for creating the EMR was introduced by Lawrence Weed and others.  This is expressed by McGowan and Winstead-Fry.
J J McGowan and P Winstead-Fry. Problem Knowledge Couplers: reengineering evidence-based medicine through interdisciplinary development, decision support, and research.
Bull Med Libr Assoc. 1999 October; 87(4): 462–470.   PMCID: PMC226622    Copyright notice


Example of Markov Decision Process (MDP) trans...

Example of Markov Decision Process (MDP) transition automaton (Photo credit: Wikipedia)

Control loop of a Markov Decision Process

Control loop of a Markov Decision Process (Photo credit: Wikipedia)


English: IBM's Watson computer, Yorktown Heigh...

English: IBM’s Watson computer, Yorktown Heights, NY (Photo credit: Wikipedia)

English: Increasing decision stakes and system...

English: Increasing decision stakes and systems uncertainties entail new problem solving strategies. Image based on a diagram by Funtowicz, S. and Ravetz, J. (1993) “Science for the post-normal age” Futures 25:735–55 ( (Photo credit: Wikipedia)



Read Full Post »

Larry H Bernstein, MD, FCAP,  Reporter




Lipid Profile Predicts Metastasis in Breast Cancer


Posted on October 24, 2012 by admin


Researchers at the Bellvitge Biomedical Research Institute (IDIBELL) and The Institute of Photonic Sciences (ICFO) have collaborated on the development of a diagnostic tool that identifies the metastatic ability of breast cancer cells. The analysis is based on the characterization of the lipid component of the cells, which is indicative of malignancy. This has allowed the researchers to develop a classifier to discriminate cells capable of inducing metastasis. The results of the study have been published in the online version of the scientific journal PLoS ONE.

The characterization of the lipids associated with malignancy has been possible thanks to the technological development of a spectroscopic device named Raman along with the versatility offered by the experimental models of breast cancer. The results of this process form the basis for introducing this technique in routine cytological diagnosis, which could be extended in the future to diagnose other tumors.


Lipids (Photo credit: AJC1)


English: Breast cancer incidence by age in wom...

English: Breast cancer incidence by age in women in the United Kingdom 2006-2008. Reference: Excel chart for Figure 1.1: Breast Cancer (C50), Average Number of New Cases per Year and Age-Specific Incidence Rates, UK, 2006-2008 at Breast cancer – UK incidence statistics at Cancer Research UK. Section updated 18/07/11. (Photo credit: Wikipedia)

The researchers have analyzed the main components and, partly, the less discriminating ones to assess the profile of the lipid composition of breast cancer cells. They have generated a classification model that segregated metastatic and non-metastatic cells. “The algorithm for the discrimination of the metastatic ability is a first step toward the stratification of breast cancer cells using this quick and reactive tool,” explains the study coordinator, Àngels Sierra, researcher at the Biological Clues of the Invasive and Metastatic Phenotype group of IDIBELL.

Using cytology techniques, the researchers have found a correlation between the activation of lipogenesis (the chemical reaction leading to fatty acids in an organism) and the amount of saturated fats in metastatic cells indicating a worse prognosis and a decreased survival. The lipid content of the breast cancer cells might be a useful measure to determine various functions coupled to the progression of breast cancer. The work has been supported by the Instituto de Salud Carlos III, the former Spanish Ministry of Science and Innovation and the private Cellex Barcelona Foundation.




Read Full Post »

The Incentive for “Imaging based cancer patient’ management”

Author and Curator: Dror Nir, PhD

It is generally agreed by radiologists and oncologists that in order to provide a comprehensive work-flow that complies with the principles of personalized medicine, future cancer patients’ management will heavily rely on “smart imaging” applications. These could be accompanied by highly sensitive and specific bio-markers, which are expected to be delivered by pharmaceutical companies in the upcoming decade. In the context of this post, smart imaging refers to imaging systems that are enhanced with tissue characterization and computerized image interpretation applications. It is expected that such systems will enable gathering of comprehensive clinical information on cancer tumors, such as location, size and rate of growth.

What is the main incentive for promoting cancer patients’ management based on smart imaging? 

It promises to enable personalized cancer patient management by providing the medical practitioner with a non-invasive and non-destructive tool to detect, stage and follow up cancer tumors in a standardized and reproducible manner. Furthermore, applying smart imaging that provides valuable disease-related information throughout the management pathway of cancer patient will eventually result in reducing the growing burden of health-care costs related to cancer patients’ treatment.

Let’s briefly review the segments that are common to all cancer patients’ pathway: screening, treatment and costs.


Screening for cancer: It is well known that one of the important factors in cancer treatment success is the specific disease staging. Often this is dependent on when the patient is diagnosed as a cancer patient. In order to detect cancer as early as possible, i.e. before any symptoms appear, leaders in cancer patients’ management came up with the idea of screening. To date, two screening programs are the most spoken of: the “officially approved and budgeted” breast cancer screening; and the unofficial, but still extremely costly, prostate cancer screening. After 20 years of practice, both are causing serious controversies:

In trend analysis of WHO mortality data base [1], the authors, Autier P, Boniol M, Gavin A and Vatten LJ, argue that breast cancer mortality in neighboring European countries with different levels of screening but similar access to treatment is the same: “The contrast between the time differences in implementation of mammography screening and the similarity in reductions in mortality between the country pairs suggest that screening did not play a direct part in the reductions in breast cancer mortality”.

In prostate cancer mortality at 11 years of follow-up [2],  the authors,Schröder FH et. al. argue regarding prostate cancer patients’ overdiagnosis and overtreatment: “To prevent one death from prostate cancer at 11 years of follow-up, 1055 men would need to be invited for screening and 37 cancers would need to be detected”.

The lobbying campaign (see picture below)  that AdmeTech ( is conducting in order to raise the USA administration’s awareness and get funding to improve prostate cancer treatment is a tribute to patients’ and practitioners’ frustration.




Treatment: Current state of the art in oncology is characterized by a shift in  the decision-making process from an evidence-based guidelines approach toward personalized medicine. Information gathered from large clinical trials with regard to individual biological cancer characteristics leads to a more comprehensive understanding of cancer.

Quoting from the National cancer institute ( website: “Advances accrued over the past decade of cancer research have fundamentally changed the conversations that Americans can have about cancer. Although many still think of a single disease affecting different parts of the body, research tells us through new tools and technologies, massive computing power, and new insights from other fields that cancer is, in fact, a collection of many diseases whose ultimate number, causes, and treatment represent a challenging biomedical puzzle. Yet cancer’s complexity also provides a range of opportunities to confront its many incarnations”.

Personalized medicine, whether it uses cytostatics, hormones, growth inhibitors, monoclonal antibodies, and loco-regional medical devices, proves more efficient, less toxic, less expensive, and creates new opportunities for cancer patients and health care providers, including the medical industry.

To date, at least 50 types of systemic oncological treatments can be offered with much more quality and efficiency through patient selection and treatment outcome prediction.

Figure taken from presentation given by Prof. Jaak Janssens at the INTERVENTIONAL ONCOLOGY SOCIETY meeting held in Brussels in October 2011

For oncologists, recent technological developments in medical imaging-guided tissue acquisition technology (biopsy) create opportunities to provide representative fresh biological materials in a large enough quantity for all kinds of diagnostic tests.


Health-care economics: We are living in an era where life expectancy is increasing while national treasuries are over their limits in supporting health care costs. In the USA, of the nation’s 10 most expensive medical conditions, cancer has the highest cost per person. The total cost of treating cancer in the U.S. rose from about $95.5 billion in 2000 to $124.6 billion in 2010, the National Cancer Institute ( estimates. The true sum is probably higher as this estimate is based on average costs from 2001-2006, before many expensive treatments came out; quoting from : “new drugs often cost $100,000 or more a year. Patients are being put on them sooner in the course of their illness and for a longer time, sometimes for the rest of their lives.”

With such high costs at stake, solutions to reduce the overall cost of cancer patients’ management should be considered. My experience is that introducing smart imaging applications into routine use could contribute to significant savings in the overall cost of cancer patients’ management, by enabling personalized treatment choice and timely monitoring of tumors’ response to treatment.



  1. 1.      BMJ. 2011 Jul 28;343:d4411. doi: 10.1136/bmj.d4411
  2. 2.      (N Engl J Med. 2012 Mar 15;366(11):981-90):

Read Full Post »

Older Posts »