Feeds:
Posts
Comments

Posts Tagged ‘R&D’


Cell Therapy Market to Grow Beyond Oncology As Big Pharma Expands Investments

Reporter: Irina Robu, PhD

Collaborations of Big Pharma with small to mid-segment companies are currently focusing R&D on precision medicine. The market is valued at $2.70 billion in 2017 and is expected to reach $8.21 billion in 2025. A varied therapeutic focus and implementation of advanced manufacturing technologies such as single-use bioreactors, will pave a way for unique cell-gene and stem cell – gene combination therapies.
Novartis and Gilead are the first companies to adopt pay for performance business for their CAR-T cell therapies. In addition to innovative pricing models, Pharma companies are also showing a preference for risk sharing and fast-to-market models in order to support the development of novel therapies. Moreover, developments in cell culturing techniques alongside the use of different stem cells such as adipose-derived stem cells, mesenchymal stem cells, and induced pluripotent stem cell will reinforce the market with superior treatment options for non-oncological conditions such as neurological, musculoskeletal, and dermatological conditions.

With the high request for cell therapies, numerous growth opportunities can occur such as:

  • With more than 959 ongoing regenerative medicine clinical trials, the market finds opportunity across both stem cell and non-stem cell-based therapies.
  • Curative combination therapies which help find application in identifying the right patient as well as predicting the immune response in cancer patients.
  • Implementation of IT solutions and single-use manufacturing techniques for optimizing small-volume, high-value manufacturing of novel cell therapies, thus dropping the time to market radically.
  • Emerging Business Models which aid market players focus on academic and research collaborations together with industry collaborations to support therapeutic and technological innovations.

Source

https://www.newswire.ca/news-releases/cell-therapy-market-to-grow-beyond-oncology-as-big-pharma-expands-investments-826628110.html

 

Read Full Post »


Allergan, Pfizer Deal Goes Through with Allergan Bigger Than Pfizer: But at What Cost to R&D?

Curator: Stephen J. Williams, Ph.D.

Just recently this site had a post entitled Pfizer Near Allergan Buyout Deal But Will Fed Allow It? 

Now, as Bloomberg reports the international deal between Allergan and Pfizer has gone through, resulting in a tax inversion and nary a discouraging word from the US Federal Government (their blessing for future tax inversions?).  And as Bloomberg Go guest speculate finally it may spark Congress to do something about it, or perhaps not.  For details see Bloomberg transcript below:

 

Pfizer Inc. and Allergan Plc agreed to combine in a record $160 billion deal, creating a drugmaking behemoth called Pfizer Plc with products from Viagra to Botox and a low-cost tax base.

QuickTake Tax Inversion

Pfizer will exchange 11.3 shares for each Allergan share, valuing the smaller drugmaker at $363.63 a share, according to a statement Monday. That’s a premium of about 27 percent above Allergan’s stock price on Oct. 28, before news of the companies’ discussions became public. Pfizer investors will be able to opt for cash instead of stock in the combined company in exchange for their shares, with as much as $12 billion to be paid out.

The transaction is structured so that Dublin-based Allergan is technically buying its much larger partner, a move that makes it easier for the company to locate its tax address in Ireland for tax purposes, though the drugmaker’s operational headquarters will be in New York. Pfizer Chief Executive Officer Ian Read will be chairman and CEO of the new company, with Allergan CEO Brent Saunders as president and chief operating officer, overseeing sales, manufacturing and strategy.

The deal will begin adding to Pfizer’s adjusted earnings starting in 2018 and will boost profit by 10 percent the following year, the companies said. Pfizer’s 11 board members will join four from Allergan, including Saunders and Executive Chairman Paul Bisaro.Pfizer dropped 2.1 percent to $31.51 at 9:34 a.m. in New York, while Allergan fell 2 percent to $306.17. The combined company will trade on the New York Stock Exchange.Pfizer said it will start a $5 billion accelerated share buyback program in the first half of 2016. The deal is expected to be completed by the end of next year.

Unprecedented Deal

Pfizer, based in New York, makes medications including Viagra, pain drug Lyrica and the Prevnar pneumococcal vaccine, and Allergan produces Botox and the Alzheimer’s drug Namenda. Together, barring any divestitures, the companies will be the biggest pharmaceutical company by annual sales, with about $60 billion. The deal will be unprecedented on many levels. It’s the largest acquisition so far this year. It’s the largest ever in the pharmaceutical world, eclipsing Pfizer’s purchase of Warner-Lambert Co. in 2000 for $116 billion. And if the new company is able to establish itself abroad for a lower tax rate, a controversial process called an inversion, it will be the largest such move in history. The U.S. Treasury Department has increasingly targeted such strategies, most recently announcing new guidance on how it will value assets owned by U.S. companies that undertake inversions. The U.S. has the highest tax rate for businesses in the world, at 35 percent, and is one of the only countries to tax corporate profits wherever they are earned. Previous moves by the U.S. Treasury have derailed other proposed inversions, including AbbVie Inc.’s plan to buy Ireland’s Shire Plc for an estimated $52 billion. Pfizer and Allergan’s deal appears structured to avoid the tax inversion rules.

Read has already reached out to lawmakers in both houses of Congress, including Senate Majority Leader Mitch McConnell, and is calling the White House Monday, according to a person with knowledge of the matter. His pitch is that that the deal will help the companies invest in more innovative drugs and that Pfizer Plc would have 40,000 U.S. employees at the close of the transaction.

Facilitate Split

An agreement may also facilitate the widely discussed potential for Pfizer to reconfigure itself by splitting the newly enlarged company into two: one focused on new drug development, the other on selling older medications. Pfizer said Monday it will decide on a potential separation by the end of 2018. Pfizer earlier this year bought Hospira Inc., the maker of generic drugs often administered in hospitals, in a transaction valued at about $17 billion. The deal bolstered Pfizer’s established-drugs business, which combines strong cash flow and slow growth. Allergan itself has been recently transformed, created through an acquisition by Actavis Plc that kept the Allergan name. The company agreed to sell its generics business to Israel’s Teva Pharmaceutical Industries Ltd. for about $40.5 billion and has been on a buying binge of its own. It now has more than 70 compounds in mid-to late-stage development.

But What About Pfizer R&D?  Will that be put on the Back Burner?

A little while ago this site posted a talk given by Pfizer on their foray into personalized medicine in

11/19/2015 8 a.m. Building a Personalized Medicine Company & Keynote: President, Worldwide R&D, Pfizer Inc. 11th Annual Personalized Medicine Conference, November 18-19, 2015, Harvard Medical School

Here Pfizer had emphasized its commitment to discoveries in the personalized medicine area however the emphasis on worldwide may have been a hint of what is to come.

Just a few days ago Allergan CEO wrote a guest post in Forbes  (edited by Matthew Herper)

Allergan CEO Brent Saunders: Here’s What I Really Think About R&D

There has been a lot of discussion about my views about pharmaceutical research and development. Let me cut to the chase. I’m pro-R&D, but I don’t believe that any single company can corner the market on innovation in even one therapeutic area. It doesn’t mean they shouldn’t do basic research where they have special insights, but even then they need to be open to the ideas of others. Innovation in healthcare is more important than ever. Other companies have had success with different models based on different capabilities, and we applaud every new drug approval. Here at Allergan, we’ve adopted a strategy we call “Open Science.” It is based on a simple concept: Sometimes great ideas come from places where they are least expected.

Allergan’s CEO goes on to stress innovation centers around academic centers such as in Boston and an emphasis on Alzheimer’s research and development but is this just shop talk or is there a agenda and strategy here?

It is known that Allergan has not felt that building big labs to support an R&D strategy was in their best interests but Derick Lowes Science blog In the Pipeline shows the changes in feeling about R&D, that Allergan is in fact pro-R&D they just don’t feel it is in their best interests to do it “in house”. (see Come to Think of It, Brent Saunders Likes R&D, Too! and the comments)

And check out CEO Saunder’s Twitter feed which gives some insight into his feeling on in house R&D.

Retweeted

on a R&D approach that can deliver big for patients.

This is all very interesting and might mean, with the size of this deal and that Allergan owns 40% of Pfizer, a massive sea-change in the way big pharma conducts R&D, possibly focusing on smaller “open-sourced” smaller players.

Our Open Science approach allows us to strategically invest in innovation and be more nimble so that we can increase our R&D efficiency. It has led to a robust pipeline of experimental medicines. We currently have 70 mid- to late-stage programs in the pipeline, and since 2009, we have successfully brought 13 new drugs and devices to the market.

It also allows us to invest in areas that other companies have abandoned, like central nervous system (CNS) treatments. In CNS, clinical development costs are higher, and market approval probability is lower. But treating these disorders can bring hope to patients of all ages. According to the Centers for Disease Control & Prevention, one in 68 children has autism spectrum disease. Alzheimer’s affects one in three people over the age of 85, based on data from the Chicago Health and Aging Project. Yet despite the 634 current open clinical trials for these diseases, there are no approved medicines for autism’s three core characteristics, nor drugs that treat Alzheimer’s underlying disease or delay its progression.

Other related articles published in this Open access Online Scientific Journal include the following:

On Allergan

https://pharmaceuticalintelligence.com/?s=Allergan

On Pfizer

https://pharmaceuticalintelligence.com/?s=Pfizer

Read Full Post »


Larry H Bernstein, MD, FCAP, Reporter    Reblog

http://pharmaceuticalintelligence.org/2013-11-17/larryhbern/cancer-biomarkers

Clinical Laboratory News Nov 2013;  39( 11)

  The Vicious Cycle of  Under-Valued Cancer Biomarkers

 Could Sweeping Changes Bring More Tests Into Clinical Practice?

By Genna Rollins

It’s a classic conundrum: biomarkers are essential to

  • diagnosing,
  • staging,
  • treating, and
  • monitoring cancer,

yet despite an explosion of research, only a trickle have made it into clinical practice. The factors behind this less-than-desirable circumstance are complex at best, but according to some observers, boil down to the healthcare system’s placing

  • more value on cancer therapeutics relative to biomarkers.

Without a better means of demonstrating the difference biomarkers make in clinical outcomes or management, they remain stuck in a loop of low value:

  1. inadequate funding for research that, in turn,
  2. limits the evidence for clinical utility,
  3. keeping their value low.

A panel of leading scholars, clinicians, and executives recently collaborated about this dilemma in the hopes of starting a national dialogue toward breaking what they call a vicious cycle. Their proposed solutions—as ambitious as the problem is convoluted—can be implemented if the industry has the will to do so, they contend.

“People don’t value tumor biomarkers. They value therapeutics. We all talk about personalized medicine but we don’t really mean it if we’re not willing to value biomarkers the way we value therapeutics,” said Daniel Hayes, MD, Stuart B. Padnos professor of breast cancer research and clinical director of the breast oncology program at the University of Michigan Comprehensive Cancer Center in Ann Arbor. “However,

  • if we insist on doing biomarker research the same way we do therapeutic research, it gets very expensive.

But without putting the kind of money and research into it that’s necessary to determine clinical utility, payers don’t want to pay as much for a diagnostic as a therapeutic, and we’re stuck in this vicious cycle.”

Hayes has been on the forefront of thinking about how to bring more clinically meaningful biomarkers into cancer care, and he was the lead author of the group’s commentary, which was published

  • in the July 31, 2013 issue of Science Translational Medicine (Sci Transl Med 2013;196:1–7).

 A Bad Track Record

Whether or not they concur with all elements of Hayes’ and his colleagues’ vicious cycle concept, many in the cancer field agree that biomarkers have an anemic batting average. The scientific literature is replete with reports of promising cancer biomarkers, but

  • fewer than two dozen protein tumor markers have been cleared by the Food and Drug Administration (FDA)—
  • only about half in the past decade—and not all have been embraced by clinicians.
  • The same is true of lab-developed tests (LDTs).

While some paradigm-shifting cancer biomarkers with obvious clinical utility have been implemented rapidly in practice, others—be they FDA-cleared or LDTs—never really have found a niche. An example of the latter is

  • UGT1A1 testing prior to starting irinotecan hydrochloride chemotherapy in colorectal cancer patients.

Individuals who are homozygous for the UGT1A1*28 allele are at increased risk for neutropenia when taking irinotecan, but this testing just hasn’t caught on, according to Hayes’ co-author, Richard L. Schilsky, MD, chief medical officer of the American Society of Clinical Oncology (ASCO). “Practically no oncologist orders the test because

  • it’s not clear what you’re supposed to do if you get a result back that shows your patient is high-risk.

There’s not a specific recommendation about whether you’re supposed to

  1. omit the drug,
  2. lower the dose,
  3. lower the dose by how much, and
  4. whether lowering the dose actually mitigates the side-effects and
  5. whether or not it actually reduces the effectiveness of the treatment,” he explained.

On the other end of the spectrum is KRAS genetic testing in metastatic colon cancer. After seminal research published in 2008 showed that

  • patients without this mutation were more likely to respond to anti-epidermal growth factor receptor (anti-EGFR) monoclonal antibody therapy,

the test, even as an LDT, quickly became the de facto standard-of-care.

“A wealth of data came out all at the same time, and almost overnight

  • oncologists started ordering KRAS testing and
  • stopped prescribing the relevant drugs for patients whose tumors had KRAS mutations,” recalled Schilsky.

“That was a clear example of where if the tumor has a mutation,

  1. the drug doesn’t work, and
  2. you shouldn’t give it.

That’s the kind of discrete information that oncologists are always looking for.”

The KRAS test also hit the medical economics jackpot:

  • one anti-EGFR agent, cetuximab, costs anywhere from $110,000–160,000 per year,
  • making an easy argument for reserving this treatment for patients most likely to benefit from it.

At the same time, the cost of KRAS testing, about $400, speaks to Hayes’ and his colleagues’ arguments about the vicious cycle. If the test has that much clinical impact, shouldn’t it be valued higher in the medical system?

  Is Drug Development a Model?

“The pharmaceutical industry

  • potentially could be a model for how to do biomarker validation and evidence generation.

But then we’d have to charge a whole lot more for the tests, which everybody sees as commodities right now,” said Hayes’ co-author Debra Leonard, MD, PhD, professor and chair of pathology at the University of Vermont College of Medicine in Burlington.

  1. “Payers pay the cost of doing the test, but they
  2. aren’t calculating in the cost of all that evidence generation.

That’s why drug companies can charge so much, because all the cost of generating evidence is built into the price of a drug when it comes onto the market. There’s also good evidence to say that it does or doesn’t work, and

  • that’s not the case for tests.”

To Leonard’s point about evidence, the authors cited one cause of the vicious cycle as how FDA regulates approvals for diagnostic tests (See Figure, below). The agency by statute does not have authority to require that

  • proposed tests show clinical utility by improving clinical outcomes,

but that is exactly what the authors would like to see. “In the current regulatory environment, many tumor-biomarker tests

  • enter the market with analytical and clinical validity but insufficient information to establish their impact on healthcare outcomes.
  • Thus, few of these tests are included in evidence-based guidelines,

leaving healthcare professionals or third-party payers unsure of whether and how to use the tests or how much to pay for them,” they wrote.

 Vicious to Virtuous

A. Vicious Cycle

B. Virtuous Cycle

(A) The vicious cycle of tumor-biomarker research and clinical utility.

(B) A proposed virtuous cycle of tumor-biomarker research and clinical utility based on proposals herein.

Used with permission of Science Translational Medicine

  Shaking Up the FDA

The authors proposed several solutions to this challenge, some more audacious than others. On the bold side, they suggested that

  • FDA reorganize how it reviews oncology products, consolidating now separate drug and diagnostic reviews into a single oncologic product line managed jointly by the respective drug and biomarker divisions;
    • in the case of the former, the Office of Hematology and Oncology Products in the Center for Drug Evaluation and Research, and
    • in the latter, the Office of In Vitro Diagnostics in the Center for Devices and Radiological Health
  • In the same vein,
    • the authors proposed that FDA approve or clear tests only with rigorous evidence of both clinical utility and analytical validity, using ASCO level 1 evidence criteria.

If those recommendations might be long-term goals, possibly even requiring Congressional approval, others seem more approachable, but perhaps no less controversial. One is that

  • FDA begins regulating LDTs.

This contentious topic, under review for more than 3 years at FDA, would, the authors suggest, subject all proposed tests to a risk-based review process, regardless of the manufacturer or commercialization strategy behind them. However, not even all the authors back the idea.

“I don’t believe we can do away with LDTs,” said Leonard. “If the FDA had a better mechanism for looking at LDTs in their risk-based system, that might be helpful. But I worry about everything having to go through FDA because of the slowness of the FDA process and the expense of using the FDA process. There also doesn’t seem to be any idea of how we’re going to get from where we are today to an FDA approval process that actually works.”

Leonard also expressed skepticism that the FDA approval process inherently produces better or more clinically useful tests than do LDTs. “The FDA process does not look at

  1. clinical utility, and there is no evidence that
  2. FDA-cleared or -approved tests do any better when
    • they get into clinical practice than ones that haven’t gone through the FDA process and are LDTs,” she said,

citing the Health and Human Services Secretary’s Advisory Committee on Genetics, Health, and Society, which, in its 2008 report on the U.S. system of overseeing genetic testing found

  • a “paucity” of information about the clinical utility of genetic testing.

Other researchers who have thought about how to speed up the biomarker pipeline also find this recommendation troubling. “Basically eliminating LDTs, especially if this were not accompanied by a prior increase in reimbursement and research dollars, would be extremely negative,” said Leigh Anderson, PhD, CEO of Washington, D.C.-based SISCAPA Assay Technologies. “I’m involved in collaborative work with a number of groups trying to develop new tests mainly in the cancer area and I don’t think any of them would be in the position

  • to think seriously about going forward with those if the LDT route didn’t exist.

They would have to raise hundreds of millions of additional dollars to take that approach. It’s not trivial to develop an LDT, but to say

  • a proposed assay has to be treated as an FDA-cleared in vitro diagnostic [IVD] represents a significant additional barrier.”

  The Benefits of Biospecimen Banks

Although Anderson wasn’t on board with the authors’ proposal about LDTs, he lauded their recommendation that

  • all drug registration trials maintain a biospecimen bank,
  • funded by the sponsoring drug company,
  • so that subsequent researchers could access the samples for prospective-retrospective studies.

In fact, writing in a Clinical Chemistry opinion piece along with the journal’s editor-in-chief Nader Rifai, PhD, he recommended that

  • the National Institutes of Health develop a list of key clinical diagnostic questions
  • prioritized by disease impact and linked to studies or medical centers with corresponding biospecimens.

This “would allow

  • a much more informed and productive application of the existing biomarker resources and
  • would provide a much-needed basis for arguing for the enormous potential health-economic value of successful new tests,” they wrote (Clin Chem 2013;59:194–7).

Along with the need to provide higher levels of evidence for candidate cancer biomarkers, the authors called for a significant

  • ramp-up in biomarker research investment and higher reimbursement for tests that demonstrate clinical utility.

In addition, they recommended that

  • scientific journals adhere to higher standards in publishing tumor biomarker studies and
  • be as willing to publish biomarker studies that have negative results as those with positive findings.

Finally, they proposed that guideline bodies follow evidence-based recommendations for tumor biomarker test use.

  Emulating the PET Registry

To up the ante on these sweeping reforms, the authors believe addressing them in concert is the only way to break the vicious cycle. But how can the healthcare industry essentially reinvent a new paradigm for better valuing cancer biomarkers when the elements of doing so are like a gyrating Medusa’s head of knotty, seemingly intractable challenges? The authors agree the problem is too daunting if considered only in its entirety. But they and other experts suggest that several tangible actions could move the field along substantially without too much chaos or pain.

For example, in the area of building evidence that would open the door for better reimbursement for cancer biomarker tests,

  • Schilsky envisions a tissue or blood test equivalent to the National Oncologic PET Registry (NOPR).

This ground-breaking initiative managed by the American College of Radiology (ACR) and the ACR Imaging Network developed evidence for Medicare to reimburse PET scanning with F-18 fluorodeoxyglucose when it wasn’t covered at all. NOPR enabled reimbursement for this testing in cancer patients on the proviso that physicians agreed to enter data in a registry that would enable a fair assessment of the impact PET had on cancer patient management. Started in 2006, NOPR led in 2009 to coverage of PET scans as part of the initial treatment strategy for most solid tumors, coverage that recently

  • was expanded to include payment for up to three PET scans in patients with advanced cancer after their initial treatment, according to Schilsky.

By doing the same thing with selected tumor biomarkers, Schilsky suggested, “we immediately begin to capture information that we’re not currently getting on

  • the prevalence of use of certain tests,
  • the kinds of clinical decisions based on those tests, and
  • the outcomes of the patients who undergo the testing,”

he explained. “Then, for payers, it becomes much less of a Wild West environment. They will have information they can analyze and use to inform their coverage decisions.” Such a system also would differentiate the most clinically useful tests from less relevant ones, enabling payers to shift resources to the winning tests. This, in turn, would incentivize test developers “to put tests out that are likely to perform well,” Schilsky added.

The authors also point to efforts like the National Human Genome Research Institute’s recent decision to fund more than $25 million over 4 years to develop the Clinical Genome Resource for

  • authoritative information on genetic variants relevant to human disease and patient care.

The National Cancer Institute (NCI) also is planning a

  • web-based inventory of all biospecimens collected under its clinical trials cooperative group program, according to Schilsky.

  The Impact of Technology

Anderson believes the authors overlooked the impact technological advances could have on the vicious cycle, by speeding up the process of vetting candidate biomarkers. “An alternative way of doing mass spectrometry [MS]-based protein assays is

  • to analyze for specific targeted proteins in smaller numbers, so you might measure 10 or 100, but accurately and quickly,” he explained.

“Those kinds of directed assay methods which are not looking for everything, but instead for specific things that you hypothesize are important biomarkers, can be run

  • fast enough and cheaply enough that you can run hundreds and hundreds of samples in a practical way.

That then removes the primary technological limitation to getting the validation of biomarkers done.”

Anderson added that this type of MS-based directed protein analysis also could speed up the bench-to-bedside time for biomarkers. “The advantage is that the same method used in biomarker verification studies at the research stage can be

  • implemented at least in the large reference labs as LDTs, where
  • it provides a significant technology improvement over immunoassays,” said Anderson.

“That it can be taken all the way from research to LDT in a capable reference lab takes a lot of delay out of the introduction into clinical practice. That’s because you don’t need to redevelop immunoassays for different platforms. Eventually you might not even need to redevelop it for an IVD platform, once mass spectrometry IVD platforms exist.” He predicted that this approach could shave 5–10 years from the biomarker development process. Anderson’s company, SISCAPA Assay Technologies, provides mass-spec-based specific assays for biomarker proteins.

Researchers also have a responsibility to think about the clinical need they want to address before diving deep into discovery, suggested Ivan Blasutig, PhD, a clinical biochemist and assistant professor at Toronto General Hospital and the University of Toronto. “Many of the biomarkers discovered may have statistically significant results but when it comes to actual clinical use they don’t cut it. That’s one of the biggest issues,” he said.

 Analytics: The Achilles Heel

Blasutig collaborates closely with Eleftherios Diamandis, MD, PhD, who has written extensively about the challenges of bringing proposed cancer biomarkers into clinical practice. He, Diamandis, and others have emphasized how important robust analytics are in the early stages of biomarker discovery. In fact Blasutig and Diamandis recently wrote about how using what turned out to be an unreliable commercial kit for CUZD1 detection set back their research team by 2 years and about $500,000 in their quest to find a new pancreatic cancer biomarker (Clin Chem 2013 doi:10.1373/clinchem.2013.215236).

Blasutig and others encouraged clinical laboratorians to participate actively in biomarker discovery, as they bring a wealth of knowledge about analytical issues in validation that research chemists don’t necessarily share. Laboratorians also are more likely to be aware of resources like the NCI’s Early Detection Research Network, which gives guidance on topics such as completing sample collections and avoiding analytical bias, Blasutig suggested.

If the vicious cycle seems completely unwieldy and unrepairable, each person CLN contacted expressed confidence that even in the face of long odds, it can be changed. Leonard spoke for many: “I’m optimistic that we have to do this for our patients, and for our healthcare delivery system. There are a lot of good people around who are interested in having this conversation. So if we can just get all the parties at the table and see if there are some concrete steps that we can take, that would be a major step in the right direction.”

 Comment by reviewer:

The clinical laboratory has been concentrating on technical accuracy considerably beyond the clinical utility of many observations in clinical medicine.  This is not necessarily appreciated, so when a test is inconsistent with the clinical hypothesis, it may be rejected as error.  Errors may occur, but are rare, except if there is specimen misidentification.  However, we are still focused on a “silver bullet” approach to use of diagnostic tests.  There is some variability of the expression of cancer cells, so that there are subclusters to be expected within a major class.  The level of applied mathematics that is needed to analyze this data has been refined enormously in the last decade, and has to be used on the selected groups of tests referred to with all due respect by Leigh Anderson, who has the imagination to pursue the highest accuracy in large scale MS analysis that his laboratory has pursued for many years.  This reviewer is interested in the “information content” of a combination of tests, when the accuracy of testing is no longer an issue.  By combining the high throuput and lower cost of processing, with vastly better mathematical technology than is customary – on the fly – would be a breakthrough.  That would not be the end of this journey because there would have to be centers for analysis distributed within a few hours of the treatment centers (or at those sites), so that testing and processing would enable better facilitation of treatment.

Read Full Post »


The importance of spatially-localized and quantified image interpretation in cancer management

Writer & reporter: Dror Nir, PhD

I became involved in the development of quantified imaging-based tissue characterization more than a decade ago. From the start, it was clear to me that what clinicians needs will not be answered by just identifying whether a certain organ harbors cancer. If imaging devices are to play a significant role in future medicine, as a complementary source of information to bio-markers and gene sequencing the minimum value expected of them is accurate directing of biopsy needles and treatment tools to the malignant locations in the organ.  Therefore, the design goal of the first Prostate-HistoScanning (“PHS”) version I went into the trouble of characterizing localized volume of tissue at the level of approximately 0.1cc (1x1x1 mm). Thanks to that, the imaging-interpretation overlay of PHS localizes the suspicious lesions with accuracy of 5mm within the prostate gland; Detection, localisation and characterisation of prostate cancer by prostate HistoScanning(™).

I then started a more ambitious research aiming to explore the feasibility of identifying sub-structures within the cancer lesion itself. The preliminary results of this exploration were so promising that it surprised not only the clinicians I was working with but also myself. It seems, that using quality ultrasound, one can find Imaging-Biomarkers that allows differentiation of inside structures of a cancerous lesions. Unfortunately, for everyone involved in this work, including me, this scientific effort was interrupted by financial constrains before reaching maturity.

My short introduction was made to explain why I find the publication below important enough to post and bring to your attention.

I hope for your agreement on the matter.

Quantitative Imaging in Cancer Evolution and Ecology

Robert A. Gatenby, MD, Olya Grove, PhD and Robert J. Gillies, PhD

From the Departments of Radiology and Cancer Imaging and Metabolism, Moffitt Cancer Center, 12902 Magnolia Dr, Tampa, FL 33612. Address correspondence to  R.A.G. (e-mail: Robert.Gatenby@Moffitt.org).

Abstract

Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral Darwinian dynamics before and during therapy. Advances in image analysis will place clinical imaging in an increasingly central role in the development of evolution-based patient-specific cancer therapy.

© RSNA, 2013

 

Introduction

Cancers are heterogeneous across a wide range of temporal and spatial scales. Morphologic heterogeneity between and within cancers is readily apparent in clinical imaging, and subjective descriptors of these differences, such as necrotic, spiculated, and enhancing, are common in the radiology lexicon. In the past several years, radiology research has increasingly focused on quantifying these imaging variations in an effort to understand their clinical and biologic implications (1,2). In parallel, technical advances now permit extensive molecular characterization of tumor cells in individual patients. This has led to increasing emphasis on personalized cancer therapy, in which treatment is based on the presence of specific molecular targets (3). However, recent studies (4,5) have shown that multiple genetic subpopulations coexist within cancers, reflecting extensive intratumoral somatic evolution. This heterogeneity is a clear barrier to therapy based on molecular targets, since the identified targets do not always represent the entire population of tumor cells in a patient (6,7). It is ironic that cancer, a disease extensively and primarily analyzed genetically, is also the most genetically flexible of all diseases and, therefore, least amenable to such an approach.

Genetic variations in tumors are typically ascribed to a mutator phenotype that generates new clones, some of which expand into large populations (8). However, although identification of genotypes is of substantial interest, it is insufficient for complete characterization of tumor dynamics because evolution is governed by the interactions of environmental selection forces with the phenotypic, not genotypic, properties of populations as shown, for example, by evolutionary convergence to identical phenotypes among cave fish even when they are from different species (911). This connection between tissue selection forces and cellular properties has the potential to provide a strong bridge between medical imaging and the cellular and molecular properties of cancers.

We postulate that differences within tumors at different spatial scales (ie, at the radiologic, cellular, and molecular [genetic] levels) are related. Tumor characteristics observable at clinical imaging reflect molecular-, cellular-, and tissue-level dynamics; thus, they may be useful in understanding the underlying evolving biology in individual patients. A challenge is that such mapping across spatial and temporal scales requires not only objective reproducible metrics for imaging features but also a theoretical construct that bridges those scales (Fig 1).

P1a

Figure 1a: Computed tomographic (CT) scan of right upper lobe lung cancer in a 50-year-old woman.

P1b

Figure 1b: Isoattenuation map shows regional heterogeneity at the tissue scale (measured in centimeters).

 cd

Figure 1c & 1d: (c, d)Whole-slide digital images (original magnification, ×3) of a histologic slice of the same tumor at the mesoscopic scale (measured in millimeters) (c) coupled with a masked image of regional morphologic differences showing spatial heterogeneity (d). 

p1e

Figure 1e: Subsegment of the whole slide image shows the microscopic scale (measured in micrometers) (original magnification, ×50).

p1f

Figure 1f: Pattern recognition masked image shows regional heterogeneity. In a, the CT image of non–small cell lung cancer can be analyzed to display gradients of attenuation, which reveals heterogeneous and spatially distinct environments (b). Histologic images in the same patient (c, e) reveal heterogeneities in tissue structure and density on the same scale as seen in the CT images. These images can be analyzed at much higher definition to identify differences in morphologies of individual cells (3), and these analyses reveal clusters of cells with similar morphologic features (d, f). An important goal of radiomics is to bridge radiologic data with cellular and molecular characteristics observed microscopically.

To promote the development and implementation of quantitative imaging methods, protocols, and software tools, the National Cancer Institute has established the Quantitative Imaging Network. One goal of this program is to identify reproducible quantifiable imaging features of tumors that will permit data mining and explicit examination of links between the imaging findings and the underlying molecular and cellular characteristics of the tumors. In the quest for more personalized cancer treatments, these quantitative radiologic features potentially represent nondestructive temporally and spatially variable predictive and prognostic biomarkers that readily can be obtained in each patient before, during, and after therapy.

Quantitative imaging requires computational technologies that can be used to reliably extract mineable data from radiographic images. This feature information can then be correlated with molecular and cellular properties by using bioinformatics methods. Most existing methods are agnostic and focus on statistical descriptions of existing data, without presupposing the existence of specific relationships. Although this is a valid approach, a more profound understanding of quantitative imaging information may be obtained with a theoretical hypothesis-driven framework. Such models use links between observable tumor characteristics and microenvironmental selection factors to make testable predictions about emergent phenotypes. One such theoretical framework is the developing paradigm of cancer as an ecologic and evolutionary process.

For decades, landscape ecologists have studied the effects of heterogeneity in physical features on interactions between populations of organisms and their environments, often by using observation and quantification of images at various scales (1214). We propose that analytic models of this type can easily be applied to radiologic studies of cancer to uncover underlying molecular, cellular, and microenvironmental drivers of tumor behavior and specifically, tumor adaptations and responses to therapy (15).

In this article, we review recent developments in quantitative imaging metrics and discuss how they correlate with underlying genetic data and clinical outcomes. We then introduce the concept of using ecology and evolutionary models for spatially explicit image analysis as an exciting potential avenue of investigation.

 

Quantitative Imaging and Radiomics

In patients with cancer, quantitative measurements are commonly limited to measurement of tumor size with one-dimensional (Response Evaluation Criteria in Solid Tumors [or RECIST]) or two-dimensional (World Health Organization) long-axis measurements (16). These measures do not reflect the complexity of tumor morphology or behavior, and in many cases, changes in these measures are not predictive of therapeutic benefit (17). In contrast, radiomics (18) is a high-throughput process in which a large number of shape, edge, and texture imaging features are extracted, quantified, and stored in databases in an objective, reproducible, and mineable form (Figs 12). Once transformed into a quantitative form, radiologic tumor properties can be linked to underlying genetic alterations (the field is called radiogenomics) (1921) and to medical outcomes (2227). Researchers are currently working to develop both a standardized lexicon to describe tumor features (28,29) and a standard method to convert these descriptors into quantitative mineable data (30,31) (Fig 3).

p2

Figure 2: Contrast-enhanced CT scans show non–small cell lung cancer (left) and corresponding cluster map (right). Subregions within the tumor are identified by clustering pixels based on the attenuation of pixels and their cumulative standard deviation across the region. While the entire region of interest of the tumor, lacking the spatial information, yields a weighted mean attenuation of 859.5 HU with a large and skewed standard deviation of 243.64 HU, the identified subregions have vastly different statistics. Mean attenuation was 438.9 HU ± 45 in the blue subregion, 210.91 HU ± 79 in the yellow subregion, and 1077.6 HU ± 18 in the red subregion.

 

p3

Figure 3: Chart shows the five processes in radiomics.

Several recent articles underscore the potential power of feature analysis. After manually extracting more than 100 CT image features, Segal and colleagues found that a subset of 14 features predicted 80% of the gene expression pattern in patients with hepatocellular carcinoma (21). A similar extraction of features from contrast agent–enhanced magnetic resonance (MR) images of glioblastoma was used to predict immunohistochemically identified protein expression patterns (22). Other radiomic features, such as texture, can be used to predict response to therapy in patients with renal cancer (32) and prognosis in those with metastatic colon cancer (33).

These pioneering studies were relatively small because the image analysis was performed manually, and the studies were consequently underpowered. Thus, recent work in radiomics has focused on technical developments that permit automated extraction of image features with the potential for high throughput. Such methods, which rely heavily on novel machine learning algorithms, can more completely cover the range of quantitative features that can describe tumor heterogeneity, such as texture, shape, or margin gradients or, importantly, different environments, or niches, within the tumors.

Generally speaking, texture in a biomedical image is quantified by identifying repeating patterns. Texture analyses fall into two broad categories based on the concepts of first- and second-order spatial statistics. First-order statistics are computed by using individual pixel values, and no relationships between neighboring pixels are assumed or evaluated. Texture analysis methods based on first-order statistics usually involve calculating cumulative statistics of pixel values and their histograms across the region of interest. Second-order statistics, on the other hand, are used to evaluate the likelihood of observing spatially correlated pixels (34). Hence, second-order texture analyses focus on the detection and quantification of nonrandom distributions of pixels throughout the region of interest.

The technical developments that permit second-order texture analysis in tumors by using regional enhancement patterns on dynamic contrast-enhanced MR images were reviewed recently (35). One such technique that is used to measure heterogeneity of contrast enhancement uses the Factor Analysis of Medical Image Sequences (or FAMIS) algorithm, which divides tumors into regions based on their patterns of enhancement (36). Factor Analysis of Medical Image Sequences–based analyses yielded better prognostic information when compared with region of interest–based methods in numerous cancer types (1921,3739), and they were a precursor to the Food and Drug Administration–approved three-time-point method (40). A number of additional promising methods have been developed. Rose and colleagues showed that a structured fractal-based approach to texture analysis improved differentiation between low- and high-grade brain cancers by orders of magnitude (41). Ahmed and colleagues used gray level co-occurrence matrix analyses of dynamic contrast-enhanced images to distinguish benign from malignant breast masses with high diagnostic accuracy (area under the receiver operating characteristic curve, 0.92) (26). Others have shown that Minkowski functional structured methods that convolve images with differently kernelled masks can be used to distinguish subtle differences in contrast enhancement patterns and can enable significant differentiation between treatment groups (42).

It is not surprising that analyses of heterogeneity in enhancement patterns can improve diagnosis and prognosis, as this heterogeneity is fundamentally based on perfusion deficits, which generate significant microenvironmental selection pressures. However, texture analysis is not limited to enhancement patterns. For example, measures of heterogeneity in diffusion-weighted MR images can reveal differences in cellular density in tumors, which can be matched to histologic findings (43). Measures of heterogeneity in T1- and T2-weighted images can be used to distinguish benign from malignant soft-tissue masses (23). CT-based texture features have been shown to be highly significant independent predictors of survival in patients with non–small cell lung cancer (24).

Texture analyses can also be applied to positron emission tomographic (PET) data, where they can provide information about metabolic heterogeneity (25,26). In a recent study, Nair and colleagues identified 14 quantitative PET imaging features that correlated with gene expression (19). This led to an association of metagene clusters to imaging features and yielded prognostic models with hazard ratios near 6. In a study of esophageal cancer, in which 38 quantitative features describing fluorodeoxyglucose uptake were extracted, measures of metabolic heterogeneity at baseline enabled prediction of response with significantly higher sensitivity than any whole region of interest standardized uptake value measurement (22). It is also notable that these extensive texture-based features are generally more reproducible than simple measures of the standardized uptake value (27), which can be highly variable in a clinical setting (44).

 

Spatially Explicit Analysis of Tumor Heterogeneity

Although radiomic analyses have shown high prognostic power, they are not inherently spatially explicit. Quantitative border, shape, and texture features are typically generated over a region of interest that comprises the entire tumor (45). This approach implicitly assumes that tumors are heterogeneous but well mixed. However, spatially explicit subregions of cancers are readily apparent on contrast-enhanced MR or CT images, as perfusion can vary markedly within the tumor, even over short distances, with changes in tumor cell density and necrosis.

An example is shown in Figure 2, which shows a contrast-enhanced CT scan of non–small cell lung cancer. Note that there are many subregions within this tumor that can be identified with attenuation gradient (attenuation per centimeter) edge detection algorithms. Each subregion has a characteristic quantitative attenuation, with a narrow standard deviation, whereas the mean attenuation over the entire region of interest is a weighted average of the values across all subregions, with a correspondingly large and skewed distribution. We contend that these subregions represent distinct habitats within the tumor, each with a distinct set of environmental selection forces.

These observations, along with the recent identification of regional variations in the genetic properties of tumor cells, indicate the need to abandon the conceptual model of cancers as bounded organlike structures. Rather than a single self-organized system, cancers represent a patchwork of habitats, each with a unique set of environmental selection forces and cellular evolution strategies. For example, regions of the tumor that are poorly perfused can be populated by only those cells that are well adapted to low-oxygen, low-glucose, and high-acid environmental conditions. Such adaptive responses to regional heterogeneity result in microenvironmental selection and hence, emergence of genetic variations within tumors. The concept of adaptive response is an important departure from the traditional view that genetic heterogeneity is the product of increased random mutations, which implies that molecular heterogeneity is fundamentally unpredictable and, therefore, chaotic. The Darwinian model proposes that genetic heterogeneity is the result of a predictable and reproducible selection of successful adaptive strategies to local microenvironmental conditions.

Current cross-sectional imaging modalities can be used to identify regional variations in selection forces by using contrast-enhanced, cell density–based, or metabolic features. Clinical imaging can also be used to identify evidence of cellular adaptation. For example, if a region of low perfusion on a contrast-enhanced study is necrotic, then an adaptive population is absent or minimal. However, if the poorly perfused area is cellular, then there is presumptive evidence of an adapted proliferating population. While the specific genetic properties of this population cannot be determined, the phenotype of the adaptive strategy is predictable since the environmental conditions are more or less known. Thus, standard medical images can be used to infer specific emergent phenotypes and, with ongoing research, these phenotypes can be associated with underlying genetic changes.

This area of investigation will likely be challenging. As noted earlier, the most obvious spatially heterogeneous imaging feature in tumors is perfusion heterogeneity on contrast-enhanced CT or MR images. It generally has been assumed that the links between contrast enhancement, blood flow, perfusion, and tumor cell characteristics are straightforward. That is, tumor regions with decreased blood flow will exhibit low perfusion, low cell density, and high necrosis. In reality, however, the dynamics are actually much more complex. As shown in Figure 4, when using multiple superimposed sequences from MR imaging of malignant gliomas, regions of tumor that are poorly perfused on contrast-enhanced T1-weighted images may exhibit areas of low or high water content on T2-weighted images and low or high diffusion on diffusion-weighted images. Thus, high or low cell densities can coexist in poorly perfused volumes, creating perfusion-diffusion mismatches. Regions with poor perfusion with high cell density are of particular clinical interest because they represent a cell population that is apparently adapted to microenvironmental conditions associated with poor perfusion. The associated hypoxia, acidosis, and nutrient deprivation select for cells that are resistant to apoptosis and thus are likely to be resistant to therapy (46,47).

p4

Figure 4: Left: Contrast-enhanced T1 image from subject TCGA-02-0034 in The Cancer Genome Atlas–Glioblastoma Multiforme repository of MR volumes of glioblastoma multiforme cases. Right: Spatial distribution of MR imaging–defined habitats within the tumor. The blue region (low T1 postgadolinium, low fluid-attenuated inversion recovery) is particularly notable because it presumably represents a habitat with low blood flow but high cell density, indicating a population presumably adapted to hypoxic acidic conditions.

Furthermore, other selection forces not related to perfusion are likely to be present within tumors. For example, evolutionary models suggest that cancer cells, even in stable microenvironments, tend to speciate into “engineers” that maximize tumor cell growth by promoting angiogenesis and “pioneers” that proliferate by invading normal issue and co-opting the blood supply. These invasive tumor phenotypes can exist only at the tumor edge, where movement into a normal tissue microenvironment can be rewarded by increased proliferation. This evolutionary dynamic may contribute to distinct differences between the tumor edges and the tumor cores, which frequently can be seen at analysis of cross-sectional images (Fig 5).

p5a

Figure 5a: CT images obtained with conventional entropy filtering in two patients with non–small cell lung cancer with no apparent textural differences show similar entropy values across all sections. 

p5b

Figure 5b: Contour plots obtained after the CT scans were convolved with the entropy filter. Further subdividing each section in the tumor stack into tumor edge and core regions (dotted black contour) reveals varying textural behavior across sections. Two distinct patterns have emerged, and preliminary analysis shows that the change of mean entropy value between core and edge regions correlates negatively with survival.

Interpretation of the subsegmentation of tumors will require computational models to understand and predict the complex nonlinear dynamics that lead to heterogeneous combinations of radiographic features. We have exploited ecologic methods and models to investigate regional variations in cancer environmental and cellular properties that lead to specific imaging characteristics. Conceptually, this approach assumes that regional variations in tumors can be viewed as a coalition of distinct ecologic communities or habitats of cells in which the environment is governed, at least to first order, by variations in vascular density and blood flow. The environmental conditions that result from alterations in blood flow, such as hypoxia, acidosis, immune response, growth factors, and glucose, represent evolutionary selection forces that give rise to local-regional phenotypic adaptations. Phenotypic alterations can result from epigenetic, genetic, or chromosomal rearrangements, and these in turn will affect prognosis and response to therapy. Changes in habitats or the relative abundance of specific ecologic communities over time and in response to therapy may be a valuable metric with which to measure treatment efficacy and emergence of resistant populations.

 

Emerging Strategies for Tumor Habitat Characterization

A method for converting images to spatially explicit tumor habitats is shown in Figure 4. Here, three-dimensional MR imaging data sets from a glioblastoma are segmented. Each voxel in the tumor is defined by a scale that includes its image intensity in different sequences. In this case, the imaging sets are from (a) a contrast-enhanced T1 sequence, (b) a fast spin-echo T2 sequence, and (c) a fluid-attenuated inversion-recovery (or FLAIR) sequence. Voxels in each sequence can be defined as high or low based on their value compared with the mean signal value. By using just two sequences, a contrast-enhanced T1 sequence and a fluid-attenuated inversion-recovery sequence, we can define four habitats: high or low postgadolinium T1 divided into high or low fluid-attenuated inversion recovery. When these voxel habitats are projected into the tumor volume, we find they cluster into spatially distinct regions. These habitats can be evaluated both in terms of their relative contributions to the total tumor volume and in terms of their interactions with each other, based on the imaging characteristics at the interfaces between regions. Similar spatially explicit analysis can be performed with CT scans (Fig 5).

Analysis of spatial patterns in cross-sectional images will ultimately require methods that bridge spatial scales from microns to millimeters. One possible method is a general class of numeric tools that is already widely used in terrestrial and marine ecology research to link species occurrence or abundance with environmental parameters. Species distribution models (4851) are used to gain ecologic and evolutionary insights and to predict distributions of species or morphs across landscapes, sometimes extrapolating in space and time. They can easily be used to link the environmental selection forces in MR imaging-defined habitats to the evolutionary dynamics of cancer cells.

Summary

Imaging can have an enormous role in the development and implementation of patient-specific therapies in cancer. The achievement of this goal will require new methods that expand and ultimately replace the current subjective qualitative assessments of tumor characteristics. The need for quantitative imaging has been clearly recognized by the National Cancer Institute and has resulted in formation of the Quantitative Imaging Network. A critical objective of this imaging consortium is to use objective, reproducible, and quantitative feature metrics extracted from clinical images to develop patient-specific imaging-based prognostic models and personalized cancer therapies.

It is increasingly clear that tumors are not homogeneous organlike systems. Rather, they contain regional coalitions of ecologic communities that consist of evolving cancer, stroma, and immune cell populations. The clinical consequence of such niche variations is that spatial and temporal variations of tumor phenotypes will inevitably evolve and present substantial challenges to targeted therapies. Hence, future research in cancer imaging will likely focus on spatially explicit analysis of tumor regions.

Clinical imaging can readily characterize regional variations in blood flow, cell density, and necrosis. When viewed in a Darwinian evolutionary context, these features reflect regional variations in environmental selection forces and can, at least in principle, be used to predict the likely adaptive strategies of the local cancer population. Hence, analyses of radiologic data can be used to inform evolutionary models and then can be mapped to regional population dynamics. Ecologic and evolutionary principles may provide a theoretical framework to link imaging to the cellular and molecular features of cancer cells and ultimately lead to a more comprehensive understanding of specific cancer biology in individual patients.

 

Essentials

  • • Marked heterogeneity in genetic properties of different cells in the same tumor is typical and reflects ongoing intratumoral evolution.
  • • Evolution within tumors is governed by Darwinian dynamics, with identifiable environmental selection forces that interact with phenotypic (not genotypic) properties of tumor cells in a predictable and reproducible manner; clinical imaging is uniquely suited to measure temporal and spatial heterogeneity within tumors that is both a cause and a consequence of this evolution.
  • • Subjective radiologic descriptors of cancers are inadequate to capture this heterogeneity and must be replaced by quantitative metrics that enable statistical comparisons between features describing intratumoral heterogeneity and clinical outcomes and molecular properties.
  • • Spatially explicit mapping of tumor regions, for example by superimposing multiple imaging sequences, may permit patient-specific characterization of intratumoral evolution and ecology, leading to patient- and tumor-specific therapies.
  • • We summarize current information on quantitative analysis of radiologic images and propose future quantitative imaging must become spatially explicit to identify intratumoral habitats before and during therapy.

Disclosures of Conflicts of Interest: R.A.G. No relevant conflicts of interest to disclose. O.G. No relevant conflicts of interest to disclose.R.J.G. No relevant conflicts of interest to disclose.

 

Acknowledgments

The authors thank Mark Lloyd, MS; Joel Brown, PhD; Dmitry Goldgoff, PhD; and Larry Hall, PhD, for their input to image analysis and for their lively and informative discussions.

Footnotes

  • Received December 18, 2012; revision requested February 5, 2013; revision received March 11; accepted April 9; final version accepted April 29.
  • Funding: This research was supported by the National Institutes of Health (grants U54CA143970-01, U01CA143062; R01CA077575, andR01CA170595).

References

    1. Kurland BF,
    2. Gerstner ER,
    3. Mountz JM,
    4. et al

    . Promise and pitfalls of quantitative imaging in oncology clinical trials. Magn Reson Imaging2012;30(9):1301–1312.

    1. Levy MA,
    2. Freymann JB,
    3. Kirby JS,
    4. et al

    . Informatics methods to enable sharing of quantitative imaging research data. Magn Reson Imaging2012;30(9):1249–1256.

    1. Mirnezami R,
    2. Nicholson J,
    3. Darzi A

    . Preparing for precision medicine. N Engl J Med 2012;366(6):489–491.

    1. Yachida S,
    2. Jones S,
    3. Bozic I,
    4. et al

    . Distant metastasis occurs late during the genetic evolution of pancreatic cancer. Nature 2010;467(7319):1114–1117.

    1. Gerlinger M,
    2. Rowan AJ,
    3. Horswell S,
    4. et al

    . Intratumor heterogeneity and branched evolution revealed by multiregion sequencing. N Engl J Med2012;366(10):883–892.

    1. Gerlinger M,
    2. Swanton C

    . How Darwinian models inform therapeutic failure initiated by clonal heterogeneity in cancer medicine. Br J Cancer2010;103(8):1139–1143.

    1. Kern SE

    . Why your new cancer biomarker may never work: recurrent patterns and remarkable diversity in biomarker failures. Cancer Res2012;72(23):6097–6101.

    1. Nowell PC

    . The clonal evolution of tumor cell populations. Science1976;194(4260):23–28.

    1. Greaves M,
    2. Maley CC

    . Clonal evolution in cancer. Nature2012;481(7381):306–313.

    1. Vincent TL,
    2. Brown JS

    . Evolutionary game theory, natural selection and Darwinian dynamics. Cambridge, England: Cambridge University Press, 2005.

    1. Gatenby RA,
    2. Gillies RJ

    . A microenvironmental model of carcinogenesis. Nat Rev Cancer 2008;8(1):56–61.

    1. Bowers MA,
    2. Matter SF

    . Landscape ecology of mammals: relationships between density and patch size. J Mammal 1997;78(4):999–1013.

    1. Dorner BK,
    2. Lertzman KP,
    3. Fall J

    . Landscape pattern in topographically complex landscapes: issues and techniques for analysis. Landscape Ecol2002;17(8):729–743.

    1. González-García I,
    2. Solé RV,
    3. Costa J

    . Metapopulation dynamics and spatial heterogeneity in cancer. Proc Natl Acad Sci U S A2002;99(20):13085–13089.

    1. Patel LR,
    2. Nykter M,
    3. Chen K,
    4. Zhang W

    . Cancer genome sequencing: understanding malignancy as a disease of the genome, its conformation, and its evolution. Cancer Lett 2012 Oct 27. [Epub ahead of print]

    1. Jaffe CC

    . Measures of response: RECIST, WHO, and new alternatives. J Clin Oncol 2006;24(20):3245–3251.

    1. Burton A

    . RECIST: right time to renovate? Lancet Oncol2007;8(6):464–465.

    1. Lambin P,
    2. Rios-Velazquez E,
    3. Leijenaar R,
    4. et al

    . Radiomics: extracting more information from medical images using advanced feature analysis. Eur J Cancer 2012;48(4):441–446.

    1. Nair VS,
    2. Gevaert O,
    3. Davidzon G,
    4. et al

    . Prognostic PET 18F-FDG uptake imaging features are associated with major oncogenomic alterations in patients with resected non-small cell lung cancer. Cancer Res2012;72(15):3725–3734.

    1. Diehn M,
    2. Nardini C,
    3. Wang DS,
    4. et al

    . Identification of noninvasive imaging surrogates for brain tumor gene-expression modules. Proc Natl Acad Sci U S A 2008;105(13):5213–5218.

    1. Segal E,
    2. Sirlin CB,
    3. Ooi C,
    4. et al

    . Decoding global gene expression programs in liver cancer by noninvasive imaging. Nat Biotechnol 2007;25(6):675–680.

    1. Tixier F,
    2. Le Rest CC,
    3. Hatt M,
    4. et al

    . Intratumor heterogeneity characterized by textural features on baseline 18F-FDG PET images predicts response to concomitant radiochemotherapy in esophageal cancer. J Nucl Med2011;52(3):369–378.

    1. Pang KK,
    2. Hughes T

    . MR imaging of the musculoskeletal soft tissue mass: is heterogeneity a sign of malignancy? J Chin Med Assoc2003;66(11):655–661.

    1. Ganeshan B,
    2. Panayiotou E,
    3. Burnand K,
    4. Dizdarevic S,
    5. Miles K

    . Tumour heterogeneity in non-small cell lung carcinoma assessed by CT texture analysis: a potential marker of survival. Eur Radiol 2012;22(4):796–802.

    1. Asselin MC,
    2. O’Connor JP,
    3. Boellaard R,
    4. Thacker NA,
    5. Jackson A

    . Quantifying heterogeneity in human tumours using MRI and PET. Eur J Cancer2012;48(4):447–455.

    1. Ahmed A,
    2. Gibbs P,
    3. Pickles M,
    4. Turnbull L

    . Texture analysis in assessment and prediction of chemotherapy response in breast cancer. J Magn Reson Imaging doi:10.1002/jmri.23971 2012. Published online December 13, 2012.

    1. Kawata Y,
    2. Niki N,
    3. Ohmatsu H,
    4. et al

    . Quantitative classification based on CT histogram analysis of non-small cell lung cancer: correlation with histopathological characteristics and recurrence-free survival. Med Phys2012;39(2):988–1000.

    1. Rubin DL

    . Creating and curating a terminology for radiology: ontology modeling and analysis. J Digit Imaging 2008;21(4):355–362.

    1. Opulencia P,
    2. Channin DS,
    3. Raicu DS,
    4. Furst JD

    . Mapping LIDC, RadLex™, and lung nodule image features. J Digit Imaging 2011;24(2):256–270.

    1. Channin DS,
    2. Mongkolwat P,
    3. Kleper V,
    4. Rubin DL

    . The Annotation and Image Mark-up project. Radiology 2009;253(3):590–592.

    1. Rubin DL,
    2. Mongkolwat P,
    3. Kleper V,
    4. Supekar K,
    5. Channin DS

    . Medical imaging on the semantic web: annotation and image markup. Presented at the AAAI Spring Symposium Series, Semantic Scientific Knowledge Integration, Palo Alto, Calif, March 26–28, 2008.

    1. Goh V,
    2. Ganeshan B,
    3. Nathan P,
    4. Juttla JK,
    5. Vinayan A,
    6. Miles KA

    . Assessment of response to tyrosine kinase inhibitors in metastatic renal cell cancer: CT texture as a predictive biomarker. Radiology 2011;261(1):165–171.

    1. Miles KA,
    2. Ganeshan B,
    3. Griffiths MR,
    4. Young RC,
    5. Chatwin CR

    . Colorectal cancer: texture analysis of portal phase hepatic CT images as a potential marker of survival. Radiology 2009;250(2):444–452.

    1. Haralick RM,
    2. Shanmugam K,
    3. Dinstein I

    . Textural features for image classification. IEEE Trans Syst Man Cybern 1973;3(6):610–621.

    1. Yang X,
    2. Knopp MV

    . Quantifying tumor vascular heterogeneity with dynamic contrast-enhanced magnetic resonance imaging: a review. J Biomed Biotechnol 2011;2011:732848.

    1. Frouin F,
    2. Bazin JP,
    3. Di Paola M,
    4. Jolivet O,
    5. Di Paola R

    . FAMIS: a software package for functional feature extraction from biomedical multidimensional images. Comput Med Imaging Graph 1992;16(2):81–91.

    1. Frouge C,
    2. Guinebretière JM,
    3. Contesso G,
    4. Di Paola R,
    5. Bléry M

    . Correlation between contrast enhancement in dynamic magnetic resonance imaging of the breast and tumor angiogenesis. Invest Radiol 1994;29(12):1043–1049.

    1. Zagdanski AM,
    2. Sigal R,
    3. Bosq J,
    4. Bazin JP,
    5. Vanel D,
    6. Di Paola R

    . Factor analysis of medical image sequences in MR of head and neck tumors. AJNR Am J Neuroradiol 1994;15(7):1359–1368.

    1. Bonnerot V,
    2. Charpentier A,
    3. Frouin F,
    4. Kalifa C,
    5. Vanel D,
    6. Di Paola R

    . Factor analysis of dynamic magnetic resonance imaging in predicting the response of osteosarcoma to chemotherapy. Invest Radiol 1992;27(10):847–855.

    1. Furman-Haran E,
    2. Grobgeld D,
    3. Kelcz F,
    4. Degani H

    . Critical role of spatial resolution in dynamic contrast-enhanced breast MRI. J Magn Reson Imaging2001;13(6):862–867.

    1. Rose CJ,
    2. Mills SJ,
    3. O’Connor JPB,
    4. et al

    . Quantifying spatial heterogeneity in dynamic contrast-enhanced MRI parameter maps. Magn Reson Med2009;62(2):488–499.

    1. Canuto HC,
    2. McLachlan C,
    3. Kettunen MI,
    4. et al

    . Characterization of image heterogeneity using 2D Minkowski functionals increases the sensitivity of detection of a targeted MRI contrast agent. Magn Reson Med2009;61(5):1218–1224.

    1. Lloyd MC,
    2. Allam-Nandyala P,
    3. Purohit CN,
    4. Burke N,
    5. Coppola D,
    6. Bui MM

    . Using image analysis as a tool for assessment of prognostic and predictive biomarkers for breast cancer: how reliable is it? J Pathol Inform2010;1:29–36.

    1. Kumar V,
    2. Nath K,
    3. Berman CG,
    4. et al

    . Variance of SUVs for FDG-PET/CT is greater in clinical practice than under ideal study settings. Clin Nucl Med2013;38(3):175–182.

    1. Walker-Samuel S,
    2. Orton M,
    3. Boult JK,
    4. Robinson SP

    . Improving apparent diffusion coefficient estimates and elucidating tumor heterogeneity using Bayesian adaptive smoothing. Magn Reson Med 2011;65(2):438–447.

    1. Thews O,
    2. Nowak M,
    3. Sauvant C,
    4. Gekle M

    . Hypoxia-induced extracellular acidosis increases p-glycoprotein activity and chemoresistance in tumors in vivo via p38 signaling pathway. Adv Exp Med Biol 2011;701:115–122.

    1. Thews O,
    2. Dillenburg W,
    3. Rösch F,
    4. Fellner M

    . PET imaging of the impact of extracellular pH and MAP kinases on the p-glycoprotein (Pgp) activity. Adv Exp Med Biol 2013;765:279–286.

    1. Araújo MB,
    2. Peterson AT

    . Uses and misuses of bioclimatic envelope modeling. Ecology 2012;93(7):1527–1539.

    1. Larsen PE,
    2. Gibbons SM,
    3. Gilbert JA

    . Modeling microbial community structure and functional diversity across time and space. FEMS Microbiol Lett2012;332(2):91–98.

    1. Shenton W,
    2. Bond NR,
    3. Yen JD,
    4. Mac Nally R

    . Putting the “ecology” into environmental flows: ecological dynamics and demographic modelling. Environ Manage 2012;50(1):1–10.

    1. Clark MC,
    2. Hall LO,
    3. Goldgof DB,
    4. Velthuizen R,
    5. Murtagh FR,
    6. Silbiger MS

    .Automatic tumor segmentation using knowledge-based techniques. IEEE Trans Med Imaging 1998;17(2):187–201.

Read Full Post »