Feeds:
Posts
Comments

Posts Tagged ‘Health Care Delivery’

Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting

 

Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting

Reporter: Stephen J. Williams, PhD

Article ID #169: Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting. Published on 3/11/2015

WordCloud Image Produced by Adam Tubman

Achievement Beyond Regulatory Approval – Design for Commercial Success

philly2nightStephen J. Williams, Ph.D.: Reporter

The Mid-Atlantic group Life Sciences Collaborative, a select group of industry veterans and executives from the pharmaceutical, biotechnology, and medical device sectors whose mission is to increase the success of emerging life sciences businesses in the Mid-Atlantic region through networking, education, training and mentorship, met Tuesday March 3, 2015 at the University of the Sciences in Philadelphia (USP) to discuss post-approval regulatory issues and concerns such as designing strong patent protection, developing strategies for insurance reimbursement, and securing financing for any stage of a business.

The meeting was divided into three panel discussions and keynote speech:

  1. Panel 1: Design for Market Protection– Intellectual Property Strategy Planning
  2. Panel 2: Design for Market Success– Commercial Strategy Planning
  3. Panel 3: Design for Investment– Financing Each Stage
  4. Keynote Speaker: Robert Radie, President & CEO Egalet Corporation

Below are Notes from each PANEL Discussion:

For more information about the Life Sciences Collaborative SEE

Website: http://www.lifesciencescollaborative.org/

Or On Facebook

Or On Twitter @LSCollaborative

Panel 1: Design for Market Protection; Intellectual Property Strategy Planning

Take-home Message: Developing a very strong Intellectual Property (IP) portfolio and strategy for a startup is CRITICALLY IMPORTANT for its long-term success. Potential investors, partners, and acquirers will focus on the strength of a startup’s IP so important to take advantage of the legal services available. Do your DUE DIGILENCE.

Panelists:

John F. Ritter, J.D.., MBA; Director Office Tech. Licensing Princeton University

Cozette McAvoy; Senior Attorney Novartis Oncology Pharma Patents

Ryan O’Donnell; Partner Volpe & Koenig

Panel Moderator: Dipanjan “DJ” Nag, PhD, MBA, CLP, RTTP; President CEO IP Shaktl, LLC

Notes:

Dr. Nag:

  • Sometimes IP can be a double edged sword; e.g. Herbert Boyer with Paul Berg and Stanley Cohen credited with developing recombinant technology but they did not keep the IP strict and opened the door for a biotech revolution (see nice review from Chemical Heritage Foundation).
  • Naked patent licenses are most profitable when try to sell IP

John Ritter: Mr. Ritter gave Princeton University’s perspective on developing and promoting a university-based IP portfolio.

  • 30-40% of Princeton’s IP portfolio is related to life sciences
  • Universities will prefer to seek provisional patent status as a quicker process and allows for publication
  • Princeton will work closely with investigators to walk them through process – Very Important to have support system in place INCLUDING helping investigators and early startups establish a STRONG startup MANAGEMENT TEAM, and making important introductions to and DEVELOPING RELATIONSHIOPS with investors, angels
  • Good to cast a wide net when looking at early development partners like pharma
  • Good example of university which takes active role in developing startups is University of Pennsylvania’s Penn UPstart program.
  • Last 2 years many universities filing patents for startups as a micro-entity

Comment from attendee: Universities are not using enough of their endowments for purpose of startups. Princeton only using $500,00 for accelerator program.

Cozette McAvoy: Mrs. McAvoy talked about monetizing your IP from an industry perspective

  • Industry now is looking at “indirect monetization” of their and others IP portfolio. Indirect monetization refers to unlocking the “indirect value” of intellectual property; for example research tools, processes, which may or may not be related to a tangible product.
  • Good to make a contractual bundle of IP – “days of the $million check is gone”
  • Big companies like big pharma looks to PR (press relation) buzz surrounding new technology, products SO IMPORTANT FOR STARTUP TO FOCUS ON YOUR PR

Ryan O’Donnell: talked about how life science IP has changed especially due to America Invests Act

  • Need to develop a GLOBAL IP strategy so whether drug or device can market in multiple countries
  • Diagnostics and genes not patentable now – Major shift in patent strategy
  • Companies like Unified Patents can protect you against the patent trolls – if patent threatened by patent troll (patent assertion entity) will file a petition with the USPTO (US Patent Office) requesting institution of inter partes review (IPR); this may cost $40,000 BUT WELL WORTH the money – BE PROACTIVE about your patents and IP

Panel 2: Design for Market Success; Commercial Strategy Planning

Take-home Message: Commercial strategy development is defined market facing data, reimbursement strategies and commercial planning that inform labeling requirements, clinical study designs, healthcare economic outcomes and pricing targets. Clarity from payers is extremely important to develop any market strategy. Develop this strategy early and seek advice from payers.

Panelists:

David Blaszczak; Founder, Precipio Health Strategies

Terri Bernacchi, PharmD, MBA; Founder & President Cambria Health Advisory Professionals

Paul Firuta; President US Commercial Operations, NPS Pharma

 

Panel Moderator: Matt Cabrey; Executive Director, Select Greater Philadelphia

 

Notes:

David Blaszczak:

  • Commercial payers are bundling payment: most important to get clarity from these payers
  • Payers are using clinical trials to alter marketing (labeling) so IMPORTANT to BUILD LABEL in early clinical trial phases (phase I or II)
  • When in early phases of small company best now to team or partner with a Medicare or PBM (pharmacy benefit manager) and payers to help develop and spot tier1 and tier 2 companies in their area

Terri Bernacchi:

  • Building relationship with the payer is very important but firms like hers will also look to patients and advocacy groups to see how they respond to a given therapy and decrease the price risk by bundling
  • Value-based contracting with manufacturers can save patient and payer $$
  • As most PBMs formularies are 80% generics goal is how to make money off of generics
  • Patent extension would have greatest impact on price, value

Paul Firuta:

  • NPS Pharma developing a pharmacy benefit program for orphan diseases
  • How you pay depends on mix of Medicare, private payers now
  • Most important change which could affect price is change in compliance regulations

Panel 3: Design for Investment; Financing Each Stage

Take-home Message: VC is a personal relationship so spend time making those relationships. Do your preparation on your value and your market. Look to non-VC avenues: they are out there.

Panelists:

Ting Pau Oei; Managing Director, Easton Capital (NYC)

Manya Deehr; CEO & Founder, Pediva Therapeutics

Sanjoy Dutta, PhD; Assistant VP, Translational Devel. & Intl. Res., Juvenile Diabetes Research Foundation

 

Panel Moderator: Shahram Hejazi, PhD; Venture Partner, BioAdvance

  • In 2000 his experience finding 1st capital was what are your assets; now has changed to value

Notes:

Ting Pau Oei:

  • Your very 1st capital is all about VALUE– so plan where you add value
  • Venture Capital is a PERSONAL RELATIONSHIP
  • 1) you need the management team, 2) be able to communicate effectively                  (Powerpoint, elevator pitch, business plan) and #1 and #2 will get you important 2nd Venture Capital meeting; VC’s don’t decide anything in 1st meeting
  • VC’s don’t normally do a good job of premarket valuation or premarket due diligence but know post market valuation well
  • Best advice: show some phase 2 milestones and VC will knock on your door

Manya Deehr:

  • Investment is more niche oriented so find your niche investors
  • Define your product first and then match the investors
  • Biggest failure she has experienced: companies that go out too early looking for capital

Dr. Dutta: funding from a non-profit patient advocacy group perspective

  • Your First Capital: find alliances which can help you get out of “valley of death
  • Develop a targeted product and patient treatment profile
  • Non-profit groups ask three questions:

1) what is the value to patients (non-profits want to partner)

2) what is your timeline (we can wait longer than VC; for example Cystic Fibrosis Foundation waited long time but got great returns for their patients with Kalydeco™)

3) when can we see return

  • Long-term market projections are the knowledge gaps that startups have (the landscape) and startups don’t have all the competitive intelligence
  • Have a plan B every step of the way

Other posts on this site related to Philadelphia Biotech, Startup Funding, Payer Issues, and Intellectual Property Issues include:

PCCI’s 7th Annual Roundtable “Crowdfunding for Life Sciences: A Bridge Over Troubled Waters?” May 12 2014 Embassy Suites Hotel, Chesterbrook PA 6:00-9:30 PM
The Vibrant Philly Biotech Scene: Focus on KannaLife Sciences and the Discipline and Potential of Pharmacognosy
The Vibrant Philly Biotech Scene: Focus on Computer-Aided Drug Design and Gfree Bio, LLC
The Vibrant Philly Biotech Scene: Focus on Vaccines and Philimmune, LLC
The Bioscience Crowdfunding Environment: The Bigger Better VC?
Foundations as a Funding Source
Venture Capital Funding in the Life Sciences: Phase4 Ventures – A Case Study
10 heart-focused apps & devices are crowdfunding for American Heart Association’s open innovation challenge
Funding, Deals & Partnerships
Medicare Panel Punts on Best Tx for Carotid Plaque
9:15AM–2:00PM, January 27, 2015 – Regulatory & Reimbursement Frameworks for Molecular Testing, LIVE @Silicon Valley 2015 Personalized Medicine World Conference, Mountain View, CA
FDA Commissioner, Dr. Margaret A. Hamburg on HealthCare for 310Million Americans and the Role of Personalized Medicine
Biosimilars: Intellectual Property Creation and Protection by Pioneer and by Biosimilar Manufacturers
Litigation on the Way: Broad Institute Gets Patent on Revolutionary Gene-Editing Method
The Patents for CRISPR, the DNA editing technology as the Biggest Biotech Discovery of the Century

 

 

Read Full Post »

The Union of Biomarkers and Drug Development

The Union of Biomarkers and Drug Development

Author and Curator: Larry H. Bernstein, MD, FCAP

There has been consolidation going on for over a decade in both thr pharmaceutical and in the diagnostics industry, and at the same time the page is being rewritten for health care delivery.  I shall try to work through a clear picture of these not coincidental events.

Key notables:

  1. A growing segment of the US population is reaching Medicare age
  2. There is also a large underserved population in both metropolitan and nonurban areas and a fragmentation of the middle class after a growth slowdown in the economy since the 2008 deep recession.
  3. The deep recession affecting worldwide economies was only buffered by availability of oil or natural gas.
  4. In addition, there was a self-destructive strategy to cut spending on national scales that withdrew the support that would bolster support for infrastrucrue renewl.
  5. There has been a dramatic success in the clinical diagnostics industry, with a long history of being viewed as a loss leader, and this has been recently followed by the pharmaceutical industry faced with inability to introduce new products, leading to more competition in off-patent medications.
  6. The introduction of the Accountable Care Act has opened the opportunities for improved care, despite political opposition, and has probably sustained opportunity in the healthcare market.

Let’s take a look at this three headed serpent. – Pharma, Diagnostics, New Entity
?  The patient  ?
?  Insurance    ?
?  Physician    ?

Part I.   The Concept

When Illumina Buys Roche: The Dawning Of The Era Of Diagnostics Dominance

Robert J. Easton, Alain J. Gilbert, Olivier Lesueur, Rachel Laing, and Mark Ratner
http://PharmaMedtechBI.com    | IN VIVO: The Business & Medicine Report Jul/Aug 2014; 32(7).

  • With current technology and resources, a well-funded IVD company can create and pursue a strategy of information gathering and informatics application to create medical knowledge, enabling it to assume the risk and manage certain segments of patients
  • We see the first step in the process as the emergence of new specialty therapy companies coming from an IVD legacy, most likely focused in cancer, infection, or critical care

When Illumina Inc. acquired the regulatory consulting firm Myraqa, a specialist in in vitro diagnostics (IVD), in July, the press release announcement characterized the deal as one that would bolster illumina’s in-house capabilities for clinical readiness and help prepare for its next growth phase in regulated markets. That’s not surprising given the US Food and Drug Administration’s (FDA) approval a year and a half ago of its MiSeq next-generation sequencer for clinical use. But the deal could also suggest illumina is beginning to move along the path toward taking on clinical risk – that is, eventually

  • advising physicians and patients, which would mean facing regulators directly

Such a move – by illumina, another life sciences tools firm, or an information specialist from the high-tech universe – is inevitable given

  • the emerging power of diagnostics and traditional health care players’ reluctance to themselves take on such risk.

Alternatively, we believe that a well-funded diagnostics company could establish this position. either way, such a champion would establish dominion over and earn higher valuation than less-aggressive players who

  • only supply compartmentalized drug and device solutions.

Diagnostics companies have long been dogged by a fundamental issue:

  1. they are viewed and valued more along the lines of a commodity business than as firms that deliver a unique product or service
  2. diagnostics companies are in position to do just that today because they are now advantaged by having access to more data points.
  3. if they were to cobble together the right capabilities, diagnostics companies would have the ability to turn information into true medical knowledge

Example: PathGEN PathChip

nucleic-acid-based platform detects 296 viruses, bacteria, fungi & parasites

http://ow.ly/d/2GvQhttp://ow.ly/DSORV

This puts the diagnostics player in an unfamiliar realm where it can ask the question of what value they offer compared with a therapeutic. The key is that diagnostics can now offer unique information and potentially unique tools to capture that information. In order to do so, it has to create information from the data it generates, and then to supply that knowledge to users who will value and act on that knowledge. Complex genomic tests, as much as physical examination, may be the first meaningful touch point for physicians’ classification of disease.

Even if lab tests are more expensive, it is a cheaper means for deciding what to do first for a patient than the trial and error of prescribing medication without adequate information. Information is gaining in value as the amount of treatment data available on genomically characterizable subpopulations increases. In such a circumstance
it is the ability to perform that advisory function that will add tremendous value above what any test provides, the leverage of being able to apply a proprietary diagnostics platform – and importantly, the data it generates. It is the ability to perform that advisory function that will add tremendous value above what any test provides.

Integrated Diagnostics Inc. and Biodesix Inc. with mass spectrometry has the tools for unraveling disease processes, and numerous players are quite visibly in or are getting into the business of providing medical knowledge and clinical decision support in pursuit of a huge payout for those who actually solve important disease mysteries. Of course one has to ask whether MS/MS is sufficient for the assigned task, and also whether the technology is ready for the kind of workload experienced in a clinical service compared to a research vehicle.  My impression (as a reviewer) is that it is not now the time to take this seriously.

Roche has not realized its intent with Ventana: failing to deliver on the promise of boosting Roche’s pipeline, which was a significant factor in the high price Roche paid. The combined company was to be “uniquely positioned to further expand Ventana’s business globally and together develop more cost-efficient, differentiated, and targeted medicines.  On the other hand,  Biodesix decided to use Veristrat to look back and analyze important trial data to try to ascertain which patients would benefit from ficlatuzumab (subset). The predictive effect for the otherwise unimpressive trial results was observed in both progression-free survival and overall survival endpoints, and encouraged the companies to conduct a proof-of-concept study of ficlatuzumab in combination with Tarceva in advanced Non Small Cell Lung Cancer Patients (NSCLC) selected using the Veristrat test.

A second phase of IVD evolution will be far more challenging to pharma, when the most accomplished companies begin to assemble and integrate much broader data
sets, thereby gaining knowledge sufficient to actually manage patients and dictate therapy, including drug selection. No individual physician has or will have access to all of this information on thousands of patients, combined with the informatics to tease out from trillions of data points the optimal personalized medical approach. When the IVD-origin knowledge integrator amasses enough data and understanding to guide therapy decisions in large categories, particularly drug choices, it will become more valuable than any of the drug suppliers.

This is an apparent reversal of fortune. The pharmaceutical industry has been considered the valued provider, while the IVD manufacturer has been the low valued cousin. Now, it is by an ability to make kore accurate the drug administration that the IVD company can control the drug bill, to the detriment of drug developers, by finding algorithms that generate equal-to-innovative-drug outcomes using generics for most of the patients, thereby limiting the margins of drug suppliers and the upsides for new drug discovery/development.

It is here that there appears to be a misunderstanding of the whole picture of the development of the healthcare industry.  The pharmaceutical industry had a high value added only insofar it could replace market leaders for treatment before or at the time of patent expiration, which largely depended either introducing a new class of drug, or by relieving the current drug in its class of undesired toxicities or “side effects”.  Otherwise, the drug armamentarium was time limited to the expiration date. In other words, the value was dependent on a window of no competition.  In addition, as the regulation of healthcare costs were tightening under managed care, the introduction of new products that were deemed to be only marginally better, could be substitued by “off-patent” drug products.

The other misunderstanding is related to the IVD sector.  Laboratory tests in the 1950’s were manual, and they could be done by “technicians” who might not have completed a specialized training in clinical laboratory sciences.  The first sign of progress was the introduction of continuous flow chemistry, with a sampling probe, tubing to bring the reacting reagents into a photocell, and the timing of the reaction controlled by a coiled glass tubing before introducing the colored product into a uv-visible photometer.  In perhaps a decade, the Technicon SMA 12 and 6 instruments were introduced that could do up to 18 tests from a single sample.

Part 2. Emergence of an IVD Clinical Automated Diagnostics Industry

Why tests are ordered

  1. Screening
  2. Diagnosis
  3. Monitoring

Historical Perspective

Case in Point 1:  Outstanding Contributions in Clinical Chemistry. 1991. Arthur Karmen.

Dr. Karmen was born in New York City in 1930. He graduated from the Bronx High School of Science in 1946 and earned an A.B. and M.D. in 1950 and 1954, respectively, from New York University. In 1952, while a medical student working on a summer project at Memorial-Sloan Kettering, he used paper chromatography of amino acids to demonstrate the presence of glutamic-oxaloacetic and glutaniic-pyruvic ransaminases (aspartate and alanine aminotransferases) in serum and blood. In 1954, he devised the spectrophotometric method for measuring aspartate aminotransferase in serum, which, with minor modifications, is still used for diagnostic testing today. When developing this assay, he studied the reaction of NADH with serum and demonstrated the presence of lactate and malate dehydrogenases, both of which were also later used in diagnosis. Using the spectrophotometric method, he found that aspartate aminotransferase increased in the period immediately after an acute myocardial infarction and did the pilot studies that showed its diagnostic utility in heart and liver diseases.  This became as important as the EKG. It was replaced in cardiology usage by the MB isoenzyme of creatine kinase, which was driven by Burton Sobel’s work on infarct size, and later by the troponins.

Case in point 2: Arterial Blood Gases.  Van Slyke. National Academy of Sciences.

The test is used to determine the pH of the blood, the partial pressure of carbon dioxide and oxygen, and the bicarbonate level. Many blood gas analyzers will also report concentrations of lactate, hemoglobin, several electrolytes, oxyhemoglobin, carboxyhemoglobin and methemoglobin. ABG testing is mainly used in pulmonology and critical care medicine to determine gas exchange which reflect gas exchange across the alveolar-capillary membrane.

DONALD DEXTER VAN SLYKE died on May 4, 1971, after a long and productive career that spanned three generations of biochemists and physicians. He left behind not only a bibliography of 317 journal publications and 5 books, but also more than 100 persons who had worked with him and distinguished themselves in biochemistry and academic medicine. His doctoral thesis, with Gomberg at University of Michigan was published in the Journal of the American Chemical Society in 1907.  Van Slyke received an invitation from Dr. Simon Flexner, Director of the Rockefeller Institute, to come to New York for an interview. In 1911 he spent a year in Berlin with Emil Fischer, who was then the leading chemist of the scientific world. He was particularly impressed by Fischer’s performing all laboratory operations quantitatively —a procedure Van followed throughout his life. Prior to going to Berlin, he published the  classic nitrous acid method for the quantitative determination of primary aliphatic amino groups,  the first of the many gasometric procedures devised by Van Slyke, and made possible the determination of amino acids. It was the primary method used to study amino acid

composition of proteins for years before chromatography. Thus, his first seven postdoctoral years were centered around the development of better methodology for protein composition and amino acid metabolism.

With his colleague G. M. Meyer, he first demonstrated that amino acids, liberated during digestion in the intestine, are absorbed into the bloodstream, that they are removed by the tissues, and that the liver alone possesses the ability to convert the amino acid nitrogen into urea.  From the study of the kinetics of urease action, Van Slyke and Cullen developed equations that depended upon two reactions: (1) the combination of enzyme and substrate in stoichiometric proportions and (2) the reaction of the combination into the end products. Published in 1914, this formulation, involving two velocity constants, was similar to that arrived at contemporaneously by Michaelis and Menten in Germany in 1913.

He transferred to the Rockefeller Institute’s Hospital in 2013, under Dr. Rufus Cole, where “Men who were studying disease clinically had the right to go as deeply into its fundamental nature as their training allowed, and in the Rockefeller Institute’s Hospital every man who was caring for patients should also be engaged in more fundamental study”.  The study of diabetes was already under way by Dr. F. M. Allen, but patients inevitably died of acidosis.  Van Slyke reasoned that if incomplete oxidation of fatty acids in the body led to the accumulation of acetoacetic and beta-hydroxybutyric acids in the blood, then a reaction would result between these acids and the bicarbonate ions that would lead to a lower than-normal bicarbonate concentration in blood plasma. The problem thus became one of devising an analytical method that would permit the quantitative determination of bicarbonate concentration in small amounts of blood plasma.  He ingeniously devised a volumetric glass apparatus that was easy to use and required less than ten minutes for the determination of the total carbon dioxide in one cubic centimeter of plasma.  It also was soon found to be an excellent apparatus by which to determine blood oxygen concentrations, thus leading to measurements of the percentage saturation of blood hemoglobin with oxygen. This found extensive application in the study of respiratory diseases, such as pneumonia and tuberculosis. It also led to the quantitative study of cyanosis and a monograph on the subject by C. Lundsgaard and Van Slyke.

In all, Van Slyke and his colleagues published twenty-one papers under the general title “Studies of Acidosis,” beginning in 1917 and ending in 1934. They included not only chemical manifestations of acidosis, but Van Slyke, in No. 17 of the series (1921), elaborated and expanded the subject to describe in chemical terms the normal and abnormal variations in the acid-base balance of the blood. This was a landmark in understanding acid-base balance pathology.  Within seven years after Van moved to the Hospital, he had published a total of fifty-three papers, thirty-three of them coauthored with clinical colleagues.

In 1920, Van Slyke and his colleagues undertook a comprehensive investigation of gas and electrolyte equilibria in blood. McLean and Henderson at Harvard had made preliminary studies of blood as a physico-chemical system, but realized that Van Slyke and his colleagues at the Rockefeller Hospital had superior techniques and the facilities necessary for such an undertaking. A collaboration thereupon began between the two laboratories, which resulted in rapid progress toward an exact physico-chemical description of the role of hemoglobin in the transport of oxygen and carbon dioxide, of the distribution of diffusible ions and water between erythrocytes and plasma,
and of factors such as degree of oxygenation of hemoglobin and hydrogen ion concentration that modified these distributions. In this Van Slyke revised his volumetric gas analysis apparatus into a manometric method.  The manometric apparatus proved to give results that were from five to ten times more accurate.

A series of papers on the CO2 titration curves of oxy- and deoxyhemoglobin, of oxygenated and reduced whole blood, and of blood subjected to different degrees of oxygenation and on the distribution of diffusible ions in blood resulted.  These developed equations that predicted the change in distribution of water and diffusible ions between blood plasma and blood cells when there was a change in pH of the oxygenated blood. A significant contribution of Van Slyke and his colleagues was the application of the Gibbs-Donnan Law to the blood—regarded as a two-phase system, in which one phase (the erythrocytes) contained a high concentration of nondiffusible negative ions, i.e., those associated with hemoglobin, and cations, which were not freely exchaThe importance of Vanngeable between cells and plasma. By changing the pH through varying the CO2 tension, the concentration of negative hemoglobin charges changed in a predictable amount. This, in turn, changed the distribution of diffusible anions such as Cl” and HCO3″ in order to restore the Gibbs-Donnan equilibrium. Redistribution of water occurred to restore osmotic equilibrium. The experimental results confirmed the predictions of the equations.

As a spin-off from the physico-chemical study of the blood, Van undertook, in 1922, to put the concept of buffer value of weak electrolytes on a mathematically exact basis.
This proved to be useful in determining buffer values of mixed, polyvalent, and amphoteric electrolytes, and put the understanding of buffering on a quantitative basis. A
monograph in Medicine entitled “Observation on the Courses of Different Types of Bright’s Disease, and on the Resultant Changes in Renal Anatomy,” was a landmark that
related the changes occurring at different stages of renal deterioration to the quantitative changes taking place in kidney function. During this period, Van Slyke and R. M. Archibald identified glutamine as the source of urinary ammonia. During World War II, Van and his colleagues documented the effect of shock on renal function and, with R. A. Phillips, developed a simple method, based on specific gravity, suitable for use in the field.

Over 100 of Van’s 300 publications were devoted to methodology. The importance of Van Slyke’s contribution to clinical chemical methodology cannot be overestimated.
These included the blood organic constituents (carbohydrates, fats, proteins, amino acids, urea, nonprotein nitrogen, and phospholipids) and the inorganic constituents (total cations, calcium, chlorides, phosphate, and the gases carbon dioxide, carbon monoxide, and nitrogen). It was said that a Van Slyke manometric apparatus was almost all the special equipment needed to perform most of the clinical chemical analyses customarily performed prior to the introduction of photocolorimeters and spectrophotometers for such determinations.

The progress made in the medical sciences in genetics, immunology, endocrinology, and antibiotics during the second half of the twentieth century obscures at times the progress that was made in basic and necessary biochemical knowledge during the first half. Methods capable of giving accurate quantitative chemical information on biological material had to be painstakingly devised; basic questions on chemical behavior and metabolism had to be answered; and, finally, those factors that adversely modified the normal chemical reactions in the body so that abnormal conditions arise that we characterize as disease states had to be identified.

Viewed in retrospect, he combined in one scientific lifetime (1) basic contributions to the chemistry of body constituents and their chemical behavior in the body, (2) a chemical understanding of physiological functions of certain organ systems (notably the respiratory and renal), and (3) how such information could be exploited in the
understanding and treatment of disease. That outstanding additions to knowledge in all three categories were possible was in large measure due to his sound and broadly based chemical preparation, his ingenuity in devising means of accurate measurements of chemical constituents, and the opportunity given him at the Hospital of the Rockefeller Institute to study disease in company with physicians.

In addition, he found time to work collaboratively with Dr. John P. Peters of Yale on the classic, two-volume Quantitative Clinical Chemistry. In 1922, John P. Peters, who had just gone to Yale from Van Slyke’s laboratory as an Associate Professor of Medicine, was asked by a publisher to write a modest handbook for clinicians describing useful chemical methods and discussing their application to clinical problems. It was originally to be called “Quantitative Chemistry in Clinical Medicine.” He soon found that it was going to be a bigger job than he could handle alone and asked Van Slyke to join him in writing it. Van agreed, and the two men proceeded to draw up an outline and divide up the writing of the first drafts of the chapters between them. They also agreed to exchange each chapter until it met the satisfaction of both.At the time it was published in 1931, it contained practically all that could be stated with confidence about those aspects of disease that could be and had been studied by chemical means. It was widely accepted throughout the medical world as the “Bible” of quantitative clinical chemistry, and to this day some of the chapters have not become outdated.

History of Laboratory Medicine at Yale University.

The roots of the Department of Laboratory Medicine at Yale can be traced back to John Peters, the head of what he called the “Chemical Division” of the Department of Internal Medicine, subsequently known as the Section of Metabolism, who co-authored with Donald Van Slyke the landmark 1931 textbook Quantitative Clinical Chemistry (2.3); and to Pauline Hald, research collaborator of Dr. Peters who subsequently served as Director of Clinical Chemistry at Yale-New Haven Hospital for many years. In 1947, Miss Hald reported the very first flame photometric measurements of sodium and potassium in serum (4). This study helped to lay the foundation for modern studies of metabolism and their application to clinical care.

The Laboratory Medicine program at Yale had its inception in 1958 as a section of Internal Medicine under the leadership of David Seligson. In 1965, Laboratory Medicine achieved autonomous section status and in 1971, became a full-fledged academic department. Dr. Seligson, who served as the first Chair, pioneered modern automation and computerized data processing in the clinical laboratory. In particular, he demonstrated the feasibility of discrete sample handling for automation that is now the basis of virtually all automated chemistry analyzers. In addition, Seligson and Zetner demonstrated the first clinical use of atomic absorption spectrophotometry. He was one of the founding members of the major Laboratory Medicine academic society, the Academy of Clinical Laboratory Physicians and Scientists.

Davenport fig 10.jpg

Case in Point 3.  Nathan Gochman.  Developer of Automated Chemistries.

Nathan Gochman, PhD, has over 40 years of experience in the clinical diagnostics industry. This includes academic teaching and research, and 30 years in the pharmaceutical and in vitro diagnostics industry. He has managed R & D, technical marketing and technical support departments. As a leader in the industry he was President of the American Association for Clinical Chemistry (AACC) and the National Committee for Clinical Laboratory Standards (NCCLS, now CLSI). He is currently a Consultant to investment firms and IVD companies.

Nathan Gochman

Nathan Gochman

The clinical laboratory has become so productive, particularly in chemistry and immunology, and the labor, instrument and reagent costs are well determined, that today a physician’s medical decisions are 80% determined by the clinical laboratory.  Medical information systems have lagged far behind.  Why is that?  Because the decision for a MIS has historical been based on billing capture.  Moreover, the historical use of chemical profiles were quite good at validating healthy dtatus in an outpatient population, but the profiles became restricted under Diagnostic Related Groups.    Thus, it came to be that the diagnostics was considered a “commodity”.  In order to be competitive, a laboratory had to provide “high complexity” tests that were drawn in by a large volume of “moderate complexity”tests.

Part 3. Biomarkers in Medical Practice

Case in Point 1.

A Solid Prognostic Biomarker

HDL-C: Target of Therapy or Fuggedaboutit?

Steven E. Nissen, MD, MACC, Peter Libby, MD

DisclosuresNovember 06, 2014

Steven E. Nissen, MD, MACC: I am Steve Nissen, chairman of the Department of Cardiovascular Medicine at the Cleveland Clinic. I am here with Dr Peter Libby, chief of cardiology at the Brigham and Women’s Hospital and professor of medicine at Harvard Medical School. We are going to discuss high-density lipoprotein cholesterol (HDL-C), a topic that has been very controversial recently. Peter, HDL-C has been a pretty good biomarker. The question is whether it is a good target.

Peter Libby, MD: Since the early days in Berkley, when they were doing ultracentrifugation, and when it was reinforced and put on the map by the Framingham Study,[1] we have known that HDL-C is an extremely good biomarker of prospective cardiovascular risk with an inverse relationship with all kinds of cardiovascular events. That is as solid a finding as you can get in observational epidemiology. It is a very reliable prospective marker. It’s natural that the pharmaceutical industry and those of us who are interested in risk reduction would focus on HDL-C as a target. That is where the controversies come in.

Dr Nissen: It has been difficult. My view is that the trials that have attempted to modulate HDL-C or the drugs they used have been flawed. Although the results have not been promising, the jury is yet out. Torcetrapib, the cholesteryl ester transfer protein (CETP) inhibitor developed by Pfizer, had anoff-target toxicity.[2] Niacin is not very effective, and there are a lot of downsides to the drug. That has been an issue, but people are still working on this. We have done some studies. We did our ApoA-1 Milano infusion study[3]about a decade ago, which showed very promising results with respect to shrinking plaques in coronary arteries. I remain open to the possibility that the right drug in the right trial will work.

Dr Libby: What do you do with the genetic data that have come out in the past couple of years? Sekar Kathiresan masterminded and organized an enormous collaboration[4] in which they looked, with contemporary genetics, at whether HDL had the genetic markers of being a causal risk factor. They came up empty-handed.

Dr Nissen: I am cautious about interpreting those data, like I am cautious about interpreting animal studies of atherosclerosis. We have both lived through this problem in which something works extremely well in animals but doesn’t work in humans, or it doesn’t work in animals but it works in humans. The genetic studies don’t seal the fate of HDL. I have an open mind about this. Drugs are complex. They work by complex mechanisms. It is my belief that what we have to do is test these hypotheses in well-designed clinical trials, which are rigorously performed with drugs that are clean—unlike torcetrapib—and don’t have off-target toxicities.

An Unmet Need: High Lp(a) Levels

Dr Nissen: I’m going to push back on that and make a couple of points. The HPS2-THRIVE study was flawed. They studied the wrong people. It was not a good study, and AIM-HIGH[8] was underpowered. I am not putting people on niacin. What do you do with a patient whose Lp(a) is 200 mg/dL?

Dr Libby: I’m waiting for the results of the PCSK9 and anacetrapib studies. You can tell me about evacetrapib.[9]Reducing Lp(a) is an unmet medical need. We both care for kindreds with high Lp(a) levels and premature coronary artery disease. We have no idea what to do with them other than to treat them with statins and lower their LDL-C levels.

Dr Nissen: I have taken a more cautious approach with respect to taking people off of niacin. If I have patients who are doing well and tolerating it (depending on why it was started), I am discontinuing niacin in some people. I am starting very few people on the drug, but I worry about the quality of the trial.

Dr Libby: So you are of the “don’t start don’t stop” school?

Dr Nissen: Yes. It’s difficult when the trial is fatally flawed. There were 11,000 patients from China in this study. I have known for years that if you give niacin to people of Asiatic ethnic descent, they have terrible flushing and they won’t continue the drug. One question is, what was the adherence? The adverse events would have been tolerable had there been efficacy. The concern here is that this study was destined to fail because they studied a low LDL/high HDL population, a group of people for whom niacin just isn’t used.

Triglycerides and HDL: Do We Have It Backwards?

Dr Libby: What about the recent genetic[10] and epidemiologic data that support triglycerides, and apolipoprotein C3 in particular as a causal risk factor? Have we been misled through all of the generations in whom we have been adjusting triglycerides for HDL-C and saying that triglycerides are not a causal risk factor because once we adjust for HDL, the risk goes away? Do you think we got it backwards?

Dr Nissen: The tricky factor here is that because of this intimate inverse relationship between triglycerides and HDL, we may be talking about the same phenomenon. That is one of the reasons that I am not certain we are not going to be able to find a therapy. What if you had a therapy that lowered triglycerides and raised HDL-C? Could that work? Could that combination be favorable? I want answers from rigorous, well-designed clinical trials that ask the right questions in the right populations. I am disappointed, just as I have been disappointed by the fibrate trials.[11,12] There is a class of drugs that raises HDL-C a little and lowers triglycerides a lot.

Dr Nissen: But the gemfibrozil studies (VA-HIT[13] and Helsinki Heart[14]) showed benefit.

The Dyslipidemia Bar Has Been Raised

Dr Libby: Those studies were from the pre-statin era. We both were involved in trials in which patients were on high-dose statins at baseline. Do you think that this is too high a bar?

Dr Nissen: The bar has been raised, and for the pharmaceutical industry, the studies that we need to find out whether lowering triglycerides or raising HDL is beneficial are going to be large. We are doing a study with evacetrapib. It has 12,000 patients. It’s fully enrolled. Evacetrapib is a very clean-looking drug. It doesn’t have such a long biological half-life as anacetrapib, so I am very encouraged that it won’t have that baggage of being around for 2-4 years. We’ve got a couple of shots on goal here. Don’t forget that we have multiple ongoing studies of HDL-C infusion therapies that are still under development. Those have some promise too. The jury is still out.

Dr Libby: We agree on the need to do rigorous, large-scale endpoint trials. Do the biomarker studies, but don’t wait to start the endpoint trial because that’s the proof in the pudding.

Dr Nissen: Exactly. We have had a little controversy about HDL-C. We often agree, but not always, and we may have a different perspective. Thanks for joining me in this interesting discussion of what will continue to be a controversial topic for the next several years until we get the results of the current ongoing trials.

Case in Point 2.

NSTEMI? Honesty in Coding and Communication?

Melissa Walton-Shirley

November 07, 2014

The complaint at ER triage: Weakness, fatigue, near syncope of several days’ duration, vomiting, and decreased sensorium.

The findings: O2sat: 88% on room air. BP: 88 systolic. Telemetry: Sinus tachycardia 120 bpm. Blood sugar: 500 mg/dL. Chest X ray: atelectasis. Urinalysis: pyuria. ECG: T-wave-inversion anterior leads. Echocardiography: normal left ventricular ejection fraction (LVEF) and wall motion. Troponin I: 0.3 ng/mL. CT angiography: negative for pulmonary embolism (PE). White blood cell count: 20K with left shift. Blood cultures: positive for Gram-negative rods.

The treatment: Intravenous fluids and IV levofloxacin—changed to ciprofloxacin.

The communication at discharge: “You had a severe urinary-tract infection and grew bacteria in your bloodstream. Also, you’ve had a slight heart attack. See your cardiologist immediately upon discharge-no more than 5 days from now.”

The diagnoses coded at discharge: Urosepsis and non-ST segment elevation MI (NSTEMI) 410.1.

One year earlier: This moderately obese patient was referred to our practice for a preoperative risk assessment. The surgery planned was a technically simple procedure, but due to the need for precise instrumentation, general endotracheal anesthesia (GETA) was being considered. The patient was diabetic, overweight, and short of air. A stress exam was equivocal for CAD due to poor exercise tolerance and suboptimal imaging. Upon further discussion, symptoms were progressive; therefore, cardiac cath was recommended, revealing angiographically normal coronaries and a predictably elevated left ventricular end diastolic pressure (LVEDP) in the mid-20s range. The patient was given a diagnosis of diastolic dysfunction, a prescription for better hypertension control, and in-depth discussion on exercise and the Mediterranean and DASH diets for weight loss. Symptoms improved with a low dose of diuretic. The surgery was completed without difficulty. Upon follow-up visit, the patient felt well, had lost a few pounds, and blood pressure was well controlled.

Five days after ER workup: While out of town, the patient developed profound weakness and went to the ER as described above. Fast forward to our office visit in the designated time frame of “no longer than 5 days’ postdischarge,” where the patient and family asked me about the “slight heart attack” that literally came on the heels of a normal coronary angiogram.

But the patient really didn’t have a “heart attack,” did they? The cardiologist aptly stated that it was likely nonspecific troponin I leak in his progress notes. Yet the hospitalist framed the diagnosis of NSTEMI as item number 2 in the final diagnoses.

The motivations on behalf of personnel who code charts are largely innocent and likely a direct result of the lack of understanding of the coding system on behalf of us as healthcare providers. I have a feeling, though, that hospitals aren’t anxious to correct this misperception, due to an opportunity for increased reimbursement. I contacted a director of a coding department for a large hospital who prefers to remain anonymous. She explained that NSTEMI ICD9 code 410.1 falls in DRG 282 with a weight of .7562. The diagnosis of “demand ischemia,” code 411.89, a slightly less inappropriate code for a nonspecific troponin I leak, falls in DRG 311 with a weight of .5662. To determine reimbursement, one must multiply the weight by the average hospital Medicare base rate of $5370. Keep in mind that each hospital’s base rate and corresponding payment will vary. The difference in reimbursement for a large hospital bill between these two choices for coding is substantial, at over $1000 difference ($4060 vs $3040).

Although hospitals that are already reeling from shrinking revenues will make more money on the front end by coding the troponin leak incorrectly as an NSTEMI, when multiple unnecessary tests are generated to follow up on a nondiagnostic troponin leak, the amount of available Centers for Medicare & Medicaid Services (CMS) reimbursement pie shrinks in the long run. Furthermore, this inappropriate categorization generates extreme concern on behalf of patients and family members that is often never laid to rest. The emotional toll of a “heart-attack” diagnosis has an impact on work fitness, quality of life, cost of medication, and the cost of future testing. If the patient lived for another 100 years, they will likely still list a “heart attack” in their medical history.

As a cardiologist, I resent the loose utilization of one of “my” heart-attack codes when it wasn’t that at all. At discharge, we need to develop a better way of communicating what exactly did happen. Equally important, we need to communicate what exactly didn’t happen as well.

Case in Point 3.

Blood Markers Predict CKD Heart Failure 

Published: Oct 3, 2014 | Updated: Oct 3, 2014

Elevated levels of high-sensitivity troponin T (hsTnT) and N-terminal pro-B-type natriuretic peptide (NT-proBNP) strongly predicted heart failure in patients with chronic kidney disease followed for a median of close to 6 years, researchers reported.

Compared with patients with the lowest blood levels of hsTnT, those with the highest had a nearly five-fold higher risk for developing heart failure and the risk was 10-fold higher in patients with the highest NT-proBNP levels compared with those with the lowest levels of the protein, researcher Nisha Bansal, MD, of the University of Washington in Seattle, and colleagues wrote online in the Journal of the American Society of Nephrology.

A separate study, published online in theJournal of the American Medical Association earlier in the week, also examined the comorbid conditions of heart and kidney disease, finding no benefit to the practice of treating cardiac surgery patients who developed acute kidney injury with infusions of the antihypertensive drug fenoldopam.

The study, reported by researcher Giovanni Landoni, MD, of the IRCCS San Raffaele Scientific Institute, Milan, Italy, and colleagues, was stopped early “for futility,” according to the authors, and the incidence of hypotension during drug infusion was significantly higher in patients infused with fenoldopam than placebo (26% vs. 15%; P=0.001).

Blood Markers Predict CKD Heart Failure

The study in patients with mild to moderate chronic kidney disease (CKD) was conducted to determine if blood markers could help identify patients at high risk for developing heart failure.

Heart failure is the most common cardiovascular complication among people with renal disease, occurring in about a quarter of CKD patients.

The two markers, hsTnT and NT-proBNP, are associated with overworked cardiac myocytes and have been shown to predict heart failure in the general population.

However, Bansal and colleagues noted, the markers have not been widely used in diagnosing heart failure among patients with CKD due to concerns that reduced renal excretion may raise levels of these markers, and therefore do not reflect an actual increase in heart muscle strain.

To better understand the importance of elevated concentrations of hsTnT and NT-proBNP in CKD patients, the researchers examined their association with incident heart failure events in 3,483 participants in the ongoing observational Chronic Renal Insufficiency Cohort (CRIC) study.

All participants were recruited from June 2003 to August 2008, and all were free of heart failure at baseline. The researchers used Cox regression to examine the association of baseline levels of hsTnT and NT-proBNP with incident heart failure after adjustment for demographic influences, traditional cardiovascular risk factors, makers of kidney disease, pertinent medication use, and mineral metabolism markers.

At baseline, hsTnT levels ranged from ≤5.0 to 378.7 pg/mL and NT-proBNP levels ranged from ≤5 to 35,000 pg/mL. Compared with patients who had undetectable hsTnT, those in the highest quartile (>26.5 ng/mL) had a significantly higher rate of heart failure (hazard ratio 4.77; 95% CI 2.49-9.14).

Compared with those in the lowest NT-proBNP quintile (<47.6 ng/mL), patients in the highest quintile (>433.0 ng/mL) experienced an almost 10-fold increase in heart failure risk (HR 9.57; 95% CI 4.40-20.83).

The researchers noted that these associations remained robust after adjustment for potential confounders and for the other biomarker, suggesting that while hsTnT and NT-proBNP are complementary, they may be indicative of distinct biological pathways for heart failure.

Even Modest Increases in NP-proBNP Linked to Heart Failure

The findings are consistent with an earlier analysis that included 8,000 patients with albuminuria in the Prevention of REnal and Vascular ENd-stage Disease (PREVEND) study, which showed that hsTnT was associated with incident cardiovascular events, even after adjustment for eGFR and severity of albuminuria.

“Among participants in the CRIC study, those with the highest quartile of detectable hsTnT had a twofold higher odds of left ventricular hypertrophy compared with those in the lowest quartile,” Bansal and colleagues wrote, adding that the findings were similar after excluding participants with any cardiovascular disease at baseline.

Even modest elevations in NT-proBNP were associated with significantly increased rates of heart failure, including in subgroups stratified by eGFR, proteinuria, and diabetic status.

“NT-proBNP regulates blood pressure and body fluid volume by its natriuretic and diuretic actions, arterial dilation, and inhibition of the renin-aldosterone-angiotensin system and increased levels of this marker likely reflect myocardial stress induced by subclinical changes in volume or pressure, even in persons without clinical disease,” the researchers wrote.

The researchers concluded that further studies are needed to develop and validate risk prediction tools for clinical heart failure in patients with CKD, and to determine the potential role of these two biomarkers in a heart failure risk prediction and prevention strategy.

Fenoldopam ‘Widely Promoted’ in AKI Cardiac Surgery Setting

The JAMA study examined whether the selective dopamine receptor D agonist fenoldopam mesylate can reduce the need for dialysis in cardiac surgery patients who develop acute kidney injury (AKI).

Fenoldopam induces vasodilation of the renal, mesenteric, peripheral, and coronary arteries, and, unlike dopamine, it has no significant affinity for D2 receptors, meaning that it theoretically induces greater vasodilation in the renal medulla than in the cortex, the researchers wrote.

“Because of these hemodynamic effects, fenoldopam has been widely promoted for the prevention and therapy of AKI in the United States and many other countries with apparent favorable results in cardiac surgery and other settings,” Landoni and colleagues wrote.

The drug was approved in 1997 by the FDA for the indication of in-hospital, short-term management of severe hypertension. It has not been approved for renal indications, but is commonly used off-label in cardiac surgery patients who develop AKI.

Although a meta analysis of randomized trials, conducted by the researchers, indicated a reduction in the incidence and progression of AKI associated with the treatment, Landoni and colleagues wrote that the absence of a definitive trial “leaves clinicians uncertain as to whether fenoldopam should be prescribed after cardiac surgery to prevent deterioration in renal function.”

To address this uncertainty, the researchers conducted a prospective, randomized, parallel-group trial in 667 patients treated at 19 hospitals in Italy from March 2008 to April 2013.

All patients had been admitted to ICUs after cardiac surgery with early acute kidney injury (≥50% increase of serum creatinine level from baseline or low output of urine for ≥6 hours). A total of 338 received fenoldopam by continuous intravenous infusion for a total of 96 hours or until ICU discharge, while 329 patients received saline infusions.

The primary end point was the rate of renal replacement therapy, and secondary end points included mortality (intensive care unit and 30-day mortality) and the rate of hypotension during study drug infusion.

Study Showed No Benefit, Was Stopped Early

Yale Lampoon – AA Liebow.   1954

Not As a Doctor
[Fourth Year]

These lyrics, sung by John Cole, Jack Gariepy and Ed Ransenhofer to music borrowed from Gilbert and Sullivan’s The Mikado, lampooned Averill Liebow, M.D., a pathologist noted for his demands on students. (CPC stands for clinical pathology conference.)

If you want to know what this is,
it’s a medical CPC
Where we give the house staff
the biz, for there’s no one so
wise as we!
We pathologists show them how,
Although it is too late now.
Our art is a sacred cow!

American physician, born 1911, Stryj in Galicia, Austria (now in Ukraine); died 1978.

Averill Abraham Liebow, born in Austria, was the “founding father” of pulmonary pathology in the United States. He started his career as a pathologist at Yale, where he remained for many years. In 1968 he moved to the University of California School of Medicine, San Diego, where he taught for 7 years as Professor and Chairman, Department of Pathology.

His studies include many classic studies of lung diseases. Best known of these is his famous classification of interstitial lung disease. He also published papers on sclerosing pneumocytoma, pulmonary alveolar proteinosis, meningothelial-like nodules, pulmonary hypertension, pulmonary veno-occlusive disease, lymphomatoid granulomatosis, pulmonary Langerhans cell histiocytosis, pulmonary epithelioid hemangioendothelioma and pulmonary hyalinizing granuloma .

As a Lieutenant Colonel in the US Army Medical Corps, He was a member of the Atomic Bomb Casualty Commission who studied the effects of the atomic bomb in Hiroshima and Nagasaki.

We thank Sanjay Mukhopadhyay, M.D., for information submitted.

As a resident at UCSD, Dr. Liebow held “Organ Recitals” every morning, including Mother’s day.  The organs had to be presented in specified order… heart, lung, and so forth.  On one occasion, we needed a heart for purification of human lactate dehydrogenase for a medical student project, so I presented the lung out of order.  Dr. Liebow asked where the heart was, and I told the group it was noprmal and I froze it for enzyme purification (smiles).  In the future show it to me first. He was generous to those who showed interest.  As I was also doing research in Nathan Kaplan’s laboratory, he made special arrangements for me to mentor Deborah Peters, the daughter of a pulmonary physician, and granddaughter of the Peters who collaborated with Van Slyke.  I mentored many students with great reward since then.  He could look at a slide and tell you what the x-ray looked like.  I didn’t encounter that again until he sent me to the Armed Forces Institute of Pathology, Washington, DC during the Vietnam War and Watergate, and I worked in Orthopedic Pathology with Lent C. Johnson.  He would not review a case without the x-ray, and he taught the radiologists.

Part 3

My Cancer Genome from Vanderbilt University: Matching Tumor Mutations to Therapies & Clinical Trials

Reporter: Aviva Lev-Ari, PhD, RN

My Cancer Genome from Vanderbilt University: Matching Tumor Mutations to Therapies & Clinical Trials


GenomOncology and Vanderbilt-Ingram Cancer Center (VICC) today announced a partnership for the exclusive commercial development of a decision support tool based on My Cancer Genome™, an online precision cancer medicine knowledge resource for physicians, patients, caregivers and researchers.

Through this collaboration, GenomOncology and VICC will enhance My Cancer Genome through the development of a new genomics content management tool. The MyCancerGenome.org website will remain free and open to the public. In addition, GenomOncology will develop a decision support tool based on My Cancer Genome™ data that will enable automated interpretation of mutations in the genome of a patient’s tumor, providing actionable results in hours versus days.

Vanderbilt-Ingram Cancer Center (VICC) launched My Cancer Genome™ in January 2011 as an integral part of their Personalized Cancer Medicine Initiative that helps physicians and researchers track the latest developments in precision cancer medicine and connect with clinical research trials. This web-based information tool is designed to quickly educate clinicians on the rapidly expanding list of genetic mutations that impact cancers and enable the research of treatment options based on specific mutations. For more information on My Cancer Genome™visit www.mycancergenome.org/about/what-is-my-cancer-genome.

Therapies based on the specific genetic alterations that underlie a patient’s cancer not only result in better outcomes but often have less adverse reactions

Up front fee

Nominal fee covers installation support, configuring the Workbench to your specification, designing and developing custom report(s) and training your team.

Per sample fee

GenomOncology is paid on signed-out clinical reports. This philosophy aligns GenomOncology with your Laboratory as we are incentivized to offer world-class support and solutions to differentiate your clinical NGS program. There is no annual license fee.

Part 4

Clinical Trial Services: Foundation Medicine & EmergingMed to Partner

Reporter: Aviva Lev-Ari, PhD, RN

Clinical Trial Services: Foundation Medicine & EmergingMed to Partner


Foundation Medicine and EmergingMed said today that they will partner to offer clinical trial navigation services for health care providers and their patients who have received one of Foundation Medicine’s tumor genomic profiling tests.

The firms will provide concierge services to help physicians

  • identify appropriate clinical trials for patients
  • based on the results of FoundationOne or FoundationOne Heme.

“By providing clinical trial navigation services, we aim to facilitate

  • timely and accurate clinical trial information and enrollment support services for physicians and patients,
  • enabling greater access to treatment options based on the unique genomic profile of a patient’s cancer

Currently, there are over 800 candidate therapies that target genomic alterations in clinical trials,

  • but “patients and physicians must identify and act on relevant options
  • when the patient’s clinical profile is aligned with the often short enrollment window for each trial.

These investigational therapies are an opportunity to engage patients with cancer whose cancer has progressed or returned following standard treatment in a most favorable second option after relapse.  The new service is unique in notifying when new clinical trials emerge that match a patient’s genomic and clinical profile.

Google signs on to Foundation Medicine cancer Dx by offering tests to employees

By Emily Wasserman

Diagnostics luminary Foundation Medicine ($FMI) is generating some upward momentum, fueled by growing revenues and the success of its clinical tests. Tech giant Google ($GOOG) has taken note and is signing onto the company’s cancer diagnostics by offering them to employees.

Foundation Medicine CEO Michael Pellini said during the company’s Q3 earnings call that Google will start covering its DNA tests for employees and their family members suffering from cancer as part of its health benefits portfolio, Reuters reports.

Both sides stand to benefit from the deal, as Google looks to keep a leg up on Silicon Valley competitors and Foundation Medicine expands its cancer diagnostics platform. Last month, Apple ($AAPL) and Facebook ($FB) announced that they would begin covering the cost of egg freezing for female employees. A diagnostics partnership and attractive health benefits could work wonders for Google’s employee retention rates and bottom line.

In the meantime, Cambridge, MA-based Foundation Medicine is charging full speed ahead with its cancer diagnostics platform after filing for an IPO in September 2013. The company chalked up 6,428 clinical tests during Q3 2014, an eye-popping 149% increase year over year, and brought in total revenue for the quarter of $16.4 million–a 100% leap from last year. Foundation Medicine credits the promising numbers in part to new diagnostic partnerships and extended coverage for its tests.

In January, the company teamed up with Novartis ($NVS) to help the drugmaker evaluate potential candidates for its cancer therapies. In April, Foundation Medicine announced that it would develop a companion diagnostic test for a Clovis Oncology ($CLVS) drug under development to treat patients with ovarian cancer, building on an ongoing collaboration between the two companies.

Foundation Medicine also has its sights set on China’s growing diagnostics market, inking a deal in October with WuXi PharmaTech ($WX) that allows the company to perform lab testing for its FoundationOne assay at WuXi’s Shanghai-based Genome Center.

a nod to the deal with Google during a corporate earnings call on Wednesday, according to a person who listened in. Pellini said Google employees were made aware of this new benefit last week.

Foundation Medicine teams with MD Anderson for new trial of cancer Dx

Second study to see if targeted therapy can change patient outcomes

August 15, 2014 | By   FierceDiagnostics

Foundation Medicine ($FMI) is teaming up with the MD Anderson Cancer Center in Texas for a new trial of the the Cambridge, MA-based company’s molecular diagnostic cancer test that targets therapies matched to individual patients.

The study is called IMPACT2 (Initiative for Molecular Profiling and Advanced Cancer Therapy) and is designed to build on results from the the first IMPACT study that found

  • 40% of the 1,144 patients enrolled had an identifiable genomic alteration.

The company said that

  • by matching specific gene alterations to therapies,
  • 27% of patients in the first study responded versus
  • 5% with an unmatched treatment, and
  • “progression-free survival” was longer in the matched group.

The FoundationOne molecular diagnostic test

  • combines genetic sequencing and data gathering
  • to help oncologists choose the best treatment for individual patients.

Costing $5,800 per test, FoundationOne’s technology can uncover a large number of genetic alterations for 200 cancer-related genes,

  • blending genomic sequencing, information and clinical practice.

“Based on the IMPACT1 data, a validated, comprehensive profiling approach has already been adopted by many academic and community-based oncology practices,” Vincent Miller, chief medical officer of Foundation Medicine, said in a release. “This study has the potential to yield sufficient evidence necessary to support broader adoption across most newly diagnosed metastatic tumors.”

The company got a boost last month when the New York State Department of Health approved Foundation Medicine’s two initial cancer tests: the FoundationOne test and FoundationOne Heme, which creates a genetic profile for blood cancers. Typically,

  • diagnostics companies struggle to win insurance approval for their tests
  • even after they gain a regulatory approval, leaving revenue growth relatively flat.

However, Foundation Medicine reported earlier this week its Q2 revenue reached $14.5 million compared to $5.9 million for the same period a year ago. Still,

  1. net losses continue to soar as the company ramps up
  2. its commercial and business development operation,
  • hitting $13.7 million versus a $10.1 million deficit in the second quarter of 2013.

Oncology

There has been a remarkable transformation in our understanding of

  • the molecular genetic basis of cancer and its treatment during the past decade or so.

In depth genetic and genomic analysis of cancers has revealed that

  • each cancer type can be sub-classified into many groups based on the genetic profiles and
  • this information can be used to develop new targeted therapies and treatment options for cancer patients.

This panel will explore the technologies that are facilitating our understanding of cancer, and

  • how this information is being used in novel approaches for clinical development and treatment.
Oncology _ Reprted by Dr. Aviva Lev-Ari, Founder, Leaders in Pharmaceutical Intelligence

Opening Speaker & Moderator:

Lynda Chin, M.D.
Department Chair, Department of Genomic Medicine
MD Anderson Cancer Center

  • Who pays for PM?
  • potential of Big data, analytics, Expert systems, so not each MD needs to see all cases, Profile disease to get same treatment
  • business model: IP, Discovery, sharing, ownership — yet accelerate therapy
  • security of healthcare data
  • segmentation of patient population
  • management of data and tracking innovations
  • platforms to be shared for innovations
  • study to be longitudinal,
  • How do we reconcile course of disease with PM
  • phinotyping the disease vs a Patient in wait for cure/treatment

Panelists:

Roy Herbst, M.D., Ph.D.
Ensign Professor of Medicine and Professor of Pharmacology;
Chief of Medical Oncology, Yale Cancer Center and Smilow Cancer Hospital

Development new drugs to match patient, disease and drug – finding the right patient for the right Clinical Trial

  • match patient to drugs
  • partnerships: out of 100 screened patients, 10 had the gene, 5 were able to attend the trial — without the biomarker — all 100 patients would participate for the WRONG drug for them (except the 5)
  • patients wants to participate in trials next to home NOT to have to travel — now it is in the protocol
  • Annotated Databases – clinical Trial informed consent – adaptive design of Clinical Trial vs protocol
  • even Academic MD can’t read the reports on Genomics
  • patients are treated in the community — more training to MDs
  • Five companies collaborating – comparison og 6 drugs in the same class
  • if drug exist and you have the patient — you must apply PM

Summary and Perspective:

The current changes in Biotechnology have been reviewed with an open question about the relationship of In Vitro Diagnostics to Biopharmaceuticals switching, with the potential, particularly in cancer and infectious diseases, to added value in targeted therapy by matching patients to the best potential treatment for a favorable outcome.

This reviewer does not see the movement of the major diagnostics leaders entering into the domain of direct patient care, even though there are signals in that direction.  The Roche example is perhaps the most interesting because Roche already became the elephant in the room after the introduction of Valium,  subsequently bought out Boehringer Mannheim Diagnostics to gain entry into the IVD market, and established a huge presence in Molecular Diagnostics early.  If it did anything to gain a foothold in the treatment realm, it would more likely forge a relationship with Foundation Medicine.  Abbott Laboratories more than a decade ago was overextended, and it had become the leader in IVD as a result of the specialty tests, but it fell into difficulties with quality control of its products in the high volume testing market, and acceeded to Olympus, Roche, and in the mid volume market to Beckman and Siemens.  Of course, Dupont and Kodak, pioneering companies in IVD, both left the market.

The biggest challenge in the long run is identified by the ability to eliminate many treatments that would be failures for a large number of patients. That has already met the proof of concept.  However, when you look at the size of the subgroups, we are not anywhere near a large scale endeavor.  In addition, there is a lot that has to be worked out that is not related to genomic expression by the “classic” model, but has to take into account the emrging knowledge and greater understanding of regulation of cell metabolism, not only in cancer, but also in chronic inflammatory diseases.

Read Full Post »

Reporter: Alan F. Kaul, PharmD. MS. MBA, FCCP

As a part of the Accountable Care Act (ACA) added to the Social Security Act, CMS under the Hospital Readmissions Reduction Program (HRRP) is required to reduce payments to Inpatient Prospective Payment System (IPPS) hospitals reporting excess readmissions commencing with discharges on October 1, 2012.

In the FY2012 IPPS final rule, readmission measures for Acute Myocardial Infarction (AMI), Heart Failure (HF), and Pneumonia (PN) and the calculated excess readmission ratio will be used, in part, to calculate the readmission payment adjustment under the HRRP. CMS using risk adjustment endorsed by the National Quality Forum (NQF) and adjustments for clinically relevant factors such as patient comorbidity, frailty, and demographics will use a national dataset to compare to each hospital’s set of patients. CMS will look-back at three years worth of each hospital’s discharges and a minimum of 25 cases to calculate a hospital’s excess readmission rate for each condition. Beginning FY 2013, each excess readmission ratio will look at discharges occurring from July 1, 2008 through June 30, 2011.

In its FY2013 proposed rules making cycle, CMS proposed to continue the process and which hospitals will be subject to the HRRP, the methodologies used in its calculations, and a process for hospitals to review the information and submit proposed corrections before the information is publically released.

While CMS moves forward with its HRRP, hospitals readiness for population-base accountable care is questionable. A national survey of all hospitals was undertaken in 2011 to assess the current state of hospital readiness in the development of Accountable Care Organizations (ACOs). There were 1,672 responses to the survey or a 34% response rate. Several major themes were identified such as:

1. Only a small percentage of hospitals (3%) participate in ACOs and only 10% were preparing to do so.
2. Hospitals expect their revenue sources fro risk-based financial reimbursement to double from 9% to 18% over the next two years.
3. Most hospitals were engaged in numerous care co-ordination methods, though there was variation in specific practices.
4. There are several perceived barriers between hospitals preparing to participate an ACO and hospitals participating in ACOs. Of note, the greatest challenges of those hospitals participating in ACOS were perceived to be reducing clinical variation and reducing costs. Those planning participation, the greatest challenge was perceived to be increasing the size of the covered patient population.
5. ACO hospitals are significantly more involved in population health management services including coordination across the continuum of care.
6. There are significant gaps in care coordination functionalities. For example, there is a low-use of risk stratification and other care coordination activities. As an example, only 38% of hospitals participating in ACOs, 33% planning their participating, and 24% of hospitals not exploring an ACO model assign case managers to patients at risk for hospital readmission for out-patient follow-up. Less than 25% of hospitals in each group have nurse case managers who work with patients with chronic diseases. Similarly, 23%, 21%, and 11% respectively of those hospitals participating, planning participation, or with no plans to participate in an ACO have a post-hospital discharge continuity of care program with scaled intensiveness. Approximately one-third of hospitals participating in or preparing to participate in an ACO have no chronic-care registry. About 20% have a registry for one condition and 40% for two conditions.
7. ACOs are striving to improve the quality of their services by using valid performance measures and making results available to the public and participating providers.

Sources:

https://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/AcuteInpatientPPS/Readmissions-Reduction-Program.html

Hospital Readiness for Population-based Accountable Care. Health Research and Educational Trust, Chicago:April 2012. Accessed at http://www.hope.org

Accessible at: http://www.hpoe.org/accointable-care

Read Full Post »