Feeds:
Posts
Comments

Archive for the ‘Population Health Management, Genetics & Pharmaceutical’ Category

Synthetic Biology: On Advanced Genome Interpretation for Gene Variants and Pathways: What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging

Curator: Aviva Lev-Ari, PhD, RN

Article ID #52: Synthetic Biology: On Advanced Genome Interpretation for Gene Variants and Pathways: What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging. Published on 5/17/2013

WordCloud Image Produced by Adam Tubman

UPDATED on 7/12/2021

  • Abstract. Synthetic biology is a field of scientific research that applies engineering principles to living organisms and living systems.
  • Introduction. This article is intended as a perspective on the field of synthetic biology. …
  • Genetic Manipulation—Plasmids. …
  • Genetic Manipulations—Genome. …
  • An Early Example of Synthetic Biology. …

UPDATED on 11/6/2018

Which biological systems should be engineered?

To solve real-world problems using emerging abilities in synthetic biology, research must focus on a few ambitious goals, argues Dan Fletcher, Professor of bioengineering and biophysics, and chair of the Department of Bioengineering at the University of California, Berkeley, USA. He is also a Chan Zuckerberg Biohub Investigator.
Start Quote

Artificial blood cells. Blood transfusions are crucial in treatments for everything from transplant surgery and cardiovascular procedures to car accidents, pregnancy-related complications and childhood malaria (see go.nature.com/2ozbfwt). In the United States alone, 36,000 units of red blood cells and 7,000 units of platelets are needed every day (see go.nature.com/2ycr2wo).

But maintaining an adequate supply of blood from voluntary donors can be challenging, especially in low- and middle-income countries. To complicate matters, blood from donors must be checked extensively to prevent the spread of infectious diseases, and can be kept for only a limited time — 42 days or 5 days for platelets alone. What if blood cells could be assembled from purified or synthesized components on demand?

In principle, cell-like compartments could be made that have the oxygen-carrying capacity of red blood cells or the clotting ability of platelets. The compartments would need to be built with molecules on their surfaces to protect the compartments from the immune system, resembling those on a normal blood cell. Other surface molecules would be needed to detect signals and trigger a response.

In the case of artificial platelets, that signal might be the protein collagen, to which circulating platelets are exposed when a blood vessel ruptures5. Such compartments would also need to be able to release certain molecules, such as factor V or the von Willebrand clotting factor. This could happen by building in a rudimentary form of exocytosis, for example, whereby a membrane-bound sac containing the molecule would be released by fusing with the compartment’s outer membrane.

It is already possible to encapsulate cytoplasmic components from living cells in membrane compartments6,7. Now a major challenge is developing ways to insert desired protein receptors into the lipid membrane8, along with reconstituting receptor signalling.

Red blood cells and platelets are good candidates for the first functionally useful synthetic cellular system because they lack nuclei. Complex functions such as nuclear transport, protein synthesis and protein trafficking wouldn’t have to be replicated. If successful, we might look back with horror on the current practice of bleeding one person to treat another.

Micrograph of red blood cells, 3 T-lymphocytes and activated platelets

Human blood as viewed under a scanning electron microscope.Credit: Dennis Kunkel Microscopy/SPL

Designer immune cells. Immunotherapy is currently offering new hope for people with cancer by shaping how the immune system responds to tumours. Cancer cells often turn off the immune response that would otherwise destroy them. The use of therapeutic antibodies to stop this process has drastically increased survival rates for people with multiple cancers, including those of the skin, blood and lung9. Similarly successful is the technique of adoptive T-cell transfer. In this, a patient’s T cells or those of a donor are engineered to express a receptor that targets a protein (antigen) on the surface of tumour cells, resulting in the T cells killing the cancerous cells (called CAR-T therapies)10. All of this has opened the door to cleverly rewiring the downstream signalling that results in the destruction of tumour cells by white blood cells11.

What if researchers went a step further and tried to create synthetic cells capable of moving towards, binding to and eliminating tumour cells?

In principle, untethered from evolutionary pressures, such cells could be designed to accomplish all sorts of tasks — from killing specific tumour cells and pathogens to removing brain amyloid plaques or cholesterol deposits. If mass production of artificial immune cells were possible, it might even lessen the need to tailor treatments to individuals — cutting costs and increasing accessibility.

To ensure that healthy cells are not targeted for destruction, engineers would also need to design complex signal-processing systems and safeguards. The designer immune cells would need to be capable of detecting and moving towards a chemical signal or tumour. (Reconstituting the complex process of cell motility is itself a major challenge, from the delivery of energy-generating ATP molecules to the assembly of actin and myosin motors that enable movement.)

Researchers have already made cell-like compartments that can change shape12, and have installed signalling circuits within them13. These could eventually be used to control movement and mediate responses to external signals.

Smart delivery vehicles. The relative ease of exposing cells in the lab to drugs, as well as introducing new proteins and engineering genomes, belies how hard it is to deliver molecules to specific locations inside living organisms. One of the biggest challenges in most therapies is getting molecules to the right place in the right cell at the right time.

Harnessing the natural proclivity of viruses to deliver DNA and RNA molecules into cells has been successful14. But virus size limits cargo size, and viruses don’t necessarily infect the cell types researchers and clinicians are aiming at. Antibody-targeted synthetic vesicles have improved the delivery of drugs to some tumours. But getting the drug close to the tumour generally depends on the vesicles leaking from the patient’s circulatory system, so results have been mixed.

Could ‘smart’ delivery vehicles containing therapeutic cargo be designed to sense where they are in the body and move the cargo to where it needs to go, such as across the blood–brain barrier?

This has long been a dream of those in drug delivery. The challenges are similar to those of constructing artificial blood and immune cells: encapsulating defined components in a membrane, incorporating receptors into that membrane, and designing signal-processing systems to control movement and trigger release of the vehicle’s contents.

The development of immune-cell ‘backpacks’ is an exciting step in the right direction. In this, particles containing therapeutic molecules are tethered to immune cells, exploiting the motility and targeting ability of the cells to carry the molecules to particular locations15.

A minimal chassis for expression. In each of the previous examples, the engineered cell-like system could conceivably be built to function over hours or days, without the need for additional protein production and regulation through gene expression. For many other tasks, however, such as the continuous production of insulin in the body, it will be crucial to have the ability to express proteins, upregulate or downregulate certain genes, and carry out functions for longer periods.

Engineering a ‘minimal chassis’ that is capable of sustained gene expression and functional homeostasis would be an invaluable starting point for building synthetic cells that produce proteins, form tissues and remain viable for months to years. This would require detailed understanding and incorporation of metabolic pathways, trafficking systems and nuclear import and export — an admittedly tall order.

It is already possible to synthesize DNA in the lab, whether through chemically reacting bases or using biological enzymes or large-scale assembly in a cell16. But we do not yet know how to ‘boot up’ DNA and turn a synthetic genome into a functional system in the absence of a live cell.

Since the early 2000s, biologists have achieved gene expression in synthetic compartments loaded with cytoplasmic extract17. And genetic circuits of increasing complexity (in which the expression of one protein results in the production or degradation of another) are now the subject of extensive research. Still to be accomplished are: long-lived gene expression, basic protein trafficking and energy production reminiscent of live cells.

End Quote

SOURCE

https://www.nature.com/articles/d41586-018-07291-3?utm_source=briefing-dy&utm_medium=email&utm_campaign=briefing&utm_content=20181106

UPDATED on 10/14/2013

Genetics of Atherosclerotic Plaque in Patients with Chronic Coronary Artery Disease

372/3:15 Genetic influence on LpPLA2 activity at baseline as evaluated in the exome chip-enriched GWAS study among ~13600 patients with chronic coronary artery disease in the STABILITY (STabilisation of Atherosclerotic plaque By Initiation of darapLadIb TherapY) trial. L. Warren, L. Li, D. Fraser, J. Aponte, A. Yeo, R. Davies, C. Macphee, L. Hegg, L. Tarka, C. Held, R. Stewart, L. Wallentin, H. White, M. Nelson, D. Waterworth.

Genetic influence on LpPLA2 activity at baseline as evaluated in the exome chip-enrichedGWASstudy among ~13600 patients with chronic coronary artery disease in the STABILITY (STabilisation of Atherosclerotic plaque By Initiation of darapLadIb TherapY) trial.

L. Warren1, L. Li1, D. Fraser1, J. Aponte1, A. Yeo2, R. Davies3, C. Macphee3, L. Hegg3,

L. Tarka3, C. Held4, R. Stewart5, L. Wallentin4, H. White5, M. Nelson1, D.

Waterworth3.

1) GlaxoSmithKline, Res Triangle Park, NC;

2) GlaxoSmithKline, Stevenage, UK;

3) GlaxoSmithKline, Upper Merion, Pennsylvania, USA;

4) Uppsala Clinical Research Center, Department of Medical Sciences, Uppsala University, Uppsala, Sweden;

5) 5Green Lane Cardiovascular Service, Auckland Cty Hospital, Auckland, New Zealand.

STABILITY is an ongoing phase III cardiovascular outcomes study that compares the effects of darapladib enteric coated (EC) tablets, 160 mg versus placebo, when added to the standard of care, on the incidence of major adverse cardiovascular events (MACE) in subjects with chronic coronary heart disease (CHD). Blood samples for determination of the LpPLA2 activity level in plasma and for extraction of DNA was obtained at randomization. To identify genetic variants that may predict response to darapladib, we genotyped ~900K common and low frequency coding variations using Illumina OmniExpress GWAS plus exome chip in advance of study completion. Among the 15828 Intent-to-Treat recruited subjects, 13674 (86%) provided informed consent for genetic analysis. Our pharmacogenetic (PGx) analysis group is comprised of subjects from 39 countries on five continents, including 10139 Whites of European heritage, 1682 Asians of East Asian or Japanese heritage, 414 Asians of Central/South Asian heritage, 268 Blacks, 1027 Hispanics and 144 others. Here we report association analysis of baseline levels of LpPLA2 to support future PGx analysis of drug response post trial completion. Among the 911375 variants genotyped, 213540 (23%) were rare (MAF < 0.5%).

Our analyses were focused on the drug target, LpPLA2 enzyme activity measured at baseline. GWAS analysis of LpPLA2 activity adjusting for age, gender and top 20 principle component scores identified 58 variants surpassing GWAS-significant threshold (5e-08).

Genome-wide stepwise regression analyses identified multiple independent associations from PLA2G7, CELSR2, APOB, KIF6, and APOE, reflecting the dependency of LpPLA2 on LDL-cholesterol levels. Most notably, several low frequency and rare coding variants in PLA2G7 were identified to be strongly associated with LpPLA2 activity. They are V279F (MAF=1.0%, P= 1.7e-108), a previously known association, and four novel associations due to I1317N (MAF=0.05%, P=4.9e-8), Q287X (MAF=0.05%, P=1.6e-7), T278M (MAF=0.02%, P=7.6e-5) and L389S (MAF=0.04%, P=4.3e-4).

All these variants had enzyme activity lowering effects and each appeared to be specific to certain ethnicity. Our comprehensive PGx analyses of baseline data has already provided great insight into common and rare coding genetic variants associated with drug target and related traits and this knowledge will be invaluable in facilitating future PGx investigation of darapladib response.

SOURCE

http://www.ashg.org/2013meeting/pdf/46025_Platform_bookmark%20for%20Web%20Final%20from%20AGS.pdf

Synthetic Biology: On Advanced Genome Interpretation for

  • Gene Variants and
  • Pathways,
  • Inversion Polymorphism,
  • Passenger Deletions,
  • De Novo Mutations,
  • Whole Genome Sequencing w/Linkage Analysis

What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging?

In a recent publication by my colleague, Stephen J. Williams, Ph.D. on  5/15/2013 titled

Finding the Genetic Links in Common Disease:  Caveats of Whole Genome Sequencing Studies

http://pharmaceuticalintelligence.com/2013/05/15/finding-the-genetic-links-in-common-disease-caveats-of-whole-genome-sequencing-studies/

we learned that:

  • Groups of variants in the same gene confirmed link between APOC3 and higher risk for early-onset heart attack
  • No other significant gene variants linked with heart disease

APOC3 – apolipoprotein C-III – Potential Relevance to the Human Aging Process

Main reason for selection
Entry selected based on indirect or inconclusive evidence linking the gene product to ageing in humans or in one or more model systems
Description
APOC3 is involved in fat metabolism and may delay the catabolism of triglyceride-rich particles. Changes in APOC3 expression levels have been reported in aged mice [1754]. Results from mice suggest that FOXO1 may regulate the expression of APOC3 [1743]. Polymorphisms in the human APOC3 gene and promoter have been associated with lipoprotein profile, cardiovascular health, insulin (INS) sensitivity, and longevity [1756]. Therefore, APOC3 may impact on some age-related diseases, though its exact role in human ageing remains to be determined.

Cytogenetic information

Cytogenetic band
11q23.1-q2
Location
116,205,833 bp to 116,208,997 bp
Orientation
Plus strand

Display region using the UCSC Genome Browser

Protein information

Gene Ontology
Process: GO:0006869; lipid transport
GO:0016042; lipid catabolic process
GO:0042157; lipoprotein metabolic process
Function: GO:0005319; lipid transporter activity
Cellular component: GO:0005576; extracellular region
GO:0042627; chylomicron

Protein interactions and network

No interactions in records.

Retrieve sequences for APOC3

Promoter
Promoter
ORF
ORF
CDS
CDS

Homologues in model organisms

Bos taurus
APOC3_BOVI
Mus musculus
Apoc3
Pan troglodytes
APOC3

In other databases

AnAge
This species has an entry in AnAge

Selected references

  • [2125] Pollin et al. (2008) A null mutation in human APOC3 confers a favorable plasma lipid profile and apparent cardioprotection.PubMed
  • [1756] Atzmon et al. (2006) Lipoprotein genotype and conserved pathway for exceptional longevity in humansPubMed
  • [1755] Araki and Goto (2004) Dietary restriction in aged mice can partially restore impaired metabolism of apolipoprotein A-IV and C-IIIPubMed
  • [1743] Altomonte et al. (2004) Foxo1 mediates insulin action on apoC-III and triglyceride metabolismPubMed
  • [1754] Araki et al. (2004) Impaired lipid metabolism in aged mice as revealed by fasting-induced expression of apolipoprotein mRNAs in the liver and changes in serum lipidsPubMed
  • [1753] Panza et al. (2004) Vascular genetic factors and human longevityPubMed
  • [1752] Anisimov et al. (2001) Age-associated accumulation of the apolipoprotein C-III gene T-455C polymorphism C 

http://genomics.senescence.info/genes/entry.php?hgnc=APOC3

Apolipoprotein C-III is a protein component of very low density lipoprotein (VLDL). APOC3 inhibitslipoprotein lipase and hepatic lipase; it is thought to inhibit hepatic uptake[1] of triglyceride-rich particles. The APOA1, APOC3 and APOA4 genes are closely linked in both rat and human genomes. The A-I and A-IV genes are transcribed from the same strand, while the A-1 and C-III genes are convergently transcribed. An increase in apoC-III levels induces the development of hypertriglyceridemia.

Clinical significance

Two novel susceptibility haplotypes (specifically, P2-S2-X1 and P1-S2-X1) have been discovered in ApoAI-CIII-AIV gene cluster on chromosome 11q23; these confer approximately threefold higher risk ofcoronary heart disease in normal[2] as well as non-insulin diabetes mellitus.[3]Apo-CIII delays the catabolism of triglyceride rich particles. Elevations of Apo-CIII found in genetic variation studies may predispose patients to non-alcoholic fatty liver disease.

  1. ^ Mendivil CO, Zheng C, Furtado J, Lel J, Sacks FM (2009). “Metabolism of VLDL and LDL containing apolipoprotein C-III and not other small apolipoproteins – R2”.Arteriosclerosis, Thrombosis and Vascular Biology 30 (2): 239–45. doi:10.1161/ATVBAHA.109.197830PMC 2818784PMID 19910636.
  2. ^ Singh PP, Singh M, Kaur TP, Grewal SS (2007). “A novel haplotype in ApoAI-CIII-AIV gene region is detrimental to Northwest Indians with coronary heart disease”. Int J Cardiol 130 (3): e93–5. doi:10.1016/j.ijcard.2007.07.029PMID 17825930.
  3. ^ Singh PP, Singh M, Gaur S, Grewal SS (2007). “The ApoAI-CIII-AIV gene cluster and its relation to lipid levels in type 2 diabetes mellitus and coronary heart disease: determination of a novel susceptible haplotype”. Diab Vasc Dis Res 4 (2): 124–29. doi:10.3132/dvdr.2007.030PMID 17654446.

In 2013 we reported on the discovery that there is a

Genetic Associations with Valvular Calcification and Aortic Stenosis

N Engl J Med 2013; 368:503-512

February 7, 2013DOI: 10.1056/NEJMoa1109034

METHODS

We determined genomewide associations with the presence of aortic-valve calcification (among 6942 participants) and mitral annular calcification (among 3795 participants), as detected by computed tomographic (CT) scanning; the study population for this analysis included persons of white European ancestry from three cohorts participating in the Cohorts for Heart and Aging Research in Genomic Epidemiology consortium (discovery population). Findings were replicated in independent cohorts of persons with either CT-detected valvular calcification or clinical aortic stenosis.

CONCLUSIONS

Genetic variation in the LPA locus, mediated by Lp(a) levels, is associated with aortic-valve calcification across multiple ethnic groups and with incident clinical aortic stenosis. (Funded by the National Heart, Lung, and Blood Institute and others.)

SOURCE:

N Engl J Med 2013; 368:503-512

Related Research by Author & Curator of this article:

Artherogenesis: Predictor of CVD – the Smaller and Denser LDL Particles

Cardiovascular Biomarkers

Genetics of Conduction Disease: Atrioventricular (AV) Conduction Disease (block): Gene Mutations – Transcription, Excitability, and Energy Homeostasis

Genomics & Genetics of Cardiovascular Disease Diagnoses: A Literature Survey of AHA’s Circulation Cardiovascular Genetics, 3/2010 – 3/2013

Hypertriglyceridemia concurrent Hyperlipidemia: Vertical Density Gradient Ultracentrifugation a Better Test to Prevent Undertreatment of High-Risk Cardiac Patients

Hypertension and Vascular Compliance: 2013 Thought Frontier – An Arterial Elasticity Focus

Personalized Cardiovascular Genetic Medicine at Partners HealthCare and Harvard Medical School

Genomics Orientations for Individualized Medicine Volume One

Market Readiness Pulse for Advanced Genome Interpretation and Individualized Medicine

We present below the MARKET LEADER in Interpretation of the Genomics Computations Results in the emerging new ERA of Medicine:  Genomic Medicine, Knome.com and its home grown software power house.

A second Case study in the  Advanced Genome Interpretation and Individualized Medicine presented following the Market Leader, is the Genome-Phenome Analyzer by SimulConsult, A Simultaneous Consult On Your Patient’s Diagnosis, Chestnut Hill, MA

 

2012: The Year When Genomic Medicine Started Paying Off

Luke Timmerman

An excerpt of an interesting article mentioning Knome [emphasis ours]…

Remember a couple of years ago when people commemorated the 10-year anniversary of the first draft human genome sequencing? The storyline then, in 200, was that we all went off to genome camp and only came home with a lousy T-shirt. Society, we were told, invested huge scientific resources in deciphering the code of life, and there wasn’t much of a payoff in the form of customized, personalized medicine.

That was an easy conclusion to reach then, when personalized medicine advocates could only point to a couple of effective targeted cancer drugs—Genentech’s Herceptin and Novartis’ Gleevec—and a couple of diagnostics. But that’s changing. My inbox the past week has been full of analyst reports from medical meetings, which mostly alerted readers to mere “incremental” advances with a number of genomic-based medicines and diagnostics. But that’s a matter of focusing on the trees, not the forest. This past year, we witnessed some really impressive progress from the early days of “clinical genomics” or “medical genomics.” The investment in deep understanding of genomics and biology is starting to look visionary.

The movement toward clinical genomics gathered steam back in June at the American Society of Clinical Oncology annual meeting. One of the hidden gem stories from ASCO was about little companies like Cambridge, MA-based Foundation Medicine and Cambridge, MA-based Knome that started seeing a surprising surge in demand from physicians for their services to help turn genomic data into medical information. The New York Times wrote a great story a month later about a young genomics researcher at Washington University in St. Louis who got cancer, had access to incredibly rich information about his tumors, and—after some wrestling with his insurance company—ended up getting a targeted drug nobody would have thought to prescribe without that information. And last month, I checked back on Stanford University researcher Mike Snyder, who made headlines this year using a smorgasbord of “omics” tools to correctly diagnose himself early with Type 2 diabetes, and then monitor his progress back into a healthy state–read the entire article

http://www.knome.com/knome-blog/2012-the-year-when-genomic-medicine-started-paying-off/

Knome and Real Time Genomics Ink Deal to Integrate and Sell the RTG Variant Platform on knoSYS™100 System

Partnership to bring accurate and fast genome analysis to translational researchers

CAMBRIDGE, MA –  May 6, 2013 – Knome Inc., the genome interpretation company, and Real Time Genomics, Inc., the genome analytics company, today announced that the Real Time Genomics (RTG) Variant platform will be integrated into every shipment of the knoSYS™100 interpretation system. The agreement enables customers to easily purchase the RTG analytics engine as an upgrade to the system. The product will combine two world-class commercial platforms to deliver end-to-end genome analytics and interpretation with superior accuracy and speed. Financial terms of the agreement were not disclosed.

“In the past year demand for genome interpretation has surged as translational researchers and clinicians adopt sequencing for human disease discovery and diagnosis,” said Wolfgang Daum, CEO of Knome. “Concomitant with that demand is the need for accurate and easy-to-use industrial grade analysis that meets expectations of clinical accuracy. The RTG platform is both incredibly fast and truly differentiating to customers doing family studies, and we are excited to add such a powerful platform to the knoSYS ecosystem.”

The partnership simplifies the purchasing process by allowing knoSYS customers to purchase the RTG platform directly from Knome sales representatives.

“The Knome system is a perfect complementary channel to further expand our commercial effort to bring the RTG platform to market,” said Steve Lombardi, CEO of Real Time Genomics. “Knome has built a recognizable brand around human clinical genome interpretation, and by delivering the RTG platform within their system, both companies are simplifying genomics to help customers understand human disease and guide clinical actions.”

About Knome

Knome Inc. (www.knome.com) is a leading provider of human genome interpretation systems and services. We help clients in two dozen countries identify the genetic basis of disease, tumor growth, and drug response. Designed to accelerate and industrialize the process of interpreting whole genomes, Knome’s big data technologies are helping to pave the healthcare industry’s transition to molecular-based, precision medicine.

About Real Time Genomics

Real Time Genomics (www.realtimegenomics.com) has a passion for genomics.  The company offers software tools and applications for the extraction of unique value from genomes.  Its competency lies in applying the combination of its patented core technology and deep computational expertise in algorithms to solve problems in next generation genomic analysis.  Real Time Genomics is a private San Francisco based company backed by investment from Catamount Ventures, Lightspeed Venture Partners, and GeneValue Ltd.

http://www.knome.com/knome-blog/knome-and-real-time-genomics-ink-deal-to-integrate-and-sell-the-rtg-variant-platform-on-knosys100-system/

Direct-to-Consumer Genomics Reinvents Itself

Malorye Allison

An excerpt of an interesting article mentioning Knome [emphasis ours]:

Cambridge, Massachusetts–based Knome made one of the splashiest entries into the field, but has now turned entirely to contract research. The company began providing DTC whole-genome sequencing to independently wealthy individuals at a time when the price was still sky high. The company’s first client, Dan Stoicescu, was a former biotech entrepreneur who paid $350,000 to have his genome sequenced in 2008 so he could review it “like a stock portfolio” as new genetic discoveries unfolded4. About a year later, the company was auctioning off a genome, with such frills as a dinner with renowned Harvard genomics researcher George Church, at a starting price of $68,000; at the time, a full-genome sequence came at the price of $99,000, indicating that the cost of genome sequencing has been plummeting steadily.

Now, the company’s model is very different. “We stopped working with the ‘wealthy healthy’ in 2010,” says Jonas Lee, Knome’s chief marketing officer. “The model changed as sequencing changed.” The new emphasis, he says, is now on using Knome’s technology and technical expertise for genome interpretation. Knome’s customers are researchers, pharmaceutical companies and medical institutions, such as Johns Hopkins University School of Medicine in Baltimore, which in January signed the company up to interpret 1,000 genomes for a study of genetic variants underlying asthma in African American and African Caribbean populations.

Knome is trying to advance the clinical use of genomics, working with groups that “want to be prepared for what’s ahead,” Lee says. “We work with at least 50 academic institutions and 20 pharmaceutical companies looking at variants and drug response.” Cancer and idiopathic genetic diseases are the first sweet spots for genomic sequencing, he says. Although cancer genomics has been hot for a while, a recent string of discoveries of Mendelian diseases5 made by whole-genome sequencing has lit up that field, too. Lee is also confident, however, that “chronic diseases like heart disease are right behind those.” The company also provides software tools. The price for its KnomeDiscovery sequencing and analysis service starts at about $12,000 per sample–read the entire article here.

http://www.knome.com/knome-blog/direct-to-consumer-genomics-reinvents-itself/

Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves

VIEW VIDEO

http://www.colbertnation.com/the-colbert-report-videos/419824/october-04-2012/george-church

 

Knome Software Makes Sense of the Genome

The startup’s software takes raw genome data and creates a usable report for doctors.

DNA decoder: Knome’s software can tease out medically relevant changes in DNA that could disrupt individual gene function or even a whole molecular pathway, as is highlighted here—certain mutations in the BRCA2 gene, which affects the function of many other genes, can be associated with an increased risk of breast cancer.

A genome analysis company called Knome is introducing software that could help doctors and other medical professionals identify genetic variations within a patient’s genome that are linked to diseases or drug response. This new product, available for now only to select medical institutions, is a patient-focused spin on Knome’s existing products aimed at researchers and pharmaceutical companies. The Knome software turns a patient’s raw genome sequence into a medically relevant report on disease risks and drug metabolism. The software can be run within a clinic’s own network—rather than in the cloud, as is the case with some genome-interpretation services—which keeps the information private.

Advances in DNA sequencing technology have sharply reduced the amount of time and money required to identify all three billion base pairs of DNA in a person’s genome. But the use of genomic information for medical decisions is still limited because the process creates such large volumes of data. Less than five years ago, Knome, based in Cambridge, Massachusetts, made headlines by offering what seemed then like a low price—$350,000—for a genome sequencing and profiling package. The same service now costs just a few thousand dollars.

Today, genome profiling has two main uses in the clinic. It’s part of the search for the cause of rare genetic diseases, and it generates tumor-specific profiles to help doctors discover the weaknesses of a patient’s particular cancer. But within a few years, the technique could move beyond rare diseases and cancer. The information gleaned from a patient’s genome could explain the origin of specific disease, could help save costs by allowing doctors to pretreat future diseases, or could improve the effectiveness and safety of medications by allowing doctors to prescribe drugs that are tuned to a person’s ability to metabolize drugs.

But teasing out the relevant genetic information from a patient’s genome is not trivial. To find the particular genetic variant that causes a specific disease or drug response can require expertise from many disciplines—from genetics to statistics to software engineering—and a lot of time. In any given patient’s genome, millions of places in that genome will differ from the standard of reference. The vast majority of these differences, or variants, will be unrelated to a patient’s medical condition, but determining that can take between 20 minutes and two hours for each variant, says Heidi Rehm, a clinical geneticist who directs the Laboratory for Molecular Medicine at Partners Healthcare Center for Personalized Genetic Medicine in Boston, and who will soon serve on the clinical advisory board of Knome. “If you scale that to … millions of variants, it becomes impossible.”

A software package like Knome’s can help whittle down the list based on factors such as disease type, the pattern of inheritance in a family, and the effects of given mutations on genes. Other companies have introduced Web- or cloud-based services to perform such an analysis, but Knome’s software suite can operate within a hospital’s network, which is critically important for privacy-concerned hospitals.

The greatest benefit of the widespread adoption of genomics in the clinic will come from the “clinical intelligence” doctors gain from networks of patient data, says Martin Tolar, CEO of Knome. Information about the association between certain genetic variants and disease or drug response could be anonymized—that is, no specific patient could be tied to the data—and shared among large hospital networks. Knome’s software will make it easy to share that kind of information, says Tolar.

“In the future, you could be in the situation where your physician will be able to pull the most appropriate information for your specific case that actually leads to recommendations about drugs and so forth,” he says.

http://www.technologyreview.com/news/428179/knome-software-makes-sense-of-the-genome/

An End-to-end Human Genome Interpretation System

The knoSYS™100 seamlessly integrates an interpretation application (knoSOFT) and informatics engine (kGAP) with a high-performance grid computer. Designed for whole genome, exome, and targeted NGS data, the knoSYS™100 helps labs quickly go “from reads to reports.”


 


Advanced Interpretation and Reporting Software

The knoSYS™100 ships with knoSOFT, an advanced application for managing sequence data through the informatics pipeline, filtering variants, running gene panels, classifying/interpreting variants, and reporting results.

knoSOFT has powerful and scalable multi-sample comparison features–capable of performing family studies, tumor/normal studies, and large case-control comparisons of hundreds of whole genomes.

Multiple simultaneous users (10) are supported, including technicians running sequence data through informatics pipeline, developers creating next-generation gene panels, geneticists researching causal variants, and production staff processing gene panels.

http://www.knome.com/knosys-100-overview/

Publications

View our collection of journal articles and genome research papers written by Knome employees, Knome board members, and other industry experts.

Publications by Knome employees and board members

The Top Two Axes of Variation of the Combined Dataset (MS, BD, PD, and IBD)

21 Aug 2012

Discerning the Ancestry of European Americans in Genetic Association Studies

Co-authored by Dr. David Goldstein, Clinical and Scientific board member for Knome

Author summary: Genetic association studies analyze both phenotypes (such as disease status) and genotypes (at sites of DNA variation) of a given set of individuals. … more

Pedigree and genetic risk prediction workflow

20 Aug 2012

Phased Whole-Genome Genetic Risk in a Family Quartet Using a Major Allele Reference Sequence

Co-authored by Dr. George Church and Dr. Heidi Rehm, Clinical and Scientific Board Members for Knome

Author summary: An individual’s genetic profile plays an important role in determining risk for disease and response to medical therapy. The development of technologies that facilitate rapid whole-genome sequencing will provide unprecedented power in the estimation of disease risk. Here we develop methods to characterize genetic determinants of disease risk and … more

20 Aug 2012

A Genome-Wide Investigation of SNPs and CNVs in Schizophrenia

Co-authored by Dr. David Goldstein, Clinical and Scientific board member for Knome

Author summary: Schizophrenia is a highly heritable disease. While the drugs commonly used to treat schizophrenia offer important relief from some symptoms, other symptoms are not well treated, and the drugs cause serious adverse effects in many individuals. This has fueled intense interest over the years in identifying genetic contributors to … more

fetchObject

20 Aug 2012

Whole-Genome Sequencing of a Single Proband Together with Linkage Analysis Identifies a Mendelian Disease Gene

Co-authored by Dr. David Goldstein, Clinical and Scientific board member for Knome

Author summary: Metachondromatosis (MC) is an autosomal dominant condition characterized by exostoses (osteochondromas), commonly of the hands and feet, and enchondromas of long bone metaphyses and iliac crests. MC exostoses may regress or even resolve over time, and short stature … more

19 Aug 2012

Exploring Concordance and Discordance for Return of Incidental Findings from Clinical Sequencing Co-authored by Dr. Heidi Rehm, Clinical and Scientific board member for Knome

Introduction: There is an increasing consensus that whole-exome sequencing (WES) and whole-genome sequencing (WGS) will continue to improve in accuracy and decline in price and that the use of these technologies will eventually become an integral part of clinical medicine.1–7 … more

Publications by industry experts and thought-leaders

22 Aug 2012

Rate of De Novo Mutations and the Importance of Father’s Age to Disease Risk

Augustine Kong, Michael L. Frigge, Gisli Masson, Soren Besenbacher, Patrick Sulem, Gisli Magnusson, Sigurjon A. Gudjonsson, Asgeir Sigurdsson, Aslaug Jonasdottir, Adalbjorg Jonasdottir, Wendy S. W. Wong, Gunnar Sigurdsson, G. Bragi Walters, Stacy Steinberg, Hannes Helgason, Gudmar Thorleifsson, Daniel F. Gudbjartsson, Agnar Helgason, Olafur Th. Magnusson, Unnur Thorsteinsdottir, & Kari Stefansson

Abstract: Mutations generate sequence diversity and provide a substrate for selection. The rate of de novo mutations is therefore of major importance to evolution. Here we conduct a study of genome-wide mutation rates by sequencing the entire genomes of 78 … more

15 Aug 2012

Passenger Deletions Generate Therapeutic Vulnerabilities in Cancer

Florian L. Muller, Simona Colla, Elisa Aquilanti, Veronica E. Manzo, Giannicola Genovese, Jaclyn Lee, Daniel Eisenson, Rujuta Narurkar, Pingna Deng, Luigi Nezi, Michelle A. Lee, Baoli Hu, Jian Hu, Ergun Sahin, Derrick Ong, Eliot Fletcher-Sananikone, Dennis Ho, Lawrence Kwong, Cameron Brennan, Y. Alan Wang, Lynda Chin, & Ronald A. DePinho

Abstract: Inactivation of tumour-suppressor genes by homozygous deletion is a prototypic event in the cancer genome, yet such deletions often encompass neighbouring genes. We propose that homozygous deletions in such passenger genes can expose cancer-specific therapeutic vulnerabilities when the collaterally … more

1 Jul 2012

Structural Diversity and African Origin of the 17q21.31 Inversion Polymorphism

Karyn Meltz Steinberg, Francesca Antonacci, Peter H Sudmant, Jeffrey M Kidd, Catarina D Campbell, Laura Vives, Maika Malig, Laura Scheinfeldt, William Beggs, Muntaser Ibrahim, Godfrey Lema, Thomas B Nyambo, Sabah A Omar, Jean-Marie Bodo, Alain Froment, Michael P Donnelly, Kenneth K Kidd, Sarah A Tishkoff, & Evan E Eichler

Abstract: The 17q21.31 inversion polymorphism exists either as direct (H1) or inverted (H2) haplotypes with differential predispositions to disease and selection. We investigated its genetic diversity in 2,700 individuals, with an emphasis on African populations. We characterize eight structural haplotypes … more

http://www.knome.com/publications/

knome’s Systems & Software

Technical specifications

Connections and communications

Two networks: 40-Gigabit Infiniband QDR via a Mellanox Switch for storage traffic and HP ProCurve switch for network traffic

High performance computing cluster

Four nodes, each node with two 8-core/16 thread, 2.4Ghz, 64 bit Intel® Xeon® E5-2660 processor with 20MB cache, 128GB of DDR3 ECC 1600 memory; 2x2TB SATA drives (7,200RPM)

Metadata server

2x2TB 3.5″ drives with 6GB/sec SATA, RAID 1 and 2x300GB SSD (RAID 1)

Object storage server

Lustre array: Two 12x4TB arrays of 12 3.5″ drives with 6GB/sec serial SATA channels, each OSS powered by a 6-core Intel Xeon 64-bit processor running at 20GHz with 32GB RAM.

knoSYS_server

96TB total, 64TB useable storage (redundancy for failure tolerance). Expandable 384TB total.

Data sources

Reference genome GRCh37 (HG19)

dbSNP, v137

Condel (SIFT and PolyPhen-2)

HPO

OMIM

Exome Variant server, with allelisms and allele frequencies

1000 Genomes, with allelisms and allele frequencies

Human Gene Mutation db (HGMD)

Phastcons 46, mammalian conservation

PhyloP

Input/output formats

Input formats: kGAP accepts Illumina FASTQ and VCF 4.1 files as inputs

Output formats: annotated VCF files

Electrical and operating requirements

Line voltage: 110V to 120V AC, 200-240V (single phase)

Frequency: 50Hz to 60Hz

Current: 30A, RoSH compliant

Connection: NEMA L5-30

Operating temperature: 50° to 95° F

UPS included

Maximum operating altitude: 10,000 feet

Power consumption: 2,800 VA (peak)

Size and weight

Height 49.2 Inches (1250 mm)
Width 30.7 Inches (780 mm)
Depth 47.6 Inches (1210 mm)
Weight 394 lbs (179 kg)

Noise generation and heat dissipation

Enclosure provides 28dB of acoustic noise reduction; system suitable for placing in working lab environment

7200w of active heat dissipation

Included in the package

knoSYS™100 hardware

Knome software: knoSOFT, kGAP

Operating system: Linux (CentOS 6.3)

http://www.knome.com/knosys-100-specifications/

Our research services group uses a set of advanced software tools designed for whole genome and exome interpretation. These tools are also available to our clients through our knomeBASE informatics service. In addition to various scripts, libraries, and conversion utilities, these tools include knomeVARIANTS and knomePATHWAYS.

knomeVARIANTS

Genome_software_knomeVARIANTS

knome VARIANTS is a query kit that lets users search for candidate causal variants in studied genomes. It includes a query interface (see above), scripting libraries, and data conversion utilities.

Users select cases and controls, input a putative inheritance mode, and add sensible filter criteria (variant functional class, rarity/novelty, location in prior candidate regions, etc.) to automatically generate a sorted short-list of leading candidates. The application includes a SQL query interface to let users query the database as they wish, including by complex or novel sets of criteria.

In addition to querying, the application lets users export subsets of the database for viewing in MS Excel. Subsets can be output that target common research foci, including the following:

  • Sites implicated in phenotypes, regardless of subject genotypes
  • Sites where at least one studied genome mismatches the reference
  • Sites where a particular set of one or more genomes, but no other genomes, show a novel variant
  • Sites in phenotype-implicated genes
  • Sites with nonsense, frameshift, splice-site, or read-through variants, relative to reference
  • Sites where some but not all subject genome were called

knomePATHWAYS

Genome_software_knomePATHWAYS

knomePATHWAYS is a visualization tool that overlays variants found in each sample genome onto known gene interaction networks in order to help spot functional interactions between variants in distinct genes, and pathways enriched for variants in cases versus controls, differential drug responder groups, etc.

knomePATHWAYS integrates reference data from many sources, including GO, HPRD, and MsigDB (which includes KEGG and Reactome data). The application is particularly helpful in addressing higher-order questions, such as finding candidate genes and protein pathways, that are not readily addressed from tabular annotation data alone.

http://www.knome.com/interpretation-toolkit/

Genome-Phenome Analyzer by SimulConsult

A Simultaneous Consult On Your Patient’s Diagnosis

Clinicians can get a “simultaneous consult” about their patient’s diagnosis using SimulConsult’s diagnostic decision support software.

Using the free “phenome” version, medical professionals can enter patient findings into the software and get an initial differential diagnosis and suggestions about other useful findings, including tests.  The database used by the software has > 4,000 diagnoses, most complete for genetics and neurology.  It includes all genes in GeneTests and all diseases in GeneReviews.  The information about diseases is entered by clinicians, referenced to the literature and peer-reviewed by experts.  The software takes into account pertinent negatives, temporal information, and cost of tests, information ignored in other diagnostic approaches.  It transforms medical diagnosis by lowering costs, reducing errors and eliminating the medical diagnostic odysseys experienced by far too many patients and their families.

http://www.simulconsult.com/index.html

Using the “genome-phenome analyzer” version, a lab can combine a genome variant table with the phenotypic data entered by the referring clinician, thereby using the full power of genome + phenome to arrive at a diagnosis in seconds.  An innovative measure of pertinence of genes focuses attention on the genes accounting for the clinical picture, even if more than one gene is involved.  The referring clinician can use the results in the free phenome version of the software, for example adding information from confirmatory tests or adding new findings that develop over time.  For details, click here.

http://www.simulconsult.com/genome/index.html

Michael M. Segal MD, PhD, Founder,Chairman and Chief Scientist.  Dr. Segal did his undergraduate work at Harvard and his MD and PhD at Columbia, where his thesis project outlined rules for the types of chemical synapses that will form in a nervous system.  After his residency in pediatric neurology at Columbia, he moved to Harvard Medical School, where he joined the faculty and developed the microisland system for studying small numbers of brain neurons in culture.  Using this system, he developed a simplified model of epilepsy, work that won him national and international young investigator awards, and set the stage for later work on the molecular mechanism of attention deficit disorder.  Dr. Segal has a long history of interest in computers, and patterned the SimulConsult software after the way that experienced clinicians actually think about diagnosis.  He is on the Electronic Communication Committee of the Child Neurology Society and the Scientific Program Committee of the American Medical Informatics Association.

http://www.simulconsult.com/company/management.html

Read Full Post »

Treatment, Prevention and Cost of Cardiovascular Disease: Current & Predicted Cost of Care and the Potential for Improved Individualized Care Using Clinical Decision Support Systems

Author, and Content Consultant to e-SERIES A: Cardiovascular Diseases: Justin Pearlman, MD, PhD, FACC

Author and Curator: Larry H Bernstein, MD, FACP

and

Curator: Aviva Lev-Ari, PhD, RN

This article has the following FIVE parts:

1. Forecasting the Impact of Heart Failure in the United States : A Policy Statement From the American Heart Association

2. A Case Study from the GENETIC CONNECTIONS — In The Family: Heart Disease Seeking Clues to Heart Disease in DNA of an Unlucky Family

3. Arterial Stiffness and Cardiovascular Events : The Framingham Heart Study

4. Arterial Elasticity in Quest for a Drug Stabilizer: Isolated Systolic Hypertension
caused by Arterial Stiffening Ineffectively Treated by Vasodilatation Antihypertensives

5. Clinical Decision Support Systems: Realtime Clinical Expert Support — Biomarkers of Cardiovascular Disease : Molecular Basis and Practical Considerations

 

1. Forecasting the Impact of Heart Failure in the United States : A Policy Statement From the American Heart Association

PA Heidenreich, NM Albert, LA Allen, DA Bluemke, J Butler, et al. Circulation: Heart Failure 2013;6.
Print ISSN: 1941-3289, Online ISSN: 1941-3297.

Heart failure (HF) poses a major burden on productivity and cost of national healthcare expenditures

  • among older Americans, more are hospitalized for HF than for any other medical condition.

As the population ages, the prevalence of HF is expected to increase.

The purpose of this report is to

  • provide an in-depth look at how the changing demographics in the United States will impact the prevalence and cost of care for HF for different US populations.

 Projections of HF Prevalence

Prevalence estimates for HF were determined from

 Projections of the US Population With HF From 2010 to 2030 for Different Age Groups

Year

All ages

18-44 y

45-64 y

65-79 y

> 80

2012 5 813 262 396 578 1 907 141 2 192 233 1 317 310
2015 6 190 606 402 926 1 949 669 2 483 853 1 354 158
2020 6 859 623 417 600 1 974 585 3 004 002 1 463 436
2025 7 644 674 434 635 1 969 852 3 526 347 1 713 840
2030 8 489 428 450 275 2 000 896 3 857 729 2 180 528

Future Costs of HF

The future costs of HF were estimated by methods developed by the American Heart Association

  • project the prevalence and costs of HF from 2012 to 2030
  • factor out  the costs attributable to comorbid conditions.

The model does this by assuming that

(1) HF prevalence percentages will remain constant by age, sex, and race/ethnicity;

(2) the costs of technological innovation will rise at the current rate.

HF prevalence and costs (direct and indirect) were projected using the following steps:

1. HF prevalence and average cost per person were estimated by age group (18–44, 45–64, 65–79, ≥80 years), gender (male, female), and race/ethnicity (white non-Hispanic, white Hispanic, black, other) [32]. The initial HF cost per person and rate of increase in cost was determined for each demographic group, as a percentage of total healthcare expeditures.

2. Inflation is separately addressed by correcting dollar values from Medical Expenditure Panel Survey (MEPS) to 2010 dollars.

3. Nursing home spending triggered an adjustment. The estimates project the incremental cost of care attributable to heart failure (HF).

4. Total HF population prevalence and costs were projected by multiplying the US Census–projected population of each demographic group by the percentage prevalence and average cost

5. The total work loss and home productivity loss costs were generated by multiplying per capita work days lost attributable to HF by (1) prevalence of HF, (2) the probability of employment given HF (for work loss costs only), (3) mean per capita daily earnings, and (4) US Census population projection counts.

Projections of Indirect Costs

Indirect costs of lost productivity from morbidity and premature mortality were estimated as detailed below.
Morbidity costs represent the value of lost earnings attributable to HF and include loss of work among

  • currently employed individuals and those too sick to work, as well as
  • home productivity loss, which is the value of household services performed by household members who do not receive pay for the services.

Total Costs Attributable to Heart Failure (HF)

Projections of Total Cost of Care ($ Billions) for HF for Different Age Groups of the US Population

Year All 18–44 45–64 65–79 ≥ 80
2012
Medical 20.9 0.33 3.67 8.46 8.42
Indirect: Morbidity 5.42 0.52 1.92 2.05 0.93
Indirect: Mortality 4.35 0.66 2.53 0.98 0.18
Total 30.7 1.51 8.12 11.5 9.53
2020
Medical 31.1 0.43 4.58 14.2 11.8
Indirect: Morbidity 7.09 0.66 2.20 3.11 1.12
Indirect: Mortality 5.39 0.79 2.89 1.49 0.22
Total 43.6 1.88 9.67 18.8 13.2
2030
Medical 53.1 0.59 5.86 23.3 23.4
Indirect: Morbidity 9.80 0.91 2.54 4.48 1.87
Indirect: Mortality 6.84 0.98 3.32 2.16 0.37
Total 69.7 2.48 11.7 29.9 25.6

Excludes HF care costs that have been attributed to comorbid conditions.

Cost of Care

Total medical costs are projected to increase from $20.9 billion in 2012 to $53.1 billion in 2030, a 2.5-fold increase. Assuming continuation of current hospitalization practices, the majority (80%) of the costs stem from

  • hospitalization. Also, the majority of increase is from directs costs. Indirect costs are expected to rise as well, but at a lower rate, from $9.8 billion to $16.6 billion, an increase of 69%.

Direct costs (cost of medical care) are expected to increase at a faster rate than indirect costs because of premature deaths and lost productivity.

The total cost of HF (direct and indirect costs) is expected to increase in 2030 from the current $30.7 billion to at least $69.8 billion. This will amount to $244 for every US adult in 2030.

Thus the burden of HF for the US healthcare system will grow substantially during the next 18 years if current trends continue.

It is estimated that

  • by 2030, the prevalence of HF in the United States will increase by 25%, to 3.0%.
  • >8 million people in the US (1 in every 33) will have HF by 2030.
  • the projected total direct medical costs of HF between 2012 and 2030 (in 2010 dollars) will increase from $21 billion to $53 billion.
  • Total costs, including indirect costs for HF, are estimated to increase from $31 billion in 2012 to $70 billion in 2030.
  • If one assumes all costs of cardiac care for HF patients are attributable to HF
    (no cost attribution to comorbid conditions), the 2030 projected cost estimates of treating patients with HF will be 3-fold higher ($160 billion in direct costs).

Projections can be lowered if action is taken to reduce the health and economic burden of HF. Strategies, plans, and implementation to prevent HF and improve the efficiency of care are needed.

Causes and Stages of HF

If the projections for accelerating HF costs are to be avoided, attention to the different causes of HF and their risk factors is warranted.
HF is a clinical syndrome that results from a variety of cardiac disorders

  1. idiopathic dilated cardiomyopathy
  2. cardiac valvular disease
  3. pericarditis or pericardial effusion
  4. ischemic heart disease
  5. primary or secondary hypertension
  6. renovascular disease
  7. advanced liver disease with decreased venous return
  8. pulmonary hypertension
  9. prolonged hypoalbuminemia with generalized interstitial edema
  10. diabetic nephropathy
  11. heart muscle infiltration disease such as primary or secondary amyloidosis
  12. myocarditis
  13. rhythm disorders
  14. congenital diseases
  15. accidental trauma (war, chest trauma)
  16. toxicities (methamphetamine, cocaine, heavy metals, chemotherapy)

HF generally causes symptoms:

  • shortness of breath
  • fatigue
  • swelling (edema)
  • inability to lay flat (orthopnea, paroxysmal nocturnal dyspnea)
  • possibly cough, wheezing

In the Western world the predominant causes of HF are:

  • coronary artery disease
  • valvular disease
  • hypertension
  • viral, alcohol, methamphetamine or other drug  toxicity cardiomyopathy
  • stress (catechol toxicity, takotsubo “broken heart” cardiomyopathy)
  • atrial fibrillation/rapid heart rates
  • thyroid disease

In 2001, the American College of Cardiology and AHA practice guidelines for chronic HF promoted a classification system that encompasses 4 stages of HF.

  • Stage A: Patients at high risk for developing HF in the future but no functional or structural heart disorder.
  • Stage B: a structural heart disorder but no symptoms.
  • Stage C: previous or current symptoms of heart failure, manageable with medical treatment.
  • Stage D: advanced disease requiring hospital-based support, a heart transplant or palliative care.

Stages A and B are considered precursors to the clinical HF and are meant

  1. to alert healthcare providers to known risk factors for HF and
  2. the available therapies aimed at mitigating disease progression.

Stage A patients have risk factors for HF hypertension, atherosclerotic heart disease, and/or diabetes mellitus.

Patients with stage B are asymptomatic patients who have  developed structural heart disease from a variety of potential insults to the heart muscle such as myocardial infarction or valvular heart disease.

Stages C and D represent the symptomatic phases of HF, with stage C manageable and stage D failing medical management, resulting in marked symptoms at rest or with minimal activity despite optimal medical therapy.

Therapeutic interventions include:

  • dietary salt restriction and diuretics
  • medications known to prolong survival (beta blockers, ACE inhibitors, aldosterone inhibitors)
  • implantable devices such as pacemakers and defibrillators
  • stoppage of tobacco, toxic drugs, excess alcohol

Classic demographic risk factors for the development of HF include

  • older age, male gender, ethnicity, and low socioeconomic status.
  • comorbid disease states contribute to the development of HF
    • Ischemic heart disease
    • Hypertension

Diabetes mellitus, insulin resistance, and obesity are also linked to HF development,

  • with diabetes mellitus increasing the risk of HF by ≈2-fold in men and up to 5-fold in women.

Smoking remains the single largest preventable cause of disease and premature death in the United States.

Translation of Scientific Evidence into Clinical Practice

In multiple studies, failures to apply evidence-based management strategies are blamed for avoidable hospitalizations and/or deaths from HF

Improved implementation of guidelines can delay, mitigate or prevent the onset of HF, and improve survival. Performance improvement programs have facilitated the implementation of evidence-based therapies in both hospital and ambulatory care settings.

Care transition programs by hospitals have become more widespread

  • in an effort to reduce avoidable readmissions.

The interventions used by these programs include

  • initiating discharge planning early in the course of hospital care,
  • actively involving patients and families or caregivers in the plan of care,
  • providing new processes and systems that ensure patient understanding of the plan of care before discharge from the hospital, and
  • improving quality of care by continually monitoring adherence to national evidence-based guidelines with appropriate adaptations for individual differences in needs and responses.

In multiple studies,adherence to the HF plan of care was associated with reduced all-cause mortality as well as HF hospitalization.

It is anticipated that care transition programs may increase appropriate admissions while decreasing inappropriate admissions

This would have a potentially benenficial impact on the 30-day all-cause readmission rate that has become

  • a focus of public reporting in pay for performance.

More than a quarter of Medicare spending occurs in the last year of life, and

  • the costs of care during the last 6 months for a patient with HF have been increasing (11% from 2000 to 2007).

Improving end-of-life care cost effectiveness for patients with stage D HF will require ongoing

  • improved prediction of outcomes
  • integration of multiple aspects of care
  • educated examination of alternatives and priorities
  • improved decision-making
  • unbiased allocation of resources and coverage for this process rather than unbalanced coverage favoring catastrophic care

Palliative care, including formal hospice care, is increasingly advocated for patients with advanced HF.
Offering palliative care to patients with HF may lead to

  • more conservative (and less expensive) treatment
  • consistent with many patients’ goals for care

The use of hospice services is growing among the HF population,

  • HF now the second most common reason for entering hospice
  • but hospice declaration may impose automated restrictions on care that can impose an impediment to election of hospice

A recent study of patients in hospice care found that

  • patients with HF were more likely than patients with cancer to use hospice services longer than 6 months or to be discharged from hospice care alive.

Highlights:

1. Increasing incidence and costs of care for heart failure projected from 2012 to 2030

2. Direct costs rising at greater rate than indirect costs

3. American Heart Association has defined 4 stages of HF, the last 2 of which are advanced

4. Stages C & D are clinically overt and contribute to rehospitalization

5. Stage D accounts for a significant use of end-of-life hospice care

6. There are evidence-based guidelines for the provision of coordinated care that are not widely applied at present

Basic questions raised:

1. If stages A & B are under the radar, then what measures can best trigger the use of evidence-based guidelines for care?
2. Why are evidence-based guidelines commonly not deployed?

  • Flaws in the “evidence” due to bias, design errors, limted ability to extrapolate to the patients it should address
  • Delays in education, convincing of caretakers, and deployment
  • Inadequate resources
  • Financial or other disincentives

The arguments for introducing coordinated care and for evidence-based guidelines is strong.

Arguments AGAINST slavish imposition of evidence based medicine include genetic individuality (what is best on average is not necessarily best for each genetically and behaviorly distinct individual). Strict adherence to evidence-based guidelines also stifles innovative explorations. None-the-less, deviations from evidence-based plans should be cautious, well-documented, and well-informed, not due to mal-aligned incentives, ignorance, carelessness or error.

The question of when and how to intervene most cost effectively is unanswered. If some patients are salt-sensitive as a contribution to the prevalence of hypertension and heart failure, should EVERYONE be salt restricted or should there be a more concerted effort to define who is salt sensitive? What if it proved more cost-effective to restrict salt intake for everyone, even though many might be fine with high sodium intake, and some might even benefit from or require high sodium intake? Is it reasonable to impose costs, hurdles, even possible harm on some as a cheaper way to achieve “greater good”?
These issues are highly relevant to the proposed emphasis on holistic solutions.

2. A Case Study from the GENETIC CONNECTIONS — In The Family: Heart Disease Seeking Clues to Heart Disease in DNA of an Unlucky Family

By GINA KOLATA   2013.05.13  New York Times

Scientists are studying the genetic makeup of the Del Sontro family for

  • telltale mutations or aberrations in the DNA.

Robin Ashwood, one of Mr. Del Sontro’s sisters, found out she had extensive heart disease even though her electrocardiograms was normal. Six of her seven siblings also have heart disease, despite not having any of the traditional risk factors. Then, after a sister, just 47 years old, found out she had advanced heart disease, Mr. Del Sontro, then 43, went to a cardiologist. An X-ray of his arteries revealed the truth. Like his grand-father, his mother, his four brothers and two sisters, he had heart disease.

Now he and his extended family have joined an extraordinary federal research project that is using genetic sequencing to find factors that increase the risk of heart disease beyond the usual suspects — high cholesterol, high blood pressure, smoking and diabetes.“We don’t know yet how many pathways there are to heart disease,” said Dr. Leslie Biesecker, who directs the study Mr. Del Sontro joined. “That’s the power of genetics. To try and dissect that.”

“I had bought the dream: if you just do the right things and eat the right things, you will be O.K.,” said Mr. Del Sontro, whose cholesterol and blood pressure are reassuringly low.

3. Arterial Stiffness and Cardiovascular Events : The Framingham Heart Study

GF Mitchell, Shih-Jen Hwang, RS Vasan, MG Larson.

Circulation. 2010;121:505-511.  http://circ.ahajournals.org/content/121/4/505
http://dx.doi.org/10.1161/CIRCULATIONAHA.109.886655

Various measures of arterial stiffness and wave reflection have been proposed as cardiovascular risk markers.
Prior studies have not assessed relations of a comprehensive panel of stiffness measures to prognosis.
First-onset major cardiovascular disease events in relation to arterial stiffness

  • pulse wave velocity [PWV]
  • wave reflection
    • augmentation index
    • carotid-brachial pressure amplification)
  • central pulse pressure

were analyzed  in 2232 participants (mean age, 63 years; 58% women) in the Framingham Heart Study by a proportional hazards model. During median follow-up of 7.8 (range, 0.2 to 8.9) years,

  • 151 of 2232 participants (6.8%) experienced an event.

In multivariable models adjusted for

  • age
  • sex
  • systolic blood pressure
  • use of antihypertensive therapy
  • total and high-density lipoprotein cholesterol concentrations
  • smoking
  • presence of diabetes mellitus

higher aortic PWV was associated with a 48% increase in cardiovascular disease risk (95% confidence interval, 1.16 to 1.91 per SD; P 0.002).

After PWV was added to a standard risk factor model, integrated discrimination improvement was 0.7% (95% confidence interval, 0.05% to 1.3%; P 0.05).

In contrast,

  • augmentation index,
  • central pulse pressure, and
  • pulse pressure amplification

were not related to cardiovascular disease outcomes in multivariable models.

Higher aortic stiffness assessed by PWV

  • is associated with increased risk for a first cardiovascular event.

Aortic PWV improves risk prediction when added to standard risk factors and may represent

  • a valuable biomarker of cardiovascular disease risk

We shall here visit a recent article by Justin D. Pearlman and Aviva Lev-Ari, PhD, RN, on

Pros and Cons of Drug Stabilizers for Arterial  Elasticity as an Alternative or Adjunct to Diuretics and Vasodilators in the Management of Hypertension, titled

4. Hypertension and Vascular Compliance: 2013 Thought Frontier – An Arterial Elasticity Focus

http://pharmaceuticalintelligence.com/2013/05/11/arterial-elasticity-in-quest-for-a-drug-stabilizer-isolated-systolic-hypertension-caused-by-arterial-stiffening-ineffectively-treated-by-vasodilatation-antihypertensives/

Speaking at the 2013 International Conference on Prehypertension and Cardiometabolic Syndrome, meeting cochair Dr Reuven Zimlichman (Tel Aviv University, Israel) argued that there is a growing number of patients for whom the conventional methods are inappropriate for

  • the definitions of hypertension
  • the risk-factor tables used to guide treatment

Most antihypertensives today work by producing vasodilation or decreasing blood volume which may be

  • ineffective treatments for patients in whom average arterial diameter and circulating volume are not the causes of hypertension and as targets of therapy may promote decompensation

In the future, he predicts, “we will have to start looking for a totally different medication that will aim to

  • improve or at least to stabilize arterial elasticity: medication that might affect factors that determine the stiffness of the arteries, like collagen, like fibroblasts.

Those are not the aim of any group of antihypertensive medications today.”

Zimlichman believes existing databases could be used to develop algorithms that focus on

  • inelasticity as a mechanism of hypertensive disease

He also points out that

  • ambulatory blood-pressure-monitoring devices can measure elasticity

http://www.theheart.org/article/1502067.do

A related article was published on the relationship between arterial stiffening and primary hypertension.

Arterial stiffening provides sufficient explanation for primary hypertension.

KH Pettersen, SM Bugenhagen, J Nauman, DA Beard, SW Omholt.

By use of empirically well-constrained computer models describing the coupled function of the baroreceptor reflex and mechanics of the circulatory system, we demonstrate quantitatively that

  • arterial stiffening seems sufficient to explain age-related emergence of hypertension.

Specifically,

  • the empirically observed chronic changes in pulse pressure with age
  • the capacity of hypertensive individuals to regulate short-term changes in blood pressure becomes impaired

The results suggest that a major target for treating chronic hypertension in the elderly  may include

  • the reestablishment of a proper baroreflex response.

http://arxiv.org/abs/1305.0727v2?goback=%2Egde_4346921_member_240018699

5. Clinical Decision Support Systems: Realtime Clinical Expert Support: Biomarkers of Cardiovascular Disease — Molecular Basis and Practical Considerations

RS Vasan.  Circulation. 2006;113:2335-2362

http://dx.doi.org/10.1161/CIRCULATIONAHA.104.482570

http://circ.ahajournals.org/content/113/19/2335

Substantial data indicate that CVD is a life course disease that begins with the evolution of risk factors that contribute to

  • subclinical atherosclerosis.

Subclinical disease culminates in overt CVD. The onset of CVD itself portends an adverse prognosis with greater

  • risks of recurrent adverse cardiovascular events, morbidity, and mortality.

Clinical assessment alone has limitations. Clinicians have used additional tools to aid clinical assessment and to enhance their ability to identify the “vulnerable” patient at risk for CVD, as suggested by a recent National Institutes of Health (NIH) panel.

Biomarkers are one such tool to better identify high-risk individuals, to diagnose disease conditions promptly for diagnosis, prognosis, and treatment guidance.

Biological marker (biomarker): A laboratory test value that is objectively measured and evaluated as an indicator of

  1. normal biological processes,
  2. pathogenic processes, or
  3. pharmacological responses to a therapeutic intervention.

Type 0 biomarker: A marker of the natural history of a disease

  • Type 0 correlates longitudinally with known clinical indices/predicts outcomes.

Type I biomarker: A marker that captures the effects of a therapeutic intervention

  • Type I assesses an aspect of treatment mechanism of action.

Type 2 biomarker (surrogate end point):  A marker intended to predict outcomes on the basis of

  • epidemiologic
  • therapeutic
  • pathophysiologic or
  • other scientific evidence.

With biomarkers monitoring disease progression or response to therapy, the patient can serve as  his or her own control (follow-up values may be compared to baseline  values).

Costs may be less important for prognostic markers when they are largely restricted to people with disease (total cost=cost per person x number to be tested, plus down-stream costs). Some biomarkers (e.g., an exercise stress test) may be used for both diagnostic and prognostic purposes.

Generally there are cost differences in establishing a prognostic value versus diagnostic value of a biomarker:

  • prognostic utility typically requires a large sample and a prospective design, whereas
  • diagnostic value often can be determined with a smaller sample in a cross-sectional design

Regardless of the intended use, it is important to remember that biomarkers that do not change disease management

  • cannot affect patient outcome and therefore
  • are unlikely to be cost-effective (judged in terms of quality-adjusted life-years gained).

Typically, for a biomarker to change management, it is important to have evidence that risk reduction strategies should vary with biomarker levels, and/or biomarker-guided management achieves advantages over a management scheme that ignores the biomarker levels.

Typically it means that biomarker levels should be modifiable by therapy.

Gil David and Larry Bernstein have developed, in consultation with Prof. Ronald Coifman, in the Yale University Applied Mathematics Program, a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that

  • provides empirical medical reference and
  • suggests quantitative diagnostics options.

The current design of the Electronic Medical Record (EMR) is a
linear presentation of portions of the record

  • by services
  • by diagnostic method, and
  • by date

to cite examples.

This allows perusal through a graphical user interface (GUI) that

  • partitions the information or necessary reports in a workstation entered by keying to icons.
  • presents decision support

Examples of data partitions include:

  • history
  • medications
  • laboratory reports
  • imaging
  • EKGs

The introduction of a DASHBOARD adds presentation of

  • drug reactions
  • allergies
  • primary and secondary diagnoses, and
  • critical information

about any patient the care giver needing access to the record.

A basic issue for such a tool is what information is presented and how it is displayed.

A determinant of the success of this endeavor is if it

  • facilitates workflow
  • facilitates decision-making process
  • reduces medical error.

Continuing work is in progress in extending the capabilities with model datasets, and sufficient data based on the assumption that computer extraction of data from disparate sources will, in the long run, further improve this process.

For instance, there is synergistic value in finding coincidence of:

  • ST shift on EKG
  • elevated cardiac biomarker (troponin)
  • in the absence of substantially reduced renal function.

Similarly, the conversion of hematology based data into useful clinical information requires the establishment of problem-solving constructs based on the measured data.

The most commonly ordered test used for managing patients worldwide is the hemogram that often incorporates

  • morphologic review of a peripheral smear
  • descriptive statistics

While the hemogram has undergone progressive modification of the measured features over time the subsequent expansion of the panel of tests has provided a window into the cellular changes in the

  • production
  • release
  • or suppression

of the formed elements from the blood-forming organ into the circulation. In the hemogram one can view data reflecting the characteristics of a broad spectrum of medical conditions.

Progressive modification of the measured features of the hemogram has delineated characteristics expressed as measurements of

  • size
  • density, and
  • concentration

resulting in many characteristic features of classification. In the diagnosis of hematological disorders

  • proliferation of marrow precursors
  • domination of a cell line
  • suppression of hematopoiesis

Other dimensions are created by considering

  • the maturity and size of the circulating cells.

The application of rules-based, automated problem solving should provide a valid approach to

  • the classification and interpretation of the data used to determine a knowledge-based clinical opinion.

The exponential growth of knowledge since the mapping of the human genome enabled by parallel advances in applied mathematics that have not been a part of traditional clinical problem solving.

As the complexity of statistical models has increased

  • the dependencies have become less clear to the individual.

Contemporary statistical modeling has a primary goal of finding an underlying structure in studied data sets.
The development of an evidence-based inference engine that can substantially interpret the data at hand and

  • convert it in real time to a “knowledge-based opinion”

could improve clinical decision-making by incorporating into the model

  • multiple complex clinical features as well as onset and duration .

An example of a difficult area for clinical problem solving is found in the diagnosis of Systemic Inflammatory Response Syndrome (SIRS) and associated sepsis. SIRS is a costly diagnosis in hospitalized patients.   Failure to diagnose it in a timely manner increases the financial and safety hazard.  The early diagnosis of SIRS/sepsis is made by the application of defined criteria by the clinician.

  • temperature
  • heartrate
  • respiratory rate and
  • WBC count

The application of those clinical criteria, however, defines the condition after it has developed, leaving unanswered the hope for

  • a reliable method for earlier diagnosis of SIRS.

The early diagnosis of SIRS may possibly be enhanced by the measurement of proteomic biomarkers, including

  • transthyretin
  • C-reactive protein
  • procalcitonin
  • mean arterial pressure

Immature granulocyte (IG) measurement has been proposed as a

  • readily available indicator of the presence of granulocyte precursors (left shift).

The use of such markers, obtained by automated systems in conjunction with innovative statistical modeling, provides

  • a promising support to early accurate decision making.

Such a system aims to reduce medical error by utilizing

  • the conjoined syndromic features of disparate data elements .

How we frame our expectations is important. It determines

  • the data we collect to examine the process.

In the absence of data to support an assumed benefit, there is no proof of validity at whatever cost.

Potential arenas of benefit include:

  • hospital operations
  • nonhospital laboratory studies
  • companies in the diagnostic business
  • planners of health systems

The problem stated by LL  WEED in “Idols of the Mind” (Dec 13, 2006):
“ a root cause of a major defect in the health care system is that, while we falsely admire and extol the intellectual powers of highly educated physicians, we do not search for the external aids their minds require.” Hospital information technology (HIT) use has been focused on information retrieval, leaving

  • the unaided mind burdened with information processing.

We deal with problems in the interpretation of data presented to the physician, and how the situation could be improved through better

  • design of the software that presents data .

The computer architecture that the physician uses to view the results is more often than not presented

  • as the designer would prefer, and not as the end-user would like.

In order to optimize the interface for physician, the system could have a “front-to-back” design, with the call up for any patient

  • A dashboard design that presents the crucial information that the physician would likely act on in an easily accessible manner
  • Each item used has to be closely related to a corresponding criterion needed for a decision.

Feature Extraction.

Eugene Rypka contributed greatly to clarifying the extraction of features in a series of articles, which

  • set the groundwork for the methods used today in clinical microbiology.

The method he describes is termed S-clustering, and

  • will have a significant bearing on how we can view laboratory data.

He describes S-clustering as extracting features from endogenous data that

  • amplify or maximize structural information to create distinctive classes.

The method classifies by taking the number of features with sufficient variety to generate maps.

The mapping is done by

  • a truth table NxN of messages and choices
  • each variable is scaled to assign values for each message choice.

For example, the message for an antibody titer would be converted from 0 + ++ +++ to 0 1 2 3.

Even though there may be a large number of measured values, the variety is reduced by this compression, even though it may represent less information.

The main issue is

  • how a combination of variables falls into a table to convey meaningful information.

We are concerned with

  • accurate assignment into uniquely variable groups by information in test relationships.

One determines the effectiveness of each variable by its contribution to information gain in the system. The reference or null set is the class having no information.  Uncertainty in assigning to a classification can be countered by providing sufficient information.

One determines the effectiveness of each variable by its contribution to information gain in the system. The possibility for realizing a good model for approximating the effects of factors supported by data used

  • for inference owes much to the discovery of Kullback-Liebler distance or “information”, and Akaike
  • found a simple relationship between K-L information and Fisher’s maximized log-likelihood function.

In the last 60 years the application of entropy comparable to

  • the entropy of physics, information, noise, and signal processing,
  • developed by Shannon, Kullback, and others
  • integrated with modern statistics,
  • as a result of the seminal work of Akaike, Leo Goodman, Magidson and Vermunt, and work by Coifman

Akaike pioneered recognition that the choice of model influence results in a measurable manner. In particular, a larger number of variables promotes further explanations of variance, such that a model selection criterion is important that penalizes for the number of variables when success is measured by explanation of variance.

Gil David et al. introduced an AUTOMATED processing of the data available to the ordering physician and

  • can anticipate an enormous impact in diagnosis and treatment of perhaps half of the top 20 most common
  • causes of hospital admission that carry a high cost and morbidity.

For example:

  1. anemias (iron deficiency, vitamin B12 and folate deficiency, and hemolytic anemia or myelodysplastic syndrome);
  2. pneumonia; systemic inflammatory response syndrome (SIRS) with or without bacteremia;
  3. multiple organ failure and hemodynamic shock;
  4. electrolyte/acid base balance disorders;
  5. acute and chronic liver disease;
  6. acute and chronic renal disease;
  7. diabetes mellitus;
  8. protein-energy malnutrition;
  9. acute respiratory distress of the newborn;
  10. acute coronary syndrome;
  11. congestive heart failure;
  12. hypertension
  13. disordered bone mineral metabolism;
  14. hemostatic disorders;
  15. leukemia and lymphoma;
  16. malabsorption syndromes; and
  17. cancer(s)[breast, prostate, colorectal, pancreas, stomach, liver, esophagus, thyroid, and parathyroid].
  18. endocrine disorders
  19. prenatal and perinatal diseases

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of
myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH (Chairman). Prealbumin in Nutritional Care Consensus Group.

Measurement of visceral protein status in assessing protein and energy
malnutrition: standard of care. Nutrition 1995; 11:169-171.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction:
integration of serum markers and clinical descriptors using information theory.
Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.;
Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the
Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB)
Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method
(GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory
data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with
chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor Wickerhauser. Adapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising. Optical Engineering, 33(7):2170–2174, July 1994.

R. Coifman and N. Saito. Constructions of local orthonormal bases for classification and regression.
C. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

Realtime Clinical Expert Support and validation System

We have developed a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that provides empirical medical reference and suggests quantitative diagnostics options. The primary purpose is to gather medical information, generate metrics, analyze them in realtime and provide a differential diagnosis, meeting the highest standard of accuracy. The system builds its unique characterization and provides a list of other patients that share this unique profile, therefore

  • utilizing the vast aggregated knowledge (diagnosis, analysis, treatment, etc.) of the medical community.
  • The main mathematical breakthroughs are provided by accurate patient profiling and inference methodologies
  • in which anomalous subprofiles are extracted and compared to potentially relevant cases.

As the model grows and its knowledge database is extended, the diagnostic and the prognostic become more accurate and precise.
We anticipate that the effect of implementing this diagnostic amplifier would result in

  • higher physician productivity at a time of great human resource limitations,
  • safer prescribing practices,
  • rapid identification of unusual patients,
  • better assignment of patients to observation, inpatient beds,
    intensive care, or referral to clinic,
  • shortened length of patients ICU and bed days.

The main benefit is a

  1. real time assessment as well as
  2. diagnostic options based on comparable cases,
  3. flags for risk and potential problems

as illustrated in the following case acquired on 04/21/10. The patient was diagnosed by our system with severe SIRS at a grade of 0.61 .

Graphical presentation of patient status

The patient was treated for SIRS and the blood tests were repeated during the following week. The full combined record of our system’s assessment of the patient, as derived from the further hematology tests, is illustrated below. The yellow line shows the diagnosis that corresponds to the first blood test (as also shown in the image above). The red line shows the next diagnosis that was performed a week later.

Progression changes in patient ICU stay with SIRS

The MISSIVE(c) system, by Justin Pearlman, is an alternative approach that includes not only automated data retrieval and reformatting of data for decision support, but also an integrated set of tools to speed up analysis, structured for quality and error reduction, couplled to facilitated report generation, incorporation of just-in-time knowledge and group expertise, standards of care, evidence-based planning, and both physician and patient instruction.

See also in Pharmaceutical Intelligence:

The Cost Burden of Disease: U.S. and Michigan.CHRT Brief. January 2010. @www.chrt.org

The National Hospital Bill: The Most Expensive Conditions by Payer, 2006. HCUP Brief #59.

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction:
integration of serum markers and clinical descriptors using information theory.
Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.;
Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB) Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method (GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory
data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor WickerhauserAdapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising.
Optical Engineering 1994; 33(7):2170–2174.

R. Coifman and N. SaitoConstructions of local orthonormal bases for classification and regressionC. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

W Ruts, S De Deyne, E Ameel, W Vanpaemel,T Verbeemen, And G Storms. Dutch norm data for 13 semantic categoriesand 338 exemplars. Behavior Research Methods, Instruments,
& Computers 2004; 36 (3): 506–515.

De Deyne, S Verheyen, E Ameel, W Vanpaemel, MJ Dry, WVoorspoels, and G Storms.  Exemplar by feature applicability matrices and other Dutch normative data for semantic
concepts.
  Behavior Research Methods 2008; 40 (4): 1030-1048

Landauer, T. K., Ross, B. H., & Didner, R. S. (1979). Processing visually presented single words: A reaction time analysis [Technical memorandum].  Murray Hill, NJ: Bell Laboratories.
Lewandowsky , S. (1991).

Weed L. Automation of the problem oriented medical record. NCHSR Research Digest Series DHEW. 1977;(HRA)77-3177.

Naegele TA. Letter to the Editor. Amer J Crit Care 1993;2(5):433.

Sheila Nirenberg/Cornell and Chethan Pandarinath/Stanford, “Retinal prosthetic strategy with the capacity to restore normal vision,” Proceedings of the National Academy of Sciences.

Other related articles published in this Open Access Online Scientific Journal include the following:

http://pharmaceuticalintelligence.com/2012/08/13/the-automated-second-opinion-generator/

http://pharmaceuticalintelligence.com/2012/09/21/the-electronic-health-record-how-far-we-
have-travelled-and-where-is-journeys-end/

http://pharmaceuticalintelligence.com/2013/02/18/the-potential-contribution-of-
informatics-to-healthcare-is-more-than-currently-estimated/

http://pharmaceuticalintelligence.com/2013/05/04/cardiovascular-diseases-decision-support-
systems-for-disease-management-decision-making/?goback=%2Egde_4346921_member_239739196

http://pharmaceuticalintelligence.com/2012/08/13/demonstration-of-a-diagnostic-clinical-
laboratory-neural-network-agent-applied-to-three-laboratory-data-conditioning-problems/

http://pharmaceuticalintelligence.com/2012/12/17/big-data-in-genomic-medicine/

http://pharmaceuticalintelligence.com/2013/02/13/cracking-the-code-of-human-life-
the-birth-of-bioinformatics-and-computational-genomics/

http://pharmaceuticalintelligence.com/2013/04/28/genetics-of-conduction-disease-
atrioventricular-av-conduction-disease-block-gene-mutations-transcription-excitability-
and-energy-homeostasis/

http://pharmaceuticalintelligence.com/2012/12/10/identification-of-biomarkers-that-
are-relatedto-the-actin-cytoskeleton/

http://pharmaceuticalintelligence.com/2012/08/14/regression-a-richly-textured-method-
for-comparison-and-classification-of-predictor-variables/

http://pharmaceuticalintelligence.com/2012/08/02/diagnostic-evaluation-of-sirs-by-
immature-granulocytes/

http://pharmaceuticalintelligence.com/2012/08/01/automated-inferential-diagnosis-
of-sirs-sepsis-septic-shock/

http://pharmaceuticalintelligence.com/2012/08/12/1815/

http://pharmaceuticalintelligence.com/2012/08/15/1946/

http://pharmaceuticalintelligence.com/2013/05/13/vinod-khosla-20-doctor-included-speculations-
musings-of-a-technology-optimist-or-technology-will-replace-80-of-what-doctors-do/

http://pharmaceuticalintelligence.com/2013/05/05/bioengineering-of-vascular-and-tissue-models/

The Heart: Vasculature Protection – A Concept-based Pharmacological Therapy including THYMOSIN
Aviva Lev-Ari, PhD, RN 2/28/2013
http://pharmaceuticalintelligence.com/2013/02/28/the-heart-vasculature-protection-a-concept-
based-pharmacological-therapy-including-thymosin/

FDA Pending 510(k) for The Latest Cardiovascular Imaging Technology
Aviva Lev-Ari, PhD, RN 1/28/2013
http://pharmaceuticalintelligence.com/2013/01/28/fda-pending-510k-for-the-latest-
cardiovascular-imaging-technology/

PCI Outcomes, Increased Ischemic Risk associated with Elevated Plasma Fibrinogen not
Platelet Reactivity    Aviva Lev-Ari, PhD, RN 1/10/2013
http://pharmaceuticalintelligence.com/2013/01/10/pci-outcomes-increased-ischemic-risk-
associated-with-elevated-plasma-fibrinogen-not-platelet-reactivity/

The ACUITY-PCI score: Will it Replace Four Established Risk Scores — TIMI, GRACE, SYNTAX,
and Clinical SYNTAX   Aviva Lev-Ari, PhD, RN 1/3/2013
http://pharmaceuticalintelligence.com/2013/01/03/the-acuity-pci-score-will-it-replace-four-
established-risk-scores-timi-grace-syntax-and-clinical-syntax/

Coronary artery disease in symptomatic patients referred for coronary angiography: Predicted by
Serum Protein Profiles    Aviva Lev-Ari, PhD, RN 12/29/2012
http://pharmaceuticalintelligence.com/2012/12/29/coronary-artery-disease-in-symptomatic-
patients-referred-for-coronary-angiography-predicted-by-serum-protein-profiles/

New Definition of MI Unveiled, Fractional Flow Reserve (FFR)CT for Tagging Ischemia
Aviva Lev-Ari, PhD, RN 8/27/2012
http://pharmaceuticalintelligence.com/2012/08/27/new-definition-of-mi-unveiled-
fractional-flow-reserve-ffrct-for-tagging-ischemia/

Herceptin Fab (antibody) - light and heavy chains

Herceptin Fab (antibody) – light and heavy chains (Photo credit: Wikipedia)

Personalized Medicine

Personalized Medicine (Photo credit: Wikipedia)

Diagnostic of pathogenic mutations. A diagnost...

Diagnostic of pathogenic mutations. A diagnostic complex is a dsDNA molecule resembling a short part of the gene of interest, in which one of the strands is intact (diagnostic signal) and the other bears the mutation to be detected (mutation signal). In case of a pathogenic mutation, the transcribed mRNA pairs to the mutation signal and triggers the release of the diagnostic signal (Photo credit: Wikipedia)

Read Full Post »

Diagnostics and Biomarkers: Novel Genomics Industry Trends vs Present Market Conditions and Historical Scientific Leaders Memoirs

Larry H Bernstein, MD, FCAP, Author and Curator

This article has two parts:

  • Part 1: Novel Genomics Industry Trends in Diagnostics and Biomarkers vs Present Market Transient Conditions

and

  • Part 2: Historical Scientific Leaders Memoirs

 

Part 1: Novel Genomics Industry Trends in Diagnostics and Biomarkers vs Present Market Transient Conditions

 

Based on “Forging a path from companion diagnostics to holistic decision support”, L.E.K.

Executive Insights, 2013;14(12). http://www.LEK.com

Companion diagnostics and their companion therapies is defined here as a method enabling

  • LIKELY responders to therapies that are specific for patients with ma specific molecular profile.

The result of this statement is that the diagnostics permitted to specific patient types gives access to

  • novel therapies that may otherwise not be approve or reimbursed in other, perhaps “similar” patients
  • who lack a matching identification of the key identifier(s) needed to permit that therapy,
  • thus, entailing a poor expected response.

The concept is new because:

(1) The diagnoses may be closely related by classical criteria, but at the same time they are
not alike with respect to efficacy of treatment with a standard therapy.
(2) The companion diagnostics is restricted to dealing with a targeted drug-specific question
without regard to other clinical issues.
(3) The efficacy issue it clarifies is reliant on a deep molecular/metabolic insight that is not available, except through
emergent genomic/proteomic analysis that has become available and which has rapidly declining cost to obtain.

The limitation example given is HER2 testing for use of Herceptin in therapy for non-candidates (HER2 negative patients).
The problem is that the current format is a “one test/one drug” match, but decision support  may require a combination of

  • validated biomakers obtained on a small biopsy sample (technically manageable) with confusing results.

While HER2 negative patients are more likely to be pre-menopausal with a more aggressive tumor than postmenopausal,

  • the HER2 negative designation does not preclude treatment with Herceptin.

So the Herceptin would be given in combination, but with what other drug in a non-candidate?

The point that L.E.K. makes is that providing highly validated biomarkers linked to approved therapies, it is necessary to pursue more holistic decision support tests that interrogate multiple biomarkers (panels of companion diagnostic markers) and discovery of signatures for treatments that are also used with a broad range of information, such as,

  • traditional tests,
  • imaging,
  • clinical trials,
  • outcomes data,
  • EMR data,
  • reimbursement and coverage data.

A comprehensive solution of this nature appears to be a distance from realization.  However, is this the direction that will lead to tomorrows treatment decision support approaches?

 Surveying the Decision Support Testing Landscape

As a starting point, L.E.K. characterized the landscape of available tests in the U.S. that inform treatment decisions compiled from ~50 leading diagnostics companies operating in the U.S. between 2004-2011. L.E.K. identified more than 200 decision support tests that were classified by test purpose, and more specifically,  whether tests inform treatment decisions for a single drug/class (e.g., companion diagnostics) vs. more holistic treatment decisions across multiple drugs/classes (i.e., multiagent response tests).

 Treatment Decision Support Tests

Companion Diagnostics
Single drug/class
Predict response/safety or guide dosing of a single drug or class

HercepTest   Dako
Determines HER2 protein overexpression for Herceptin treatment selection

Multiple drugs/classes

Vysis ALK Break
Apart FISH
Abbott Labs Predicts the NSCLC patient response to Xalkori

Other Decision Support
Provide prognostic and predictive information on the benefit of treatment

Oncotype Dx    Genomic Health, Inc.
Predicts both recurrence of breast cancer and potential patient benefit to chemotherapy regimens

PML-RARα     Clarient, Inc.
Predicts response to all-trans retinoic acid (ATRA) and other chemotherapy agents

TRUGENE    Siemens
Measures resistence to multiple  HIV-1 anti-retroviral agents

Multi-agent Response

Inform targeted therapy class selection by interrogating a panel of biomarkers
Target Now  Caris Life Sciences
Examines tumor’s molecular profile to tailor treatment options

ResponseDX: Lung    Response Genetics, Inc.
Examines multiple biomarkers to guide therapeutic treatment decisions for NSCLC patients

Source: L.E.K. Analysis

Includes IVD and LDT tests from

  1. top-15 IVD test suppliers,
  2. top-four large reference labs,
  3. top-five AP labs, and
  4. top-20 specialty reference labs.

For descriptive purposes only, may not map to exact regulatory labeling

Most tests are companion diagnostics and other decision support tests that provide guidance on

  • single drug/class therapy decisions.

However, holistic decision support tests (e.g., multi-agent response) are growing the fastest at 56% CAGR.
The emergence of multi-agent response tests suggests diagnostics companies are already seeing the need to aggregate individual tests (e.g., companion diagnostics) into panels of appropriate markers addressing a given clinical decision need. L.E.K. believes this trend is likely to continue as

  • increasing numbers of  biomarkers become validated for diseases and multiplexing tools
  • enabling the aggregation of multiple biomarker interrogations into a single test

to become deployed in the clinic.

Personalized Medicine Partnerships

L.E.K. also completed an assessment of publicly available personalized medicine partnership activity from 2009-2011 for ~150 leading organizations operating in the U.S. to look at broader decision support trends and emergence of more holistic solutions beyond diagnostic tests.

Survey of partnerships deals was conducted for

  • top-10 academic medical centers research institutions,
  • top-25 biopharma,
  • top-four healthcare IT companies,
  • top-three healthcare imaging companies,
  • top-20 IVD manufacturers,
  • top-20 laboratories,
  • top-10 payers/PBMs,
  • top-15 personalized healthcare companies,
  • top-10 regulatory/guideline entities, and
  • top-20 tools vendors for the period of 01/01/2009 – 12/31/2011.
    Source: Company websites, GenomeWeb, L.E.K. analysis

Across the sample we identified 189 publicly announced partnerships of which ~65% focused on more traditional areas (biomarker discovery, companion diagnostics and targeted therapies). However, a significant portion (~30%) included elements geared towards creating more holistic decision support models.

Partnerships categorized as holistic decision support by L.E.K. were focused on

  • mining large patient datasets (e.g., from payers or providers),
  • molecular profiling (e.g., deploying next-generation sequencing),
  • creating information technology (IT) infrastructure needed to enable holistic decision support models and
  • integrating various datasets to create richer decision support solutions.

Interestingly, holistic decision support partnerships often included stakeholders outside of biopharma and diagnostics such as

  • research tools,
  • payers/PBMs,
  • healthcare IT companies as well as
  • emerging personalized healthcare (PHC) companies (e.g., Knome, Foundation Medicine and 23andMe).

This finding suggests that these new stakeholders will be increasingly important in influencing care decisions going forward.

Holistic Treatment Decision Support

Holistic Decision   Support Focus

Technology Provider Partners
Stakeholder Deploying the Solution

Holistic Decision
Support Activities
Molecular Profiling

Life Technologies

TGEN/US
Oncology

Sequencing of triple-negative breast  cancer patients to identify potential treatment strategies

Foundation Medicine

Novartis

Deployment of cancer genomics analysis platform to support Novartis clinical research efforts
Predictive genomics

Clarient, Inc.
(GE Healthcare)

Acorn
Research

Biomarker profiling of patients within Acorn’s network of providers to support clinical research efforts

GenomeQuest

Beth Israel Deaconess
Medical Center

Whole genome analysis and to guide patient management
Outcomes Data Mining

AstraZeneca

WellPoint

Evaluate comparative effectiveness of selected marketed therapies

23andMe

NIH

Leverage information linking drug response and CYP2C9/CYP2C19 variation

Pfizer

Medco

Leverage patient genotype, phenotype and outcome for treatment decisions and target therapeutics
Healthcare IT Infrastructure

IBM

WellPoint

Deploy IBM’s Watson-based solution to evidence-based healthcare decision-making support

Oracle

Moffitt Cancer Center

Deploy Oracle’s informatics platform to store and manage patient medical information
Data Integration

Siemens Diagnostics

Susquehanna Health

Integration of imaging and laboratory diagnostics

Cernostics

Geisinger
Health

Integration of advanced tissue diagnostics, digital pathology, annotated biorepository and EMR
to create solutions
next-generation treatment decision support solutions

CardioDx

GE Healthcare

Integration of genomics with imaging data in CVD

Implications

L.E.K. believes the likely debate won’t center on which models and companies will prevail. It appears that the industry is now moving along the continuum to a truly holistic capability.
The mainstay of personalized medicine today will become integrated and enhanced by other data.

The companies that succeed will be able to capture vast amounts of information

  • and synthesize it for personalized care.

Holistic models will be powered by increasingly larger datasets and sophisticated decision-making algorithms.
This will require the participation of an increasingly broad range of participants to provide the

  • science, technologies, infrastructure and tools necessary for deployment.

There are a number of questions posed by this study, but only some are of interest to this discussion:

Group A.    Pharmaceuticals and Devices

  •  How will holistic decision support impact the landscape ?
    (e.g., treatment /testing algorithms, decision making, clinical trials)

Group B.     Diagnostics and   Decision Support

  •   What components will be required to build out holistic solutions?

– Testing technologies

– Information (e.g., associations, outcomes, trial databases, records)

– IT infrastructure for data integration and management, simulation and reporting

  •  How can various components be brought together to build seamless holistic  decision support solutions?

Group C.      Providers and Payers

  •  In which areas should models be deployed over time?
  • Where are clinical and economic arguments  most compelling?

Part 2: Historical Scientific Leaders Memoirs – Realtime Clinical Expert Support

Gil David and Larry Bernstein have developed, in consultation with Prof. Ronald Coifman,
in the Yale University Applied Mathematics Program,

A software system that is the equivalent of an intelligent Electronic Health Records Dashboard that

  • provides empirical medical reference and
  • suggests quantitative diagnostics options.

The current design of the Electronic Medical Record (EMR) is a linear presentation of portions of the record

  • by services
  • by diagnostic method, and
  • by date, to cite examples.

This allows perusal through a graphical user interface (GUI) that partitions the information or necessary reports

  • in a workstation entered by keying to icons.

This requires that the medical practitioner finds the

  • history,
  • medications,
  • laboratory reports,
  • cardiac imaging and
  • EKGs, and
  • radiology in different workspaces.

The introduction of a DASHBOARD has allowed a presentation of

  • drug reactions
  • allergies
  • primary and secondary diagnoses, and
  • critical information

about any patient the care giver needing access to the record.

The advantage of this innovation is obvious.  The startup problem is what information is presented and

  • how it is displayed, which is a source of variability and a key to its success.

We are proposing an innovation that supercedes the main design elements of a DASHBOARD and utilizes

  • the conjoined syndromic features of the disparate data elements.

So the important determinant of the success of this endeavor is that

  • it facilitates both the workflow and the decision-making process with a reduction of medical error.

Continuing work is in progress in extending the capabilities with model datasets, and sufficient data because

  • the extraction of data from disparate sources will, in the long run, further improve this process.

For instance, the finding of  both ST depression on EKG coincident with an elevated cardiac biomarker (troponin), particularly in the absence of substantially reduced renal function. The conversion of hematology based data into useful clinical information requires the establishment of problem-solving constructs based on the measured data.

The most commonly ordered test used for managing patients worldwide is the hemogram that often incorporates

  • the review of a peripheral smear.

While the hemogram has undergone progressive modification of the measured features over time the subsequent expansion of the panel of tests has provided a window into the cellular changes in the

  • production
  • release
  • or suppression

of the formed elements from the blood-forming organ into the circulation. In the hemogram one can view

  • data reflecting the characteristics of a broad spectrum of medical conditions.

Progressive modification of the measured features of the hemogram has delineated characteristics expressed as measurements of

  • size
  • density, and
  • concentration,

resulting in many characteristic features of classification. In the diagnosis of hematological disorders

  • proliferation of marrow precursors, the
  • domination of a cell line, and features of
  • suppression of hematopoiesis

provide a two dimensional model.  Other dimensions are created by considering

  • the maturity of the circulating cells.

The application of rules-based, automated problem solving should provide a valid approach to

  • the classification and interpretation of the data used to determine a knowledge-based clinical opinion.

The exponential growth of knowledge since the mapping of the human genome enabled by parallel advances in applied mathematics that have not been a part of traditional clinical problem solving.

As the complexity of statistical models has increased

  • the dependencies have become less clear to the individual.

Contemporary statistical modeling has a primary goal of finding an underlying structure in studied data sets.
The development of an evidence-based inference engine that can substantially interpret the data at hand and

  • convert it in real time to a “knowledge-based opinion”

could improve clinical decision-making by incorporating

  • multiple complex clinical features as well as duration of onset into the model.

An example of a difficult area for clinical problem solving is found in the diagnosis of SIRS and associated sepsis. SIRS (and associated sepsis) is a costly diagnosis in hospitalized patients.   Failure to diagnose sepsis in a timely manner creates a potential financial and safety hazard.  The early diagnosis of SIRS/sepsis is made by the application of defined criteria by the clinician.

  • temperature
  • heart rate
  • respiratory rate and
  • WBC count

The application of those clinical criteria, however, defines the condition after it has developed and

  • has not provided a reliable method for the early diagnosis of SIRS.

The early diagnosis of SIRS may possibly be enhanced by the measurement of proteomic biomarkers, including

  • transthyretin
  • C-reactive protein
  • procalcitonin
  • mean arterial pressure

Immature granulocyte (IG) measurement has been proposed as a

  • readily available indicator of the presence of granulocyte precursors (left shift).

The use of such markers, obtained by automated systems

  • in conjunction with innovative statistical modeling, provides
  • a promising approach to enhance workflow and decision making.

Such a system utilizes the conjoined syndromic features of

  • disparate data elements with an anticipated reduction of medical error.

How we frame our expectations is so important that it determines

  • the data we collect to examine the process.

In the absence of data to support an assumed benefit, there is no proof of validity at whatever cost.
This has meaning for

  • hospital operations,
  • for nonhospital laboratory operations,
  • for companies in the diagnostic business, and
  • for planning of health systems.

The problem stated by LL  WEED in “Idols of the Mind” (Dec 13, 2006): “ a root cause of a major defect in the health care system is that, while we falsely admire and extol the intellectual powers of highly educated physicians, we do not search for the external aids their minds require”.  HIT use has been

  • focused on information retrieval, leaving
  • the unaided mind burdened with information processing.

We deal with problems in the interpretation of data presented to the physician, and how through better

  • design of the software that presents this data the situation could be improved.

The computer architecture that the physician uses to view the results is more often than not presented

  • as the designer would prefer, and not as the end-user would like.

In order to optimize the interface for physician, the system would have a “front-to-back” design, with
the call up for any patient ideally consisting of a dashboard design that presents the crucial information

  • that the physician would likely act on in an easily accessible manner.

The key point is that each item used has to be closely related to a corresponding criterion needed for a decision.

Feature Extraction.

This further breakdown in the modern era is determined by genetically characteristic gene sequences
that are transcribed into what we measure.  Eugene Rypka contributed greatly to clarifying the extraction
of features in a series of articles, which

  • set the groundwork for the methods used today in clinical microbiology.

The method he describes is termed S-clustering, and

  • will have a significant bearing on how we can view laboratory data.

He describes S-clustering as extracting features from endogenous data that

  • amplify or maximize structural information to create distinctive classes.

The method classifies by taking the number of features

  • with sufficient variety to map into a theoretic standard.

The mapping is done by

  • a truth table, and each variable is scaled to assign values for each: message choice.

The number of messages and the number of choices forms an N-by N table.  He points out that the message

  • choice in an antibody titer would be converted from 0 + ++ +++ to 0 1 2 3.

Even though there may be a large number of measured values, the variety is reduced

  • by this compression, even though there is risk of loss of information.

Yet the real issue is how a combination of variables falls into a table with meaningful information. We are concerned with accurate assignment into uniquely variable groups by information in test relationships. One determines the effectiveness of each variable by

  • its contribution to information gain in the system.

The reference or null set is the class having no information.  Uncertainty in assigning to a classification is

  • only relieved by providing sufficient information.

The possibility for realizing a good model for approximating the effects of factors supported by data used

  • for inference owes much to the discovery of Kullback-Liebler distance or “information”, and Akaike
  • found a simple relationship between K-L information and Fisher’s maximized log-likelihood function.

In the last 60 years the application of entropy comparable to

  • the entropy of physics, information, noise, and signal processing,
  • has been fully developed by Shannon, Kullback, and others, and has been integrated with modern statistics,
  • as a result of the seminal work of Akaike, Leo Goodman, Magidson and Vermunt, and work by Coifman.

Gil David et al. introduced an AUTOMATED processing of the data available to the ordering physician and

  • can anticipate an enormous impact in diagnosis and treatment of perhaps half of the top 20 most common
  • causes of hospital admission that carry a high cost and morbidity.

For example: anemias (iron deficiency, vitamin B12 and folate deficiency, and hemolytic anemia or myelodysplastic syndrome); pneumonia; systemic inflammatory response syndrome (SIRS) with or without bacteremia; multiple organ failure and hemodynamic shock; electrolyte/acid base balance disorders; acute and chronic liver disease; acute and chronic renal disease; diabetes mellitus; protein-energy malnutrition; acute respiratory distress of the newborn; acute coronary syndrome; congestive heart failure; disordered bone mineral metabolism; hemostatic disorders; leukemia and lymphoma; malabsorption syndromes; and cancer(s)[breast, prostate, colorectal, pancreas, stomach, liver, esophagus, thyroid, and parathyroid].

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH (Chairman). Prealbumin in Nutritional Care Consensus Group.

Measurement of visceral protein status in assessing protein and energy malnutrition: standard of care. Nutrition 1995; 11:169-171.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction: integration of serum markers and clinical descriptors using information theory. Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.; Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB) Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method (GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor Wickerhauser. Adapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising. Optical Engineering, 33(7):2170–2174, July 1994.

R. Coifman and N. Saito. Constructions of local orthonormal bases for classification and regression. C. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

Realtime Clinical Expert Support and validation System

We have developed a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that provides empirical medical reference and suggests quantitative diagnostics options.

The primary purpose is to

  1. gather medical information,
  2. generate metrics,
  3. analyze them in realtime and
  4. provide a differential diagnosis,
  5. meeting the highest standard of accuracy.

The system builds its unique characterization and provides a list of other patients that share this unique profile, therefore utilizing the vast aggregated knowledge (diagnosis, analysis, treatment, etc.) of the medical community. The

  • main mathematical breakthroughs are provided by accurate patient profiling and inference methodologies
  • in which anomalous subprofiles are extracted and compared to potentially relevant cases.

As the model grows and its knowledge database is extended, the diagnostic and the prognostic become more accurate and precise. We anticipate that the effect of implementing this diagnostic amplifier would result in

  • higher physician productivity at a time of great human resource limitations,
  • safer prescribing practices,
  • rapid identification of unusual patients,
  • better assignment of patients to observation, inpatient beds,
    intensive care, or referral to clinic,
  • shortened length of patients ICU and bed days.

The main benefit is a real time assessment as well as diagnostic options based on

  • comparable cases,
  • flags for risk and potential problems

as illustrated in the following case acquired on 04/21/10. The patient was diagnosed by our system with severe SIRS at a grade of 0.61 .

Graphical presentation of patient status

The patient was treated for SIRS and the blood tests were repeated during the following week. The full combined record of our system’s assessment of the patient, as derived from the further hematology tests, is illustrated below. The yellow line shows the diagnosis that corresponds to the first blood test (as also shown in the image above). The red line shows the next diagnosis that was performed a week later.

Progression changes in patient ICU stay with SIRS

Chemistry of Herceptin [Trastuzumab] is explained with images in

http://www.chm.bris.ac.uk/motm/herceptin/index_files/Page450.htm

 

REFERENCES

The Cost Burden of Disease: U.S. and Michigan CHRT Brief. January 2010.
@www.chrt.org

The National Hospital Bill: The Most Expensive Conditions by Payer, 2006. HCUP Brief #59.

Rudolph RA, Bernstein LH, Babb J: Information-Induction for the diagnosis of myocardial infarction. Clin Chem 1988;34:2031-2038.

Bernstein LH, Qamar A, McPherson C, Zarich S, Rudolph R. Diagnosis of myocardial infarction: integration of serum markers and clinical descriptors using information theory. Yale J Biol Med 1999; 72: 5-13.

Kaplan L.A.; Chapman J.F.; Bock J.L.; Santa Maria E.; Clejan S.; Huddleston D.J.; Reed R.G.; Bernstein L.H.; Gillen-Goldstein J. Prediction of Respiratory Distress Syndrome using the Abbott FLM-II amniotic fluid assay. The National Academy of Clinical Biochemistry (NACB) Fetal Lung Maturity Assessment Project.  Clin Chim Acta 2002; 326(8): 61-68.

Bernstein LH, Qamar A, McPherson C, Zarich S. Evaluating a new graphical ordinal logit method (GOLDminer) in the diagnosis of myocardial infarction utilizing clinical features and laboratory data. Yale J Biol Med 1999; 72:259-268.

Bernstein L, Bradley K, Zarich SA. GOLDmineR: Improving models for classifying patients with chest pain. Yale J Biol Med 2002; 75, pp. 183-198.

Ronald Raphael Coifman and Mladen Victor Wickerhauser. Adapted Waveform Analysis as a Tool for Modeling, Feature Extraction, and Denoising. Optical Engineering 1994; 33(7):2170–2174.

  1. Coifman and N. Saito. Constructions of local orthonormal bases for classification and regression. C. R. Acad. Sci. Paris, 319 Série I:191-196, 1994.

W Ruts, S De Deyne, E Ameel, W Vanpaemel,T Verbeemen, And G Storms. Dutch norm data for 13 semantic categories and 338 exemplars. Behavior Research Methods, Instruments, & Computers 2004; 36 (3): 506–515.

De Deyne, S Verheyen, E Ameel, W Vanpaemel, MJ Dry, WVoorspoels, and G Storms.  Exemplar by feature applicability matrices and other Dutch normative data for semantic concepts.  Behavior Research Methods 2008; 40 (4): 1030-1048

Landauer, T. K., Ross, B. H., & Didner, R. S. (1979). Processing visually presented single words: A reaction time analysis [Technical memorandum].  Murray Hill, NJ: Bell Laboratories. Lewandowsky, S. (1991).

Weed L. Automation of the problem oriented medical record. NCHSR Research Digest Series DHEW. 1977;(HRA)77-3177.

Naegele TA. Letter to the Editor. Amer J Crit Care 1993:2(5):433.

Retinal prosthetic strategy with the capacity to restore normal vision, Sheila Nirenberg and Chethan Pandarinath

http://www.pnas.org/content/109/37/15012

 

Other related articles published in http://pharmaceuticalintelligence.com include the following:

 

  • The Automated Second Opinion Generator

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/13/the-automated-second-opinion-generator/

 

  • The electronic health record: How far we have travelled and where is journeys end

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/09/21/the-electronic-health-record-how-far-we-have-travelled-and-where-is-journeys-end/

 

  • The potential contribution of informatics to healthcare is more than currently estimated.

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2013/02/18/the-potential-contribution-of-informatics-to-healthcare-is-more-than-currently-estimated/

 

  • Clinical Decision Support Systems for Management Decision Making of Cardiovascular Diseases

Justin Pearlman, MD, PhD, FACC and Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/05/04/cardiovascular-diseases-decision-support-systems-for-disease-management-decision-making/

 

  • Demonstration of a diagnostic clinical laboratory neural network applied to three laboratory data conditioning problems

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/13/demonstration-of-a-diagnostic-clinical-laboratory-neural-network-agent-applied-to-three-laboratory-data-conditioning-problems/

 

  • CRACKING THE CODE OF HUMAN LIFE: The Birth of BioInformatics & Computational Genomics

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2014/08/30/cracking-the-code-of-human-life-the-birth-of-bioinformatics-computational-genomics/

 

  • Genetics of conduction disease atrioventricular AV conduction disease block gene mutations transcription excitability and energy homeostasis

Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/04/28/genetics-of-conduction-disease-atrioventricular-av-conduction-disease-block-gene-mutations-transcription-excitability-and-energy-homeostasis/

 

  • Identification of biomarkers that are related to the actin cytoskeleton

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/12/10/identification-of-biomarkers-that-are-related-to-the-actin-cytoskeleton/

 

  • Regression: A richly textured method for comparison of predictor variables

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/14/regression-a-richly-textured-method-for-comparison-and-classification-of-predictor-variables/

 

  • Diagnostic evaluation of SIRS by immature granulocytes

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/02/diagnostic-evaluation-of-sirs-by-immature-granulocytes/

 

  • Big data in genomic medicine

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/12/17/big-data-in-genomic-medicine/

 

  • Automated inferential diagnosis of SIRS, sepsis, septic shock

Larry H Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/01/automated-inferential-diagnosis-of-sirs-sepsis-septic-shock/

 

  • A Software Agent for Diagnosis of ACUTE MYOCARDIAL INFARCTION

Isaac E. Mayzlin, Ph.D., David Mayzlin and Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/08/12/1815/

 

  • Artificial Vision: Cornell and Stanford Researchers crack Retinal Code

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/08/15/1946/

 

  • Vinod Khosla: 20 doctor included speculations, musings of a technology optimist or technology will replace 80 percent of what doctors do

Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/05/13/vinod-khosla-20-doctor-included-speculations-musings-of-a-technology-optimist-or-technology-will-replace-80-of-what-doctors-do/

 

  • Biomaterials Technology: Models of Tissue Engineering for Reperfusion and Implantable Devices for Revascularization

Larry H Bernstein, MD, FACP and Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/05/05/bioengineering-of-vascular-and-tissue-models/

 

  • The Heart: Vasculature Protection – A Concept-based Pharmacological Therapy including THYMOSIN

Aviva Lev-Ari, PhD, RN 2/28/2013

https://pharmaceuticalintelligence.com/2013/02/28/the-heart-vasculature-protection-a-concept-based-pharmacological-therapy-including-thymosin/

 

  • FDA Pending 510(k) for The Latest Cardiovascular Imaging Technology

Aviva Lev-Ari, PhD, RN 1/28/2013

https://pharmaceuticalintelligence.com/2013/01/28/fda-pending-510k-for-the-latest-cardiovascular-imaging-technology/

 

  • PCI Outcomes, Increased Ischemic Risk associated with Elevated Plasma Fibrinogen not Platelet Reactivity

Aviva Lev-Ari, PhD, RN 1/10/2013

https://pharmaceuticalintelligence.com/2013/01/10/pci-outcomes-increased-ischemic-risk-associated-with-elevated-plasma-fibrinogen-not-platelet-reactivity/

 

  • The ACUITY-PCI score: Will it Replace Four Established Risk Scores — TIMI, GRACE, SYNTAX, and Clinical SYNTAX

Aviva Lev-Ari, PhD, RN 1/3/2013

https://pharmaceuticalintelligence.com/2013/01/03/the-acuity-pci-score-will-it-replace-four-established-risk-scores-timi-grace-syntax-and-clinical-syntax/

 

  • Coronary artery disease in symptomatic patients referred for coronary angiography: Predicted by Serum Protein Profiles

Aviva Lev-Ari, PhD, RN 12/29/2012

https://pharmaceuticalintelligence.com/2012/12/29/coronary-artery-disease-in-symptomatic-patients-referred-for-coronary-angiography-predicted-by-serum-protein-profiles

 

  • New Definition of MI Unveiled, Fractional Flow Reserve (FFR)CT for Tagging Ischemia

Aviva Lev-Ari, PhD, RN 8/27/2012

https://pharmaceuticalintelligence.com/2012/08/27/new-definition-of-mi-unveiled-fractional-flow-reserve-ffrct-for-tagging-ischemia/
 

Additional Related articles

  • Hospital EHRs Inadequate for Big Data; Need for Specialized -Omics Systems(labsoftnews.typepad.com)
  • Apple Inc. (AAPL), QUALCOMM, Inc. (QCOM): Disruptions Needed(insidermonkey.com)
  • Netsmart Names Dr. Ian Chuang Senior Vice President, Healthcare Informatics and Chief Medical Officer(prweb.com)
  • Strategic partnership signals new age of stratified medicine(prweb.com)
  • Personalized breast cancer therapeutic with companion diagnostic poised for clinical trials in H2(medcitynews.com)

Read Full Post »

Metabolomics: its Applications in Food and Nutrition Research

Reporter and Curator: Sudipta Saha, Ph.D.

 

Metabolomics is a relatively new field of “omics” research concerned with the high-throughput identification and quantification of small molecule (<1500 Da) metabolites in the metabolome. The metabolome is formally defined as the collection of all small molecule metabolites or chemicals that can be found in a cell, organ or organism. These small molecules can include a range of endogenous and exogenous chemical entities such as peptides, amino acids, nucleic acids, carbohydrates, organic acids, vitamins, polyphenols, alkaloids, minerals and just about any other chemical that can be used, ingested or synthesized by a given cell or organism.

Metabolomics is ideally positioned to be used in many areas of food science and nutrition research including food component analysis, food quality/authenticity assessment, food consumption monitoring and physiological monitoring in food intervention studies. However, the potential impact of metabolomics is still limited by two factors: (1) technology and (2) databases. In terms of instrumentation, it is clear that significant improvements need to be made to make metabolite detection and quantification technology more robust, automated and comprehensive. While promising advances have been made, current techniques are only capable of detecting perhaps 1/10th of the relevant metabolome. This expanded breadth and depth of coverage is particularly important in food and nutrition studies.

Many more reference spectral or chromatographic databases on metabolites, food components and phytochemicals need to be developed and made public. It is only through these databases that nutritionally relevant compounds can be routinely identified or quantified. Indeed a comprehensive effort, similar to that undertaken to annotate the human metabolome, needs to be made to complete and annotate the “food metabolome”. Similar efforts also need to be directed towards creating publicly accessible, comprehensive nutritional phenotype databases that include quantitative metabolomic (and other omic) data collected from diet-challenge or food intervention experiments. While these kinds of endeavours may take years to complete and cost millions of dollars, hopefully the food science community (and its funding agencies) will find a way of coordinating its activities to complete these efforts. Indeed, having public resource like a food metabolome database or a nutritional phenotype database could be as valuable to food scientists as GenBank has been to molecular biologists.

Source References:

http://www.sciencedirect.com/science/article/pii/S0924224408000770

http://www.sciencedirect.com/science/article/pii/B9780123945983000010

http://www.sciencedirect.com/science/article/pii/S092422440900226X

http://www.sciencedirect.com/science/article/pii/S1359644605036093

http://www.sciencedirect.com/science/article/pii/B9780080885049000520

http://www.sciencedirect.com/science/article/pii/B9780123744135000051

Other articles related to this topic were published on this Open Access Online Scientific Journal, including the following:

Ca2+ signaling: transcriptional control

Larry H. Bernstein, MD, FCAP, Reporter, RN 03/06/2013

http://pharmaceuticalintelligence.com/2013/03/06/ca2-signaling-transcriptional-control/

Harnessing Personalized Medicine for Cancer Management, Prospects of Prevention and Cure: Opinions of Cancer Scientific Leaders @ http://pharmaceuticalintelligence.com

Aviva Lev-Ari, PhD, RN 01/12/2013

http://pharmaceuticalintelligence.com/2013/01/12/harnessing-personalized-medicine-for-cancer-management-prospects-of-prevention-and-cure-opinions-of-cancer-scientific-leaders-httppharmaceuticalintelligence-com/

Breakthrough Digestive Disorders Research: Conditions affecting the Gastrointestinal Tract.

Aviva Lev-Ari, PhD, RN 12/12/2012

http://pharmaceuticalintelligence.com/2012/12/12/breakthrough-digestive-disorders-research-conditions-affecting-the-gastrointestinal-tract/

A Second Look at the Transthyretin Nutrition Inflammatory Conundrum

Larry H. Bernstein, MD, FCAP, Reporter, RN 12/03/2012

http://pharmaceuticalintelligence.com/2012/12/03/a-second-look-at-the-transthyretin-nutrition-inflammatory-conundrum/

Metabolic drivers in aggressive brain tumors

Prabodh Kandala, PhD, RN 11/11/2012

http://pharmaceuticalintelligence.com/2012/11/11/metabolic-drivers-in-aggressive-brain-tumors/

Metabolite Identification Combining Genetic and Metabolic Information: Genetic association links unknown metabolites to functionally related genes

Aviva Lev-Ari, PhD, RN 10/22/2012

http://pharmaceuticalintelligence.com/2012/10/22/metabolite-identification-combining-genetic-and-metabolic-information-genetic-association-links-unknown-metabolites-to-functionally-related-genes/

Advances in Separations Technology for the “OMICs” and Clarification of Therapeutic Targets

Larry H. Bernstein, MD, FCAP, Reporter, RN 10/22/2012

http://pharmaceuticalintelligence.com/2012/10/22/advances-in-separations-technology-for-the-omics-and-clarification-of-therapeutic-targets/

Expanding the Genetic Alphabet and linking the genome to the metabolome

Larry H. Bernstein, MD, FCAP, Reporter, RN 09/24/2012

http://pharmaceuticalintelligence.com/2012/09/24/expanding-the-genetic-alphabet-and-linking-the-genome-to-the-metabolome/

Therapeutic Targets for Diabetes and Related Metabolic Disorders

Aviva Lev-Ari, PhD, RN 08/20/2012

http://pharmaceuticalintelligence.com/2012/08/20/therapeutic-targets-for-diabetes-and-related-metabolic-disorders/

The Automated Second Opinion Generator

Larry H. Bernstein, MD, FCAP, Reporter, RN 08/13/2012

http://pharmaceuticalintelligence.com/2012/08/13/the-automated-second-opinion-generator/

 

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

 

Medicare Reveals Hospital Charge Information

By David Pittman, Washington Correspondent, MedPage Today

Published: May 08, 2013

 

WASHINGTON — The Obama administration made public on Wednesday previously unpublished hospital charges for the 100 most common inpatient treatments in 2011, saying a similar release of physician data is on the horizon.

The massive data file reveals wide variation in charges for these 100 services listed in hospitals’ “chargemasters” — industry jargon for what hospitals charge. The data set represents added transparency the administration hopes will influence consumer behavior.

“Making this available for free for the first time will save consumers money by arming them with information that can help them make better choices,” Health and Human Services Secretary Kathleen Sebelius said in a call with reporters Wednesday.

The data only include inpatient hospital services, but when asked about physician fees and other inpatient services, a top Centers for Medicare & Medicaid Services (CMS) official said those data could come later as the agency expands its price transparency initiative.

“We don’t have a set timetable for expansion for this data release,” Jonathan Blum, PhD, acting principal deputy administrator at CMS, said on the same call as Sebelius. “I think it is fair to say we intend to build upon this data release.”

Blum said multiple times in his call with reporters that CMS will study the impact this information has on consumer behavior and what value the public places on it.

Journalist Steven Brill — who wrote a March 4 Time magazine cover story on healthcare-pricing practices largely credited for CMS’ action Wednesday — said in a blog that Sebelius and CMS should next focus on outpatient services.

“The Feds need to publish chargemaster and Medicare pricing for the most frequent outpatient procedures and diagnostic tests at clinics — two huge profit venues in the medical world,” Brill wrote. “This will be harder — the government doesn’t collect that data as comprehensively — but those outpatient centers and clinics provide a huge portion of American medical care.”

A quick scan of the hospital data released Wednesday reveals wide variation for the same procedure in the same town.

For example, St. Dominic Hospital in Jackson, Miss., charged nearly $26,000 to implant a pacemaker while the University of Mississippi Medical Center across town charged more than $57,000 for the same procedure.

In Washington, the George Washington University Hospital charged nearly $69,000 for a lower-leg joint replacement without major complications. That same procedure cost just under $30,000 at Sibley Memorial Hospital — a nonprofit community hospital 5 miles away.

A joint replacement ranged from $5,300 at a hospital in Ada, Okla., to $223,000 at a hospital in Monterey Park, Calif., CMS said.

“Hospitals that charge two or three times the going rate rightfully face greater scrutiny,” Sebelius said.

Said Blum, “We’re really trying to help elevate the conversation and continue the conversation and to ask questions why there is so much variation.”

Common explanations for the varying costs — patients’ health status, hospital payer mix, teaching status — don’t seem accurate or clear from data CMS released, Blum said, adding that making such information public will help researchers, consumers, and others better ask questions and engage in debate over costs.

Opponents to such transparency note that chargemaster prices are irrelevant to most patients. Private insurance companies and Medicare negotiate their own prices with hospitals.

Instead, it’s only the uninsured who face the prices on the chargemaster.

“Most perniciously, uninsured people are the ones who usually pay the highest prices for their hospital care,” Ron Pollack, executive director of the liberal patient rights group Families USA here,said in a statement. “It is absurd – and, indeed, unconscionable – that the people least capable of paying for their hospital care bear the largest, and often unaffordable, cost burdens.”

The American Hospital Association (AHA) said healthcare’s “charge” system is a matter of financing that urgently needs updating.

“The complex and bewildering interplay among ‘charges,’ ‘rates,’ ‘bills’ and ‘payments’ across dozens of payers, public and private, does not serve any stakeholder well, including hospitals,”AHA president and chief executive Rich Umbdenstock said in a statement. “This is especially true when what is most important to a patient is knowing what his or her financial responsibility will be.”

The Federation of American Hospitals declined to comment.

 

David Pittman

 

David Pittman is MedPage Today’s Washington Correspondent, following the intersection of policy and healthcare. He covers Congress, FDA, and other health agencies in Washington, as well as major healthcare events. David holds bachelors’ degrees in journalism and chemistry from the University of Georgia and previously worked at the Amarillo Globe-News in Texas,Chemical & Engineering News and most recently FDAnews.

 

Read Full Post »

Clinical Decision Support Systems for Management Decision Making of Cardiovascular Diseases

Author, and Content Consultant to e-SERIES A: Cardiovascular Diseases: Justin Pearlman, MD, PhD, FACC

and

Curator: Aviva Lev-Ari, PhD, RN

This image has an empty alt attribute; its file name is ArticleID-46.png

WordCloud Image Produced by Adam Tubman

Clinical Decision Support Systems (CDSS)

Clinical decision support system (CDSS) is an interactive decision support system (DSS). It generally relies on computer software designed to assist physicians and other health professionals with decision-making tasks, such as when to apply a particular diagnosis, further specific tests or treatments. A functional definition proposed by Robert Hayward of the Centre for Health Evidence defines CDSS as follows:  “Clinical Decision Support systems link health observations with health knowledge to influence health choices by clinicians for improved health care”. CDSS is a major topic in artificial intelligence in medicine.

Vinod Khosla of A Khosla Ventures investment, in a Fortune Magazine article, “Technology will replace 80% of what doctors do”, on December 4, 2012, wrote about CDSS as a harbinger of science in medicine.

Computer-assisted decision support is in its infancy, but we have already begun to see meaningful impact on healthcare. Meaningful use of computer systems is now rewarded under the Affordable Care Act.  Studies have demonstrated the ability of computerized clinical decision support systems to lower diagnostic errors of omission significantly, by directly countering cognitive bias.  Isabel is a differential diagnosis tool and, according to a Stony Book study, matched the diagnoses of experienced clinicians in 74% of complex cases. The system improved to a 95% match after a more rigorous entry of patient data. The IBM supercomputer, Watson, after beating all humans at the intelligence-based task of playing Jeopardy, is now turning its attention to medical diagnosis. It can process natural language questions and is fast at parsing high volumes of medical information, reading and understanding 200 million pages of text in 3 seconds. 

Examples of CDSS

  1. CADUCEUS
  2. DiagnosisPro
  3. Dxplain
  4. MYCIN
  5. RODIA

VIEW VIDEO

“When Should a Physician Deviate from the Diagnostic Decision Support Tool and What Are the Associated Risks?”

Introduction

Justin D. Pearlman, MD, PhD

A Decision Support System consists of one or more tools to help achieve good decisions. For example, decisions that can benefit from DSS include whether or not to undergo surgery, whether or not to undergo a stress test first, whether or not to have an annual mammogram starting at a particular age, or a computed tomography (CT) to screen for lung cancer, whether or not to utilize intensive care support such as a ventilator, chest shocks, chest compressions, forced feeding, strong antibiotics and so on versus care directed to comfort measures only without regard to longevity.

Any DSS can be viewed like a digestive tract, chewing on input, and producing output, and like the digestive tract, the output may only be valuable to a farmer. A well designed DSS is efficient in the input, timely in its processing and useful in the output. Mathematically, a DSS is a model with input parameters and an output variable or set of variables that can be used to determine an action. The input can be categorical (alive, dead), semi-quantitative (cold-warm-hot), or quantitative (temperature, systolic blood pressure, heart rate, oxygen saturation). The output can be binary (yes-no) or it can express probabilities or confidence intervals.

The process of defining specifications for a function and then deriving a useful function is called mathematical modeling. We will derive the function for “average” as an example. By way of specifications, we want to take a list of numbers as input, and come out with a single number that represents the middle of the pack or “central tendency.”   The order of the list should not matter, and if we change scales, the output should scale the same way. For example, if we use centimeters instead of inches, and we apply 2.54 centimeters to an inch, then the output should increase by the multiplier 2.54. If the list of numbers are all the same then the output should be the consistent value. Representing these specifications symbolically:

1. order doesn’t matter: f(a,b) = f(b,a), where “a” and “b” are input values, “f” is the function.

2. multipliers pass through (linearity):  f(ka,kb)=k f(a,b), where k is a scalar e.g. 2.54 cm/inch.

3. identity:  f(a,a,a,…) = a

Properties 1 and 2 lead us to consider linear functions consisting of sums and multipliers: f(a,b,c)=Aa+Bb+Cc …, where the capital letters are multipliers by “constants” – numbers that are independent of the list values a,b,c, and since the order should not matter, we simplify to f(a,b,c)=K (a+b+c+…) because a constant multiplier K makes order not matter. Property 3 forces us to pick K = 1/N where N is the length of the list. These properties lead us to the mathematical solution: average = sum of list of numbers divided by the length of the list.

A coin flip is a simple DSS: heads I do it, tails I don’t. The challenge of a good DSS is to perform better than random choice and also perform better (more accurately, more efficiently, more reliably, more timely and/or under more adverse conditions) than unassisted human decision making.

Therefore, I propose the following guiding principles for DSS design: choose inputs wisely (accessible, timely, efficient, relevant), determine to what you want output to be sensitive AND to what you want output to be insensitive, and be very clear about your measures of success.

For example, consider designing a DSS to determine whether a patient should receive the full range of support capabilities of an intensive care unit (ICU), or not. Politicians have cited the large bump in the cost of the last year of life as an opportunity to reduce costs of healthcare, and now pay primary care doctors to encourage patients to establish advanced directives not to use ICU services. From the DSS standpoint, the reasoning is flawed because the decision not to use ICU services should be sensitive to benefit as well as cost, commonly called cost-benefit analysis. If we measure success of ICU services by the benefit of quality life net gain (QLNG, “quailing”), measured in quality life-years (QuaLYs) and achieve 50% success with that, then the cost per QuaLY measures the cost-benefit of ICU services. In various cost-benefit decisions, the US Congress has decided to proceed if the cost is under $20-$100,000/QuaLY. If ICU services are achieving such a cost-benefit, then it is not logical to summarily block such services in advance. Rather, the ways to reduce those costs include improving the cost efficiency of ICU care, and improving the decision-making of who will benefit.

An example of a DSS is the prediction of plane failure from a thousand measurements of strain and function of various parts of an airplane. The desired output is probability of failure to complete the next flight safely. Cost-Benefit analysis then establishes what threshold or operating point merits grounding the plane for further inspection and preventative maintenance repairs. If a DSS reports probability of failure, then the decision (to ground the plane) needs to establish a threshold at which a certain probability triggers the decision to ground the plane.

The notion of an operating point brings up another important concept in decision support. At first blush, one might think the success of a DSS is determined by its ability to correctly identify a predicted outcome, such as futility of ICU care (when will the end result be no quality life net gain). The flaw in that measure of success is that it depends on prevalence in the study group. As an extreme example, if you study a group of patients with fatal gunshot wounds to the head, none will benefit and the DSS requirement is trivial and any DSS that says no for that group has performed well. At the other extreme, if all patients become healthy, the DSS requirement is also trivial, just say yes. Therefore the proper assessment of a DSS should pay attention to the prevalence and the operating point.

The impact of prevalence and operating point on decision-making is addressed by receiver-operator curves. Consider looking at the blood concentration of Troponin-I (TnI) as the sole determinant to decide who is having a heart attack.  If one plots a graph with horizontal axis troponin level and vertical axis ultimate proof of heart attack, the percentage of hits will generally be higher for higher values of TnI. To create such a graph, we compute a “truth table” which reports whether the test was above or below a decision threshold operating point, and whether or not the disease (heart attack) was in fact present:

TRUTH TABLE

              Disease            Not Disease
Test Positive

TP

FP

Test Negative

FN

TN

Total

TP+FN

FP+TN

The sensitivity to the disease is the true positive rate (TPR), the percentage of all disease cases that are ranked by the decision support as positive: TPR = TP/(TP+FN). 100% sensitivity can be achieved trivially by lowering the threshold for a positive test to zero, at a cost.  While sensitivity is necessary for success it is not sufficient. In addition to wanting sensitivity to disease, we want to avoid labeling non-disease as disease. That is often measured by specificity, the true negative rate (TNR), the percentage of those without disease who are correctly identified as not having disease: TNR = TN/(FP+TN). I propose also we define the complement to specificity, the anti-sensitivity, as the false positive rate (FPR), FPR = FP/(FP+TN) = 1 – TNR. Anti-sensitivity is a penalty cost of lowering the diagnostic threshold to boost sensitivity, as the concomitant rise in anti-sensitivity means a growing number of non-disease subjects are labeled as having disease. We want high sensitivity to true disease without high anti-sensitivity to false disease, and we want to be insensitive to common distractors. In these formulas, note that false negatives (FN) are True for disease, and false positives (FP) are False for disease, so the denominators add FN to TP for total True disease, and add FP to TN for total False for disease.

The graph in figure 1 justifies the definition of anti-sensitivity. It is an ROC or “Receiver-Operator Curve” which is a plot of sensitivity versus anti-sensitivity for different diagnostic thresholds of a test (operating points). Note, higher sensitivity comes at the cost of higher anti-sensitivity. Where to operate (what threshold to use for diagnosis) can be selected according to cost-benefit analysis of sensitivity versus anti-sensitivity (and specificity).

 untitled
FIgure 1 ROC (Receiver-Operator Curve): Graph of sensitivity (true positive rate) versus anti-sensitivity (false positive rate) computed by changing the operating point (threshold for declaring a test numeric value positive for disease). High area under the curve (AUC) is favorable because it means less anti-sensitivity for high sensitivity (upper left corner of shaded area more to the left, and higher). The dots on the curve are operating points. An inclusive operating point (high on the curve, high sensitivity) is used for screening tests, whereas an exclusive operating point (low on the curve, low anti-sensitivity) is used for definitive diagnosis.

Cost benefit analysis generally is based on a semi-lattice, or upside-down branching tree, which represents all choices and outcomes. It is important to include all branches down to final outcomes. For example, if the test is a mammogram to screen for breast cancer, the cost is not just the cost of the test, and the benefit “early diagnosis.” The cost-benefit calculation forces us to put a numerical value on the impact, such as a financial cost to an avoidable death, or we can get a numerical result in terms of quality life years expected. The cost, however, is not just the cost of the mammogram, but also of downstream events such as the cost of the needle biopsies for the suspicious “positives” and so on.

semilattice decision treeFigure 2 Semi-lattice Decision Tree: Starting from all patients, create a branch point for your test result, and add further branch points for any subsequent step-wise outcomes until you reach the “bottom line.” Assign a value to each, resulting in a numerical net cost and net benefit. If tests invoke risks (for example, needle biopsy of lung can collapse a lung and require hospitalization for a chest tube) then insert branch points for whether the complication occurs or not, as the treatment of a complication counts as part of the cost. The intermediary nodes can have probability of occurrence as their numeric factor, and the bottom line can apply the net probability of the path leading to a value as a multiplier to the dollar value (a 10% chance of costing $10,000 counts as an expectation cost of 0.1 x 10,000 = $1,000).

A third area of discussion is the statistical power of a DSS – how reliable is it in the application that you care about? Commonly DSS design is contrary to common statistical applications which address significance of a deviation in a small number of variables that have been measured many times in a large population. Instead, DSS often uses many variables to fully describe or characterize the status of a small population. For example, thousands of different measurements may be performed on a few dozen airplanes, aiming to predict when the plane should be grounded for repairs. A similar inversion of numbers – numerous variables, small number of cases – is common in genomics studies.

The success of a DDS is measured by its predictive value compared to outcomes or other measures of success. Thus measures of success include positive predictive value, negative predictive value, and confidence. A major problem with DDS is the inversion of the usually desired ratio of repetitions to measurement variables. When you get a single medical lab test, you have a single measurement value such as potassium level and a large number of normal subjects for comparison. If we knew the  mean μ and standard deviation σ that describes the distribution of normal values in the population at large, then we could compute the confidence in the decision to call our observed value abnormal based on the normal distribution:  , <br /><br /><br /><br /><br /><br /><br /><br /><br />
f(x) = \frac{1}{\sigma\sqrt{2\pi}} e^{ -\frac{(x-\mu)^2}{2\sigma^2} }.<br /><br /><br /><br /><br /><br /><br /><br /><br />

A value may be deemed distinctive based on a 95% confidence interval if it falls outside of the norm, say by more than twice the standard deviation σ, thereby establishing that it is unlikely to be random as the distance from the mean excludes 95% of the normal distribution.

The determination of confidence in an observed set of results stems from maximized likelihood estimates. Earlier in this article we described how to derive the the mean, or center, of a set of measurements. A similar analysis can derive the standard deviation (square root of variance) as a measure of spread around the mean, as well as other descriptive statistics based on sample values. These formulas describe the distribution of sample values about the mean. The calculation is based on a simple inversion. If we knew the mean and variance of a population of values for a measurement, we could calculate the likelihood of each new measurement falling a particular distance from the mean, and we could calculate the combined likelihood for a set of observed values. Maximized Likelihood Estimation (MLE) simply inverts the method of calculation. Instead of treating the mean and variance as known, we can treat the sample observations as the known data, to characterize a distribution for the observed data samples from an estimate of the spread about an unknown mean from a set of N normal samples x(one can apply calculus to compute the formulas below for the unknown mean and unknown variance, based simply on computing how to maximize the joint likelihood of the observations  xfrom the frequency distribution above, in order t0 derive the following formulas): 

\sigma = \sqrt{\frac{1}{N}\left[(x_1-\mu)^2 + (x_2-\mu)^2 + \cdots + (x_N - \mu)^2\right]}, {\rm \ \ where\ \ } \mu = \frac{1}{N} (x_1 + \cdots + x_N),

The frequency distribution (a function of mean and spread) reports the frequency of observing x if it is drawn from a population with the specified mean μ and standard deviation σ . We can invert that by treating the observations, x, as known and the mean μ and standard deviation σ unknown, then calculate the values μ and  σ that maximize the likelihood of our sample set as coming from the dynamically described population.

In DSS there is typically an inversion of the usually requisite large number of samples (small versus large) and number of variables (large versus small. This inversion has major consequences on data confidence. If you measure just 14 independent variables versus one variable, each at 95% confidence, the net confidence drops exponentially to less than 50%: 0.9514=49%. In the airplane grounding screen tests, 1000 independent variables, at 95% confidence each, yields a net confidence of only 5 x 10-23 which is 10 sextillion times less than 50% confidence. This same problem arises in genomics research, in which we have a large array of gene product measurements on a small number of patients. Standard statistical tools are problematic at high variable counts. One can turn to qualitative grouping tools such as exploratory factor analysis, or recover statistical robustness with HykGene, a combined cluster and ranking method devised by the author to improve dramatically the ability to identify distinctions with confidence when the number of variables is high.

Evolution of DSS

Aviva Lev-Ari, PhD, RN

The examples provided above refer to sets of binary models, one family of DSS. Another type of DSS is multivariate in nature, a corollary of multivariate scenarios constitute alternative choice options. Last decade development in the DSS field involved the design of Recommendation Engines given manifested preference functions that involved simultaneous trade-off functions against cost function. Game theoretical context is embedded into Recommendation Engines. The output mentioned above, is in fact an array of options with probabilities of saving reward assigned by the Recommendation Engine.

Underlining Computation Engines

Methodological Basis of Clinical DSS

There are many different methodologies that can be used by a CDSS in order to provide support to the health care professional.[7]

The basic components of a CDSS include a dynamic (medical) knowledge base and an inference mechanism (usually a set of rules derived from the experts and evidence-based medicine) and implemented through medical logic modules based on a language such as Arden syntax. It could be based on Expert systems or artificial neural networks or both (connectionist expert systems).

Bayesian Network

The Bayesian network is a knowledge-based graphical representation that shows a set of variables and their probabilistic relationships between diseases and symptoms. They are based on conditional probabilities, the probability of an event given the occurrence of another event, such as the interpretation of diagnostic tests. Bayes’ rule helps us compute the probability of an event with the help of some more readily available information and it consistently processes options as new evidence is presented. In the context of CDSS, the Bayesian network can be used to compute the probabilities of the presence of the possible diseases given their symptoms.

Some of the advantages of Bayesian Network include the knowledge and conclusions of experts in the form of probabilities, assistance in decision making as new information is available and are based on unbiased probabilities that are applicable to many models.

Some of the disadvantages of Bayesian Network include the difficulty to get the probability knowledge for possible diagnosis and not being practical for large complex systems given multiple symptoms. The Bayesian calculations on multiple simultaneous symptoms could be overwhelming for users.

Example of a Bayesian network in the CDSS context is the Iliad system which makes use of Bayesian reasoning to calculate posterior probabilities of possible diagnoses depending on the symptoms provided. The system now covers about 1500 diagnoses based on thousands of findings.

Another example is the DXplain system that uses a modified form of the Bayesian logic. This CDSS produces a list of ranked diagnoses associated with the symptoms.

A third example is SimulConsult, which began in the area of neurogenetics. By the end of 2010 it covered ~2,600 diseases in neurology and genetics, or roughly 25% of known diagnoses. It addresses the core issue of Bayesian systems, that of a scalable way to input data and calculate probabilities, by focusing specialty by specialty and achieving completeness. Such completeness allows the system to calculate the relative probabilities, rather than the person inputting the data. Using the peer-reviewed medical literature as its source, and applying two levels of peer-review to the data entries, SimulConsult can add a disease with less than a total of four hours of clinician time. It is widely used by pediatric neurologists today in the US and in 85 countries around the world.

Neural Network

Artificial Neural Networks (ANN) is a nonknowledge-based adaptive CDSS that uses a form of artificial intelligence, also known as machine learning, that allows the systems to learn from past experiences / examples and recognizes patterns in clinical information. It consists of nodes called neuron and weighted connections that transmit signals between the neurons in a forward or looped fashion. An ANN consists of 3 main layers: Input (data receiver or findings), Output (communicates results or possible diseases) and Hidden (processes data). The system becomes more efficient with known results for large amounts of data.

The advantages of ANN include the elimination of needing to program the systems and providing input from experts. The ANN CDSS can process incomplete data by making educated guesses about missing data and improves with every use due to its adaptive system learning. Additionally, ANN systems do not require large databases to store outcome data with its associated probabilities. Some of the disadvantages are that the training process may be time consuming leading users to not make use of the systems effectively. The ANN systems derive their own formulas for weighting and combining data based on the statistical recognition patterns over time which may be difficult to interpret and doubt the system’s reliability.

Examples include the diagnosis of appendicitis, back pain, myocardial infarction, psychiatric emergencies and skin disorders. The ANN’s diagnostic predictions of pulmonary embolisms were in some cases even better than physician’s predictions. Additionally, ANN based applications have been useful in the analysis of ECG (A.K.A. EKG) waveforms.

Genetic Algorithms

Genetic Algorithm (GA) is a nonknowledge-based method developed in the 1940s at the Massachusetts Institute of Technology based on Darwin’s evolutionary theories that dealt with the survival of the fittest. These algorithms rearrange to form different re-combinations that are better than the previous solutions. Similar to neural networks, the genetic algorithms derive their information from patient data.

An advantage of genetic algorithms is these systems go through an iterative process to produce an optimal solution. The fitness function determines the good solutions and the solutions that can be eliminated. A disadvantage is the lack of transparency in the reasoning involved for the decision support systems making it undesirable for physicians. The main challenge in using genetic algorithms is in defining the fitness criteria. In order to use a genetic algorithm, there must be many components such as multiple drugs, symptoms, treatment therapy and so on available in order to solve a problem. Genetic algorithms have proved to be useful in the diagnosis of female urinary incontinence.

Rule-Based System

A rule-based expert system attempts to capture knowledge of domain experts into expressions that can be evaluated known as rules; an example rule might read, “If the patient has high blood pressure, he or she is at risk for a stroke.” Once enough of these rules have been compiled into a rule base, the current working knowledge will be evaluated against the rule base by chaining rules together until a conclusion is reached. Some of the advantages of a rule-based expert system are the fact that it makes it easy to store a large amount of information, and coming up with the rules will help to clarify the logic used in the decision-making process. However, it can be difficult for an expert to transfer their knowledge into distinct rules, and many rules can be required for a system to be effective.

Rule-based systems can aid physicians in many different areas, including diagnosis and treatment. An example of a rule-based expert system in the clinical setting is MYCIN. Developed at Stanford University by Edward Shortliffe in the 1970s, MYCIN was based on around 600 rules and was used to help identify the type of bacteria causing an infection. While useful, MYCIN can help to demonstrate the magnitude of these types of systems by comparing the size of the rule base (600) to the narrow scope of the problem space.

The Stanford AI group subsequently developed ONCOCIN, another rules-based expert system coded in Lisp in the early 1980s.[8] The system was intended to reduce the number of clinical trial protocol violations, and reduce the time required to make decisions about the timing and dosing of chemotherapy in late phase clinical trials. As with MYCIN, the domain of medical knowledge addressed by ONCOCIN was limited in scope and consisted of a series of eligibility criteria, laboratory values, and diagnostic testing and chemotherapy treatment protocols that could be translated into unambiguous rules. Oncocin was put into production in the Stanford Oncology Clinic.

Logical Condition

The methodology behind logical condition is fairly simplistic; given a variable and a bound, check to see if the variable is within or outside of the bounds and take action based on the result. An example statement might be “Is the patient’s heart rate less than 50 BPM?” It is possible to link multiple statements together to form more complex conditions. Technology such as a decision table can be used to provide an easy to analyze representation of these statements.

In the clinical setting, logical conditions are primarily used to provide alerts and reminders to individuals across the care domain. For example, an alert may warn an anesthesiologist that their patient’s heart rate is too low; a reminder could tell a nurse to isolate a patient based on their health condition; finally, another reminder could tell a doctor to make sure he discusses smoking cessation with his patient. Alerts and reminders have been shown to help increase physician compliance with many different guidelines; however, the risk exists that creating too many alerts and reminders could overwhelm doctors, nurses, and other staff and cause them to ignore the alerts altogether.

Causal Probabilistic Network

The primary basis behind the causal network methodology is cause and effect. In a clinical causal probabilistic network, nodes are used to represent items such as symptoms, patient states or disease categories. Connections between nodes indicate a cause and effect relationship. A system based on this logic will attempt to trace a path from symptom nodes all the way to disease classification nodes, using probability to determine which path is the best fit. Some of the advantages of this approach are the fact that it helps to model the progression of a disease over time and the interaction between diseases; however, it is not always the case that medical knowledge knows exactly what causes certain symptoms, and it can be difficult to choose what level of detail to build the model to.

The first clinical decision support system to use a causal probabilistic network was CASNET, used to assist in the diagnosis of glaucoma. CASNET featured a hierarchical representation of knowledge, splitting all of its nodes into one of three separate tiers: symptoms, states and diseases.

  1. a b c d e “Decision support systems .” 26 July 2005. 17 Feb. 2009 <http://www.openclinical.org/dss.html>.
  2. 2^ a b c d e f g Berner, Eta S., ed. Clinical Decision Support Systems. New York, NY: Springer, 2007.
  3. 3^ Khosla, Vinod (December 4, 2012). “Technology will replace 80% of what doctors do”. Retrieved April 25, 2013.
  4. ^ Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J et al. (2005). “Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review.”JAMA 293 (10): 1223–38. doi:10.1001/jama.293.10.1223PMID 15755945.
  5. ^ Kensaku Kawamoto, Caitlin A Houlihan, E Andrew Balas, David F Lobach. (2005). “Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success.”BMJ 330 (7494): 765. doi:10.1136/bmj.38398.500764.8FPMC 555881PMID 15767266.
  6. ^ Gluud C, Nikolova D (2007). “Likely country of origin in publications on randomised controlled trials and controlled clinical trials during the last 60 years.”Trials 8: 7. doi:10.1186/1745-6215-8-7PMC 1808475PMID 17326823.
  7. ^ Wagholikar, K. “Modeling Paradigms for Medical Diagnostic Decision Support: A Survey and Future Directions”. Journal of Medical Systems. Retrieved 2012.
  8. ^ ONCOCIN: An expert system for oncology protocol management E. H. Shortliffe, A. C. Scott, M. B. Bischoff, A. B. Campbell, W. V. Melle, C. D. Jacobs Seventh International Joint Conference on Artificial Intelligence, Vancouver, B.C.. Published in 1981

SOURCE for Computation Engines Section and REFERENCES:

http://en.wikipedia.org/wiki/Clinical_decision_support_system

Cardiovascular Diseases: Decision Support Systems (DSS) for Disease Management Decision Making – DSS analyzes information from hospital cardiovascular patients in real time and compares it with a database of thousands of previous cases to predict the most likely outcome.

Can aviation technology reduce heart surgery complications?

Algorithm for real-time analysis of data holds promise for forecasting
August 13, 2012 | By 

British researchers are working to adapt technology from the aviation industry to help prevent complications among heart patients after surgery. Up to 1,000 sensors aboard aircraft help airlines determine when a plane requires maintenance, reports The Engineer, serving as a model for the British risk-prediction system.

The system analyzes information from hospital cardiovascular patients in real time and compares it with a database of thousands of previous cases to predict the most likely outcome.

“There are vast amounts of clinical data currently collected which is not analyzed in any meaningful way. This tool has the potential to identify subtle early signs of complications from real-time data,” Stuart Grant, a research fellow in surgery at University Hospital of South Manchester, says in a hospital statement. Grant is part of the Academic Surgery Unit working with Lancaster University on the project, which is still its early stages.

The software predicts the patient’s condition over a 24-hour period using four metrics: systolic blood pressure, heart rate, respiration rate and peripheral oxygen saturationexplains EE Times.

As a comparison tool, the researchers obtained a database of 30,000 patient records from the Massachusetts Institute of Technology and combined it with a smaller, more specialized database from Manchester.

In six months of testing, its accuracy is about 75 percent, The Engineer reports. More data and an improved algorithm could boost that rate to 85 percent, the researchers believe. Making the software web-based would allow physicians to access the data anywhere, even on tablets or phones, and could enable remote consultation with specialists.

In their next step, the researchers are applying for more funding and for ethical clearance for a large-scale trial.

U.S. researchers are working on a similar crystal ball, but one covering an array of conditions. Researchers from the University of Washington, MIT and Columbia University are using a statistical model that can predict future ailments based on a patient’s history–and that of thousands of others.

And the U.S. Department of Health & Human Services is using mathematical modeling to analyze effects of specific healthcare interventions.

Predictive modeling also holds promise to make clinical research easier by using algorithms examine multiple scenarios based on different kinds of patient populations, specified health conditions and various treatment regimens

To learn more:
– here’s the Engineer article
– check out the hospital report
– read the EE Times article

Related Articles:
Algorithm looks to past to predict future health conditions
HHS moves to mathematical modeling for research, intervention evaluation
Decision support, predictive modeling may speed clinical research

SOURCE:

Can aviation technology reduce heart surgery complications? – FierceHealthIT http://www.fiercehealthit.com/story/can-aviation-technology-reduce-heart-surgery-complications/2012-08-13#ixzz2SITHc61J

http://www.fiercehealthit.com/story/study-decision-support-systems-must-be-flexible-adaptable-transparent/2012-08-20

Medical Decision Making Tools: Overview of DSS available to date  

http://www.openclinical.org/dss.html

Clinical Decision Support Systems – used for Cardiovascular Medical Decisions

Stud Health Technol Inform. 2010;160(Pt 2):846-50.

AALIM: a cardiac clinical decision support system powered by advanced multi-modal analytics.

Amir A, Beymer D, Grace J, Greenspan H, Gruhl D, Hobbs A, Pohl K, Syeda-Mahmood T, Terdiman J, Wang F.

Source

IBM Almaden Research Center, San Jose, CA, USA.

Abstract

Modern Electronic Medical Record (EMR) systems often integrate large amounts of data from multiple disparate sources. To do so, EMR systems must align the data to create consistency between these sources. The data should also be presented in a manner that allows a clinician to quickly understand the complete condition and history of a patient’s health. We develop the AALIM system to address these issues using advanced multimodal analytics. First, it extracts and computes multiple features and cues from the patient records and medical tests. This additional metadata facilitates more accurate alignment of the various modalities, enables consistency check and empowers a clear, concise presentation of the patient’s complete health information. The system further provides a multimodal search for similar cases within the EMR system, and derives related conditions and drugs information from them. We applied our approach to cardiac data from a major medical care organization and found that it produced results with sufficient quality to assist the clinician making appropriate clinical decisions.

PMID: 20841805 [PubMed – indexed for MEDLINE]

DSS development for Enhancement of Heart Drug Compliance by Cardiac Patients 

A good example of a thorough and effective CDSS development process is an electronic checklist developed by Riggio et al. at Thomas Jefferson University Hospital (TJUH) [12]. TJUH had a computerized physician order-entry system in place. To meet congestive heart failure and acute myocardial infarction quality measures (e.g., use of aspirin, beta blockers, and angiotensin-converting enzyme (ACE) inhibitors), a multidisciplinary team including a focus group of residents developed a checklist, embedded in the computerized discharge instructions, that required resident physicians to prescribe the recommended medications or choose from a drop-down list of contraindications. The checklist was vetted by several committees, including the medical executive committee, and presented at resident conferences for feedback and suggestions. Implementation resulted in a dramatic improvement in compliance.

http://virtualmentor.ama-assn.org/2011/03/medu1-1103.html

Early DSS Development at Stanford Medical Center in the 70s

MYCIN (1976)     MYCIN was a rule-based expert system designed to diagnose and recommend treatment for certain blood infections (antimicrobial selection for patients with bacteremia or meningitis). It was later extended to handle other infectious diseases. Clinical knowledge in MYCIN is represented as a set of IF-THEN rules with certainty factors attached to diagnoses. It was a goal-directed system, using a basic backward chaining reasoning strategy (resulting in exhaustive depth-first search of the rules base for relevant rules though with additional heuristic support to control the search for a proposed solution). MYCIN was developed in the mid-1970s by Ted Shortliffe and colleagues at Stanford University. It is probably the most famous early expert system, described by Mark Musen as being “the first convincing demonstration of the power of the rule-based approach in the development of robust clinical decision-support systems” [Musen, 1999].

The EMYCIN (Essential MYCIN) expert system shell, employing MYCIN’s control structures was developed at Stanford in 1980. This domain-independent framework was used to build diagnostic rule-based expert systems such as PUFF, a system designed to interpret pulmonary function tests for patients with lung disease.

http://www.bmj.com/content/346/bmj.f657

ECG for Detection of MI: DSS use in Cardiovascualr Disease Management

http://faculty.ksu.edu.sa/AlBarrak/Documents/Clinical%20Decision%20Support%20Systems_Ch01.pdf

also showed that neural networks did a better job than two experienced cardiologists in detecting acute myocardial infarction in electrocardiograms with concomitant left bundle branch block.

Olsson SE, Ohlsson M, Ohlin H, Edenbrandt L. Neural networks—a diagnostic tool in acute myocardial infarction with concomitant left bundle branch block. Clin Physiol Funct Imaging 2002;22:295–299.

Sven-Erik Olsson, Hans Öhlin, Mattias Ohlsson and Lars Edenbrandt
Neural networks – a diagnostic tool in acute myocardial infarction with concomitant left bundle branch block
Clinical Physiology and Functional Imaging 22, 295-299 (2002) 

Abstract
The prognosis of acute myocardial infarction (AMI) improves by early revascularization. However the presence of left bundle branch block (LBBB) in the electrocardiogram (ECG) increases the difficulty in recognizing an AMI and different ECG criteria for the diagnosis of AMI have proved to be of limited value. The purpose of this study was to detect AMI in ECGs with LBBB using artificial neural networks and to compare the performance of the networks to that of six sets of conventional ECG criteria and two experienced cardiologists. A total of 518 ECGs, recorded at an emergency department, with a QRS duration > 120 ms and an LBBB configuration, were selected from the clinical ECG database. Of this sample 120 ECGs were recorded on patients with AMI, the remaining 398 ECGs being used as a control group. Artificial neural networks of feed-forward type were trained to classify the ECGs as AMI or not AMI. The neural network showed higher sensitivities than both the cardiologists and the criteria when compared at the same levels of specificity. The sensitivity of the neural network was 12% (P = 0.02) and 19% (P = 0.001) higher than that of the cardiologists. Artificial neural networks can be trained to detect AMI in ECGs with concomitant LBBB more effectively than conventional ECG criteria or experienced cardiologists.

http://home.thep.lu.se/~mattias/publications/papers/lu_tp_00_38_abs.html

Additional SOURCES:

http://www.implementationscience.com/content/6/1/92

http://www.fiercehealthit.com/story/study-decision-support-systems-must-be-flexible-adaptable-transparent/2012-08-20

 Comment of Note

During 1979-1983 Dr. Aviva Lev-Ari was part of Prof. Ronald A. Howard, Stanford University, Study Team, the consulting group to Stanford Medical Center during MYCIN feature enhancement development.

Professor Howard is one of the founders of the decision analysis discipline. His books on probabilistic modeling, decision analysis, dynamic programming, and Markov processes serve as major references for courses and research in these fields.

https://engineering.stanford.edu/profile/rhoward

It was Prof. Howard from EES, Prof. Amos Tversky of Behavior Science  (Advisor of Dr. Lev-Ari’s Masters Thesis at HUJ), and Prof. Kenneth Arrow, Economics, with 15 doctoral students in the early 80s, that formed the Interdisciplinary Decision Analysis Core Group at Stanford. Students of Prof. Howard, chiefly, James E. Matheson, started the Decision Analysis Practice at Stanford Research Institute (SRI, Int’l) in Menlo Park, CA.

http://www.sri.com/

Dr. Lev-Ari  was hired on 3/1985 to head SRI’s effort in algorithm-based DSS development. The models she developed were applied in problem solving for  SRI Clients, among them Pharmaceutical Manufacturers: Ciba Geigy, now NOVARTIS, DuPont, FMC, Rhone-Poulenc, now Sanofi-Aventis.

Read Full Post »

Curator: Aviva Lev-Ari, PhD, RN

First post published on 4/30/2012

We went pretty far:

On 4/30/2013 – Great News to Share

News from the National Academy of Sciences

Date: April 30, 2013

FOR IMMEDIATE RELEASE

National Academy of Sciences Members and Foreign Associates Elected

The National Academy of Sciences announced today the election of 84 new members and 21 foreign associates from 14 countries in recognition of their distinguished and continuing achievements in original research.

Those elected today bring the total number of active members to 2,179 and the total number of foreign associates to 437. Foreign associates are nonvoting members of the Academy, with citizenship outside the United States.

Newly elected members and their affiliations at the time of election are:

We congratulate OUR BOARD MEMBER for being elected 

Feldman, Marcus W.

Director, Morrison Institute for Population and Resource Studies, and Burnet C. and Mildred Finley Wohlford Professor of Biological Sciences, department of biological sciences, Stanford University, Stanford, Calif.

http://www.nasonline.org/news-and-multimedia/news/2013_04_30_NAS_Election.html

Our Site Statistics Shines:

Discussions

URL

Clicks

ncbi.nlm.nih.gov 1,121
nature.com 587
genomeweb.com 220
sciencedirect.com 192
medicregister.com 179
pnas.org 150
nejm.org 137

Contributing Authors:

All Time

Author Views
2012pharmaceutical – Aviva Lev-Ari 56,648
larryhbern 22,368
Dr. Sudipta Saha 8,638
tildabarliya 8,146
ritusaxena 6,272
Dror Nir 4,726
sjwilliamspa 4,371
aviralvatsa 3,481
anamikasarkar 1,776
pkandala 1,637
Alan F. Kaul, PharmD., MS, MBA, FCCP 1,101
megbaker58 858
zraviv06 618
zs22 506
Aashir Awan, Phd 461
Demet Sag, Ph.D., CRA, GCP 347
howarddonohue 309
jdpmdphd 193
Ed Kislauskis 171
anayou1 144
jukkakarjalainen 139
Dr.Sreedhar Tirunagari 99
S. Chakrabarti, Ph.D. 65
apreconasia 57

OUR BioMed e-BOOKS Series

Forthcoming on e-Book List of Amazon-KINDLE

Larry Bernstein, MD FCAP and Aviral Vatsa, MBBS, PhD, Editors

Target Market: CARDIOLOGISTS

Justin D. Pearlman MD ME PhD MA FACC, Editor (9/2013)

Target Market: CARDIOLOGISTS


Larry Bernstein, MD FACP, Sr. Editor, SJ Williams, PhD and Aviva Lev-Ari, PhD, RN, Editors  
(7/2013)

Target Market: Life Scientists and BIG PARMA, Academia and the Public

Larry Bernstein, MD, FCAP & Ritu Saxena, PhD, Editors


Target Market: GI MDs, Nutritionists, Food Industry, Academia and the Public


SJ. Williams, PhD, Sr. Editor,  Tilda Barliya, PhD and Ritu Saxena, PhD, Editors


Target Market: ONCOLOGISTS

Dr. Tilda Barliya, Editor


Target Market: BIG PHARMA & Academics

– Editor TBA

– Editor TBA

List of ADDITIONAL TITLES for 2014 e-Books

– Editor TBA

– Editor TBA

– Editor TBA

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

The reader is encourage to review the following ANALYSIS of this subject matter:

Genomics & Genetics of Cardiovascular DiseaseDiagnoses: A Literature Survey of AHA’s Circulation Cardiovascular Genetics, 3/2010 – 3/2013

and

10 Years On, Still Much To Be Learned From Human Genome Map

Advances made in genetics of disease, but creating new drugs more complex than first thought

By Amanda Gardner
HealthDay Reporter

FRIDAY, April 12 (HealthDay News) — As scientists mark the 10th anniversary Sunday of the completion of the Human Genome Project, they will note how that watershed effort has led to the discovery of the genetic underpinnings of almost 5,000 diseases.

And it has made it possible to develop personalized treatments that have prolonged the lives of many.

But the scientists will also acknowledge that, while the project has unlocked many mysteries that once shrouded diseases, there’s still much to be learned before new drugs can be developed to target illness-causing mutations in human DNA.

“What we’ve learned over the past 10 years is that we’re still far from really understanding the complexity of the human genome,” said Eric Schadt, chairman of genetics and genomic sciences at Mount Sinai Icahn School of Medicine in New York City. “Human disease is way more complicated than the old view that single hits to single genes cause diseases.

“In most forms of diseases, it’s whole constellations of genes operating in networks,” Schadt explained. “That becomes a much harder problem. How do you target networks with a single drug?

“We keep learning how much we really don’t know and how much further we need to go,” he added. “That’s the big story.”

A decade ago, the Human Genome Project was hailed as a major milestone because researchers identified all of the nearly 25,000 genes in human DNA and sequenced the 3 billion chemical base pairs comprising that DNA.

The feat took 13 years and cost close to $3 billion, but the genetic information gleaned from the project gave scientists the tools needed to pinpoint how changes in specific genes could kick-start some diseases.

One of the most tangible benefits of the project has been the development of ever more sophisticated sequencing technology and a dramatic lowering of the cost of using that technology.

Today, the cost of sequencing one human genome is closer to $5,000 and can be done in a day or two, said Dr. Eric Green, director of the National Human Genome Research Institute in Bethesda, Md.

What that means is that the pace of research, and its attendant discoveries, has been accelerated.

When the project first began, scientists knew the genetic basis of about 53 diseases. Today, that number is close to 5,000, Green noted. That means doctors can now test patients to see if they carry gene mutations that raise their risk for certain diseases, and counsel them accordingly on ways they might prevent or delay illness. There are currently almost 2,000 genetic tests for specific diseases or conditions, according to the U.S. National Institutes of Health.

There have also been breakthroughs with some rare diseases.

In 2011, 6-year-old Nicholas Volker became the first child to be saved by the new technology. He had undergone a hundred surgeries, including the removal of his colon, as doctors tried to identify his mysterious bowel disease. Genomic sequencing uncovered a genetic mutation that could be treated with a bone marrow transplant consisting of cells from umbilical cord blood.

“Knowing more of the basic genetics that makes up an individual has allowed us to diagnose far more genetic diseases,” said Dr. Barbara Pober, a medical geneticist at the Frank H. Netter, M.D. School of Medicine at Quinnipiac University in North Haven, Conn.

Once a diagnosis has been made, doctors can now use gene sequencing to determine treatment for some diseases. For instance, breast cancer patients can be tested to see how they will respond to the drug Herceptin. HIV patients can be tested to determine their response to the drug abacavir. And those on the widely used blood thinner warfarin can be tested to determine the most effective dose, according to the NIH.

The field of pharmacogenetics, still in its infancy, enables doctors to use a patient’s genetic information to figure out which cancer drugs the patient will best respond to before treatment even starts.

The U.S. Food and Drug Administration now includes genetic information on labeling for more than 100 drugs, up from just four 10 years ago, Green said.

The goal of developing new drugs to target diseases with genetic roots, however, will take much longer to realize.

Although the NIH states that there are roughly 350 biotechnological products currently being tested in clinical trials, new drugs take a decade or more to develop. Not only that, the knowledge gained from the Human Genome Project has actually made the field of genetic medicine even more complex. Scientists are finding that many diseases are triggered by interaction involving multiple gene variants, making it difficult to design a treatment that targets all the culprits in a particular illness.

And the complexities don’t end there.

Not long ago, scientists discovered that so-called “junk” DNA, which makes up 98 percent of the genome, is not junk at all but serves critical regulatory functions.

What’s more, about 10 percent of the human genome still hasn’t been sequenced and can’t be sequenced by existing technology, Green added. “There are parts of the genome we didn’t know existed back when the genome was completed,” he said.

More information

For more on developments over the past 10 years, visit the Human Genome Projectwebsite.

SOURCES: Eric Green, M.D., Ph.D., director, National Human Genome Research Institute, Bethesda, Md.; Barbara Pober, M.D., professor, medical sciences, Frank H. Netter, M.D., School of Medicine, Quinnipiac University, North Haven, Conn.; Eric Schadt, Ph.D., professor and chairman, department of genetics and genomic sciences, Mount Sinai Icahn School of Medicine, New York City

Last Updated: April 12, 2013

Health News Copyright © 2013 HealthDay. All rights reserved.

http://consumer.healthday.com/Article.asp?AID=675381

Read Full Post »

Drug Eluting Stents: On MIT’s Edelman Lab’s Contributions to Vascular Biology and its Pioneering Research on DES

Drug Eluting Stents: On MIT‘s Edelman Lab’s Contributions to Vascular Biology and its Pioneering Research on DES

Author: Larry H Bernstein, MD, FACP

and 

Curator: Aviva Lev-Ari, PhD, RN
http://PharmaceuticalIntelligence.com/2013/04/25/Contributions
-to-vascular-biology/

This is the first of a three part series on the evolution of vascular biology and the studies of the effects of biomaterials in vascular reconstruction and on drug delivery, which has embraced a collaboration of cardiologists at Harvard Medical School , Affiliated Hospitals, and MIT,
requiring cardiovascular scientists at the PhD and MD level, physicists, and computational biologists working in concert, and
an exploration of the depth of the contributions by a distinguished physician, scientist, and thinker.

The first part – Vascular Biology and Disease – will cover the advances in the research on

  • vascular biology,
  • signaling pathways,
  • drug diffusion across the endothelium and
  • the interactions with the underlying muscularis (media),
  • with additional considerations for type 2 diabetes mellitus.

The second part – Stents and Drug Delivery – will cover the

  • purposes,
  • properties and
  • evolution of stent technology with
  • the acquired knowledge of the pharmacodynamics of drug interactions and drug distribution.

The third part – Problems and Promise of Biomaterials Technology – will cover the shortcomings of the cardiovascular devices, and opportunities for improvement

Vascular Biology and Cardiovascular Disease

Early work on endothelial injury and drug release principles

The insertion of a catheter for the administration of heparin is not an innocuous procedure. Heparin is infused to block coagulation, lowering the risk of a dangerous

  • clot formation and
  • dissemination.

It was shown experimentally that the continuous infusion of heparin

  • suppresses smooth muscle proliferation after endothelial injury. It may lead to
  • hemorrhage as a primary effect.

The anticoagulant property of heparin was removed by chemical modification without loss of the anti-proliferative effect.

In this study, MIT researches placed ethylene-vinyl acetate copolymer matrices containing standard and modified heparin adjacent to rat carotid arteries at the time of balloon deendothelialization.

Matrix delivery of both heparin compounds effectively diminished this proliferation in comparison to controls without producing systemic anticoagulation or side effects.

This mode of therapy appeared more effective than administering the agents by either

  • intravenous pumps or
  • heparin/polymer matrices placed in a subcutaneous site distant from the injured carotid artery

This indicated that the site of placement at the site of injury is a factor in the microenvironment, and is a preference for avoiding restenosis after angioplasty and other interventions.

This raised the question of why the proliferation of vascular muscle occurs in the first place.
 Edelman, Nugent and Karnovsky  (1) showed that the proliferation required first the denudation of vascular surface endothelium. This exposed the underlayer to the effect of basic fibroblast growth factor, which stimulates mitogenesis of the exposed cell, explained by the endothelium as a barrier from circulating bFGF.

To answer this question, they compared the effect of

  • 125I-labelled bFGF intravenously given with perivascular controlled bFGF release.
  • Polymeric controlled release devices delivered bFGF to the extravascular space without transendothelial transport. 
Deposition within the blood vessel wall was rapidly distributed circumferentially and was substantially greater than that observed following intravenous injection.

The amount of bFGF deposited in arteries adjacent to the release devices was 40 times that deposited in similar arteries in animals who received a single intravenous bolus of bFGF.

The presence of intimal hyperplasia increased deposition of perivascularly released bFGF 2.4-fold but decreased the deposition of intravenously injected bFGF by 67%.

  • bFGF was 5- to 30-fold more abundant in solid organs after intravenous injection than it was following perivascular release, and
  • bFGF deposition was greatest in the kidney, liver, and spleen and was substantially lower in the heart and lung.

This result indicated that vascular deposition of bFGF is independent of endothelium, and

  • bFGF delivery is effectively perivascular. (2)

Drug activity studies have to be done in well controlled and representative conditions.
 Edelsman’s Lab researchers studied the

  • dose response of injured arteries to exogenous heparin in vivo by providing steady and predictable arterial levels of drug.
  • Controlled-release devices were fabricated to direct heparin uniformly and at a steady rate to the adventitial surface of balloon-injured rat carotid arteries.

Researchers predicted the distribution of heparin throughout the arterial wall using computational simulations and correlated these concentrations with the biologic response of the tissues.

Researchers determined from this process that an in vivo arterial concentration of 0.3 mg/ml of heparin is required to maximallyinhibit intimal hyperplasia after injury.

This estimation of the required tissue concentration of a drug is

  • independent of the route of administration and
  • applies to all forms of drug release.

In this way the Team was able to

  • evaluate the potential of  widely disparate forms of drug release and, to finally
  • create some rigorous criteria by which to guide the development of particular delivery strategies for local diseases. (3)

Chiefly, the following three effects:

(1) Effect of controlled adventitial heparin delivery on smooth muscle cell proliferation following endothelial injury. ER Edelman, DH Adams, and MJ Karnovsky. PNAS May 1990; 87: 3773-3777.


(2) Perivascular and intravenous administration of basic fibroblast growth factor: Vascular and solid organ deposition. ER Edelman, MA Nugent, and MJ Karnovsky. PNAS Feb 1993; 90: 1513-1517.


(3) Tissue concentration of heparin, not administered dose, correlates with the biological response of injured arteries in vivo. MA Lovich and ER Edelman. PNAS Sep 1999; 96: 11111–11116.

Vascular Injury and Repair

Perlecan is a heparin-sulfate proteoglycan that might be critical for regulation of vascular repair by inhibiting the binding and mitogenic activity of basic fibroblast growth factor-2 (bFGF-2) in vascular smooth muscle cells .

The Team generated

  • Clones of endothelial cells expressing an antisense vector targeting domain III of perlecan. The transfected cells produced significantly less perlecan than parent cells, and they had reduced bFGF in vascular smooth muscle cells.
  • Endothelial cells were seeded onto three-dimensional polymeric matrices and implanted adjacent to porcine carotid arteries subjected to deep injury.
  • The parent endothelial cells prevented thrombosis, but perlecan deficient cells were ineffective.

The ability of endothelial cells to inhibit intimal hyperplasia, however, was only in part suppressed by perlecan. The differential regulation by perlecan of these aspects of vascular repair may clarify why control of clinical clot formation does not lead to full control of intimal hyperplasia.

The use of genetically modified tissue engineered cells provides a new approach for dissecting the role of specific factors within the blood vessel wall.(1) Successful implementation of local arterial drug delivery requires transmural distribution of drug. The physicochemical properties of the applied compound govern its transport and tissue binding.

  • Hydrophilic compounds are cleared rapidly.
  • Hydrophobic drugs bind to fixed tissue elements, potentially prolonging tissue residence and biological effect.

Local vascular drug delivery provides

  • elevated concentrations of drug in the target tissue while
  • minimizing systemic side effects.

To better characterize local pharmacokinetics the Team examined the arterial transport of locally applied dextran and dextran derivatives in vivo.

Using a two-compartment pharmacokinetic model to correct

  • The measured transmural flux of these compounds for systemic
  • Redistribution and elimination as delivered from a photo-polymerizable hydrogel.
  • The diffusivities and the transendothelial permeabilities were strongly dependent on molecular weight and charge
  • For neutral dextrans, the diffusive resistance increased with molecular weightapproximately 4.1-fold between the molecular weights of 10 and 282 kDa.
  • Endothelial resistance increased 28-fold over the same molecular weight range.
  • The effective medial diffusive resistance was unaffected by cationic charge as such molecules moved identically to neutral compounds, but increased approximately 40% when dextrans were negatively charged.

Transendothelial resistance was 20-fold lower for the cationic dextrans, and 11-fold higher for the anionic dextrans, when both were compared to neutral counterparts.

These results suggest that, while

  • low molecular weight drugs will rapidly traverse the arterial wall with the endothelium posing a minimal barrier,
  • the reverse is true for high molecular weight agents.

The deposition and distribution of locally released vascular therapeutic compounds might be predicted based upon chemical properties, such as molecular weight and charge. (2)

Paclitaxel is hydrophobic and has therapeutic potential against proliferative vascular disease.
 The favorable preclinical data with this compound may, in part, result from preferential tissue binding.
 The complexity of Paclitaxel pharmacokinetics required in-depth investigation if this drug is to reach its full clinical potential in proliferative vascular diseases.

Equilibrium distribution of Paclitaxel reveals partitioning above and beyond perfusate concentration and a spatial gradient of drug across the arterial wall.

The effective diffusivity (Deff) was estimated from the Paclitaxel distribution data to

  • facilitate comparison of transport of Paclitaxel through arterial parenchyma with that of other vasoactive agents and to
  • characterize the disparity between endovascular and perivascular application of drug.

This transport parameter described the motion of drug in tissues given an applied concentration gradient and includes, in addition to diffusion,

  • the impact of steric hindrance within the arterial interstitium;
  • nonspecific binding to arterial elements; and, in the preparation used here,
  • convective effects from the applied transmural pressure gradient.

At all times, the effective diffusivity for endovascular delivery exceeded that of perivascular delivery. The arterial transport of Paclitaxel was quantified through application ex vivo and measurement of the subsequent transmural distribution.

  • Arterial Paclitaxel deposition at equilibrium varied across the arterial wall.
  • Permeation into the wall increased with time, from 15 minutes to 4 hours, and
  • varied with the origin of delivery.

In contrast to hydrophilic compounds, the concentration in tissue exceeded the applied concentration and the rate of transport was markedly slower. Furthermore, endovascular and perivascular Paclitaxel application led to differences in deposition across the blood vessel wall.

This leads to a conclusion that Paclitaxel interacts with arterial tissue elements  as it moves under the forces of

  • diffusion and
  • convection and
  • can establish substantial partitioning and spatial gradients across the tissue. (3)

Endovascular drug-eluting stents have changed the practice of  cardiovascular vascularization, and yet it is unclear how they so dramatically reduce restenosis

We don’t know how to distinguish between the different formulations available.
 Researchers are now questioning whether individual properties of different drugs beyond lipid avidity effect arterial transport and distribution.

In bovine internal carotid segments, tissue-loading profiles for

  • Hydrophobic Paclitaxel and Rapamycin are indistinguishable, reaching load steady state after 2 days.
  • Hydrophilic dextran reaches equilibrium in hours.

Paclitaxel and Rapamycin bind to the artery at 30–40 times bulk concentration, and bind to specific tissue elements.

Transmural drug distribution profiles are markedly different for the two compounds.

  • Rapamycin binds specifically to FKBP12 binding protein and it distributes evenly through the artery,
  • Paclitaxel binds specifically to microtubules, and remains primarily in the subintimal space.

The binding of Rapamycin and Paclitaxel to specific intracellular proteins plays an essential role in

  • determining arterial transport and distribution and in
  • distinguishing one compound from another.

These results offer further insight into the

  • mechanism of local drug delivery and the
  • specific use of existing drug-eluting stent formulations. (4)

The Role of Amyloid beta (A) in Creation of Vascular Toxic Plaque

Amyloid beta (A) is a peptide family produced and deposited in neurons and endothelial cells (EC).
It is found at subnanomolar concentrations in the plasma of healthy individuals.
 Simple conformational changes produce a form of A-beta , A-beta 42, which creates toxic plaque in the brains of Alzheimer’s patients.

Oxidative stress induced blood brain barrier degeneration has been proposed as a key factor for A-beta 42 toxicity.

This cannot account for lack of injury from the same peptide in healthy tissues.
Researchers hypothesized that cell state mediates A-beta’s effect.
 They examined the viability in the presence of A-beta secreted from transfected
Chinese hamster ovary cells (CHO) of

  • aortic Endothelial Cells (EC),
  • vascular smooth muscle cells (SMC) and
  • epithelial cells (EPI) in different states

A-beta was more toxic to all cell types when they were subconfluent.
 Subconfluent EC sprouted and SMC and EPI were inhibited by A-beta.
Confluent EC were virtually resistant to A-beta and suppressed A-beta production by A-beta +CHO.

Products of subconfluent EC overcame this resistant state, stimulating the production and toxicity of A-beta 42. Confluent EC overgrew >35% beyond their quiescent state in the presence of A-beta conditioned in media from subconfluent EC.

These findings imply that A-beta 42 may well be even more cytotoxic to cells in injured or growth states and potentially explain the variable and potent effects of this protein.

One may now need to consider tissue and cell state in addition to local concentration of and exposure duration to A-beta.

The specific interactions of A-beta and EC in a state-dependent fashion may help understand further the common and divergent forms of vascular and cerebral toxicity of A-beta and the spectrum of AD. (5)

(1) Perlecan is required to inhibit thrombosis after deep vascular injury and contributes
to endothelial cell-mediated inhibition of intimal hyperplasia. MA Nugent, HM Nugent,
RV Iozzoi, K Sanchack, and ER Edelman. PNAS Jun 2000; 97(12): 6722-6727


(2) Correlation of transarterial transport of various dextrans with their physicochemical properties.
O Elmalak, MA Lovich, E Edelman. Biomaterials 2000; 21: 2263-2272


(3) Arterial Paclitaxel Distribution and Deposition. CJ Creel, MA Lovich, ER Edelman. Circ Res. 2000;86:879-884


(4) Specific binding to intracellular proteins determines arterial transport properties for rapamycin and Paclitaxel.
AD Levin, N Vukmirovic, Chao-Wei Hwang, and ER Edelman. PNAS Jun 2004; 101(25): 9463–9467.
www.pnas.org/cgi/doi/10.1073/pnas.0400918101

(5) Amyloid beta toxicity dependent upon endothelial cell state. M Balcells, JS Wallins, ER Edelman.
Neuroscience Letters 441 (2008) 319–322

Endothelial Damage as an Inflammatory State

Autoimmunity may drive vascular disease through anti-endothelial cell (EC) antibodies. This raises a question about whether an increased morbidity of cardiovascular diseases in concert with systemic illnesses may involve these antibodies.

Matrix-embedded ECs act as powerful regulators of vascular repair accompanied by significant reduction in expected systemic and local inflammation.

The Lab researchers compared the immune response against free and matrix-embedded ECs in naive mice and mice with heightened EC immune reactivity. Mice were presensitized to EC with repeated subcutaneous injections of saline-suspended porcine EC (PAE) (5*10^5 cells).

On day 42, both naive mice (controls) and mice with heightened EC immune reactivity received 5*10^5 matrix-embedded or free PAEs. Circulating PAE-specific antibodies and effector T-cells were analyzed 90 days after implantation for –

  • PAE-specific antibody-titers,
  • frequency of CD4+-effector cells, and
  • xenoreactive splenocytes

These were 2- to 4-fold lower (P<0.0001) when naıve mice were injected with matrix-embedded instead of saline-suspended PAEs.

Though basal levels of circulating antibodies were significantly elevated after serial PAE injections (2210+341 mean fluorescence intensity, day 42) and almost doubled again 90 days after injection of a fourth set of free PAEs, antibody levels declined by half in recipients of matrix-embedded PAEs at day 42 (P<0.0001), as did levels of CD4+-effector cells and xenoreactive splenocytes.

A significant immune response to implantation of free PAE is elicited in naıve mice, that is even more pronounced in mice with pre-developed anti-endothelial immunity.

Matrix-embedding protects xenogeneic ECs against immune reaction in naive mice and in mice with heightened immune reactivity.

Matrix-embedded EC might offer a promising approach for treatment of advanced cardiovascular disease. (1)

Researchers examined the molecular mechanisms through which

mechanical force and hypertension modulate

endothelial cell regulation of vascular homeostasis.

Exposure to mechanical strain increased the paracrine inhibition of vascular smooth muscle cells (VSMCs) by endothelial cells.

Mechanical strain stimulated the production by endothelial cells of perlecan and heparan-sulfate glycosaminoglycans. By inhibiting the expression of perlecan with an antisense vector researchers demonstrated that perlecan was essential to the strain-mediated effects on endothelial cell growth control.

Mechanical regulation of perlecan expression in endothelial cells was

  • governed by a mechano-transduction pathway
  • requiring transforming growth factor (TGF-β) signaling and
  • intracellular signaling through the ERK pathway.

Immunohistochemical staining of the aortae of spontaneously hypertensive rats
demonstrated strong correlations between

  • endothelial TGF-β,
  • phosphorylated signaling intermediates, and
  • arterial thickening.

Studies on ex vivo arteries exposed to varying levels of pressure demonstrated that

ERK and TGF-beta signaling were required for pressure-induced upregulation of endothelial HSPG.

The Team’s findings suggest a novel feedback control mechanism in which

  • net arterial remodeling to hemodynamic forces is controlled by a dynamic interplay between growth stimulatory signals from vSMCs and
  • growth inhibitory signals from endothelial cells. (2)

Heparan-sulfate proteoglycans (HSPGs) are potent regulators of vascular remodeling and repair.
 The major enzyme capable of degrading HSPGs is heparanase, which led us to examine
the role of heparanase in controlling

  • arterial structure,
  • mechanics, and
  • remodeling.

In vitro studies suggested heparanase expression in endothelial cells serves as a negative regulator of endothelial inhibition of vascular smooth muscle cell (vSMC) proliferation.

ECs inhibit vSMC proliferation through the interplay between

  • growth stimulatory signals from vSMCs and
  • growth inhibitory signals from ECs.

This would be expected if ECs had HSPGs that are degraded by heparanase.
Arterial structure and remodeling to injury is modified by heparanase expression.
Transgenic mice overexpressing heparanase had

  • increased arterial thickness,
  • cellular density, and
  • mechanical compliance.

Endovascular stenting studies in Zucker rats demonstrated increased heparanase expression in the neointima of obese, hyperlipidemic rats in comparison to lean rats.

The extent of heparanase expression within the neointima strongly correlated with the neointimal thickness following injury. To test the effects of heparanase overexpression on arterial repair, researchers developed a novel murine model of stent injury using small diameter self-expanding stents.

Using this model, researchers found that increased

  • neointimal formation and
  • macrophage recruitment occurs in transgenic mice overexpressing heparanase.
  • Taken together, these results support a role for heparanase in the regulation of arterial structure, mechanics, and repair. (3)

The first host–donor reaction in transplantation occurs at the blood–tissue interface.
When the primary component of the implant (donor) is the endothelial cells, it incites an immunologic reaction. Injections of free endothelial cell implants elicit a profound major histocompatibility complex (MHC) II dominated immune response.

Endothelial cells embedded within three-dimensional matrices behave like quiescent endothelial cells.

Perivascular implants of such embedded ECs cells are the most potent inhibitor of intimal hyperplasia and thrombosis following controlled vascular injury, but without any immune reactivity.

Allo- and even exenogenic endothelial cells evoke no significant humoral or
cellular immune response in immune-competent hosts when embedded within matrices.
 Moreover,  endothelial implants are immune-modulatory, reducing the extent of the memory response to previous free cell implants.

Attenuated immunogenicity results in muted activation of adaptive and innate immune cells. These findings point toward a pivotal role of matrix–cell-interconnectivity for

  • the cellular immune phenotype and might therefore assist in the design  of
  • extracellular matrix components for successful tissue engineering. (4)

Because changes in subendothelial matrix composition are associated with alterations of the endothelial immune phenotype, researchers sought to understand if

  • cytokine-induced NF-κB activity and
  • downstream effects depend on substrate adherence of endothelial cells (EC).

The team compared the upstream

  • phosphorylation cascade,
  • activation of NF-ĸβ, and
  • expression/secretion

of downstream effects of EC grown on tissue culture polystyrene plates (TCPS) with EC embedded within collagen-based matrices (MEEC).

Adhesion of natural killer (NK) cells was quantified in vitro and in vivo.

  • NF-κβ subunit p65 nuclear levels were significantly lower and
  • p50 significantly higher in cytokine-stimulated MEEC than in EC-TCPS.

Despite similar surface expression of TNF-α receptors, MEEC had significantly decreased secretion and expression of IL-6, IL-8, MCP-1, VCAM-1, and ICAM-1.

Attenuated fractalkine expression and secretion in MEEC (two to threefold lower than in EC-TCPS; p < 0.0002) correlated with 3.7-fold lower NK cell adhesion to EC (6,335 ± 420 vs. 1,735 ± 135 cpm; p < 0.0002).

Furthermore, NK cell infiltration into sites of EC implantation in vivo was significantly reduced when EC were embedded within matrix.

Matrix embedding enables control of EC substratum interaction.

This in turn regulates chemokine and surface molecule expression and secretion, in particular – of those compounds within NF-κβ pathways,

  • chemoattraction of NK cells,
  • local inflammation, and
  • tissue repair. (5)

Monocyte recruitment and interaction with the endothelium is imperative to vascular recovery.

Tie2 plays a key role in endothelial health and vascular remodeling.
Researchers studied monocyte-mediated Tie2/angiopoietin signaling following interaction of primary monocytes with endothelial cells and its role in endothelial cell survival.

The direct interaction of primary monocytes with subconfluent endothelial cells

resulted in transient secretion of angiopoietin-1 from monocytes and

the activation of endothelial Tie2. This effect was abolished by preactivation of monocytes with tumor necrosis factor-α (TNFα).

Although primary monocytes contained high levels of

  • both angiopoietin 1 and 2,
  • endothelial cells contained primarily angiopoietin 2.

Seeding of monocytes on serum-starved endothelial cells reduced caspase-3 activity by 46+5.1%, and 52+5.8% after TNFα treatment, and it decreased single-stranded DNA levels by 41+4.2% and 40+ 3.5%, respectively.

This protective effect of monocytes on endothelial cells was reversed by Tie2 silencing with specific short interfering RNA.

The antiapoptotic effect of monocytes was further supported by the

  • activation of cell survival signaling pathways involving phosphatidylinositol 3-kinase,
  • STAT3, and
  • AKT.

Monocytes and endothelial cells form a unique Tie2/angiopoietin-1 signaling system that affects endothelial cell survival and may play critical a role in vascular remodeling and homeostasis. (6)

(1) Cell–Matrix Contact Prevents Recognition and Damage of Endothelial Cells in States of Heightened Immunity.
H Methe, ER Edelman. Circulation. 2006;114[suppl I]:I-233–I-238.
http://www.circulationaha.org/DOI/10.1161/CIRCULATIONAHA.105.000687

(2) Endothelial Cells Provide Feedback Control for Vascular Remodeling Through a Mechanosensitive Autocrine
TGFβ Signaling Pathway. AB Baker, DS Ettenson, M Jonas, MA Nugent, RV Iozzo, ER Edelman.
Circ. Res. 2008;103;289-297   http://dx.doi.org/10.1161/CIRCRESAHA.108.179465http://circres.ahajournals.org/cgi/content/full/103/3/289

(3) Heparanase Alters Arterial Structure, Mechanics, and Repair Following Endovascular Stenting in Mice.
AB Baker, A Groothuis, M Jonas, DS Ettenson…ER Edelman.   Circ. Res. 2009;104;380-387;
http://dx.doi.org/10.1161/CIRCRESAHA.108.180695  http://circres.ahajournals.org/cgi/content/full/104/3/380

(4) The effect of three-dimensional matrix-embedding of endothelial cells on the humoral and cellular immune response.
H Methe, S Hess, ER Edelman. Seminars in Immunology 20 (2008) 117–122. http://dx.doi.org/10.1016/j.smim.2007.12.005

(5) NF-kB Activity in Endothelial Cells Is Modulated by Cell Substratum Inter-actions and Influences Chemokine-Mediated
Adhesion of Natural Killer Cells.  S Hess, H Methe, Jong-Oh Kim, ER Edelman.
Cell Transplantation 2009; 18: 261–273


(6) Primary Monocytes Regulate Endothelial Cell Survival Through Secretion of Angiopoietin-1 and Activation of Endothelial Tie2.
SY Schubert, A Benarroch, J Monter-Solans and ER Edelman. Arterioscler Thromb Vasc Biol 2011;31;870-875
http://dx.doi.org/10.1161/ATVBAHA.110.218255

Neointimal Formation, Shear Stress, and Remodelling with Reference to Diabetes

Innate immunity is of major importance in vascular repair. The present study evaluated whether

  • systemic and transient depletion of monocytes and macrophages with
  • liposome-encapsulated bisphosphonates inhibits experimental in-stent neointimal formation.

The Experiment

Rabbits fed on a hypercholesterolemic diet underwent bilateral iliac artery balloon denudation and stent deployment.

Liposomal alendronate (3 or 6 mg/kg) was given concurrently with stenting.

  • Monocyte counts were reduced by 90% 24 to 48 hours aftera single injection of liposomal alendronate, returning to basal levels at 6 days.

This treatment significantly reduced

  • intimal area at 28 days, from 3.88+0.93 to 2.08+0.58 and 2.16 +0.62 mm2.
  • Lumen area was increased from 2.87+0.44 to 3.57­+0.65 and 3.45+0.58 mm2, and
  • arterial stenosis was reduced from 58 11% to 37 8% and 38 7% in controls, in rabbits treated with 3 mg/kg, and with 6 mg/kg, respectively (mean+SD, n=8 rabbits/group, P< 0.01 for all 3 parameters).

No drug-related adverse effects were observed.
Reduction in neointimal formation was associated with

  • reduced arterial macrophage infiltration and proliferation at 6 days and with an
  • equal reduction in intimal macrophage and smooth muscle cell content at 28 days after injury.

Conversely, drug regimens ineffective in reducing monocyte levels did not inhibit neointimal formation.
Researchers have shown that a

  • single liposomal bisphosphonates injection concurrent with injury reduces in-stent neointimal formation and
  • arterial stenosis in hypercholesterolemic rabbits, accompanied by systemic transient depletion of monocytes and macrophages. (1)

Diabetes and insulin resistance are associated with increased disease risk and poor outcomes from cardiovascular interventions.

Even drug-eluting stents exhibit reduced efficacy in patients with diabetes.
Researchers reported the first study of vascular response to stent injury in insulin-resistant and diabetic animal models.

Endovascular stents were expanded in the aortae of

  • obese insulin-resistant and
  • type 2 diabetic Zucker rats,
  • in streptozotocin-induced type 1 diabetic Sprague-Dawley rats, and
  • in matched controls.

Insulin-resistant rats developed thicker neointima (0.46+0.08 versus 0.37+0.06 mm2, P 0.05), with  decreased lumen area (2.95+0.26 versus 3.29+0.15 mm2, P 0.03) 14 days after stenting compared with controls, but without increased vascular inflammation (tissue macrophages).

Insulin-resistant and diabetic rat vessels did exhibit markedly altered signaling pathway activation 1 and 2 weeks after stenting, with up to a 98% increase in p-ERK (anti-phospho ERK) and a 54% reduction in p-Akt (anti-phospho Akt) stained cells. Western blotting confirmed a profound effect of insulin resistance and diabetes on Akt and ERK signaling in stented segments. p-ERK/p-Akt ratio in stented segments uniquely correlated with neointimal response (R2 = 0.888, P< 0.04) , but not in lean controls.

Transfemoral aortic stenting in rats provides insight into vascular responses in insulin resistance and diabetes.

Shifts in ERK and Akt signaling related to insulin resistance may reflect altered tissue repair in diabetes accompanied by a

  • shift in metabolic : proliferative balance.

These findings may help explain the increased vascular morbidity in diabetes and suggest specific therapies for patients with insulin resistance and diabetes. (2)

Researchers investigated the role of Valsartan (V) alone or in combination with Simvastatin (S) on coronary atherosclerosis and vascular remodeling, and tested the hypothesis that V or V/S attenuate the pro-inflammatory effect of low endothelial shear stress (ESS).

Twenty-four diabetic, hyperlipidemic swine were allocated into Early (n = 12) and Late (n=12) groups.
Diabetic swine in each group were treated with Placebo (n=4), V (n = 4) and V/S (n = 4) and  followed for 8 weeks in the Early group and 30 weeks in the Late group.

Blood pressure, serum cholesterol and glucose were similar across the treatment subgroups.
ESS was calculated in plaque-free subsegments of interest (n = 109) in the Late group at week 23.
Coronary arteries of this group were harvested at week 30, and the subsegments of interest were identified, and analyzed histopathologically.

Intravascular geometrically correct 3-dimensional reconstruction of the coronary arteries of 12 swine was performed 23 weeks after initiation of diabetes mellitus and a hyperlipidemic diet. Local endothelial shear stress was calculated

  • in plaque-free subsegments of interest (n=142) with computational fluid dynamics, and
  • the coronary arteries (n=31) were harvested and the same subsegments were identified at 30 weeks.

V alone or with S

  • reduced the severity of inflammation in high-risk plaques.
Both regimens attenuated the severity of enzymatic degradation of the arterial wall, reducing the severity of expansive remodeling.
  • attenuated the pro-inflammatory effect of low ESS.
V alone or with S
  • exerts a beneficial effect of reducing and stabilizing high-risk plaque characteristics independent of a blood pressure- and lipid-lowering effect. (3)

This study tested the hypothesis that low endothelial shear stress  augments the

  • expression of matrix-degrading proteases, promoting the
  • formation of thin-capped atheromata.

Researchers assessed the messenger RNA and protein expression, and elastolytic activity of selected elastases and their endogenous inhibitors.

Subsegments with low endothelial shear stress at week 23 showed

  • reduced endothelial coverage,
  • enhanced lipid accumulation, and
  • intense infiltration of activated inflammatory cells at week 30.

These lesions showed increased expression of messenger RNAs encoding

  • matrix metalloproteinase-2, -9, and -12, and cathepsins K and S
  • relative to their endogenous inhibitors and
  • increased elastolytic activity.

Expression of these enzymes correlated positively with the severity of internal elastic lamina fragmentation.

Thin-capped atheromata in regions with

  • lower preceding endothelial shear stress had
  • reduced endothelial coverage,
  • intense lipid and inflammatory cell accumulation,
  • enhanced messenger RNA expression and
  • elastolytic activity of MMPs and cathepsins with
  • severe internal elastic lamina fragmentation.

Low endothelial shear stress induces endothelial discontinuity and

  • accumulation of activated inflammatory cells, thereby
  • augmenting the expression and activity of elastases in the intima and
  • shifting the balance with their inhibitors toward matrix breakdown.

Team’s results provide new insight into the mechanisms of regional formation of plaques with thin fibrous caps. (4)

Elevated CRP levels predict increased incidence of cardiovascular events and poor outcomes following interventions. There is the suggestion that CRP is also a mediator of vascular injury.

Transgenic mice carrying the human CRP gene (CRPtg) are predisposed to arterial thrombosis post-injury.

Researchers examined whether CRP similarly modulates the proliferative and hyperplastic phases of vascular repair in CRPtg when thrombosis is controlled with daily aspirin and heparin at the time of trans-femoral arterial wire-injury.

Complete thrombotic arterial occlusion at 28 days was comparable for wild-type and CRPtg mice (14 and 19%, respectively). Neointimal area at 28d was 2.5 fold lower in CRPtg (4190±3134 m2, n = 12) compared to wild-types (10,157±8890 m2, n = 11, p < 0.05).

Likewise, neointimal/media area ratio was 1.10±0.87 in wild-types and 0.45±0.24 in CRPtg (p < 0.05).

  • Seven days post-injury, cellular proliferation and apoptotic cell number in the intima were both less pronounced in CRPtg than wild-type.
  • No differences were seen in leukocyte infiltration or endothelial coverage.
CRPtg mice had significantly reduced p38 MAPK signaling pathway activation following injury.

The pro-thrombotic phenotype of CRPtg mice was suppressed by aspirin/heparin, revealing CRP’s influence on neointimal growth after trans-femoral arterial wire-injury.

  • Signaling pathway activation,
  • cellular proliferation, and
  • neointimal formation

were all reduced in CRPtg following vascular injury.
 Increasingly the Team was aware of CRP multipotent effects.
 Once considered only a risk factor, and recently a harmful agent, CRP is a far more complex regulator of vascular biology. (5)

(1) Liposomal Alendronate Inhibits Systemic Innate Immunity and Reduces In-Stent Neointimal
Hyperplasia in Rabbits. HD Danenberg, G Golomb, A Groothuis, J Gao…, ER Edelman.
Circulation. 2003;108:2798-2804


(2) Vascular Neointimal Formation and Signaling Pathway Activation in Response to Stent Injury
in Insulin-Resistant and Diabetic Animals. M Jonas, ER Edelman, A Groothuis, AB Baker, P Seifert, C Rogers.
Circ. Res. 2005;97;725-733.        http://dx.doi.org/10.1161/01.RES.0000183730.52908.C6
http://circres.ahajournals.org/cgi/content/full/97/7/725

(3) Attenuation of inflammation and expansive remodeling by Valsartan alone or in combination with
Simvastatin in high-risk coronary atherosclerotic plaques. YS Chatzizisis, M Jonas, R Beigel, AU Coskun…
ER Edelman, CL Feldman, PH Stone.  Atherosclerosis 203 (2009) 387–394


(4) Augmented Expression and Activity of Extracellular Matrix-Degrading Enzymes in Regions of Low
Endothelial Shear Stress Colocalize With Coronary Atheromata With Thin Fibrous Caps in Pigs.
YS Chatzizisis, AB Baker, GK Sukhova,…P Libby, CL Feldman, ER Edelman, PH Stone
Circulation 2011;123;621-630     http://dx.doi.org/10.1161/CIRCULATIONAHA.110.970038
http://circ.ahajournals.org/cgi/content/full/123/6/621


(5) Neointimal formation is reduced after arterial injury in human crp transgenic mice
HD Danenberg, E Grad, RV Swaminathan, Z Chenc,…ER Edelman
Atherosclerosis 201 (2008) 85–91

A Rattle Bag of Science and the Art of Translation

Science Translational Medicine – A rattle bag of science and the art of translation
E. R. Edelman, G. A. FitzGerald.
Sci.Transl. Med. 3, 104ed3 (2011). http://dx.doi.org/10.1126/scitranslmed.3002131

Elazer R. Edelman is the Thomas D. and Virginia W. Cabot Professor of Health Sciences and Technology at MIT,
Professor of Medicine at Harvard Medical School, a coronary care unit cardiologist at the Brigham and Women’s
Hospital, and Director of the Harvard-MIT Biomedical Engineering Center. E-mail: ere@mit.edu

Garret A. FitzGerald is the McNeil Professor in Translational Medicine and Therapeutics, Chair of the Department of
Pharmacology, and Director of the Institute for Translational Medicine & Therapeutics, University of Pennsylvania.
E-mail: garret@upenn.edu

In 2011, the American Association for the Advancement of Science (AAAS)  founded Science Translational Medicine (STM)
to disseminate interdisciplinary science integrating basic and clinical research that defines and fosters new therapeutics, devices, and diagnostics.

Conceived and nourished under the creative vision of Elias Zerhouni and Katrina Kelner, the journal has attracted widespread attention.
Now, as we assume the mantle of co-chief scientific advisors, we look back on the journal’s early accomplishments, restate our mission, and make clear the kinds of manuscripts we seek and accept for publication.

STM’s mission, as articulated by Elias and Katrina, was to

“promote human health by providing a forum for communication and cross-fertilization among basic, translational, and clinical research practitioners and trainees from all relevant established and emerging disciplines.”

This statement remains relevant and accurate today.
 With this mission on our masthead, STM now receives ~25 manuscripts (full-length research articles) per week and publishes ~10% of them. Roughly half of the submissions are deemed inappropriate for the journal and are returned without review within 8 to 10 days of receipt.

Of those papers that undergo full peer review,

decisions to reject are made within 48 days and

the mean time to acceptance (including the revision period) is 125 days.

There is now an average wait of only 24 days between acceptance and publication.

Defining TRANSLATIONAL Medicine

In accord with the journal’s broad readership, the ideal manuscript meets five criteria: It
(i) reports a discovery of translational relevance with high-impact potential;
(ii) has a conceptual focus with interdisciplinary appeal;
(iii) elucidates a biological mechanism;
(iv) is innovative and novel; and
(v) is presented in clear, broadly accessible language.
 STM seeks to publish research that describes

  • how innovative concepts drive the creative biomedical science
  • that ultimately improves the quality of people’s lives—

This is the broadest of our journal’s criteria but is the one that sets us apart as well.
Translational relevance does not require demonstration of benefit in humans but does require the evident potential to advance clinical medicine, thus impacting the direction of our culture and the welfare of our communities. Conceptual focus and mechanistic emphasis discriminate our papers from those that contain observational descriptions of technical findings for which value is restricted to a specific discipline.

However, innovation and novelty may apply to a fundamental scientific discovery or to the nature of its application and relevance to the translational process. Criteria enable the journal to consider versatile technological advances that apply new and creative thinking but may not necessarily offer fresh insights into biological mechanisms. Finally, while the subsequent additional efforts of the STM editorial staff are not to be discounted, the clarity of writing and coherence of argument presented within a submitted manuscript are likely to facilitate its progress through the challenge of peer review.

On Causes – Hippocrates, Aristotle, Robert Koch, and the Dread Pirate Roberts

Elazer R. Edelman
Circulation 2001;104:2509-2512

The idea of risk factors for vascular disease has evolved

  • from a dichotomous to continuous hazard analysis and
  • from the consideration of a few factors to
  • mechanistic investigation of many interrelated risks.

However, confusion still abounds regarding issues of association and causation. Originally, the simple presence of

  • tobacco abuse, hypertension, and/or hypercholesterolemia were tallied, and
  • the cumulative score was predictive of subsequent coronary artery disease.

Since then, dose responses have been defined for these and other factors and it has been suggested that almost 300 factors place patients at risk; these factors include elevations in plasma homocysteine.
 Recent studies shed interesting light on the mechanism of this potentially causal relationship, which was first noted in 1969.

Aside from putative effects on vessel wall dynamics, there is now direct evidence that homocysteine is atherogenic. Twenty-fold increases in plasma homocysteine achieved by dietary manipulation of apoE–/– mice increased aortic root lesion size 2-fold and produced a prolonged chronic inflammatory mural response accompanied by elevations in vascular cell adhesion molecule-1 (VCAM) and tumor necrosis factor-a (TNF-a).

In long term followup, homocysteine levels elevated by

  • dietary supplementation with methionine or homocysteine
  • promoted lesion size and plaque fibrosis in these
  • atherosclerosis-prone mice early in life, but without influencing ultimate plaque burden as the animals aged.

A number of mechanisms were proposed by which homocysteine achieved this effect, including

  • promotion of inflammation,
  • regulation of lipoprotein metabolism, and
  • modification of critical biochemical pathways and
  • metabolites including nitric oxide (NO).

See p 2569
In the present issue of Circulation,

Stühlinger et al 7 advance these mechanistic insights one critical step further by defining homocysteine’s effects at an enzymatic level.

The group led by Lentz published an association between levels of the

  • endogenous inhibitor of Nirtic Oxide synthase,
  • asymmetric dimethyl arginine (ADMA), and
  • homocysteine in cultured endothelial cells and in the serum of cynomolgus monkeys.

Such an association is interesting because the L-arginine–NO synthase pathway seems to be a critical component in the full range of endothelial cell biology and vascular dysfunction.

Stühlinger et al 7  now show that increased cultured endothelial cell elaboration of ADMA by homocysteine and its precursor L-methionine is associated with a dose-dependent impairment of the activity of endothelial dimethylarginine dimethylaminohydrolase (DDAH), the enzyme that degrades ADMA. Homocysteine directly inhibited DDAH activity in a cell-free system by targeting a critical sulfhydryl group on this enzyme.

Thus, one could envision that the balance of cardiovascular health and disease could well be determined by the ability of an intact Nirtic Oxide synthase system to overcome environmental, dietary, and even genetic factors.

In patients with altered enzymatic defense systems,

  • elevated homocysteine,
  • oxidized lipoproteins,
  • inflammation, and other
  • vasotoxins

may dominate even the most potent defense mechanisms.
These studies raise a number of issues.
Do we need to add to our list of established cardiovascular risk factors to accommodate new findings and associations?
Is there a final common pathway for all risk factors or perhaps even a unified factor theory into which all potential risks can be grouped?
And, as always, should we consider Nirtic Oxide at the core of this universality?
Finally, should we change our focus altogether and speak not of risk factors but of

  • genetic predisposition,
  • extent of biochemical aberration, and
  • degree of physical damage?

Some would view these remarkable success stories and the repeated association of hyperhomocyst(e)inemia with coronary, cerebral, and peripheral vascular disease and simply advocate for increased folic acid intake for all.

Indeed, this intervention of negligible cost and

  • insignificant side effect is already partially in place;
  • many foods are fortified with folate to prevent congenital neural tube defects.

This reader considers the seminal work by Vernon Young and Yves Ingenbleek on the relationship between

  • S8 and regions distant from lava flows in Asia and Indian subcontinents,
  • where they have determined hyperhomocysteinemia and the consequence associated with:
  • veganism (not voluntary)
  • impaired methyl donor reactions and transsulfuration pathways (not corrected by B12, folate)
  • loss of lean body mass due to the constant relationship of S:N (insufficient from plant sources)

What happens, when we fail to continue to pursue causality,

  • the linkage of biological significance or scientific plausibility with
  • epidemiologically or statistically significant association?

In medicine, risk becomes the likelihood that people without a disease will acquire the disease through contact with factors thought to increase disease risk.

All of these risk factors are then, by nature, imprecise and nonspecific.
 They are stochastic measures of what will happen to normal people who fall into particular measures of these parameters.

The daring may be willing to accept these risks, citing friend and foe who live well beyond or for far lesser times than anticipated by risk alone. Such concerns may well become moot if we can simultaneously identify patients at risk

  • by linking phenotype with genotype,
  • gene expression with protein elaboration, and
  • environmental exposures with the biochemical consequences and
  • direct anatomic aberrations they induce.

This kind of characterization may well replace a family history of arterial disease as a rough estimate of

  • genotype,
  • serum cholesterol as an indirect measure of the health of lipoprotein metabolism,
  • serum glucose as a crude determinant of the ravages of diabetes mellitus,
  • blood pressure measurement as a marker of long-standing endogenous exposure to altered flow, and
  • tobacco abuse as a maker of long-standing exposure to exogenous toxins.

Rather than identifying patients on the basis of their serum cholesterol, we will have a direct measure of their

  • LDL receptor number,
  • internalization rate,
  • macrophage content in the blood vessel wall,
  • metalloproteinase activity, etc.
  • insulin receptor metabolism,
  • oxidative state, and
  • glycated burden.
  • Serum glucose will similarly give way to these tests

Evaluating a new way to open clogged arteries: Computational model offers insight into mechanisms of drug-coated balloons.

A new study from MIT analyzes the potential usefulness of a new treatment that combines the benefits of angioplasty balloons and drug-releasing stents, but may pose fewer risks. With this new approach, a balloon is inflated in the artery for only a brief period, during which it releases a drug that prevents cells from accumulating and clogging the arteries over time.
While approved for limited use in Europe, these drug-coated balloons are still in development in the United States and have not received FDA approval. The MIT study, which models the behavior of the balloons, should help scientists optimize their performance and aid regulators in evaluating their effectiveness and safety.
“Until now, people who evaluate such technology could not distinguish hype from promise,” says Elazer Edelman, the Thomas D. and Virginia W. Cabot Professor of Health Sciences and Technology and senior author of the paper describing the study, which appeared online recently in the journal Circulation.
Lead author of the paper is Vijaya Kolachalama, a former MIT postdoc who is now a principal member of the technical staff at the Charles Stark Draper Laboratory.
Edelman’s lab is investigating a possible alternative to the current treatments: drug-coated balloons. “We’re trying to understand how and when this therapy could work and identify the conditions in which it may not,” Kolachalama says. “It has its merits; it has some disadvantages.”

Modeling drug release

The drug-coated balloons are delivered by a catheter and inflated at the narrowed artery for about 30 seconds, sometimes longer. During that time, the balloon coating, containing a drug such as Zotarolimus, is released from the balloon. The properties of the coating allow the drug to be absorbed in the body’s tissues. Once the drug is released, the balloon is removed.
In their new study, Kolachalama, Edelman and colleagues set out to rigorously characterize the properties of the drug-coated balloons. After performing experiments in tissue grown in the lab and in pigs, they developed a computer model that explains the dynamics of drug release and distribution. They found that factors such as the size of the balloon, the duration of delivery time, and the composition of the drug coating all influence how long the drug stays at the injury site and how effectively it clears the arteries.
One significant finding is that when the drug is released, some of it sticks to the lining of the blood vessels. Over time, that drug is slowly released back into the tissue, which explains why the drug’s effects last much longer than the initial 30-second release period.
“This is the first time we can explain the reasons why drug-coated balloons can work,” Kolachalama says. “The study also offers areas where people can consider thinking about optimizing drug transfer and delivery.”

http://circ.ahajournals.org/content/127/20/2047.short  
http://www.mit.edu/people/vbk/Circulation_2013.pdf 
http://www.sciencedaily.com/…13/05/130521121513.ht…    
Circulation, 2013; 127 (20): 2047 – 2055
http://dx.doi.org/10.1161/CIRCULATIONAHA.113.002051;

 

Conclusion

MIT’s Edelman’s Lab conducted the pioneering work in Vascular biology, animal models of drug eluting stents and was at the forefront of Empirical Molecular Cardiology in its studies in vascular physiology, biology and biomaterials for medical devices.

Related articles

MUC1* Ligand, NM23-H1, Is a Novel Growth Factor That Maintains Human Stem Cells in a More Naïve State (plosone.org)

Mass. General team develops implantable, bioengineered rat kidney (eurekalert.org)

Suppression of JAK2/STAT3 Signaling Reduces End-to-End Arterial Anastomosis Induced Cell Proliferation in Common Carotid Arteries of Rats (plosone.org)

Blood Vessel Function and Breathing Control Adversely Affected by Cutting Back on Sleep (medindia.net)

miRNA Biogenesis Enzyme Drosha Is Required for Vascular Smooth Muscle Cell Survival (plosone.org)

Cell-Permeable Peptide Shows Promise For Controlling Cardiovascular Disease (medicalnewstoday.com)

The Heart Revolution By Kilmer McCully, Martha McCully

HarperCollinsPublishers, 1969

http://books.google.com/books?id=iYLbuZFxEt8C&pg=PR20&dq=New+York+Times+homocysteine+and+Cholesterol&hl=en&sa=X&ei=_0F7UfDRA8zB4APozIHQAQ&ved=0CEMQ6AEwAg

 

Other Related Articles that were published on this Open Access Online Scientific Journal include the following:

Modeling Targeted Therapy

Larry H Bernstein, MD, FACP 3/2/2013

Quantum Biology And Computational Medicine

Larry H Bernstein, MD, FACP 4/3/2013

Virtual Biopsy – is it possible?

Larry H Bernstein, MD, FACP 3/3/2013

Reprogramming cell fate  3/2/2013

Larry H Bernstein, MD, FACP

How Methionine Imbalance with Sulfur-Insufficiency Leads to Hyperhomocysteinemia

Larry H Bernstein, MD, FACP 4/4/2013

http://pharmaceuticalintelligence.com/2013/04/04/sulfur-deficiency-and-hyperhomocusteinemia/

Amyloidosis with Cardiomyopathy

Larry H Bernstein, MD, FACP 3/31/2013

http://pharmaceuticalintelligence.com/2013/03/31/amyloidosis-with-cardiomyopathy/

Nitric Oxide, Platelets, Endothelium and Hemostasis

Larry H Bernstein, MD, FACP 11/8/2012

http://pharmaceuticalintelligence.com/2012/11/08/nitric-oxide-platelets-endothelium-and-hemostasis/

Mitochondrial Damage and Repair under Oxidative Stress

Larry H Bernstein, MD, FACP 10/28/2012

http://pharmaceuticalintelligence.com/2012/10/28/mitochondrial-damage-and-repair-under-oxidative-stress/

Endothelial Function and Cardiovascular Disease

Larry H Bernstein, MD, FACP 10/25/2012

http://pharmaceuticalintelligence.com/2012/10/25/endothelial-function-and-cardiovascular-disease/

Endothelial Dysfunction, Diminished Availability of cEPCs, Increasing CVD Risk for Macrovascular Disease –Therapeutic Potential of cEPCs

Aviva Lev-Ari, PhD, RN 8/27/2012

Revascularization: PCI, Prior History of PCI vs CABG

Aviva Lev-Ari, PhD, RN 4/25/2013

http://pharmaceuticalintelligence.com/2013/04/25/revascularization-pci-prior-history-of-pci-vs-cabg/

Cholesteryl Ester Transfer Protein (CETP) Inhibitor: Potential of Anacetrapib to treat Atherosclerosis and CAD

Aviva Lev-Ari, PhD, RN 4/7/2013

http://pharmaceuticalintelligence.com/2013/04/07/cholesteryl-ester-transfer-protein-cetp-inhibitor-potential-of-anacetrapib-to-treat-atherosclerosis-and-cad/

Hypertriglyceridemia concurrent Hyperlipidemia: Vertical Density Gradient Ultracentrifugation a Better Test to Prevent Undertreatment of High-Risk Cardiac Patients

Aviva Lev-Ari, PhD, RN 4/4/2013

http://pharmaceuticalintelligence.com/2013/04/04/hypertriglyceridemia-concurrent-hyperlipidemia-vertical-density-gradient-ultracentrifugation-a-better-test-to-prevent-undertreatment-of-high-risk-cardiac-patients/

Fight against Atherosclerotic Cardiovascular Disease: A Biologics not a Small Molecule – Recombinant Human lecithin-cholesterol acyltransferase (rhLCAT) attracted AstraZeneca to acquire AlphaCore

Aviva Lev-Ari, PhD, RN 4/3/2013

http://pharmaceuticalintelligence.com/2013/04/03/fight-against-atherosclerotic-cardiovascular-disease-a-biologics-not-a-small-molecule-recombinant-human-lecithin-cholesterol-acyltransferase-rhlcat-attracted-astrazeneca-to-acquire-alphacore/

High-Density Lipoprotein (HDL): An Independent Predictor of Endothelial Function & Atherosclerosis, A Modulator, An Agonist, A Biomarker for Cardiovascular Risk

Aviva Lev-Ari, PhD, RN 3/31/2013

http://pharmaceuticalintelligence.com/2013/03/31/high-density-lipoprotein-hdl-an-independent-predictor-of-endothelial-function-artherosclerosis-a-modulator-an-agonist-a-biomarker-for-cardiovascular-risk/

Acute Chest Pain/ER Admission: Three Emerging Alternatives to Angiography and PCI

Aviva Lev-Ari, PhD, RN 3/10/2013

http://pharmaceuticalintelligence.com/2013/03/10/acute-chest-painer-admission-three-emerging-alternatives-to-angiography-and-pci/

Genomics & Genetics of Cardiovascular Disease Diagnoses: A Literature Survey of AHA’s Circulation Cardiovascular Genetics, 3/2010 – 3/2013

Lev-Ari, A. and L H Bernstein 3/7/2013

http://pharmaceuticalintelligence.com/2013/03/07/genomics-genetics-of-cardiovascular-disease-diagnoses-a-literature-survey-of-ahas-circulation-cardiovascular-genetics-32010-32013/

The Heart: Vasculature Protection – A Concept-based Pharmacological Therapy including THYMOSIN

Aviva Lev-Ari, PhD, RN 2/28/2013

http://pharmaceuticalintelligence.com/2013/02/28/the-heart-vasculature-protection-a-concept-based-pharmacological-therapy-including-thymosin/

Arteriogenesis and Cardiac Repair: Two Biomaterials – Injectable Thymosin beta4 and Myocardial Matrix Hydrogel

Aviva Lev-Ari, PhD, RN 2/27/2013

http://pharmaceuticalintelligence.com/2013/02/27/arteriogenesis-and-cardiac-repair-two-biomaterials-injectable-thymosin-beta4-and-myocardial-matrix-hydrogel/

Coronary artery disease in symptomatic patients referred for coronary angiography: Predicted by Serum Protein Profiles

Aviva Lev-Ari, PhD, RN 12/29/2012

http://pharmaceuticalintelligence.com/2012/12/29/coronary-artery-disease-in-symptomatic-patients-referred-for-coronary-angiography-predicted-by-serum-protein-profiles/

Special Considerations in Blood Lipoproteins, Viscosity, Assessment and Treatment

Bernstein, HL and Lev-Ari, A. 11/28/2012

http://pharmaceuticalintelligence.com/2012/11/28/special-considerations-in-blood-lipoproteins-viscosity-assessment-and-treatment/

Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes

Aviva Lev-Ari, PhD, RN 11/13/2012

http://pharmaceuticalintelligence.com/2012/11/13/peroxisome-proliferator-activated-receptor-ppar-gamma-receptors-activation-pparγ-transrepression-for-angiogenesis-in-cardiovascular-disease-and-pparγ-transactivation-for-treatment-of-dia/

Clinical Trials Results for Endothelin System: Pathophysiological role in Chronic Heart Failure, Acute Coronary Syndromes and MI – Marker of Disease Severity or Genetic Determination?

Aviva Lev-Ari, PhD, RN 10/19/2012

http://pharmaceuticalintelligence.com/2012/10/19/clinical-trials-results-for-endothelin-system-pathophysiological-role-in-chronic-heart-failure-acute-coronary-syndromes-and-mi-marker-of-disease-severity-or-genetic-determination/

Endothelin Receptors in Cardiovascular Diseases: The Role of eNOS Stimulation

Aviva Lev-Ari, PhD, RN 10/4/2012

http://pharmaceuticalintelligence.com/2012/10/04/endothelin-receptors-in-cardiovascular-diseases-the-role-of-enos-stimulation/

Inhibition of ET-1, ETA and ETA-ETB, Induction of NO production, stimulation of eNOS and Treatment Regime with PPAR-gamma agonists (TZD): cEPCs Endogenous Augmentation for Cardiovascular Risk Reduction – A Bibliography

Aviva Lev-Ari, PhD, RN 10/4/2012

http://pharmaceuticalintelligence.com/2012/10/04/inhibition-of-et-1-eta-and-eta-etb-induction-of-no-production-and-stimulation-of-enos-and-treatment-regime-with-ppar-gamma-agonists-tzd-cepcs-endogenous-augmentation-for-cardiovascular-risk-reduc/

Positioning a Therapeutic Concept for Endogenous Augmentation of cEPCs — Therapeutic Indications for Macrovascular Disease: Coronary, Cerebrovascular and Peripheral

Aviva Lev-Ari, PhD, RN 8/29/2012

http://pharmaceuticalintelligence.com/2012/08/29/positioning-a-therapeutic-concept-for-endogenous-augmentation-of-cepcs-therapeutic-indications-for-macrovascular-disease-coronary-cerebrovascular-and-peripheral/

Cardiovascular Outcomes: Function of circulating Endothelial Progenitor Cells (cEPCs): Exploring Pharmaco-therapy targeted at Endogenous Augmentation of cEPCs

Aviva Lev-Ari, PhD, RN 8/28/2012

http://pharmaceuticalintelligence.com/2012/08/28/cardiovascular-outcomes-function-of-circulating-endothelial-progenitor-cells-cepcs-exploring-pharmaco-therapy-targeted-at-endogenous-augmentation-of-cepcs/

Endothelial Dysfunction, Diminished Availability of cEPCs, Increasing CVD Risk for Macrovascular Disease – Therapeutic Potential of cEPCs

Aviva Lev-Ari, PhD, R N 8/27/2012

http://pharmaceuticalintelligence.com/2012/08/27/endothelial-dysfunction-diminished-availability-of-cepcs-increasing-cvd-risk-for-macrovascular-disease-therapeutic-potential-of-cepcs/

Vascular Medicine and Biology: CLASSIFICATION OF FAST ACTING THERAPY FOR PATIENTS AT HIGH RISK FOR MACROVASCULAR EVENTS Macrovascular Disease – Therapeutic Potential of cEPCs

Aviva Lev-Ari, PhD, RN 8/24/2012

http://pharmaceuticalintelligence.com/2012/08/24/vascular-medicine-and-biology-classification-of-fast-acting-therapy-for-patients-at-high-risk-for-macrovascular-events-macrovascular-disease-therapeutic-potential-of-cepcs/

Cardiovascular Disease (CVD) and the Role of agent alternatives in endothelial Nitric Oxide Synthase (eNOS) Activation and Nitric Oxide Production

Aviva Lev-Ari, PhD, RN 7/19/2012

http://pharmaceuticalintelligence.com/2012/07/19/cardiovascular-disease-cvd-and-the-role-of-agent-alternatives-in-endothelial-nitric-oxide-synthase-enos-activation-and-nitric-oxide-production/

Resident-cell-based Therapy in Human Ischaemic Heart Disease: Evolution in the PROMISE of Thymosin beta4 for Cardiac Repair

Aviva Lev-Ari, PhD, RN 4/30/2012

http://pharmaceuticalintelligence.com/2012/04/30/93/

Triple Antihypertensive Combination Therapy Significantly Lowers Blood Pressure in Hard-to-Treat Patients with Hypertension and Diabetes

Aviva Lev-Ari, PhD, RN 5/29/2012

http://pharmaceuticalintelligence.com/2012/05/29/445/

Macrovascular Disease – Therapeutic Potential of cEPCs: Reduction Methods for CV Risk

Aviva Lev-Ari, PhD, RN 7/2/2012

http://pharmaceuticalintelligence.com/2012/07/02/macrovascular-disease-therapeutic-potential-of-cepcs-reduction-methods-for-cv-risk/

Mitochondria Dysfunction and Cardiovascular Disease – Mitochondria: More than just the “powerhouse of the cell”

Aviva Lev-Ari, PhD, RN 7/9/2012

http://pharmaceuticalintelligence.com/2012/07/09/mitochondria-more-than-just-the-powerhouse-of-the-cell/

Bystolic’s generic Nebivolol – positive effect on circulating Endothelial Proginetor Cells endogenous augmentation

Aviva Lev-Ari, PhD, RN 7/16/2012

http://pharmaceuticalintelligence.com/2012/07/16/bystolics-generic-nebivolol-positive-effect-on-circulating-endothilial-progrnetor-cells-endogenous-augmentation/

Arteriogenesis and Cardiac Repair: Two Biomaterials – Injectable Thymosin beta4 and Myocardial Matrix Hydrogel

Aviva Lev-Ari, PhD, RN 2/27/2013

http://pharmaceuticalintelligence.com/2013/02/27/arteriogenesis-and-cardiac-repair-two-biomaterials-injectable-thymosin-beta4-and-myocardial-matrix-hydrogel/

Cardiac Surgery Theatre in China vs. in the US: Cardiac Repair Procedures, Medical Devices in Use, Technology in Hospitals, Surgeons’ Training and Cardiac Disease Severity”

Aviva Lev-Ari, PhD, RN 1/8/2013

http://pharmaceuticalintelligence.com/2013/01/08/cardiac-surgery-theatre-in-china-vs-in-the-us-cardiac-repair-procedures-medical-devices-in-use-technology-in-hospitals-surgeons-training-and-cardiac-disease-severity/

Heart Remodeling by Design – Implantable Synchronized Cardiac Assist Device: Abiomed’s Symphony

Aviva Lev-Ari, PhD, RN 7/23/2012

http://pharmaceuticalintelligence.com/2012/07/23/heart-remodeling-by-design-implantable-synchronized-cardiac-assist-device-abiomeds-symphony/

Acute Chest Pain/ER Admission: Three Emerging Alternatives to Angiography and PCI

Aviva Lev-Ari, PhD, RN 3/10/2013

http://pharmaceuticalintelligence.com/2013/03/10/acute-chest-painer-admission-three-emerging-alternatives-to-angiography-and-pci/

Dilated Cardiomyopathy: Decisions on implantable cardioverter-defibrillators (ICDs) using left ventricular ejection fraction (LVEF) and Midwall Fibrosis: Decisions on Replacement using late gadolinium enhancement cardiovascular MR (LGE-CMR)

Aviva Lev-Ari, PhD, RN 3/10/2013
http://pharmaceuticalintelligence.com/2013/03/10/dilated-cardiomyopathy-decisions-on-implantable-cardioverter-defibrillators-icds-using-left-ventricular-ejection-fraction-lvef-and-midwall-fibrosis-decisions-on-replacement-using-late-gadolinium/

The Heart: Vasculature Protection – A Concept-based Pharmacological Therapy including THYMOSIN

Aviva Lev-Ari, PhD, RN 2/28/2013
http://pharmaceuticalintelligence.com/2013/02/28/the-heart-vasculature-protection-a-concept-based-pharmacological-therapy-including-thymosin/

FDA Pending 510(k) for The Latest Cardiovascular Imaging Technology

Aviva Lev-Ari, PhD, RN 1/28/2013
http://pharmaceuticalintelligence.com/2013/01/28/fda-pending-510k-for-the-latest-cardiovascular-imaging-technology/

PCI Outcomes, Increased Ischemic Risk associated with Elevated Plasma Fibrinogen not Platelet Reactivity

Aviva Lev-Ari, PhD, RN 1/10/2013
http://pharmaceuticalintelligence.com/2013/01/10/pci-outcomes-increased-ischemic-risk-associated-with-elevated-plasma-fibrinogen-not-platelet-reactivity/

The ACUITY-PCI score: Will it Replace Four Established Risk Scores — TIMI, GRACE, SYNTAX, and Clinical SYNTAX

Aviva Lev-Ari, PhD, RN
http://pharmaceuticalintelligence.com/2013/01/03/the-acuity-pci-score-will-it-replace-four-established-risk-scores-timi-grace-syntax-and-clinical-syntax/

Coronary artery disease in symptomatic patients referred for coronary angiography: Predicted by Serum Protein Profiles

Aviva Lev-Ari, PhD, RN
http://pharmaceuticalintelligence.com/2012/12/29/coronary-artery-disease-in-symptomatic-patients-referred-for-coronary-angiography-predicted-by-serum-protein-profiles/

Heart Renewal by pre-existing Cardiomyocytes: Source of New Heart Cell Growth Discovered

Aviva Lev-Ari, PhD, RN 12/23/2012
http://pharmaceuticalintelligence.com/2012/12/23/heart-renewal-by-pre-existing-cardiomyocytes-source-of-new-heart-cell-growth-discovered/

Cardiovascular Risk Inflammatory Marker: Risk Assessment for Coronary Heart Disease and Ischemic Stroke – Atherosclerosis.

Aviva Lev-Ari, PhD, RN 10/30/2012
http://pharmaceuticalintelligence.com/2012/10/30/cardiovascular-risk-inflammatory-marker-risk-assessment-for-coronary-heart-disease-and-ischemic-stroke-atherosclerosis/

To Stent or Not? A Critical Decision

Aviva Lev-Ari, PhD, RN 10/23/2012
http://pharmaceuticalintelligence.com/2012/10/23/to-stent-or-not-a-critical-decision/

New Definition of MI Unveiled, Fractional Flow Reserve (FFR)CT for Tagging Ischemia

Aviva Lev-Ari, PhD, RN 8/27/2012
http://pharmaceuticalintelligence.com/2012/08/27/new-definition-of-mi-unveiled-fractional-flow-reserve-ffrct-for-tagging-ischemia/

Ethical Considerations in Studying Drug Safety — The Institute of Medicine Report

Aviva Lev-Ari, PhD, RN 8/23/2012
http://pharmaceuticalintelligence.com/2012/08/23/ethical-considerations-in-studying-drug-safety-the-institute-of-medicine-report/

New Drug-Eluting Stent Works Well in STEMI

Aviva Lev-Ari, PhD, RN 8/22/2012
http://pharmaceuticalintelligence.com/2012/08/22/new-drug-eluting-stent-works-well-in-stemi/

Expected New Trends in Cardiology and Cardiovascular Medical Devices

Aviva Lev-Ari, PhD, RN 8/17/2012
http://pharmaceuticalintelligence.com/2012/08/17/expected-new-trends-in-cardiology-and-cardiovascular-medical-devices/

Coronary Artery Disease – Medical Devices Solutions: From First-In-Man Stent Implantation, via Medical Ethical Dilemmas to Drug Eluting Stents

Aviva Lev-Ari, PhD, RN 8/13/2012

http://pharmaceuticalintelligence.com/2012/08/13/coronary-artery-disease-medical-devices-solutions-from-first-in-man-stent-implantation-via-medical-ethical-dilemmas-to-drug-eluting-stents/

Percutaneous Endocardial Ablation of Scar-Related Ventricular Tachycardia

Aviva Lev-Ari, PhD, RN 7/18/2012

http://pharmaceuticalintelligence.com/2012/07/18/percutaneous-endocardial-ablation-of-scar-related-ventricular-tachycardia/

Competition in the Ecosystem of Medical Devices in Cardiac and Vascular Repair: Heart Valves, Stents, Catheterization Tools and Kits for Open Heart and Minimally Invasive Surgery (MIS)

Aviva Lev-Ari, PhD, RN 6/22/2012

http://pharmaceuticalintelligence.com/2012/06/22/competition-in-the-ecosystem-of-medical-devices-in-cardiac-and-vascular-repair-heart-valves-stents-catheterization-tools-and-kits-for-open-heart-and-minimally-invasive-surgery-mis/

Global Supplier Strategy for Market Penetration & Partnership Options (Niche Suppliers vs. National Leaders) in the Massachusetts Cardiology & Vascular Surgery Tools and Devices Market for Cardiac Operating Rooms and Angioplasty Suites

Aviva Lev-Ari, PhD, RN 6/22/2012

http://pharmaceuticalintelligence.com/2012/06/22/global-supplier-strategy-for-market-penetration-partnership-options-niche-suppliers-vs-national-leaders-in-the-massachusetts-cardiology-vascular-surgery-tools-and-devices-market-for-car/

Blood_Vessels

Blood_Vessels (Photo credit: shoebappa)

Visceral Myopathy in Statins

Visceral Myopathy in Statins (Photo credit: Snipergirl)

Medical science has advanced significantly sin...

Medical science has advanced significantly since 1507, when Leonardo da Vinci drew this diagram of the internal organs and vascular systems of a woman. (Photo credit: Wikipedia)

English: Lee Hood, MD, PhD, President and Co-f...

English: Lee Hood, MD, PhD, President and Co-found of the Institute for Systems Biology (Photo credit: Wikipedia)

Read Full Post »

Economic Toll of Heart Failure in the US: Forecasting the Impact of Heart Failure in the United States – A Policy Statement From the American Heart Association

Reporter: Aviva Lev-Ari, PhD, RN

 

  • AHA Policy Statement

Forecasting the Impact of Heart Failure in the United States

A Policy Statement From the American Heart Association

  1. Paul A. Heidenreich, MD, MS, FAHA, Chair,

  2. Nancy M. Albert, PhD, RN, FAHA,
  3. Larry A. Allen, MD, MHS,
  4. David A. Bluemke, MD, PhD, FAHA,
  5. Javed Butler, MD, MPH, FAHA,
  6. Gregg C. Fonarow, MD, FAHA,
  7. John S. Ikonomidis, MD, PhD, FRCS(C), FAHA,
  8. Olga Khavjou, MA,
  9. Marvin A. Konstam, MD,
  10. Thomas M. Maddox, MD, MSc,
  11. Graham Nichol, MD, MPH, FRCP(C), FAHA,
  12. Michael Pham, MD, MPH,
  13. Ileana L. Piña, MD, MPH, FAHA,
  14. Justin G. Trogdon, PhD and
  15. on behalf of the American Heart Association Advocacy Coordinating Committee:
  • Council on Arteriosclerosis,
  • Thrombosis and Vascular Biology,
  • Council on Cardiovascular Radiology and Intervention,
  • Council on Clinical Cardiology,
  • Council on Epidemiology and Prevention, and
  • Stroke Council

Abstract

Background—Heart failure (HF) is an important contributor to both the burden and cost of national healthcare expenditures, with more older Americans hospitalized for HF than for any other medical condition. With the aging of the population, the impact of HF is expected to increase substantially.

Methods and Results—We estimated future costs of HF by adapting a methodology developed by the American Heart Association to project the epidemiology and future costs of HF from 2012 to 2030 without double counting the costs attributed to comorbid conditions. The model assumes that HF prevalence will remain constant by age, sex, and race/ethnicity and that rising costs and technological innovation will continue at the same rate.

By 2030,

  • >8 million people in the United States (1 in every 33) will have HF.
  • Between 2012 and 2030, real (2010$) total direct medical costs of HF are projected to increase from $21 billion to $53 billion.
  • Total costs, including indirect costs for HF, are estimated to increase from $31 billion in 2012 to $70 billion in 2030.
  • If one assumes all costs of cardiac care for HF patients are attributable to HF (no cost attribution to comorbid conditions), the 2030 projected cost estimates of treating patients with HF will be 3-fold higher ($160 billion in direct costs).

Conclusions—The estimated prevalence and cost of care for HF will increase markedly because of aging of the population. Strategies to prevent HF and improve the efficiency of care are needed.

Key Words:

http://circheartfailure.ahajournals.org/content/early/2013/04/24/HHF.0b013e318291329a.abstract

15 page PDF, at the below link

http://circheartfailure.ahajournals.org/content/early/2013/04/24/HHF.0b013e318291329a.full.pdf+html?sid=ad1efd74-a4e1-45b0-8a47-350e85435487

REFERENCE

Four Policy Statement From the American Heart Association

  1. AHA Policy StatementForecasting the Impact of Heart Failure in the United States: A Policy Statement From the American Heart Association

    • Paul A. Heidenreich,
    • Nancy M. Albert,
    • Larry A. Allen,
    • David A. Bluemke,
    • Javed Butler,
    • Gregg C. Fonarow,
    • John S. Ikonomidis,
    • Olga Khavjou,
    • Marvin A. Konstam,
    • Thomas M. Maddox,
    • Graham Nichol,
    • Michael Pham,
    • Ileana L. Piña,
    • and Justin G. Trogdon

    Circ Heart Fail. 2013;published online before print April 24 2013,doi:10.1161/HHF.0b013e318291329a

    …American Heart Association. Expert peer review of AHA Scientific Statements is conducted by the AHA Office of Science Operations…and improve the efficiency of care are needed. AHA Scientific Statements|heart failure|
  2. Select this article

    Special ReportStatement Regarding the Pre and Post Market Assessment of Durable, Implantable Ventricular Assist Devices in the United States

    • Michael A. Acker,
    • Francis D. Pagani,
    • Wendy Gattis Stough,
    • Douglas L. Mann,
    • Mariell Jessup,
    • Robert Kormos,
    • Mark S. Slaughter,
    • Timothy Baldwin,
    • Lynne Stevenson,
    • Keith D. Aaronson,
    • Leslie Miller,
    • David Naftel,
    • Clyde Yancy,
    • Joseph Rogers,
    • Jeffrey Teuteberg,
    • Randall C. Starling,
    • Bartley Griffith,
    • Steven Boyce,
    • Stephen Westaby,
    • Elizabeth Blume,
    • Peter Wearden,
    • Robert Higgins,
    • and Michael Mack

    Circ Heart Fail. 2013;6:e1-e11, published online before print November 12 2012,doi:10.1161/HHF.0b013e318279f6b5

    …wolterskluwer.com . Expert peer review of AHA Scientific Statements is conducted by the AHA Office of Science Operations…of Mechanically Assisted Circulatory Support.AHA Scientific Statements|heart-assist device|heart failure|BTC…
  3. Select this article

    Special ReportStatement Regarding the Pre and Post Market Assessment of Durable, Implantable Ventricular Assist Devices in the United States: Executive Summary

    • Michael A. Acker,
    • Francis D. Pagani,
    • Wendy Gattis Stough,
    • Douglas L. Mann,
    • Mariell Jessup,
    • Robert Kormos,
    • Mark S. Slaughter,
    • Timothy Baldwin,
    • Lynne Stevenson,
    • Keith D. Aaronson,
    • Leslie Miller,
    • David Naftel,
    • Clyde Yancy,
    • Joseph Rogers,
    • Jeffrey Teuteberg,
    • Randall C. Starling,
    • Bartley Griffith,
    • Steven Boyce,
    • Stephen Westaby,
    • Elizabeth Blume,
    • Peter Wearden,
    • Robert Higgins,
    • and Michael Mack

    Circ Heart Fail. 2013;6:145-150, published online before print November 12 2012,doi:10.1161/HHF.0b013e318279f55d

    …wolterskluwer.com . Expert peer review of AHA Scientific Statements is conducted by the AHA Office of Science Operations…of Mechanically Assisted Circulatory Support.AHA Scientific Statements|heart-assist device|heart failure| Background…
  4. Select this article

    ACCF/AHA/HFSA Data and Survey ReportACCF/AHA/HFSA 2011 Survey Results: Current Staffing Profile of Heart Failure Programs, Including Programs That Perform Heart Transplant and Mechanical Circulatory Support Device Implantation: A Report of the ACCF Heart Failure and Transplant Committee, AHA Heart Failure and Transplantation Committee, and Heart Failure Society of America

    • Mariell Jessup,
    • Nancy M. Albert,
    • David E. Lanfear,
    • JoAnn Lindenfeld,
    • Barry M. Massie,
    • Mary Norine Walsh,
    • and Mark J. Zucker

    Circ Heart Fail. 2011;4:378-387, published online before print April 4 2011,doi:10.1161/HHF.0b013e3182186210

    …hired for a given practice volume. These survey results are an initial step in developing such standards. AHA Scientific Statements|heart failure|heart transplant|mechanical circulatory support device|staffing profile| 1. Introduction…

http://circheartfailure.ahajournals.org/search?fulltext=AHA+Scientific+Statements&sortspec=date&submit=Submit&andorexactfulltext=phrase

Read Full Post »

« Newer Posts - Older Posts »