Advertisements
Feeds:
Posts
Comments

Archive for the ‘Cardiovascular Pharmacogenomics’ Category


Regulatory MicroRNAs in Aberrant Cholesterol Transport and Metabolism

Curator: Marzan Khan, B.Sc

Aberrant levels of lipids and cholesterol accumulation in the body lead to cardiometabolic disorders such as atherosclerosis, one of the leading causes of death in the Western World(1). The physical manifestation of this condition is the build-up of plaque along the arterial endothelium causing the arteries to constrict and resist a smooth blood flow(2). This obstructive deposition of plaque is merely the initiation of atherosclerosis and is enriched in LDL cholesterol (LDL-C) as well foam cells which are macrophages carrying an overload of toxic, oxidized LDL(2). As the condition progresses, the plaque further obstructs blood flow and creates blood clots, ultimately leading to myocardial infarction, stroke and other cardiovascular diseases(2). Therefore, LDL is referred to as “the bad cholesterol”(2).

Until now, statins are most widely prescribed as lipid-lowering drugs that inhibit the enzyme 3-hydroxy-3methylgutaryl-CoA reductase (HMGCR), the rate-limiting step in de-novo cholesterol biogenesis (1). But some people cannot continue with the medication due to it’s harmful side-effects(1). With the need to develop newer therapeutics to combat cardiovascular diseases, Harvard University researchers at Massachusetts General Hospital discovered 4 microRNAs that control cholesterol, triglyceride, and glucose homeostasis(3)

MicroRNAs are non-coding, regulatory elements approximately 22 nucleotides long, with the ability to control post-transcriptional expression of genes(3). The liver is the center for carbohydrate and lipid metabolism. Stringent regulation of endogenous LDL-receptor (LDL-R) pathway in the liver is crucial to maintain a minimal concentration of LDL particles in blood(3). A mechanism whereby peripheral tissues and macrophages can get rid of their excess LDL is mediated by ATP-binding cassette, subfamily A, member 1 (ABCA1)(3). ABCA1 consumes nascent HDL particles- dubbed as the “good cholesterol” which travel back to the liver for its contents of triglycerides and cholesterol to be excreted(3).

Genome-wide association studies (GWASs) meta-analysis carried out by the researchers disclosed 4 microRNAs –(miR-128-1, miR-148a, miR-130b, and miR-301b) to lie close to single-nucleotide polymorphisms (SNPs) associated with abnormal metabolism and transport of lipids and cholesterol(3) Experimental analyses carried out on relevant cell types such as the liver and macrophages have proven that these microRNAs bind to the 3’ UTRs of both LDL-R and ABCA1 transporters, and silence their activity. Overexpression of miR-128-1 and miR148a in mice models caused circulating HDL-C to drop. Corroborating the theory under investigation further, their inhibition led to an increased clearance of LDL from the blood and a greater accumulation in the liver(3).

That the antisense inhibition of miRNA-128-1 increased insulin signaling in mice, propels us to hypothesize that abnormal expression of miR-128-1 might cause insulin resistance in metabolic syndrome, and defective insulin signaling in hepatic steatosis and dyslipidemia(3)

Further examination of miR-148 established that Liver-X-Receptor (LXR) activation of the Sterol regulatory element-binding protein 1c (SREBP1c), the transcription factor responsible for controlling  fatty acid production and glucose metabolism, also mediates the expression of miR-148a(4,5) That the promoter region of miR-148 contained binding sites for SREBP1c was shown by chromatin immunoprecipitation combined with massively parallel sequencing (ChIP-seq)(4). More specifically, SREBP1c attaches to the E-box2, E-box3 and E-box4 elements on miR-148-1a promoter sites to control its expression(4).

Earlier, the same researchers- Andres Naars and his team had found another microRNA called miR-33 to block HDL generation, and this blockage to reverse upon antisense targeting of miR-33(6).

These experimental data substantiate the theory of miRNAs being important regulators of lipoprotein receptors and transporter proteins as well as underscore the importance of employing antisense technologies to reverse their gene-silencing effects on LDL-R and ABCA1(4). Such a therapeutic approach, that will consequently lower LDL-C and promote HDL-C seems to be a promising strategy to treat atherosclerosis and other cardiovascular diseases(4).

References:

1.Goedeke L1,Wagschal A2,Fernández-Hernando C3, Näär AM4. miRNA regulation of LDL-cholesterol metabolism. Biochim Biophys Acta. 2016 Dec;1861(12 Pt B):. Biochim Biophys Acta. 2016 Dec;1861(12 Pt B):2047-2052

https://www.ncbi.nlm.nih.gov/pubmed/26968099

2.MedicalNewsToday. Joseph Nordgvist. Atherosclerosis:Causes, Symptoms and Treatments. 13.08.2015

http://www.medicalnewstoday.com/articles/247837.php

3.Wagschal A1,2, Najafi-Shoushtari SH1,2, Wang L1,2, Goedeke L3, Sinha S4, deLemos AS5, Black JC1,6, Ramírez CM3, Li Y7, Tewhey R8,9, Hatoum I10, Shah N11, Lu Y11, Kristo F1, Psychogios N4, Vrbanac V12, Lu YC13, Hla T13, de Cabo R14, Tsang JS11, Schadt E15, Sabeti PC8,9, Kathiresan S4,6,8,16, Cohen DE7, Whetstine J1,6, Chung RT5,6, Fernández-Hernando C3, Kaplan LM6,10, Bernards A1,6,16, Gerszten RE4,6, Näär AM1,2. Genome-wide identification of microRNAs regulating cholesterol and triglyceride homeostasis. . Nat Med.2015 Nov;21(11):1290

https://www.ncbi.nlm.nih.gov/pubmed/26501192

4.Goedeke L1,2,3,4, Rotllan N1,2, Canfrán-Duque A1,2, Aranda JF1,2,3, Ramírez CM1,2, Araldi E1,2,3,4, Lin CS3,4, Anderson NN5,6, Wagschal A7,8, de Cabo R9, Horton JD5,6, Lasunción MA10,11, Näär AM7,8, Suárez Y1,2,3,4, Fernández-Hernando C1,2,3,4. MicroRNA-148a regulates LDL receptor and ABCA1 expression to control circulating lipoprotein levels. Nat Med. 2015 Nov;21(11):1280-9.

https://www.ncbi.nlm.nih.gov/pubmed/26437365

5.Eberlé D1, Hegarty B, Bossard P, Ferré P, Foufelle F. SREBP transcription factors: master regulators of lipid homeostasis. Biochimie. 2004 Nov;86(11):839-48.

https://www.ncbi.nlm.nih.gov/pubmed/15589694

6.Harvard Medical School. News. MicoRNAs and Metabolism.

https://hms.harvard.edu/news/micrornas-and-metabolism

7. MGH – Four microRNAs identified as playing key roles in cholesterol, lipid metabolism

http://www.massgeneral.org/about/pressrelease.aspx?id=1862

 

Other related articles published in this Open Access Online Scientific Journal include the following:

 

  • Cardiovascular Diseases, Volume Three: Etiologies of Cardiovascular Diseases: Epigenetics, Genetics and Genomics,

on Amazon since 11/29/2015

http://www.amazon.com/dp/B018PNHJ84

 

HDL oxidation in type 2 diabetic patients

Larry H. Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/27/hdl-oxidation-in-type-2-diabetic-patients/

 

HDL-C: Target of Therapy – Steven E. Nissen, MD, MACC, Cleveland Clinic vs Peter Libby, MD, BWH

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2014/11/07/hdl-c-target-of-therapy-steven-e-nissen-md-macc-cleveland-clinic-vs-peter-libby-md-bwh/

 

High-Density Lipoprotein (HDL): An Independent Predictor of Endothelial Function & Atherosclerosis, A Modulator, An Agonist, A Biomarker for Cardiovascular Risk

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/03/31/high-density-lipoprotein-hdl-an-independent-predictor-of-endothelial-function-artherosclerosis-a-modulator-an-agonist-a-biomarker-for-cardiovascular-risk/

 

Risk of Major Cardiovascular Events by LDL-Cholesterol Level (mg/dL): Among those treated with high-dose statin therapy, more than 40% of patients failed to achieve an LDL-cholesterol target of less than 70 mg/dL.

Reporter: Aviva Lev-Ari, PhD., RN

https://pharmaceuticalintelligence.com/2014/07/29/risk-of-major-cardiovascular-events-by-ldl-cholesterol-level-mgdl-among-those-treated-with-high-dose-statin-therapy-more-than-40-of-patients-failed-to-achieve-an-ldl-cholesterol-target-of-less-th/

 

LDL, HDL, TG, ApoA1 and ApoB: Genetic Loci Associated With Plasma Concentration of these Biomarkers – A Genome-Wide Analysis With Replication

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/12/18/ldl-hdl-tg-apoa1-and-apob-genetic-loci-associated-with-plasma-concentration-of-these-biomarkers-a-genome-wide-analysis-with-replication/

 

Two Mutations, in the PCSK9 Gene: Eliminates a Protein involved in Controlling LDL Cholesterol

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/04/15/two-mutations-in-a-pcsk9-gene-eliminates-a-protein-involve-in-controlling-ldl-cholesterol/

Artherogenesis: Predictor of CVD – the Smaller and Denser LDL Particles

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/11/15/artherogenesis-predictor-of-cvd-the-smaller-and-denser-ldl-particles/

 

A Concise Review of Cardiovascular Biomarkers of Hypertension

Curator: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/04/25/a-concise-review-of-cardiovascular-biomarkers-of-hypertension/

 

Triglycerides: Is it a Risk Factor or a Risk Marker for Atherosclerosis and Cardiovascular Disease ? The Impact of Genetic Mutations on (ANGPTL4) Gene, encoder of (angiopoietin-like 4) Protein, inhibitor of Lipoprotein Lipase

Reporters, Curators and Authors: Aviva Lev-Ari, PhD, RN and Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/03/13/triglycerides-is-it-a-risk-factor-or-a-risk-marker-for-atherosclerosis-and-cardiovascular-disease-the-impact-of-genetic-mutations-on-angptl4-gene-encoder-of-angiopoietin-like-4-protein-that-in/

 

Excess Eating, Overweight, and Diabetic

Larry H Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/15/excess-eating-overweight-and-diabetic/

 

Obesity Issues

Larry H. Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/12/obesity-issues/

 

Advertisements

Read Full Post »


cvd-series-a-volume-iii


Series A: e-Books on Cardiovascular Diseases
 

Series A Content Consultant: Justin D Pearlman, MD, PhD, FACC

VOLUME THREE

Etiologies of Cardiovascular Diseases:

Epigenetics, Genetics and Genomics

http://www.amazon.com/dp/B018PNHJ84

 

by  

Larry H Bernstein, MD, FCAP, Senior Editor, Author and Curator

and

Aviva Lev-Ari, PhD, RN, Editor and Curator

Introduction to Volume Three 

PART 1
Genomics and Medicine

1.1  Genomics and Medicine: The Physician’s View

1.2  Ribozymes and RNA Machines – Work of Jennifer A. Doudna

1.3  Genomics and Medicine: Contributions of Genetics and Genomics to Cardiovascular Disease Diagnoses

1.4 Genomics Orientations for Individualized Medicine, Volume One

1.4.1 CVD Epidemiology, Ethnic subtypes Classification, and Medication Response Variability: Cardiology, Genomics and Individualized Heart Care: Framingham Heart Study (65 y-o study) & Jackson Heart Study (15 y-o study)

1.4.2 What comes after finishing the Euchromatic Sequence of the Human Genome?

1.5  Genomics in Medicine – Establishing a Patient-Centric View of Genomic Data

 

PART 2
Epigenetics – Modifiable Factors Causing Cardiovascular Diseases

2.1 Diseases Etiology

2.1.1 Environmental Contributors Implicated as Causing Cardiovascular Diseases

2.1.2 Diet: Solids, Fluid Intake and Nutraceuticals

2.1.3 Physical Activity and Prevention of Cardiovascular Diseases

2.1.4 Psychological Stress and Mental Health: Risk for Cardiovascular Diseases

2.1.5 Correlation between Cancer and Cardiovascular Diseases

2.1.6 Medical Etiologies for Cardiovascular Diseases: Evidence-based Medicine – Leading DIAGNOSES of Cardiovascular Diseases, Risk Biomarkers and Therapies

2.1.7 Signaling Pathways

2.1.8 Proteomics and Metabolomics

2.1.9 Sleep and Cardiovascular Diseases

2.2 Assessing Cardiovascular Disease with Biomarkers

2.2.1 Issues in Genomics of Cardiovascular Diseases

2.2.2 Endothelium, Angiogenesis, and Disordered Coagulation

2.2.3 Hypertension BioMarkers

2.2.4 Inflammatory, Atherosclerotic and Heart Failure Markers

2.2.5 Myocardial Markers

2.3  Therapeutic Implications: Focus on Ca(2+) signaling, platelets, endothelium

2.3.1 The Centrality of Ca(2+) Signaling and Cytoskeleton Involving Calmodulin Kinases and Ryanodine Receptors in Cardiac Failure, Arterial Smooth Muscle, Post-ischemic Arrhythmia, Similarities and Differences, and Pharmaceutical Targets

2.3.2 EMRE in the Mitochondrial Calcium Uniporter Complex

2.3.3 Platelets in Translational Research ­ 2: Discovery of Potential Anti-platelet Targets

2.3.4 The Final Considerations of the Role of Platelets and Platelet Endothelial Reactions in Atherosclerosis and Novel Treatments

2.3.5 Nitric Oxide Synthase Inhibitors (NOS-I)

2.3.6 Resistance to Receptor of Tyrosine Kinase

2.3.7 Oxidized Calcium Calmodulin Kinase and Atrial Fibrillation

2.3.8 Advanced Topics in Sepsis and the Cardiovascular System at its End Stage

2.4 Comorbidity of Diabetes and Aging

2.4.1 Heart and Aging Research in Genomic Epidemiology: 1700 MIs and 2300 coronary heart disease events among about 29 000 eligible patients

2.4.2 Pathophysiological Effects of Diabetes on Ischemic-Cardiovascular Disease and on Chronic Obstructive Pulmonary Disease (COPD)

2.4.3 Risks of Hypoglycemia in Diabetics with Chronic Kidney Disease (CKD)

2.4.4  Mitochondrial Mechanisms of Disease in Diabetes Mellitus

2.4.5 Mitochondria: More than just the “powerhouse of the cell”

2.4.6  Pathophysiology of GLP-1 in Type 2 Diabetes

2.4.7 Developments in the Genomics and Proteomics of Type 2 Diabetes Mellitus and Treatment Targets

2.4.8 CaKMII Inhibition in Obese, Diabetic Mice leads to Lower Blood Glucose Levels

2.4.9 Protein Target for Controlling Diabetes, Fractalkine: Mediator cell-to-cell Adhesion though CX3CR1 Receptor, Released from cells Stimulate Insulin Secretion

2.4.10 Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes

2.4.11 CABG or PCI: Patients with Diabetes – CABG Rein Supreme

2.4.12 Reversal of Cardiac Mitochondrial Dysfunction

2.4.13  BARI 2D Trial Outcomes

2.4.14 Overview of new strategy for treatment of T2DM: SGLT2 inhibiting oral antidiabetic agents

2.5 Drug Toxicity and Cardiovascular Diseases

2.5.1 Predicting Drug Toxicity for Acute Cardiac Events

2.5.2 Cardiotoxicity and Cardiomyopathy Related to Drugs Adverse Effects

2.5.3 Decoding myocardial Ca2+ signals across multiple spatial scales: A role for sensitivity analysis

2.5.4. Leveraging Mathematical Models to Understand Population Variability in Response to Cardiac Drugs: Eric Sobie, PhD

2.5.5 Exploiting mathematical models to illuminate electrophysiological variability between individuals.

2.5.6 Clinical Effects and Cardiac Complications of Recreational Drug Use: Blood pressure changes, Myocardial ischemia and infarction, Aortic dissection, Valvular damage, and Endocarditis, Cardiomyopathy, Pulmonary edema and Pulmonary hypertension, Arrhythmias, Pneumothorax and Pneumopericardium

 

2.6 Male and Female Hormonal Replacement Therapy: The Benefits and the Deleterious Effects on Cardiovascular Diseases

2.6.1  Testosterone Therapy for Idiopathic Hypogonadotrophic Hypogonadism has Beneficial and Deleterious Effects on Cardiovascular Risk Factors

2.6.2 Heart Risks and Hormones (HRT) in Menopause: Contradiction or Clarification?

2.6.3 Calcium Dependent NOS Induction by Sex Hormones: Estrogen

2.6.4 Role of Progesterone in Breast Cancer Progression

PART 3
Determinants of Cardiovascular Diseases Genetics, Heredity and Genomics Discoveries

Introduction

3.1 Why cancer cells contain abnormal numbers of chromosomes (Aneuploidy)

3.1.1 Aneuploidy and Carcinogenesis

3.2 Functional Characterization of Cardiovascular Genomics: Disease Case Studies @ 2013 ASHG

3.3 Leading DIAGNOSES of Cardiovascular Diseases covered in Circulation: Cardiovascular Genetics, 3/2010 – 3/2013

3.3.1: Heredity of Cardiovascular Disorders

3.3.2: Myocardial Damage

3.3.3: Hypertention and Atherosclerosis

3.3.4: Ethnic Variation in Cardiac Structure and Systolic Function

3.3.5: Aging: Heart and Genetics

3.3.6: Genetics of Heart Rhythm

3.3.7: Hyperlipidemia, Hyper Cholesterolemia, Metabolic Syndrome

3.3.8: Stroke and Ischemic Stroke

3.3.9: Genetics and Vascular Pathologies and Platelet Aggregation, Cardiac Troponin T in Serum

3.3.10: Genomics and Valvular Disease

3.4  Commentary on Biomarkers for Genetics and Genomics of Cardiovascular Disease

PART 4
Individualized Medicine Guided by Genetics and Genomics Discoveries

4.1 Preventive Medicine: Cardiovascular Diseases

4.1.1 Personal Genomics for Preventive Cardiology Randomized Trial Design and Challenges

4.2 Gene-Therapy for Cardiovascular Diseases

4.2.1 Genetic Basis of Cardiomyopathy

4.3 Congenital Heart Disease/Defects

4.4 Cardiac Repair: Regenerative Medicine

4.4.1 A Powerful Tool For Repairing Damaged Hearts

4.4.2 Modified RNA Induces Vascular Regeneration After a Heart

4.5 Pharmacogenomics for Cardiovascular Diseases

4.5.1 Blood Pressure Response to Antihypertensives: Hypertension Susceptibility Loci Study

4.5.2 Statin-Induced Low-Density Lipoprotein Cholesterol Reduction: Genetic Determinants in the Response to Rosuvastatin

4.5.3 SNPs in apoE are found to influence statin response significantly. Less frequent variants in PCSK9 and smaller effect sizes in SNPs in HMGCR

4.5.4 Voltage-Gated Calcium Channel and Pharmacogenetic Association with Adverse Cardiovascular Outcomes: Hypertension Treatment with Verapamil SR (CCB) vs Atenolol (BB) or Trandolapril (ACE)

4.5.5 Response to Rosuvastatin in Patients With Acute Myocardial Infarction: Hepatic Metabolism and Transporter Gene Variants Effect

4.5.6 Helping Physicians identify Gene-Drug Interactions for Treatment Decisions: New ‘CLIPMERGE’ program – Personalized Medicine @ The Mount Sinai Medical Center

4.5.7 Is Pharmacogenetic-based Dosing of Warfarin Superior for Anticoagulation Control?

Summary & Epilogue to Volume Three

 

 

Read Full Post »


Regulation of mesenchymal cell generation

 

Curator: Larry H. Bernstein, MD, FCAP

from Butyrov

LPBI

 

 

Controlling Mesenchymal Stem Cell Activity With Microparticles Loaded With Small Molecules

M. Butyrov      Beyond the Dish

Mesenchymal stem cells are the subject of many clinical trials and show a potent ability to down-regulate unwanted immune responses and quell inflammation. A genuine challenge with mesenchymal stem cells (MSCs) is controlling the genes they express and the proteins they secrete.

A new publication details the strategy of one enterprising laboratory to control MSC function. Work by Jeffery Karp from the Harvard Stem Cell Institute and Maneesha Inamdar from the Institute for Stem Cell Biology and Regenerative Medicine in Bangalore, India and their colleagues had use microparticles that are loaded with small molecules and are readily taken up by cultures MSCs.

In this paper, which appeared in Stem Cell Reports (DOI:http://dx.doi.org/10.1016/j.stemcr.2016.05.003), human MSCs were stimulated with a small signaling protein called Tumor Necrosis Factor-alpha (TNF-alpha). TNF-alpha makes MSCs “angry” and they pour out pro-inflammatory molecules upon stimulation with TNF-alpha. However, to these TNF-alpha-stimulated, MSC, Karp and others added tiny microparticles loaded with a small molecule called TPCA-1. TPCA-1 inhibits the NF-κB signaling pathway, which is one of the major signal transduction pathways involved in inflammation.

fx1 (7)

Delivery of these TPCA-1-containing microparticles thinned-out the production of pro-inflammatory molecules by these TNF-alpha-treated MSCs for at least 6 days. When the culture medium from TPCA-1-loaded MSCs was given to different cell types, the molecules secreted by these cells reduced the recruitment of white blood cells called monocytes. This is indicative of the anti-inflammatory nature of TPCA-1-treated MSCs. The culture medium from these cells also prevented the differentiation of human cardiac fibroblasts into collagen-making cells called “myofibroblasts.” Myofibroblasts lay down the collagen that produces the heart scar after a heart attack. This is a further indication of the anti-inflammatory nature of the molecules made by these TPCA-1-treated MSCs.

These results are important because it shows that MSC activities can be manipulated without gene therapy. It is possible that such non-gene therapy-based approached can be used to fine-tune MSC activity and the types of molecules secreted by implanted MSCs. Furthermore, given the effect of these cells on monocytes and cardiac fibroblasts, perhaps microparticle-treated MSCs can prevent the adverse remodeling that occurs in the heart after a heart attack.

 

Controlled Inhibition of the Mesenchymal Stromal Cell Pro-inflammatory Secretome via Microparticle Engineering

Sudhir H. Ranganath, Zhixiang Tong, Oren Levy, Keir Martyn, Jeffrey M. Karpcorrespondence, Maneesha S. Inamdar
Stem Cell Reports June 2016; Volume 6 (Issue 6): 926–939    http://dx.doi.org/10.1016/j.stemcr.2016.05.003
Mesenchymal stromal cells (MSCs) are promising therapeutic candidates given their potent immunomodulatory and anti-inflammatory secretome. However, controlling the MSC secretome post-transplantation is considered a major challenge that hinders their clinical efficacy. To address this, we used a microparticle-based engineering approach to non-genetically modulate pro-inflammatory pathways in human MSCs (hMSCs) under simulated inflammatory conditions. Here we show that microparticles loaded with TPCA-1, a smallmolecule NF-kB inhibitor, when delivered to hMSCs can attenuate secretion of pro-inflammatory factors for at least 6 days in vitro. Conditioned medium (CM) derived from TPCA-1-loaded hMSCs also showed reduced ability to attract human monocytes and prevented differentiation of human cardiac fibroblasts to myofibroblasts, compared with CM from untreated or TPCA-1-preconditioned hMSCs. Thus, we provide a broadly applicable bioengineering solution to facilitate intracellular sustained release of agents that modulate signaling. We propose that this approach could be harnessed to improve control over MSC secretome post-transplantation, especially to prevent adverse remodeling post-myocardial infarction.
Mesenchymal stromal cells (MSCs; also known as bone marrow stromal cells and earlier known as mesenchymal stem cells) are being explored as therapeutics in over 550 clinical trials registered with the US Food and Drug Administration (www.FDA.gov) for the treatment of a wide range of diseases (Ankrum et al., 2014c). Their immuneevasive properties (Ankrum et al., 2014c) and safe transplant record, allowing allogeneic administration without an immunosuppressive regimen, positions MSCs as an appealing candidate for a potential off-the-shelf product. One of the primary mechanisms exploited in MSC therapeutics is a secretome-based paracrine effect as evidenced in many pre-clinical studies (Ranganath et al., 2012). However, controlling the MSC secretome post-transplantation is considered a major challenge that hinders their clinical efficacy. For instance, upon transplantation, MSCs are subjected to a complex inflammatory milieu (soluble mediators and immune cells) in most injury settings. MSCs not only secrete anti-inflammatory factors, but also produce pro-inflammatory factors that may compromise their therapeutic efficacy. Table S1 lists a few in vitro and in vivo conditions that demonstrate the complex microenvironment under which MSCs switch between anti-inflammatory and pro-inflammatory phenotypes.
Levels of pro- or anti-inflammatory cytokines are not always predictive of the response, possibly due to the dynamic cytokine combinations (and concentrations) present in the cell microenvironment (Table S1). For example, relatively low inflammatory stimulus (<20 ng/ml tumor necrosis factor alpha [TNF-a] alone or along with interferon-g [IFN-g]) can polarize MSCs toward pro-inflammatory effects (Bernardo and Fibbe, 2013) resulting in increased inflammation characterized by T cell proliferation and transplant rejection. Conversely, exposure to high levels of the inflammatory cytokine TNF-a has been shown in certain studies to result in MSC-mediated anti-inflammatory effects via secretion of potent mediators such as TSG6, PGE2, STC-1, IL-1Ra, and sTNFR1 as demonstrated in multiple inflammation-associated disease models (Prockop and Oh, 2012; Ylostalo et al., 2012). These effects are mediated via molecular pathways such as NF-kB, PI3K, Akt, and JAK-STAT (Ranganath et al., 2012). However, it is not clear that low and high levels of TNF-a always exert the same effect on anti- versus pro-inflammatory MSC secretome. NF-kB is a central regulator of the anti-inflammatory secretome response in monolayer (Yagi et al., 2010), spheroid MSCs (Bartosh et al., 2013; Ylostalo et al., 2012) and TNFa-mediated (20 ng/ml for 120 min) apoptosis (Peng et al., 2011). Given that NF-kB can promote secretion of proinflammatory components in the MSC secretome (Lee et al., 2010), we hypothesized that NF-kB inhibition via small molecules in MSC subjected to a representative in- flammatory stimulus (10 ng/ml TNF-a) would inhibit their pro-inflammatory responses.
Adverse remodeling or cardiac fibrosis due to differentiation of cardiac fibroblasts (CF) into cardiac myofibroblasts (CMF) with pro-inflammatory phenotype and collagen deposition is the leading cause for heart failure. The secretome from exogenous MSCs has anti-fibrotic and angiogenic effects that can reduce scar formation (Preda et al., 2014) and improve ejection fraction when administered early or prior to adverse remodeling (Preda et al., 2014; Tang et al., 2010; Williams et al., 2013). Unfortunately in many cases, due to poor prognosis, MSCs may not be administered in time to prevent adverse remodeling to inhibit CF differentiation to CMF (Virag and Murry, 2003) or to prevent myocardial expression of TNF-a (Bozkurt et al., 1998; Mann, 2001). Also, when administered following an adverse remodeling event including CMF differentiation, MSCs may assume a pro-inflammatory phenotype and secretome (Naftali-Shani et al., 2013) under TNF-a (typically 5 pg/mg of total protein in myocardial infarction (MI) rat myocardium which is not significantly higher than 1–3 pg/mg protein in control rat myocardium) (Moro et al., 2007) resulting in impaired heart function. Hence, if the pro-inflammatory response of hMSCs can be suppressed it may maximize efficacy.
While the MSC phenotype can be controlled under regulated conditions in vitro, the in vivo response of MSCs post-transplantation is poorly controlled as it is dictated by highly dynamic and complex host microenvironments (Discher et al., 2009; Rodrigues et al., 2010). Factors including MSC tissue source (Melief et al., 2013; NaftaliShani et al., 2013), donor and batch-to-batch variability with respect to cytokine secretion and response to inflammatory stimuli (Zhukareva et al., 2010), gender (Crisostomo et al., 2007), and age (Liang et al., 2013) also affect the response of MSCs. Thus, it is important to develop approaches to control the MSC secretome post-transplantation regardless of their source or expansion conditions. We hypothesized that engineering MSCs to induce a specific secretome profile under a simulated host microenvironment may maximize their therapeutic utility. Previously, the MSC secretome has been regulated via genetic engineering (Gnecchi et al., 2006; Wang et al., 2009) or cytokine/small-molecule preconditioning approaches (Crisostomo et al., 2008; Mias et al., 2008). Genetically engineered human MSCs (hMSCs) pose challenging long-term regulatory hurdles given that the potential tumorigenicity has not been well characterized, and while preconditioning hMSCs with cytokines/small molecules may be safer, the phenotype-altering effects are transient. …
While TPCA-1 pretreatment of hMSCs before TNF-a stimulation showed a significant reduction in CMF numbers, this was greatly enhanced when TPCA-1 was available intracellularly via the microparticles. CM from TNF + TPCAmP-hMSC likely prevented differentiation of CF to CMF due to the continued intracellular inhibition of IKK-mediated NF-kB activation, thus preventing the release of an hMSC pro-inflammatory secretome. Surprisingly, the CMF number was reduced by over 2-fold, suggesting that TPCA-1 may activate hMSC pathways that revert the CMF phenotype. We cannot rule out the possibility that other contributors in the hMSC secretome that were not profiled might also be contributing, and their action is facilitated by intracellular TPCA-1. For instance, in an in vitro 3D model of cardiac fibrosis under hypoxic conditions, reversal of CMF to the CF phenotype was shown to be due to reduced MSC TGF-b levels (Galie and Stegemann, 2014). Our assay revealed a similar trend in terms of collagen production from CF. CF treated with CM from control-hMSCs, or TPCApre + TNF-hMSCs or mP-hMSCs secreted elevated levels of collagen into the media suggesting that only inhibited levels of pro-inflammatory mediators could not be implicated in the reduction in collagen secretion. The reduced number of a-SMA+ CMF in CM from TNF + TPCAmP-hMSCs possibly contributed to the lower secretion level of collagen in the media (Figure 5B). Upon dedifferentiation or lowered a-SMA expression, it is possible that CMFs lose collagen secretion ability. In regions of MI, CF switch to the myofibroblast phenotype due to stress from the infarct scar (Tomasek et al., 2002). High expression of a-SMA typical in such CMF (Teunissen et al., 2007) has been implicated for remodeling due to their high contractility (Santiago et al., 2010). In addition, the collagen secretion capacity of CMF is very high (Petrov et al., 2002). Overall, attenuation in the number of collagen-secreting a-SMA+ CMF could be beneficial in preventing pathological remodeling or irreversible scar formation and allowing cardiac regeneration.
Here we have demonstrated that the pro-inflammatory hMSC secretome could be inhibited using a microparticle engineering approach delivering an intracellular NF-kB inhibitor, TPCA-1. It is however important to note that the MSC secretome composition may change depending on the level of TNF-a encountered in vivo following transplantation. Nevertheless, a similar approach could be beneficial in inflammatory disease settings such as chronic inflammation and macrophage-mediated atherosclerosis. The approach of microparticle engineering of an exogenous cell population by modulating a central regulatory pathway, may find application in other cell types and pathways and could provide an attractive strategy for harnessing any cell secretome for therapy. This approach could also be potentially employed to modulate the composition of extracellular vesicles (exosomes) for therapy.
Stem Cell Reports j Vol. 6 j 926–939 j June 14, 2016

Read Full Post »


https://www.youtube.com/v/_yEkeetKqtg?fs=1&hl=fr_FR

Please visit: www.openpediatrics.org OPENPediatrics™ is an interactive digital learning platform for healthcare clinicians sponsored by Boston Children’s Hos…

Sourced through Scoop.it from: www.youtube.com

See on Scoop.itCardiovascular Disease: PHARMACO-THERAPY

Read Full Post »


Clinical Laboratory Challenges

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

CLINICAL LABORATORY NEWS   

The Lab and CJD: Safe Handling of Infectious Prion Proteins

Body fluids from individuals with possible Creutzfeldt-Jakob disease (CJD) present distinctive safety challenges for clinical laboratories. Sporadic, iatrogenic, and familial CJD (known collectively as classic CJD), along with variant CJD, kuru, Gerstmann-Sträussler-Scheinker, and fatal familial insomnia, are prion diseases, also known as transmissible spongiform encephalopathies. Prion diseases affect the central nervous system, and from the onset of symptoms follow a typically rapid progressive neurological decline. While prion diseases are rare, it is not uncommon for the most prevalent form—sporadic CJD—to be included in the differential diagnosis of individuals presenting with rapid cognitive decline. Thus, laboratories may deal with a significant number of possible CJD cases, and should have protocols in place to process specimens, even if a confirmatory diagnosis of CJD is made in only a fraction of these cases.

The Lab’s Role in Diagnosis

Laboratory protocols for handling specimens from individuals with possible, probable, and definitive cases of CJD are important to ensure timely and appropriate patient management. When the differential includes CJD, an attempt should be made to rule-in or out other causes of rapid neurological decline. Laboratories should be prepared to process blood and cerebrospinal fluid (CSF) specimens in such cases for routine analyses.

Definitive diagnosis requires identification of prion aggregates in brain tissue, which can be achieved by immunohistochemistry, a Western blot for proteinase K-resistant prions, and/or by the presence of prion fibrils. Thus, confirmatory diagnosis is typically achieved at autopsy. A probable diagnosis of CJD is supported by elevated concentration of 14-3-3 protein in CSF (a non-specific marker of neurodegeneration), EEG, and MRI findings. Thus, the laboratory may be required to process and send CSF samples to a prion surveillance center for 14-3-3 testing, as well as blood samples for sequencing of the PRNP gene (in inherited cases).

Processing Biofluids

Laboratories should follow standard protective measures when working with biofluids potentially containing abnormally folded prions, such as donning standard personal protective equipment (PPE); avoiding or minimizing the use of sharps; using single-use disposable items; and processing specimens to minimize formation of aerosols and droplets. An additional safety consideration is the use of single-use disposal PPE; otherwise, re-usable items must be either cleaned using prion-specific decontamination methods, or destroyed.

Blood. In experimental models, infectivity has been detected in the blood; however, there have been no cases of secondary transmission of classical CJD via blood product transfusions in humans. As such, blood has been classified, on epidemiological evidence by the World Health Organization (WHO), as containing “no detectible infectivity,” which means it can be processed by routine methods. Similarly, except for CSF, all other body fluids contain no infectivity and can be processed following standard procedures.

In contrast to classic CJD, there have been four cases of suspected secondary transmission of variant CJD via transfused blood products in the United Kingdom. Variant CJD, the prion disease associated with mad cow disease, is unique in its distribution of prion aggregates outside of the central nervous system, including the lymph nodes, spleen, and tonsils. For regions where variant CJD is a concern, laboratories should consult their regulatory agencies for further guidance.

CSF. Relative to highly infectious tissues of the brain, spinal cord, and eye, infectivity has been identified less often in CSF and is considered to have “low infectivity,” along with kidney, liver, and lung tissue. Since CSF can contain infectious material, WHO has recommended that analyses not be performed on automated equipment due to challenges associated with decontamination. Laboratories should perform a risk assessment of their CSF processes, and, if deemed necessary, consider using manual methods as an alternative to automated systems.

Decontamination

The infectious agent in prion disease is unlike any other infectious pathogen encountered in the laboratory; it is formed of misfolded and aggregated prion proteins. This aggregated proteinacious material forms the infectious unit, which is incredibly resilient to degradation. Moreover, in vitro studies have demonstrated that disrupting large aggregates into smaller aggregates increases cytotoxicity. Thus, if the aim is to abolish infectivity, all aggregates must be destroyed. Disinfectant procedures used for viral, bacterial, and fungal pathogens such as alcohol, boiling, formalin, dry heat (<300°C), autoclaving at 121°C for 15 minutes, and ionizing, ultraviolet, or microwave radiation, are either ineffective or variably effective against aggregated prions.

The only means to ensure no risk of residual infectious prions is to use disposable materials. This is not always practical, as, for instance, a biosafety cabinet cannot be discarded if there is a CSF spill in the hood. Fortunately, there are several protocols considered sufficient for decontamination. For surfaces and heat-sensitive instruments, such as a biosafety cabinet, WHO recommends flooding the surface with 2N NaOH or undiluted NaClO, letting stand for 1 hour, mopping up, and rinsing with water. If the surface cannot tolerate NaOH or NaClO, thorough cleaning will remove most infectivity by dilution. Laboratories may derive some additional benefit by using one of the partially effective methods discussed previously. Non-disposable heat-resistant items preferably should be immersed in 1N NaOH, heated in a gravity displacement autoclave at 121°C for 30 min, cleaned and rinsed in water, then sterilized by routine methods. WHO has outlined several alternate decontamination methods. Using disposable cover sheets is one simple solution to avoid contaminating work surfaces and associated lengthy decontamination procedures.

With standard PPE—augmented by a few additional safety measures and prion-specific decontamination procedures—laboratories can safely manage biofluid testing in cases of prion disease.

 

The Microscopic World Inside Us  

Emerging Research Points to Microbiome’s Role in Health and Disease

Thousands of species of microbes—bacteria, viruses, fungi, and protozoa—inhabit every internal and external surface of the human body. Collectively, these microbes, known as the microbiome, outnumber the body’s human cells by about 10 to 1 and include more than 1,000 species of microorganisms and several million genes residing in the skin, respiratory system, urogenital, and gastrointestinal tracts. The microbiome’s complicated relationship with its human host is increasingly considered so crucial to health that researchers sometimes call it “the forgotten organ.”

Disturbances to the microbiome can arise from nutritional deficiencies, antibiotic use, and antiseptic modern life. Imbalances in the microbiome’s diverse microbial communities, which interact constantly with cells in the human body, may contribute to chronic health conditions, including diabetes, asthma and allergies, obesity and the metabolic syndrome, digestive disorders including irritable bowel syndrome (IBS), and autoimmune disorders like multiple sclerosis and rheumatoid arthritis, research shows.

While study of the microbiome is a growing research enterprise that has attracted enthusiastic media attention and venture capital, its findings are largely preliminary. But some laboratorians are already developing a greater appreciation for the microbiome’s contributions to human biochemistry and are considering a future in which they expect to measure changes in the microbiome to monitor disease and inform clinical practice.

Pivot Toward the Microbiome

Following the National Institutes of Health (NIH) Human Genome Project, many scientists noted the considerable genetic signal from microbes in the body and the existence of technology to analyze these microorganisms. That realization led NIH to establish the Human Microbiome Project in 2007, said Lita Proctor, PhD, its program director. In the project’s first phase, researchers studied healthy adults to produce a reference set of microbiomes and a resource of metagenomic sequences of bacteria in the airways, skin, oral cavities, and the gastrointestinal and vaginal tracts, plus a catalog of microbial genome sequences of reference strains. Researchers also evaluated specific diseases associated with disturbances in the microbiome, including gastrointestinal diseases such as Crohn’s disease, ulcerative colitis, IBS, and obesity, as well as urogenital conditions, those that involve the reproductive system, and skin diseases like eczema, psoriasis, and acne.

Phase 1 studies determined the composition of many parts of the microbiome, but did not define how that composition affects health or specific disease. The project’s second phase aims to “answer the question of what microbes actually do,” explained Proctor. Researchers are now examining properties of the microbiome including gene expression, protein, and human and microbial metabolite profiles in studies of pregnant women at risk for preterm birth, the gut hormones of patients at risk for IBS, and nasal microbiomes of patients at risk for type 2 diabetes.

Promising Lines of Research

Cystic fibrosis and microbiology investigator Michael Surette, PhD, sees promising microbiome research not just in terms of evidence of its effects on specific diseases, but also in what drives changes in the microbiome. Surette is Canada research chair in interdisciplinary microbiome research in the Farncombe Family Digestive Health Research Institute at McMaster University
in Hamilton, Ontario.

One type of study on factors driving microbiome change examines how alterations in composition and imbalances in individual patients relate to improving or worsening disease. “IBS, cystic fibrosis, and chronic obstructive pulmonary disease all have periods of instability or exacerbation,” he noted. Surette hopes that one day, tests will provide clinicians the ability to monitor changes in microbial composition over time and even predict when a patient’s condition is about to deteriorate. Monitoring perturbations to the gut microbiome might also help minimize collateral damage to the microbiome during aggressive antibiotic therapy for hospitalized patients, he added.

Monitoring changes to the microbiome also might be helpful for “culture negative” patients, who now may receive multiple, unsuccessful courses of different antibiotics that drive antibiotic resistance. Frustration with standard clinical biology diagnosis of lung infections in cystic fibrosis patients first sparked Surette’s investigations into the microbiome. He hopes that future tests involving the microbiome might also help asthma patients with neutrophilia, community-acquired pneumonia patients who harbor complex microbial lung communities lacking obvious pathogens, and hospitalized patients with pneumonia or sepsis. He envisions microbiome testing that would look for short-term changes indicating whether or not a drug is effective.

Companion Diagnostics

Daniel Peterson, MD, PhD, an assistant professor of pathology at Johns Hopkins University School of Medicine in Baltimore, believes the future of clinical testing involving the microbiome lies in companion diagnostics for novel treatments, and points to companies that are already developing and marketing tests that will require such assays.

Examples of microbiome-focused enterprises abound, including Genetic Analysis, based in Oslo, Norway, with its high-throughput test that uses 54 probes targeted to specific bacteria to measure intestinal gut flora imbalances in inflammatory bowel disease and irritable bowel syndrome patients. Paris, France-based Enterome is developing both novel drugs and companion diagnostics for microbiome-related diseases such as IBS and some metabolic diseases. Second Genome, based in South San Francisco, has developed an experimental drug, SGM-1019, that the company says blocks damaging activity of the microbiome in the intestine. Cambridge, Massachusetts-based Seres Therapeutics has received Food and Drug Administration orphan drug designation for SER-109, an oral therapeutic intended to correct microbial imbalances to prevent recurrent Clostridium difficile infection in adults.

One promising clinical use of the microbiome is fecal transplantation, which both prospective and retrospective studies have shown to be effective in patients with C. difficile infections who do not respond to front-line therapies, said James Versalovic, MD, PhD, director of Texas Children’s Hospital Microbiome Center and professor of pathology at Baylor College of Medicine in Houston. “Fecal transplants and other microbiome replacement strategies can radically change the composition of the microbiome in hours to days,” he explained.

But NIH’s Proctor discourages too much enthusiasm about fecal transplant. “Natural products like stool can have [side] effects,” she pointed out. “The [microbiome research] field needs to mature and we need to verify outcomes before anything becomes routine.”

Hurdles for Lab Testing

While he is hopeful that labs someday will use the microbiome to produce clinically useful information, Surette pointed to several problems that must be solved beforehand. First, molecular methods commonly used right now should be more quantitative and accurate. Additionally, research on the microbiome encompasses a wide variety of protocols, some of which are better at extracting particular types of bacteria and therefore can give biased views of communities living in the body. Also, tests may need to distinguish between dead and live microbes. Another hurdle is that labs using varied bioinfomatic methods may produce different results from the same sample, a problem that Surette sees as ripe for a solution from clinical laboratorians, who have expertise in standardizing robust protocols and in automating tests.

One way laboratorians can prepare for future, routine microbiome testing is to expand their notion of clinical chemistry to include both microbial and human biochemistry. “The line between microbiome science and clinical science is blurring,” said Versalovic. “When developing future assays to detect biochemical changes in disease states, we must consider the contributions of microbial metabolites and proteins and how to tailor tests to detect them.” In the future, clinical labs may test for uniquely microbial metabolites in various disease states, he predicted.

 

Automated Review of Mass Spectrometry Results  

Can We Achieve Autoverification?

Author: Katherine Alexander and Andrea R. Terrell, PhD  // Date: NOV.1.2015  // Source:Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/november/automated-review-of-mass-spectrometry-results-can-we-achieve-autoverification

 

Paralleling the upswing in prescription drug misuse, clinical laboratories are receiving more requests for mass spectrometry (MS) testing as physicians rely on its specificity to monitor patient compliance with prescription regimens. However, as volume has increased, reimbursement has declined, forcing toxicology laboratories both to increase capacity and lower their operational costs—without sacrificing quality or turnaround time. Now, new solutions are available enabling laboratories to bring automation to MS testing and helping them with the growing demand for toxicology and other testing.

What is the typical MS workflow?

A typical workflow includes a long list of manual steps. By the time a sample is loaded onto the mass spectrometer, it has been collected, logged into the lab information management system (LIMS), and prepared for analysis using a variety of wet chemistry techniques.

Most commercial clinical laboratories receive enough samples for MS analysis to batch analyze those samples. A batch consists of a calibrator(s), quality control (QC) samples, and patient/donor samples. Historically, the method would be selected (i.e. “analysis of opiates”), sample identification information would be entered manually into the MS software, and the instrument would begin analyzing each sample. Upon successful completion of the batch, the MS operator would view all of the analytical data, ensure the QC results were acceptable, and review each patient/donor specimen, looking at characteristics such as peak shape, ion ratios, retention time, and calculated concentration.

The operator would then post acceptable results into the LIMS manually or through an interface, and unacceptable results would be rescheduled or dealt with according to lab-specific protocols. In our laboratory we perform a final certification step for quality assurance by reviewing all information about the batch again, prior to releasing results for final reporting through the LIMS.

What problems are associated with this workflow?

The workflow described above results in too many highly trained chemists performing manual data entry and reviewing perfectly acceptable analytical results. Lab managers would prefer that MS operators and certifying scientists focus on troubleshooting problem samples rather than reviewing mounds of good data. Not only is the current process inefficient, it is mundane work prone to user errors. This risks fatigue, disengagement, and complacency by our highly skilled scientists.

Importantly, manual processes also take time. In most clinical lab environments, turnaround time is critical for patient care and industry competitiveness. Lab directors and managers are looking for solutions to automate mundane, error-prone tasks to save time and costs, reduce staff burnout, and maintain high levels of quality.

How can software automate data transfer from MS systems to LIMS?

Automation is not a new concept in the clinical lab. Labs have automated processes in shipping and receiving, sample preparation, liquid handling, and data delivery to the end user. As more labs implement MS, companies have begun to develop special software to automate data analysis and review workflows.

In July 2011, AIT Labs incorporated ASCENT into our workflow, eliminating the initial manual peak review step. ASCENT is an algorithm-based peak picking and data review system designed specifically for chromatographic data. The software employs robust statistical and modeling approaches to the raw instrument data to present the true signal, which often can be obscured by noise or matrix components.

The system also uses an exponentially modified Gaussian (EMG) equation to apply a best-fit model to integrated peaks through what is often a noisy signal. In our experience, applying the EMG results in cleaner data from what might appear to be poor chromatography ultimately allows us to reduce the number of samples we might otherwise rerun.

How do you validate the quality of results?

We’ve developed a robust validation protocol to ensure that results are, at minimum, equivalent to results from our manual review. We begin by building the assay in ASCENT, entering assay-specific information from our internal standard operating procedure (SOP). Once the assay is configured, validation proceeds with parallel batch processing to compare results between software-reviewed data and staff-reviewed data. For new implementations we run eight to nine batches of 30–40 samples each; when we are modifying or upgrading an existing implementation we run a smaller number of batches. The parallel batches should contain multiple positive and negative results for all analytes in the method, preferably spanning the analytical measurement range of the assay.

The next step is to compare the results and calculate the percent difference between the data review methods. We require that two-thirds of the automated results fall within 20% of the manually reviewed result. In addition to validating patient sample correlation, we also test numerous quality assurance rules that should initiate a flag for further review.

What are the biggest challenges during implementation and continual improvement initiatives?

On the technological side, our largest hurdle was loading the sequence files into ASCENT. We had created an in-house mechanism for our chemists to upload the 96-well plate map for their batch into the MS software. We had some difficulty transferring this information to ASCENT, but once we resolved this issue, the technical workflow proceeded fairly smoothly.

The greater challenge was changing our employees’ mindset from one of fear that automation would displace them, to a realization that learning this new technology would actually make them more valuable. Automating a non-mechanical process can be a difficult concept for hands-on scientists, so managers must be patient and help their employees understand that this kind of technology leverages the best attributes of software and people to create a powerful partnership.

We recommend that labs considering automated data analysis engage staff in the validation and implementation to spread the workload and the knowledge. As is true with most technology, it is best not to rely on just one or two super users. We also found it critical to add supervisor level controls on data file manipulation, such as removing a sample that wasn’t run from the sequence table. This can prevent inadvertent deletion of a file, requiring reinjection of the entire batch!

 

Understanding Fibroblast Growth Factor 23

Author: Damien Gruson, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/understanding-fibroblast-growth-factor-23

What is the relationship of FGF-23 to heart failure?

A Heart failure (HF) is an increasingly common syndrome associated with high morbidity, elevated hospital readmission rates, and high mortality. Improving diagnosis, prognosis, and treatment of HF requires a better understanding of its different sub-phenotypes. As researchers gained a comprehensive understanding of neurohormonal activation—one of the hallmarks of HF—they discovered several biomarkers, including natriuretic peptides, which now are playing an important role in sub-phenotyping HF and in driving more personalized management of this chronic condition.

Like the natriuretic peptides, fibroblast growth factor 23 (FGF-23) could become important in risk-stratifying and managing HF patients. Produced by osteocytes, FGF-23 is a key regulator of phosphorus homeostasis. It binds to renal and parathyroid FGF-Klotho receptor heterodimers, resulting in phosphate excretion, decreased 1-α-hydroxylation of 25-hydroxyvitamin D, and decreased parathyroid hormone (PTH) secretion. The relationship to PTH is important because impaired homeostasis of cations and decreased glomerular filtration rate might contribute to the rise of FGF-23. The amino-terminal portion of FGF-23 (amino acids 1-24) serves as a signal peptide allowing secretion into the blood, and the carboxyl-terminal portion (aa 180-251) participates in its biological action.

How might FGF-23 improve HF risk assessment?

Studies have shown that FGF-23 is related to the risk of cardiovascular diseases and mortality. It was first demonstrated that FGF-23 levels were independently associated with left ventricular mass index and hypertrophy as well as mortality in patients with chronic kidney disease (CKD). FGF-23 also has been associated with left ventricular dysfunction and atrial fibrillation in coronary artery disease subjects, even in the absence of impaired renal function.

FGF-23 and FGF receptors are both expressed in the myocardium. It is possible that FGF-23 has direct effects on the heart and participates in the physiopathology of cardiovascular diseases and HF. Experiments have shown that for in vitro cultured rat cardiomyocytes, FGF-23 stimulates pathological hypertrophy by activating the calcineurin-NFAT pathway—and in wild-type mice—the intra-myocardial or intravenous injection of FGF-23 resulted in left ventricular hypertrophy. As such, FGF-23 appears to be a potential stimulus of myocardial hypertrophy, and increased levels may contribute to the worsening of heart failure and long-term cardiovascular death.

Researchers have documented that HF patients have elevated FGF-23 circulating levels. They have also found a significant correlation between plasma levels of FGF-23 and B-type natriuretic peptide, a biomarker related to ventricular stretch and cardiac hypertrophy, in patients with left ventricular hypertrophy. As such, measuring FGF-23 levels might be a useful tool to predict long-term adverse cardiovascular events in HF patients.

Interestingly, researchers have documented a significant relationship between FGF-23 and PTH in both CKD and HF patients. As PTH stimulates FGF-23 expression, it could be that in HF patients, increased PTH levels increase the bone expression of FGF-23, which enhances its effects on the heart.

 

The Past, Present, and Future of Western Blotting in the Clinical Laboratory

Author: Curtis Balmer, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/the-past-present-and-future-of-western-blotting-in-the-clinical-laboratory

Much of the discussion about Western blotting centers around its performance as a biological research tool. This isn’t surprising. Since its introduction in the late 1970s, the Western blot has been adopted by biology labs of virtually every stripe, and become one of the most widely used techniques in the research armamentarium. However, Western blotting has also been employed in clinical laboratories to aid in the diagnosis of various diseases and disorders—an equally important and valuable application. Yet there has been relatively little discussion of its use in this context, or of how advances in Western blotting might affect its future clinical use.

Highlighting the clinical value of Western blotting, Stanley Naides, MD, medical director of Immunology at Quest Diagnostics observed that, “Western blotting has been a very powerful tool in the laboratory and for clinical diagnosis. It’s one of many various methods that the laboratorian brings to aid the clinician in the diagnosis of disease, and the selection and monitoring of therapy.” Indeed, Western blotting has been used at one time or the other to aid in the diagnosis of infectious diseases including hepatitis C (HCV), HIV, Lyme disease, and syphilis, as well as autoimmune disorders such as paraneoplastic disease and myositis conditions.

However, Naides was quick to point out that the choice of assays to use clinically is based on their demonstrated sensitivity and performance, and that the search for something better is never-ending. “We’re constantly looking for methods that improve detection of our target [protein],” Naides said. “There have been a number of instances where we’ve moved away from Western blotting because another method proves to be more sensitive.” But this search can also lead back to Western blotting. “We’ve gone away from other methods because there’s been a Western blot that’s been developed that’s more sensitive and specific. There’s that constant movement between methods as new tests are developed.”

In recent years, this quest has been leading clinical laboratories away from Western blotting toward more sensitive and specific diagnostic assays, at least for some diseases. Using confirmatory diagnosis of HCV infection as an example, Sai Patibandla, PhD, director of the immunoassay group at Siemens Healthcare Diagnostics, explained that movement away from Western blotting for confirmatory diagnosis of HCV infection began with a technical modification called Recombinant Immunoblotting Assay (RIBA). RIBA streamlines the conventional Western blot protocol by spotting recombinant antigen onto strips which are used to screen patient samples for antibodies against HCV. This approach eliminates the need to separate proteins and transfer them onto a membrane.

The RIBA HCV assay was initially manufactured by Chiron Corporation (acquired by Novartics Vaccines and Diagnostics in 2006). It received Food and Drug Administration (FDA) approval in 1999, and was marketed as Chiron RIBA HCV 3.0 Strip Immunoblot Assay. Patibandla explained that, at the time, the Chiron assay “…was the only FDA-approved confirmatory testing for HCV.” In 2013 the assay was discontinued and withdrawn from the market due to reports that it was producing false-positive results.

Since then, clinical laboratories have continued to move away from Western blot-based assays for confirmation of HCV in favor of the more sensitive technique of nucleic acid testing (NAT). “The migration is toward NAT for confirmation of HCV [diagnosis]. We don’t use immunoblots anymore. We don’t even have a blot now to confirm HCV,” Patibandla said.

Confirming HIV infection has followed a similar path. Indeed, in 2014 the Centers for Disease Control and Prevention issued updated recommendations for HIV testing that, in part, replaced Western blotting with NAT. This change was in response to the recognition that the HIV-1 Western blot assay was producing false-negative or indeterminable results early in the course of HIV infection.

At this juncture it is difficult to predict if this trend away from Western blotting in clinical laboratories will continue. One thing that is certain, however, is that clinicians and laboratorians are infinitely pragmatic, and will eagerly replace current techniques with ones shown to be more sensitive, specific, and effective. This raises the question of whether any of the many efforts currently underway to improve Western blotting will produce an assay that exceeds the sensitivity of currently employed techniques such as NAT.

Some of the most exciting and groundbreaking work in this area is being done by Amy Herr, PhD, a professor of bioengineering at University of California, Berkeley. Herr’s group has taken on some of the most challenging limitations of Western blotting, and is developing techniques that could revolutionize the assay. For example, the Western blot is semi-quantitative at best. This weakness dramatically limits the types of answers it can provide about changes in protein concentrations under various conditions.

To make Western blotting more quantitative, Herr’s group is, among other things, identifying losses of protein sample mass during the assay protocol. About this, Herr explains that the conventional Western blot is an “open system” that involves lots of handling of assay materials, buffers, and reagents that makes it difficult to account for protein losses. Or, as Kevin Lowitz, a senior product manager at Thermo Fisher Scientific, described it, “Western blot is a [simple] technique, but a really laborious one, and there are just so many steps and so many opportunities to mess it up.”

Herr’s approach is to reduce the open aspects of Western blot. “We’ve been developing these more closed systems that allow us at each stage of the assay to account for [protein mass] losses. We can’t do this exactly for every target of interest, but it gives us a really good handle [on protein mass losses],” she said. One of the major mechanisms Herr’s lab is using to accomplish this is to secure proteins to the blot matrix with covalent bonding rather than with the much weaker hydrophobic interactions that typically keep the proteins in place on the membrane.

Herr’s group also has been developing microfluidic platforms that allow Western blotting to be done on single cells, “In our system we’re doing thousands of independent Westerns on single cells in four hours. And, hopefully, we’ll cut that down to one hour over the next couple years.”

Other exciting modifications that stand to dramatically increase the sensitivity, quantitation, and through-put of Western blotting also are being developed and explored. For example, the use of capillary electrophoresis—in which proteins are conveyed through a small electrolyte-filled tube and separated according to size and charge before being dropped onto a blotting membrane—dramatically reduces the amount of protein required for Western blot analysis, and thereby allows Westerns to be run on proteins from rare cells or for which quantities of sample are extremely limited.

Jillian Silva, PhD, an associate specialist at the University of California, San Francisco Helen Diller Family Comprehensive Cancer Center, explained that advances in detection are also extending the capabilities of Western blotting. “With the advent of fluorescence detection we have a way to quantitate Westerns, and it is now more quantitative than it’s ever been,” said Silva.

Whether or not these advances produce an assay that is adopted by clinical laboratories remains to be seen. The emphasis on Western blotting as a research rather than a clinical tool may bias advances in favor of the needs and priorities of researchers rather than clinicians, and as Patibandla pointed out, “In the research world Western blotting has a certain purpose. [Researchers] are always coming up with new things, and are trying to nail down new proteins, so you cannot take Western blotting away.” In contrast, she suggested that for now, clinical uses of Western blotting remain “limited.”

 

Adapting Next Generation Technologies to Clinical Molecular Oncology Service

Author: Ronald Carter, PhD, DVM  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/adapting-next-generation-technologies-to-clinical-molecular-oncology-service

Next generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and in the amount of information they provide. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing. This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.What follows is one viewpoint on the major challenges in adopting NGTs into diagnostic molecular oncology service.

Choosing a Platform

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

When high-throughput instruments first made their appearance, laboratories paid more attention to the accuracy of base-reading: Less accurate sequencing meant more data cleaning and resequencing (1). Now, new instrument designs have narrowed the differences, and test chemistry can have a comparatively large impact on analytical accuracy (Figure 1). The robustness of technical performance can also vary significantly depending upon specimen type. For example, LifeTechnologies’ sequencing platforms appear to be comparatively more tolerant of low DNA quality and concentration, which is an important consideration for fixed and processed tissues.

https://www.aacc.org/~/media/images/cln/articles/2015/october/carter_fig1_cln_oct15_ed.jpg

Figure 1 Comparison of Sequencing Chemistries

Sequence pile-ups of the same target sequence (2 large genes), all performed on the same analytical instrument. Results from 4 different chemistries, as designed and supplied by reagent manufacturers prior to optimization in the laboratory. Red lines represent limits of exons. Height of blue columns proportional to depth of coverage. In this case, the intent of the test design was to provide high depth of coverage so that reflex Sanger sequencing would not be necessary. Courtesy B. Sadikovic, U. of Western Ontario.

 

In addition, batching, robotics, workload volume patterns, maintenance contracts, software licenses, and platform lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT: In some applications, fees for intellectual property can represent more than 50% of the bench cost of performing a given test, and increase substantially without warning.

Laboratories must also deal with the problem of obsolescence. Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives. Before NGTs, major instruments could confidently be expected to remain current for at least 6 to 8 years. Now, a major instrument is obsolete much sooner, often within 2 to 3 years. This means that keeping it in service might cost more than investing in a new platform. Lease-purchase arrangements help mitigate year-to-year fluctuations in capital equipment costs, and maximize the value of old equipment at resale.

One Size Still Does Not Fit All

Laboratories face numerous technical considerations to optimize sequencing protocols, but the test has to be matched to the performance criteria needed for the clinical indication (2). For example, measuring response to treatment depends first upon the diagnostic recognition of mutation(s) in the tumor clone; the marker(s) then have to be quantifiable and indicative of tumor volume throughout the course of disease (Table 1).

As a result, diagnostic tests need to cover many different potential mutations, yet accurately identify any clinically relevant mutations actually present. On the other hand, tests for residual disease need to provide standardized, sensitive, and accurate quantification of a selected marker mutation against the normal background. A diagnostic panel might need 1% to 3% sensitivity across many different mutations. But quantifying early response to induction—and later assessment of minimal residual disease—needs a test that is reliably accurate to the 10-4 or 10-5 range for a specific analyte.

Covering all types of mutations in one diagnostic test is not yet possible. For example, subtyping of acute myeloid leukemia is both old school (karyotype, fluorescent in situ hybridization, and/or PCR-based or array-based testing for fusion rearrangements, deletions, and segmental gains) and new school (NGT-based panel testing for molecular mutations).

Chemistries that cover both structural variants and copy number variants are not yet in general use, but the advantages of NGTs compared to traditional methods are becoming clearer, such as in colorectal cancer (3). Researchers are also using cell-free DNA (cfDNA) to quantify residual disease and detect resistance mutations (4). Once a clinically significant clone is identified, enrichment techniques help enable extremely sensitive quantification of residual disease (5).

Validation and Quality Assurance

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assembling the resources for validation and quality assurance. The second is keeping tests up-to-date as new analytes are needed. Even if a given test chemistry has the flexibility to add analytes without revalidating the entire panel, keeping up with clinical advances is a constant priority.

Due to their throughput and multiplexing capacities, NGT platforms typically require considerable upfront investment to adopt, and training staff to perform testing takes even more time. Proper validation is harder to document: Assembling positive controls, documenting test performance criteria, developing quality assurance protocols, and conducting proficiency testing are all demanding. Labs meet these challenges in different ways. Laboratory-developed tests (LDTs) allow self-determined choice in design, innovation, and control of the test protocol, but can be very expensive to set up.

Food and Drug Administration (FDA)-approved methods are attractive but not always an option. More FDA-approved methods will be marketed, but FDA approval itself brings other trade-offs. There is a cost premium compared to LDTs, and the test methodologies are locked down and not modifiable. This is particularly frustrating for NGTs, which have the specific attraction of extensive multiplexing capacity and accommodating new analytes.

IT and the Evolution of Molecular Oncology Reporting Standards

The options for information technology (IT) pipelines for NGTs are improving rapidly. At the same time, recent studies still show significant inconsistencies and lack of reproducibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing. It can be difficult to duplicate published performances in clinical studies because of a lack of sufficient information about the protocol (chemistry) and software. Building bioinformatics capacity is a key requirement, yet skilled people are in short supply and the qualifications needed to work as a bioinformatician in a clinical service are not yet clearly defined.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-specific­ variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages (6). One of the biggest challenges is to reproducibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations (7). For multiple analyte panels, such as predictive testing for breast cancer, only the performance of the whole panel in a population of patients can be compared; individual patients may be scored into different risk categories by different tests, all for the same test indication.

In large scale sequencing of tumor genomes, which types of mutations are most informative in detecting, quantifying, and predicting the behavior of the tumor over time? The amount and complexity of mutation varies considerably across different tumor types, and while some mutations are more common, stable, and clinically informative than others, the utility of a given tumor marker varies in different clinical situations. And, for a given tumor, treatment effect and metastasis leads to retesting for changes in drug sensitivities.

These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision. One approach many labs choose is licensed technologies with shared databases that are updated in real time. These are attractive, despite their cost and licensing fees. New tests that incorporate proprietary IT with NGT platforms link the genetic signatures of tumors to clinically significant considerations like tumor classification, recommended methodologies for monitoring response, predicted drug sensitivities, eligible clinical trials, and prognostic classifications. In-house development of such solutions will be difficult, so licensing platforms from commercial partners is more likely to be the norm.

The Commercial Value of Health Records and Test Data

The future of cancer management likely rests on large-scale databases that link hereditary and somatic tumor testing with clinical outcomes. Multiple centers have such large studies underway, and data extraction and analysis is providing increasingly refined interpretations of clinical significance.

Extracting health outcomes to correlate with molecular test results is commercially valuable, as the pharmaceutical, insurance, and healthcare sectors focus on companion diagnostics, precision medicine, and evidence-based health technology assessment. Laboratories that can develop tests based on large-scale integration of test results to clinical utility will have an advantage.

NGTs do offer opportunities for net reductions in the cost of healthcare. But the lag between availability of a test and peer-evaluated demon­stration of clinical utility can be considerable. Technical developments arise faster than evidence of clinical utility. For example, immuno­histochemistry, estrogen receptor/progesterone receptor status, HER2/neu, and histology are still the major pathological criteria for prognostic evaluation of breast cancer at diagnosis, even though multiple analyte tumor profiling has been described for more than 15 years. Healthcare systems need a more concerted assessment of clinical utility if they are to take advantage of the promises of NGTs in cancer care.

Disruptive Advances

Without a doubt, “disruptive” is an appropriate buzzword in molecular oncology, and new technical advances are about to change how, where, and for whom testing is performed.

• Predictive Testing

Besides cost per analyte, one of the drivers for taking up new technologies is that they enable multiplexing many more analytes with less biopsy material. Single-analyte sequential testing for epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase, and other targets on small biopsies is not sustainable when many more analytes are needed, and even now, a significant proportion of test requests cannot be completed due to lack of suitable biopsy material. Large panels incorporating all the mutations needed to cover multiple tumor types are replacing individual tests in companion diagnostics.

• Cell-Free Tumor DNA

Challenges of cfDNA include standardizing the collection and processing methodologies, timing sampling to minimize the effect of therapeutic toxicity on analytical accuracy, and identifying the most informative sample (DNA, RNA, or protein). But for more and more tumor types, it will be possible to differentiate benign versus malignant lesions, perform molecular subtyping, predict response, monitor treatment, or screen for early detection—all without a surgical biopsy.

cfDNA technologies can also be integrated into core laboratory instrumentation. For example, blood-based EGFR analysis for lung cancer is being developed on the Roche cobas 4800 platform, which will be a significant change from the current standard of testing based upon single tests of DNA extracted from formalin-fixed, paraffin-embedded sections selected by a pathologist (8).

• Whole Genome and Whole Exome Sequencing

Whole genome and whole exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individual or multiple gene test panels as the technical cost of sequencing declines and interpretive accuracy improves (9). Laboratories can apply informatics selectively or broadly to extract much more information at relatively little increase in cost, and the interpretation of individual analytes will be improved by the context of the whole sequence.

• Minimal Residual Disease Testing

Massive resequencing and enrichment techniques can be used to detect minimal residual disease, and will provide an alternative to flow cytometry as costs decline. The challenge is to develop robust analytical platforms that can reliably produce results in a high proportion of patients with a given tumor type, despite using post-treatment specimens with therapy-induced degradation, and a very low proportion of target (tumor) sequence to benign background sequence.

The tumor markers should remain informative for the burden of disease despite clonal evolution over the course of multiple samples taken during progression of the clinical course and treatment. Quantification needs to be accurate and sensitive down to the 10-5 range, and cost competitive with flow cytometry.

• Point-of-Care Test Methodologies

Small, rapid, cheap, and single use point-of-care (POC) sequencing devices are coming. Some can multiplex with analytical times as short as 20 minutes. Accurate and timely testing will be possible in places like pharmacies, oncology clinics, patient service centers, and outreach programs. Whether physicians will trust and act on POC results alone, or will require confirmation by traditional laboratory-based testing, remains to be seen. However, in the simplest type of application, such as a patient known to have a particular mutation, the advantages of POC-based testing to quantify residual tumor burden are clear.

Conclusion

Molecular oncology is moving rapidly from an esoteric niche of diagnostics to a mainstream, required component of integrated clinical laboratory services. While NGTs are markedly reducing the cost per analyte and per specimen, and will certainly broaden the scope and volume of testing performed, the resources required to choose, install, and validate these new technologies are daunting for smaller labs. More rapid obsolescence and increased regulatory scrutiny for LDTs also present significant challenges. Aligning test capacity with approved clinical indications will require careful and constant attention to ensure competitiveness.

References

1. Liu L, Li Y, Li S, et al. Comparison of next-generation sequencing systems. J Biomed Biotechnol 2012; doi:10.1155/2012/251364.

2. Brownstein CA, Beggs AH, Homer N, et al. An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge. Genome Biol 2014;15:R53.

3. Haley L, Tseng LH, Zheng G, et al. Performance characteristics of next-generation sequencing in clinical mutation detection of colorectal ­cancers. [Epub ahead of print] Modern Pathol July 31, 2015 as doi:10.1038/modpathol.2015.86.

4. Butler TM, Johnson-Camacho K, Peto M, et al. Exome sequencing of cell-free DNA from metastatic cancer patients identifies clinically actionable mutations distinct from primary ­disease. PLoS One 2015;10:e0136407.

5. Castellanos-Rizaldos E, Milbury CA, Guha M, et al. COLD-PCR enriches low-level variant DNA sequences and increases the sensitivity of genetic testing. Methods Mol Biol 2014;1102:623–39.

6. Hiltemann S, Jenster G, Trapman J, et al. Discriminating somatic and germline mutations in tumor DNA samples without matching normals. Genome Res 2015;25:1382–90.

7. Lammers PE, Lovly CM, Horn L. A patient with metastatic lung adenocarcinoma harboring concurrent EGFR L858R, EGFR germline T790M, and PIK3CA mutations: The challenge of interpreting results of comprehensive mutational testing in lung cancer. J Natl Compr Canc Netw 2015;12:6–11.

8. Weber B, Meldgaard P, Hager H, et al. Detection of EGFR mutations in plasma and biopsies from non-small cell lung cancer patients by allele-specific PCR assays. BMC Cancer 2014;14:294.

9. Vogelstein B, Papadopoulos N, Velculescu VE, et al. Cancer genome landscapes. Science 2013;339:1546–58.

10. Heitzer E, Auer M, Gasch C, et al. Complex tumor genomes inferred from single circulating tumor cells by array-CGH and next-generation sequencing. Cancer Res 2013;73:2965–75.

11. Healy B. BRCA genes — Bookmaking, fortunetelling, and medical care. N Engl J Med 1997;336:1448–9.

 

 

 

Read Full Post »


Praluent FDA Approved

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

PRALUENT® (alirocumab) is now approved for additional LDL-C lowering on top of maximally tolerated statin therapy in patients with HeFH or clinical ASCVD1
INDICATIONS AND USAGE
PRALUENT is a PCSK9 (Proprotein Convertase Subtilisin/Kexin Type 9) inhibitor antibody indicated as adjunct to diet and maximally tolerated statin therapy for the treatment of adults with heterozygous familial hypercholesterolemia or clinical atherosclerotic cardiovascular disease, who require additional lowering of LDL-C.

The effect of PRALUENT on cardiovascular morbidity and mortality has not been determined.

DOSING INFORMATION
The recommended starting dose of PRALUENT is 75 mg administered subcutaneously once every 2 weeks, since the majority of patients achieve sufficient LDL-C reduction with this dosage. If the LDL-C response is inadequate, the dosage may be increased to the maximum dosage of 150 mg administered every 2 weeks.

Measure LDL-C levels within 4 to 8 weeks of initiating or titrating PRALUENT to assess response and adjust the dose, if needed.

PRALUENT is a human monoclonal antibody that binds to PCSK91
PRALUENT efficacy was investigated in 5 double-blind, placebo-controlled trials with 3499 patients enrolled: 36% with HeFH and 54% non-FH with clinical ASCVD.
All patients were receiving a maximally tolerated dose of statin with or without other lipid-modifying therapies
3 studies used an initial dose of 75 mg Q2W as part of an up-titration regimen with criteria-based up-titration to 150 mg Q2W at week 12 for patients who did not achieve their prespecified target LDL-C at week 8
2 studies with 150 mg Q2W dose only
Clinical ASCVD is defined in the ACC/AHA guidelines2 as acute coronary syndromes or a history of any of the following: myocardial infarction, stable or unstable angina, coronary or other arterial revascularization, transient ischemic attack or stroke, or peripheral arterial disease presumed to be of atherosclerotic origin.
All studies met their primary efficacy endpoint measured at week 241
All trials were at least 52 weeks in duration with the primary efficacy endpoint measured at week 24 (mean percent change in LDL-C from baseline)
The first and only FDA-approved PCSK9 inhibitor with 2 doses that allows you to adjust the dose based on your patients’ LDL-C lowering needs1
MyPRALUENT™: Comprehensive support for you and your patients
MyPRALUENT is designed to help meet your needs and your patients’ needs
Speak with a MyPRALUENT Care Specialist at 1-844-PRALUENT (1-844-772-5836), option 1
IMPORTANT SAFETY INFORMATION
PRALUENT is contraindicated in patients with a history of a serious hypersensitivity reaction to PRALUENT. Reactions have included hypersensitivity vasculitis and hypersensitivity reactions requiring hospitalization.

Hypersensitivity reactions (e.g., pruritus, rash, urticaria), including some serious events (e.g., hypersensitivity vasculitis and hypersensitivity reactions requiring hospitalization), have been reported with PRALUENT treatment. If signs or symptoms of serious allergic reactions occur, discontinue treatment with PRALUENT, treat according to the standard of care, and monitor until signs and symptoms resolve.

The most commonly occurring adverse reactions (≥5% of patients treated with PRALUENT and occurring more frequently than with placebo) are nasopharyngitis, injection site reactions, and influenza.

Local injection site reactions including erythema/redness, itching, swelling, and pain/tenderness were reported more frequently in patients treated with PRALUENT (7.2% versus 5.1% for PRALUENT and placebo, respectively). Few patients discontinued treatment because of these reactions (0.2% versus 0.4% for PRALUENT and placebo, respectively), but patients receiving PRALUENT had a greater number of injection site reactions, had more reports of associated symptoms, and had reactions of longer average duration than patients receiving placebo.

Neurocognitive events were reported in 0.8% of patients treated with PRALUENT and 0.7% of patients treated with placebo. Confusion or memory impairment were reported more frequently by those treated with PRALUENT (0.2% for each) than in those treated with placebo (<0.1% for each).

Liver-related disorders (primarily related to abnormalities in liver enzymes) were reported in 2.5% of patients treated with PRALUENT and 1.8% of patients treated with placebo, leading to treatment discontinuation in 0.4% and 0.2% of patients, respectively. Increases in serum transaminases to greater than 3 times the upper limit of normal occurred in 1.7% of patients treated with PRALUENT and 1.4% of patients treated with placebo.

The most common adverse reactions leading to treatment discontinuation in patients treated with PRALUENT were allergic reactions (0.6% versus 0.2% for PRALUENT and placebo, respectively) and elevated liver enzymes (0.3% versus <0.1%).

PRALUENT is a human monoclonal antibody. As with all therapeutic proteins, there is a potential for immunogenicity with PRALUENT.

Please see full Prescribing Information.

 

Read Full Post »


Eric Topol, M.D.

Curators: Larry H Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

Eric Topol, M.D. is professor of genomics and holds the Scripps endowed chair in innovative medicine. He is the director of the Scripps Translational Science Institute in La Jolla, California. Previously, he led the Cleveland Clinic to its #1 ranking in heart care, started a new medical school, and led key discoveries in heart disease.

Professor of Genomics
Department of Molecular and Experimental Medicine
California Campus
Laboratory Website
etopol@scripps.edu
(858) 554-5708

Scripps Research Joint Appointments

Director, Scripps Translational Science Institute
Faculty, Graduate Program

Other Joint Appointments

Chief Academic Officer, Scripps Health
Senior Consultant, Scripps Clinic, Division of Cardiovascular Diseases

Research Focus

My research is on indvidualized medicine, using the genome and digital technologies to understand each person at the biologic, physiologic granular level to determine appropriate therapies and prevention. An example is the use of pharmacogenomics and our research on clopidogrel (Plavix). By determining the reasons for why such a large proportion of people do not respond to this medication, we can use alternative treatment strategies to prevent blood clots.

 

Education

M.D., University of Rochester, New York, 1979
B.A., Biomedicine, University of Virginia, Charlottesville, 1975

Professional Experience

University of Virginia, B.A. With Highest Distinction, 1975
University of Rochester, M.D. With Honor, 1979
University of California, San Francisco, Internal Medicine Residency, 1979-1982
Johns Hopkins, Cardiology Fellowship, 1982-1985
University of Michigan, Professor with Tenure, Department of Internal Medicine, 1985-1991
Cleveland Clinic, Chairman of the Department of Cardiovascular Medicine, 1991-2006
Cleveland Clinic, Chief Academic Officer, 2000-2005
Cleveland Clinic Lerner College of Medicine,Founder and Provost
Case Western Reserve University, Professor of Genetics,2003-2006

Awards & Professional Activities

Elected to Institute of Medicine, National Academy of Sciences
Simon Dack Award, American College of Cardiology
American Heart Association, Top 10 Research Advances (2001, 2004)
Top 10 Most Cited Researchers in Medicine, Institute for Scientific Information
Doctor of the Decade, Thompson Scientific Award

Selected References

Goetz L, Bethel K, Topol EJ. Rebooting cancer tissue handling in the sequencing era. JAMA 309: in press, 2013

Harper AR, Topol EJPharmacogenomics in clinical practice and drug developmentNature Biotechnology, 2012 Nov;30(11):1117-24. [PMID: 23138311]

Komatireddy R, Topol EJ. Medicine Unplugged: The Future of Laboratory Medicine. Clin Chem. 2012 Oct 15. [PMID: 23071365]

Harismendy O, Notani D, Song X, Rahim NG, Tanasa B, Heintzman N, Ren B, Fu X-D, Topol EJ, Rosenfeld MG, Frazer KA. 9p21 DNAvariants associated with coronary artery disease impair interferon-c signalling response. Nature470(7333):264-268, 2011. [PMID 21307941]

Bloss CS, Schork NJ, Topol EJEffect of Direct-to-Consumer Genomewide Profiling to Assess Disease RiskNew England Journal of Medicine 364(6):524-534, 2011. [PMID 21226570]

Topol EJ, Schork NJ. Catapulting clopidogrel pharmacogenomics forward. Nature Medicine 17(1):40-41, 2011. [PMID 21217678]

Rosenberg S, et al, Topol EJ; PREDICT Investigators. Multicenter validation of the diagnostic accuracy of a blood-based gene expression test for assesing coronary artery disease in nondiabetic patients. Annals of Internal Medicine153(7):425-434, 2010. [PMID 20921541]

Topol, EJ. Transforming Medicine via Digital Innovation. Science Translational Medicine 2(16):16cm4, 2010. [PMID 20371472]

http://creativedestructionofmedicine.com/?p=3

The wireless future of medicine

http://www.ted.com/talks/eric_topol_the_wireless_future_of_medicine

FinanciaPost‘s Digital revolution in antiquated health-care industry a major operation

Listen to Dr. Topol’s podcast interview with Knowledge@Wharton

In his new book, The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care, Eric Topol argues that medicine is set to undergo its biggest shakeup in history, pushed by demanding consumers and the availability of game-changing technology. Topol — a cardiologist, director of the Scripps Translational Science Institute and co-founder of the West Wireless Health Institute in La Jolla, Calif. — was recently interviewed for Knowledge@Wharton by C. William Hanson, III, a professor of anesthesiology and critical care, and director, surgical intensive care, at the Hospital of the University of Pennsylvania. Hanson’s latest book is titled, Smart Medicine: How the Changing Role of Doctors Will Revolutionize Health Care, published in 2011.

Below is an edited transcript of the conversation.

William Hanson: I thought it might be worthwhile to quickly give you a sense of who I am and where I’m coming from [in this interview]. I’m an anesthesiologist and an intensivist, primarily a surgical intensivist, and serve as chief medical information officer at Penn. So I have some interests that will skew in that direction.

I love the title of your book. There are many people on both sides of this question. Some would say that the creative destruction of medicine is a pretty scary concept, and I think there would be plenty of us who would agree that something drastic needs to happen. You’re obviously in the latter category.

Eric Topol: I’m in the [group that feels] something drastic needs to happen. I think it can happen, it will happen and I’m hoping that we can help facilitate or catalyze that.

Hanson: You have been in a prominent role in terms of questioning traditional medical concepts. Maybe you could describe for the audience what your personal practice is and some of the issues in which you have engaged the traditional medical establishment in the past.

Topol: What I’ve done to try to change medicine in many different ways [includes] research on how to come up with better therapies. These were in large trials, as large as 40,000 patients with heart attacks, but also in [other] ways, such as starting a new medical school with a very innovative curriculum and challenging a drug safety issue which was really important for the public. So I’ve had different experiences over the years.

But what was changing for me was that four or five years ago, we recognized we had this new emerging capability of digitizing human beings, which we’ve never had before. Everybody is used to digitizing books, movies and newspapers, whatever. But when you digitize human beings by knowing the sequence of their DNA, all their physiologic metrics, like their vital signs, their anatomy [and so forth], this comes together as a unique kairos — this supreme opportune moment in medicine.

Hanson: That’s a nice lead in. I want to return to that digitization because it is something I’m dealing with in our IT systems, as I’m sure you are — what to do with the digitized information, how much of it to keep, how to analyze it. But I wanted to come back to your book title and to one of the gentlemen who endorsed the book, Clayton Christensen. He has written a couple of books as you know, including The Innovator’s Dilemma and The Innovator’s Prescription.

Topol: He has written three books on innovation and is renowned for his insights and leadership in [that] space. But there’s a little bit of a difference between us.

Hanson: Maybe you could elaborate on that.

Topol: Yes. I look to him as one of the real guiding lights of innovation. He’s not a physician. In the book, Innovator’s Prescription, he worked with a young physician, Jason Hwang, who is now out at Stanford.

Hanson: Yes, I have met him.

Topol: The difference, though, is that I am coming at it from almost three decades in the medical profession, and I’m not calling for an innovation. He calls it disruptive innovation. I’m looking at a much more radical thing. This is like taking what Clayton has popularized [and making it much bigger] … in terms of how transformative this can be, this whole ability to digitize human beings.

Hanson: In his work, he has talked about the digitization of the music industry, for example, and the film industry. Recognizing that you’re dealing at a much deeper level with the medical side of things, [it has to do with] how products enter the market at the low end and disrupt and take over higher-end products. For me and you at academic medical centers, where we think we’re providing state of the art care, I wonder to what extent we are likely to be made irrelevant by radical disruptions of the kind you’re talking about. What do you think about that?

Topol: I think that the medical community has been incredibly resistant to change. That’s across not just academic medical centers, but the whole continuum of medicine, [including] practicing physicians. But there is a consumer-driven health care revolution out there where each individual has access to their smart phone, all their vital signs and relevant data. There’s an ability to tap into their DNA sequence and all of their genomics. And of course that’s superimposed on this digital infrastructure that each person has now with a social network, with broadband Internet access and pervasive connectivity.

The Atlantic’s
 Q&A with Dr. Topol

Destroying Medicine to Rebuild It: Eric Topol on Patients Using Data

The emergency announcement on the transcontinental flight was terse and urgent: “Is there a doctor on board?” A passenger in distress was feeling intense pressure in his chest.

Eric Topol strode down the aisle to examine the passenger to see if he was having a heart attack, a diagnosis that normally would be tough at 35,000 feet. But Topol was armed with a prototype device that can take a person’s electrocardiogram (ECG) using a smartphone. The director of the Scripps Translational Science Institute near San Diego, he had just demonstrated how it worked during a lecture in Washington, D.C.

“It’s a case that fits over your iPhone with two built-in sensors connected to an app,” says Topol, showing me the device, made by Oklahoma City-based AliveCor. “You put your fingers on the sensors, or put them up to your chest, and it works like an ECG that you read in real-time on your phone.”

Dr. Topol’s guest blog for ForbesThe Power of Digitizing Human Beings and his Q&A with SalonThe Coming Medical Revolution

Read Wired’s Q&A with Dr. Topol: Why Doctors Need to Embrace Their Digital Future Now

Slate featured the book on their blog “Future Tense”

And here for an interview on Keen On (Tech Crunch TV): Why the Entrepreneurial Opportunities are Limitless http://www.pbgtoolkit.com/docs_pbg/1331561497Doc6.jpg

“Keen On” http://www.pbgtoolkit.com/docs_pbg/1331915366topol.jpg

Read Full Post »

Older Posts »