Feeds:
Posts
Comments

Archive for the ‘Cardiovascular Pharmacogenomics’ Category

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

 

A heart-healthy diet has been the basis of atherosclerotic cardiovascular disease (ASCVD) prevention and treatment for decades. The potential cardiovascular (CV) benefits of specific individual components of the “food-ome” (defined as the vast array of foods and their constituents) are still incompletely understood, and nutritional science continues to evolve.

 

The scientific evidence base in nutrition is still to be established properly. It is because of the complex interplay between nutrients and other healthy lifestyle behaviours associated with changes in dietary habits. However, several controversial dietary patterns, foods, and nutrients have received significant media exposure and are stuck by hype.

 

Decades of research have significantly advanced our understanding of the role of diet in the prevention and treatment of ASCVD. The totality of evidence includes randomized controlled trials (RCTs), cohort studies, case-control studies, and case series / reports as well as systematic reviews and meta-analyses. Although a robust body of evidence from RCTs testing nutritional hypotheses is available, it is not feasible to obtain meaningful RCT data for all diet and health relationships.

 

Studying preventive diet effects on ASCVD outcomes requires many years because atherosclerosis develops over decades and may be cost-prohibitive for RCTs. Most RCTs are of relatively short duration and have limited sample sizes. Dietary RCTs are also limited by frequent lack of blinding to the intervention and confounding resulting from imperfect diet control (replacing 1 nutrient or food with another affects other aspects of the diet).

 

In addition, some diet and health relationships cannot be ethically evaluated. For example, it would be unethical to study the effects of certain nutrients (e.g., sodium, trans fat) on cardiovascular disease (CVD) morbidity and mortality because they increase major risk factors for CVD. Epidemiological studies have suggested associations among diet, ASCVD risk factors, and ASCVD events. Prospective cohort studies yield the strongest observational evidence because the measurement of dietary exposure precedes the development of the disease.

 

However, limitations of prospective observational studies include: imprecise exposure quantification; co-linearity among dietary exposures (e.g., dietary fiber tracks with magnesium and B vitamins); consumer bias, whereby consumption of a food or food category may be associated with non-dietary practices that are difficult to control (e.g., stress, sleep quality); residual confounding (some non-dietary risk factors are not measured); and effect modification (the dietary exposure varies according to individual/genetic characteristics).

 

It is important to highlight that many healthy nutrition behaviours occur with other healthy lifestyle behaviours (regular physical activity, adequate sleep, no smoking, among others), which may further confound results. Case-control studies are inexpensive, relatively easy to do, and can provide important insight about an association between an exposure and an outcome. However, the major limitation is how the study population is selected or how retrospective data are collected.

 

In nutrition studies that involve keeping a food diary or collecting food frequency information (i.e., recall or record), accurate memory and recording of food and nutrient intake over prolonged periods can be problematic and subject to error, especially before the diagnosis of disease.

 

The advent of mobile technology and food diaries may provide opportunities to improve accuracy of recording dietary intake and may lead to more robust evidence. Finally, nutrition science has been further complicated by the influences of funding from the private sector, which may have an influence on nutrition policies and practices.

 

So, the future health of the global population largely depends on a shift to healthier dietary patterns. Green leafy vegetables and antioxidant suppliments have significant cardio-protective properties when consumed daily. Plant-based proteins are significantly more heart-healthy compared to animal proteins.

 

However, in the search for the perfect dietary pattern and foods that provide miraculous benefits, consumers are vulnerable to unsubstantiated health benefit claims. As clinicians, it is important to stay abreast of the current scientific evidence to provide meaningful and effective nutrition guidance to patients for ASCVD risk reduction.

 

Available evidence supports CV benefits of nuts, olive oil and other liquid vegetable oils, plant-based diets and plant-based proteins, green leafy vegetables, and antioxidant-rich foods. Although juicing may be of benefit for individuals who would otherwise not consume adequate amounts of fresh fruits and vegetables, caution must be exercised to avoid excessive calorie intake. Juicing of fruits / vegetables with pulp removal increases calorie intake. Portion control is necessary to avoid weight gain and thus cardiovascular health.

 

There is currently no evidence to supplement regular intake of antioxidant dietary supplements. Gluten is an issue for those with gluten-related disorders, and it is important to be mindful of this in routine clinical practice; however, there is no evidence for CV or weight loss benefits, apart from the potential caloric restriction associated with a gluten free diet.

 

References:

 

https://www.ncbi.nlm.nih.gov/pubmed/28254181

 

https://www.sciencedirect.com/science/article/pii/S0735109713060294?via%3Dihub

 

http://circ.ahajournals.org/content/119/8/1161

 

http://refhub.elsevier.com/S0735-1097(17)30036-0/sref6

 

https://www.scopus.com/record/display.uri?eid=2-s2.0-0031709841&origin=inward&txGid=af40773f7926694c7f319d91efdcd40c

 

https://www.magonlinelibrary.com/doi/10.12968/hosp.2000.61.4.1875

 

https://jamanetwork.com/journals/jamainternalmedicine/article-abstract/2548255

 

https://pharmaceuticalintelligence.com/2018/05/31/supplements-offer-little-cv-benefit-and-some-are-linked-to-harm-in-j-am-coll-cardiol/

Read Full Post »

Regulatory MicroRNAs in Aberrant Cholesterol Transport and Metabolism

Curator: Marzan Khan, B.Sc

Aberrant levels of lipids and cholesterol accumulation in the body lead to cardiometabolic disorders such as atherosclerosis, one of the leading causes of death in the Western World(1). The physical manifestation of this condition is the build-up of plaque along the arterial endothelium causing the arteries to constrict and resist a smooth blood flow(2). This obstructive deposition of plaque is merely the initiation of atherosclerosis and is enriched in LDL cholesterol (LDL-C) as well foam cells which are macrophages carrying an overload of toxic, oxidized LDL(2). As the condition progresses, the plaque further obstructs blood flow and creates blood clots, ultimately leading to myocardial infarction, stroke and other cardiovascular diseases(2). Therefore, LDL is referred to as “the bad cholesterol”(2).

Until now, statins are most widely prescribed as lipid-lowering drugs that inhibit the enzyme 3-hydroxy-3methylgutaryl-CoA reductase (HMGCR), the rate-limiting step in de-novo cholesterol biogenesis (1). But some people cannot continue with the medication due to it’s harmful side-effects(1). With the need to develop newer therapeutics to combat cardiovascular diseases, Harvard University researchers at Massachusetts General Hospital discovered 4 microRNAs that control cholesterol, triglyceride, and glucose homeostasis(3)

MicroRNAs are non-coding, regulatory elements approximately 22 nucleotides long, with the ability to control post-transcriptional expression of genes(3). The liver is the center for carbohydrate and lipid metabolism. Stringent regulation of endogenous LDL-receptor (LDL-R) pathway in the liver is crucial to maintain a minimal concentration of LDL particles in blood(3). A mechanism whereby peripheral tissues and macrophages can get rid of their excess LDL is mediated by ATP-binding cassette, subfamily A, member 1 (ABCA1)(3). ABCA1 consumes nascent HDL particles- dubbed as the “good cholesterol” which travel back to the liver for its contents of triglycerides and cholesterol to be excreted(3).

Genome-wide association studies (GWASs) meta-analysis carried out by the researchers disclosed 4 microRNAs –(miR-128-1, miR-148a, miR-130b, and miR-301b) to lie close to single-nucleotide polymorphisms (SNPs) associated with abnormal metabolism and transport of lipids and cholesterol(3) Experimental analyses carried out on relevant cell types such as the liver and macrophages have proven that these microRNAs bind to the 3’ UTRs of both LDL-R and ABCA1 transporters, and silence their activity. Overexpression of miR-128-1 and miR148a in mice models caused circulating HDL-C to drop. Corroborating the theory under investigation further, their inhibition led to an increased clearance of LDL from the blood and a greater accumulation in the liver(3).

That the antisense inhibition of miRNA-128-1 increased insulin signaling in mice, propels us to hypothesize that abnormal expression of miR-128-1 might cause insulin resistance in metabolic syndrome, and defective insulin signaling in hepatic steatosis and dyslipidemia(3)

Further examination of miR-148 established that Liver-X-Receptor (LXR) activation of the Sterol regulatory element-binding protein 1c (SREBP1c), the transcription factor responsible for controlling  fatty acid production and glucose metabolism, also mediates the expression of miR-148a(4,5) That the promoter region of miR-148 contained binding sites for SREBP1c was shown by chromatin immunoprecipitation combined with massively parallel sequencing (ChIP-seq)(4). More specifically, SREBP1c attaches to the E-box2, E-box3 and E-box4 elements on miR-148-1a promoter sites to control its expression(4).

Earlier, the same researchers- Andres Naars and his team had found another microRNA called miR-33 to block HDL generation, and this blockage to reverse upon antisense targeting of miR-33(6).

These experimental data substantiate the theory of miRNAs being important regulators of lipoprotein receptors and transporter proteins as well as underscore the importance of employing antisense technologies to reverse their gene-silencing effects on LDL-R and ABCA1(4). Such a therapeutic approach, that will consequently lower LDL-C and promote HDL-C seems to be a promising strategy to treat atherosclerosis and other cardiovascular diseases(4).

References:

1.Goedeke L1,Wagschal A2,Fernández-Hernando C3, Näär AM4. miRNA regulation of LDL-cholesterol metabolism. Biochim Biophys Acta. 2016 Dec;1861(12 Pt B):. Biochim Biophys Acta. 2016 Dec;1861(12 Pt B):2047-2052

https://www.ncbi.nlm.nih.gov/pubmed/26968099

2.MedicalNewsToday. Joseph Nordgvist. Atherosclerosis:Causes, Symptoms and Treatments. 13.08.2015

http://www.medicalnewstoday.com/articles/247837.php

3.Wagschal A1,2, Najafi-Shoushtari SH1,2, Wang L1,2, Goedeke L3, Sinha S4, deLemos AS5, Black JC1,6, Ramírez CM3, Li Y7, Tewhey R8,9, Hatoum I10, Shah N11, Lu Y11, Kristo F1, Psychogios N4, Vrbanac V12, Lu YC13, Hla T13, de Cabo R14, Tsang JS11, Schadt E15, Sabeti PC8,9, Kathiresan S4,6,8,16, Cohen DE7, Whetstine J1,6, Chung RT5,6, Fernández-Hernando C3, Kaplan LM6,10, Bernards A1,6,16, Gerszten RE4,6, Näär AM1,2. Genome-wide identification of microRNAs regulating cholesterol and triglyceride homeostasis. . Nat Med.2015 Nov;21(11):1290

https://www.ncbi.nlm.nih.gov/pubmed/26501192

4.Goedeke L1,2,3,4, Rotllan N1,2, Canfrán-Duque A1,2, Aranda JF1,2,3, Ramírez CM1,2, Araldi E1,2,3,4, Lin CS3,4, Anderson NN5,6, Wagschal A7,8, de Cabo R9, Horton JD5,6, Lasunción MA10,11, Näär AM7,8, Suárez Y1,2,3,4, Fernández-Hernando C1,2,3,4. MicroRNA-148a regulates LDL receptor and ABCA1 expression to control circulating lipoprotein levels. Nat Med. 2015 Nov;21(11):1280-9.

https://www.ncbi.nlm.nih.gov/pubmed/26437365

5.Eberlé D1, Hegarty B, Bossard P, Ferré P, Foufelle F. SREBP transcription factors: master regulators of lipid homeostasis. Biochimie. 2004 Nov;86(11):839-48.

https://www.ncbi.nlm.nih.gov/pubmed/15589694

6.Harvard Medical School. News. MicoRNAs and Metabolism.

https://hms.harvard.edu/news/micrornas-and-metabolism

7. MGH – Four microRNAs identified as playing key roles in cholesterol, lipid metabolism

http://www.massgeneral.org/about/pressrelease.aspx?id=1862

 

Other related articles published in this Open Access Online Scientific Journal include the following:

 

  • Cardiovascular Diseases, Volume Three: Etiologies of Cardiovascular Diseases: Epigenetics, Genetics and Genomics,

on Amazon since 11/29/2015

http://www.amazon.com/dp/B018PNHJ84

 

HDL oxidation in type 2 diabetic patients

Larry H. Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/27/hdl-oxidation-in-type-2-diabetic-patients/

 

HDL-C: Target of Therapy – Steven E. Nissen, MD, MACC, Cleveland Clinic vs Peter Libby, MD, BWH

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2014/11/07/hdl-c-target-of-therapy-steven-e-nissen-md-macc-cleveland-clinic-vs-peter-libby-md-bwh/

 

High-Density Lipoprotein (HDL): An Independent Predictor of Endothelial Function & Atherosclerosis, A Modulator, An Agonist, A Biomarker for Cardiovascular Risk

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/03/31/high-density-lipoprotein-hdl-an-independent-predictor-of-endothelial-function-artherosclerosis-a-modulator-an-agonist-a-biomarker-for-cardiovascular-risk/

 

Risk of Major Cardiovascular Events by LDL-Cholesterol Level (mg/dL): Among those treated with high-dose statin therapy, more than 40% of patients failed to achieve an LDL-cholesterol target of less than 70 mg/dL.

Reporter: Aviva Lev-Ari, PhD., RN

https://pharmaceuticalintelligence.com/2014/07/29/risk-of-major-cardiovascular-events-by-ldl-cholesterol-level-mgdl-among-those-treated-with-high-dose-statin-therapy-more-than-40-of-patients-failed-to-achieve-an-ldl-cholesterol-target-of-less-th/

 

LDL, HDL, TG, ApoA1 and ApoB: Genetic Loci Associated With Plasma Concentration of these Biomarkers – A Genome-Wide Analysis With Replication

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/12/18/ldl-hdl-tg-apoa1-and-apob-genetic-loci-associated-with-plasma-concentration-of-these-biomarkers-a-genome-wide-analysis-with-replication/

 

Two Mutations, in the PCSK9 Gene: Eliminates a Protein involved in Controlling LDL Cholesterol

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2013/04/15/two-mutations-in-a-pcsk9-gene-eliminates-a-protein-involve-in-controlling-ldl-cholesterol/

Artherogenesis: Predictor of CVD – the Smaller and Denser LDL Particles

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/11/15/artherogenesis-predictor-of-cvd-the-smaller-and-denser-ldl-particles/

 

A Concise Review of Cardiovascular Biomarkers of Hypertension

Curator: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/04/25/a-concise-review-of-cardiovascular-biomarkers-of-hypertension/

 

Triglycerides: Is it a Risk Factor or a Risk Marker for Atherosclerosis and Cardiovascular Disease ? The Impact of Genetic Mutations on (ANGPTL4) Gene, encoder of (angiopoietin-like 4) Protein, inhibitor of Lipoprotein Lipase

Reporters, Curators and Authors: Aviva Lev-Ari, PhD, RN and Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/03/13/triglycerides-is-it-a-risk-factor-or-a-risk-marker-for-atherosclerosis-and-cardiovascular-disease-the-impact-of-genetic-mutations-on-angptl4-gene-encoder-of-angiopoietin-like-4-protein-that-in/

 

Excess Eating, Overweight, and Diabetic

Larry H Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/15/excess-eating-overweight-and-diabetic/

 

Obesity Issues

Larry H. Bernstein, MD, FCAP, Curator

https://pharmaceuticalintelligence.com/2015/11/12/obesity-issues/

 

Read Full Post »

Etiologies of Cardiovascular Diseases: Epigenetics, Genetics and Genomics: Request for Book Review Writing on Amazon.com, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

cvd-series-a-volume-iii


Series A: e-Books on Cardiovascular Diseases
 

Series A Content Consultant: Justin D Pearlman, MD, PhD, FACC

VOLUME THREE

Etiologies of Cardiovascular Diseases:

Epigenetics, Genetics and Genomics

http://www.amazon.com/dp/B018PNHJ84

 

by  

Larry H Bernstein, MD, FCAP, Senior Editor, Author and Curator

and

Aviva Lev-Ari, PhD, RN, Editor and Curator

Introduction to Volume Three 

PART 1
Genomics and Medicine

1.1  Genomics and Medicine: The Physician’s View

1.2  Ribozymes and RNA Machines – Work of Jennifer A. Doudna

1.3  Genomics and Medicine: Contributions of Genetics and Genomics to Cardiovascular Disease Diagnoses

1.4 Genomics Orientations for Individualized Medicine, Volume One

1.4.1 CVD Epidemiology, Ethnic subtypes Classification, and Medication Response Variability: Cardiology, Genomics and Individualized Heart Care: Framingham Heart Study (65 y-o study) & Jackson Heart Study (15 y-o study)

1.4.2 What comes after finishing the Euchromatic Sequence of the Human Genome?

1.5  Genomics in Medicine – Establishing a Patient-Centric View of Genomic Data

 

PART 2
Epigenetics – Modifiable Factors Causing Cardiovascular Diseases

2.1 Diseases Etiology

2.1.1 Environmental Contributors Implicated as Causing Cardiovascular Diseases

2.1.2 Diet: Solids, Fluid Intake and Nutraceuticals

2.1.3 Physical Activity and Prevention of Cardiovascular Diseases

2.1.4 Psychological Stress and Mental Health: Risk for Cardiovascular Diseases

2.1.5 Correlation between Cancer and Cardiovascular Diseases

2.1.6 Medical Etiologies for Cardiovascular Diseases: Evidence-based Medicine – Leading DIAGNOSES of Cardiovascular Diseases, Risk Biomarkers and Therapies

2.1.7 Signaling Pathways

2.1.8 Proteomics and Metabolomics

2.1.9 Sleep and Cardiovascular Diseases

2.2 Assessing Cardiovascular Disease with Biomarkers

2.2.1 Issues in Genomics of Cardiovascular Diseases

2.2.2 Endothelium, Angiogenesis, and Disordered Coagulation

2.2.3 Hypertension BioMarkers

2.2.4 Inflammatory, Atherosclerotic and Heart Failure Markers

2.2.5 Myocardial Markers

2.3  Therapeutic Implications: Focus on Ca(2+) signaling, platelets, endothelium

2.3.1 The Centrality of Ca(2+) Signaling and Cytoskeleton Involving Calmodulin Kinases and Ryanodine Receptors in Cardiac Failure, Arterial Smooth Muscle, Post-ischemic Arrhythmia, Similarities and Differences, and Pharmaceutical Targets

2.3.2 EMRE in the Mitochondrial Calcium Uniporter Complex

2.3.3 Platelets in Translational Research ­ 2: Discovery of Potential Anti-platelet Targets

2.3.4 The Final Considerations of the Role of Platelets and Platelet Endothelial Reactions in Atherosclerosis and Novel Treatments

2.3.5 Nitric Oxide Synthase Inhibitors (NOS-I)

2.3.6 Resistance to Receptor of Tyrosine Kinase

2.3.7 Oxidized Calcium Calmodulin Kinase and Atrial Fibrillation

2.3.8 Advanced Topics in Sepsis and the Cardiovascular System at its End Stage

2.4 Comorbidity of Diabetes and Aging

2.4.1 Heart and Aging Research in Genomic Epidemiology: 1700 MIs and 2300 coronary heart disease events among about 29 000 eligible patients

2.4.2 Pathophysiological Effects of Diabetes on Ischemic-Cardiovascular Disease and on Chronic Obstructive Pulmonary Disease (COPD)

2.4.3 Risks of Hypoglycemia in Diabetics with Chronic Kidney Disease (CKD)

2.4.4  Mitochondrial Mechanisms of Disease in Diabetes Mellitus

2.4.5 Mitochondria: More than just the “powerhouse of the cell”

2.4.6  Pathophysiology of GLP-1 in Type 2 Diabetes

2.4.7 Developments in the Genomics and Proteomics of Type 2 Diabetes Mellitus and Treatment Targets

2.4.8 CaKMII Inhibition in Obese, Diabetic Mice leads to Lower Blood Glucose Levels

2.4.9 Protein Target for Controlling Diabetes, Fractalkine: Mediator cell-to-cell Adhesion though CX3CR1 Receptor, Released from cells Stimulate Insulin Secretion

2.4.10 Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes

2.4.11 CABG or PCI: Patients with Diabetes – CABG Rein Supreme

2.4.12 Reversal of Cardiac Mitochondrial Dysfunction

2.4.13  BARI 2D Trial Outcomes

2.4.14 Overview of new strategy for treatment of T2DM: SGLT2 inhibiting oral antidiabetic agents

2.5 Drug Toxicity and Cardiovascular Diseases

2.5.1 Predicting Drug Toxicity for Acute Cardiac Events

2.5.2 Cardiotoxicity and Cardiomyopathy Related to Drugs Adverse Effects

2.5.3 Decoding myocardial Ca2+ signals across multiple spatial scales: A role for sensitivity analysis

2.5.4. Leveraging Mathematical Models to Understand Population Variability in Response to Cardiac Drugs: Eric Sobie, PhD

2.5.5 Exploiting mathematical models to illuminate electrophysiological variability between individuals.

2.5.6 Clinical Effects and Cardiac Complications of Recreational Drug Use: Blood pressure changes, Myocardial ischemia and infarction, Aortic dissection, Valvular damage, and Endocarditis, Cardiomyopathy, Pulmonary edema and Pulmonary hypertension, Arrhythmias, Pneumothorax and Pneumopericardium

 

2.6 Male and Female Hormonal Replacement Therapy: The Benefits and the Deleterious Effects on Cardiovascular Diseases

2.6.1  Testosterone Therapy for Idiopathic Hypogonadotrophic Hypogonadism has Beneficial and Deleterious Effects on Cardiovascular Risk Factors

2.6.2 Heart Risks and Hormones (HRT) in Menopause: Contradiction or Clarification?

2.6.3 Calcium Dependent NOS Induction by Sex Hormones: Estrogen

2.6.4 Role of Progesterone in Breast Cancer Progression

PART 3
Determinants of Cardiovascular Diseases Genetics, Heredity and Genomics Discoveries

Introduction

3.1 Why cancer cells contain abnormal numbers of chromosomes (Aneuploidy)

3.1.1 Aneuploidy and Carcinogenesis

3.2 Functional Characterization of Cardiovascular Genomics: Disease Case Studies @ 2013 ASHG

3.3 Leading DIAGNOSES of Cardiovascular Diseases covered in Circulation: Cardiovascular Genetics, 3/2010 – 3/2013

3.3.1: Heredity of Cardiovascular Disorders

3.3.2: Myocardial Damage

3.3.3: Hypertention and Atherosclerosis

3.3.4: Ethnic Variation in Cardiac Structure and Systolic Function

3.3.5: Aging: Heart and Genetics

3.3.6: Genetics of Heart Rhythm

3.3.7: Hyperlipidemia, Hyper Cholesterolemia, Metabolic Syndrome

3.3.8: Stroke and Ischemic Stroke

3.3.9: Genetics and Vascular Pathologies and Platelet Aggregation, Cardiac Troponin T in Serum

3.3.10: Genomics and Valvular Disease

3.4  Commentary on Biomarkers for Genetics and Genomics of Cardiovascular Disease

PART 4
Individualized Medicine Guided by Genetics and Genomics Discoveries

4.1 Preventive Medicine: Cardiovascular Diseases

4.1.1 Personal Genomics for Preventive Cardiology Randomized Trial Design and Challenges

4.2 Gene-Therapy for Cardiovascular Diseases

4.2.1 Genetic Basis of Cardiomyopathy

4.3 Congenital Heart Disease/Defects

4.4 Cardiac Repair: Regenerative Medicine

4.4.1 A Powerful Tool For Repairing Damaged Hearts

4.4.2 Modified RNA Induces Vascular Regeneration After a Heart

4.5 Pharmacogenomics for Cardiovascular Diseases

4.5.1 Blood Pressure Response to Antihypertensives: Hypertension Susceptibility Loci Study

4.5.2 Statin-Induced Low-Density Lipoprotein Cholesterol Reduction: Genetic Determinants in the Response to Rosuvastatin

4.5.3 SNPs in apoE are found to influence statin response significantly. Less frequent variants in PCSK9 and smaller effect sizes in SNPs in HMGCR

4.5.4 Voltage-Gated Calcium Channel and Pharmacogenetic Association with Adverse Cardiovascular Outcomes: Hypertension Treatment with Verapamil SR (CCB) vs Atenolol (BB) or Trandolapril (ACE)

4.5.5 Response to Rosuvastatin in Patients With Acute Myocardial Infarction: Hepatic Metabolism and Transporter Gene Variants Effect

4.5.6 Helping Physicians identify Gene-Drug Interactions for Treatment Decisions: New ‘CLIPMERGE’ program – Personalized Medicine @ The Mount Sinai Medical Center

4.5.7 Is Pharmacogenetic-based Dosing of Warfarin Superior for Anticoagulation Control?

Summary & Epilogue to Volume Three

 

 

Read Full Post »

Regulation of mesenchymal cell generation

 

Curator: Larry H. Bernstein, MD, FCAP

from Butyrov

LPBI

 

 

Controlling Mesenchymal Stem Cell Activity With Microparticles Loaded With Small Molecules

M. Butyrov      Beyond the Dish

Mesenchymal stem cells are the subject of many clinical trials and show a potent ability to down-regulate unwanted immune responses and quell inflammation. A genuine challenge with mesenchymal stem cells (MSCs) is controlling the genes they express and the proteins they secrete.

A new publication details the strategy of one enterprising laboratory to control MSC function. Work by Jeffery Karp from the Harvard Stem Cell Institute and Maneesha Inamdar from the Institute for Stem Cell Biology and Regenerative Medicine in Bangalore, India and their colleagues had use microparticles that are loaded with small molecules and are readily taken up by cultures MSCs.

In this paper, which appeared in Stem Cell Reports (DOI:http://dx.doi.org/10.1016/j.stemcr.2016.05.003), human MSCs were stimulated with a small signaling protein called Tumor Necrosis Factor-alpha (TNF-alpha). TNF-alpha makes MSCs “angry” and they pour out pro-inflammatory molecules upon stimulation with TNF-alpha. However, to these TNF-alpha-stimulated, MSC, Karp and others added tiny microparticles loaded with a small molecule called TPCA-1. TPCA-1 inhibits the NF-κB signaling pathway, which is one of the major signal transduction pathways involved in inflammation.

fx1 (7)

Delivery of these TPCA-1-containing microparticles thinned-out the production of pro-inflammatory molecules by these TNF-alpha-treated MSCs for at least 6 days. When the culture medium from TPCA-1-loaded MSCs was given to different cell types, the molecules secreted by these cells reduced the recruitment of white blood cells called monocytes. This is indicative of the anti-inflammatory nature of TPCA-1-treated MSCs. The culture medium from these cells also prevented the differentiation of human cardiac fibroblasts into collagen-making cells called “myofibroblasts.” Myofibroblasts lay down the collagen that produces the heart scar after a heart attack. This is a further indication of the anti-inflammatory nature of the molecules made by these TPCA-1-treated MSCs.

These results are important because it shows that MSC activities can be manipulated without gene therapy. It is possible that such non-gene therapy-based approached can be used to fine-tune MSC activity and the types of molecules secreted by implanted MSCs. Furthermore, given the effect of these cells on monocytes and cardiac fibroblasts, perhaps microparticle-treated MSCs can prevent the adverse remodeling that occurs in the heart after a heart attack.

 

Controlled Inhibition of the Mesenchymal Stromal Cell Pro-inflammatory Secretome via Microparticle Engineering

Sudhir H. Ranganath, Zhixiang Tong, Oren Levy, Keir Martyn, Jeffrey M. Karpcorrespondence, Maneesha S. Inamdar
Stem Cell Reports June 2016; Volume 6 (Issue 6): 926–939    http://dx.doi.org/10.1016/j.stemcr.2016.05.003
Mesenchymal stromal cells (MSCs) are promising therapeutic candidates given their potent immunomodulatory and anti-inflammatory secretome. However, controlling the MSC secretome post-transplantation is considered a major challenge that hinders their clinical efficacy. To address this, we used a microparticle-based engineering approach to non-genetically modulate pro-inflammatory pathways in human MSCs (hMSCs) under simulated inflammatory conditions. Here we show that microparticles loaded with TPCA-1, a smallmolecule NF-kB inhibitor, when delivered to hMSCs can attenuate secretion of pro-inflammatory factors for at least 6 days in vitro. Conditioned medium (CM) derived from TPCA-1-loaded hMSCs also showed reduced ability to attract human monocytes and prevented differentiation of human cardiac fibroblasts to myofibroblasts, compared with CM from untreated or TPCA-1-preconditioned hMSCs. Thus, we provide a broadly applicable bioengineering solution to facilitate intracellular sustained release of agents that modulate signaling. We propose that this approach could be harnessed to improve control over MSC secretome post-transplantation, especially to prevent adverse remodeling post-myocardial infarction.
Mesenchymal stromal cells (MSCs; also known as bone marrow stromal cells and earlier known as mesenchymal stem cells) are being explored as therapeutics in over 550 clinical trials registered with the US Food and Drug Administration (www.FDA.gov) for the treatment of a wide range of diseases (Ankrum et al., 2014c). Their immuneevasive properties (Ankrum et al., 2014c) and safe transplant record, allowing allogeneic administration without an immunosuppressive regimen, positions MSCs as an appealing candidate for a potential off-the-shelf product. One of the primary mechanisms exploited in MSC therapeutics is a secretome-based paracrine effect as evidenced in many pre-clinical studies (Ranganath et al., 2012). However, controlling the MSC secretome post-transplantation is considered a major challenge that hinders their clinical efficacy. For instance, upon transplantation, MSCs are subjected to a complex inflammatory milieu (soluble mediators and immune cells) in most injury settings. MSCs not only secrete anti-inflammatory factors, but also produce pro-inflammatory factors that may compromise their therapeutic efficacy. Table S1 lists a few in vitro and in vivo conditions that demonstrate the complex microenvironment under which MSCs switch between anti-inflammatory and pro-inflammatory phenotypes.
Levels of pro- or anti-inflammatory cytokines are not always predictive of the response, possibly due to the dynamic cytokine combinations (and concentrations) present in the cell microenvironment (Table S1). For example, relatively low inflammatory stimulus (<20 ng/ml tumor necrosis factor alpha [TNF-a] alone or along with interferon-g [IFN-g]) can polarize MSCs toward pro-inflammatory effects (Bernardo and Fibbe, 2013) resulting in increased inflammation characterized by T cell proliferation and transplant rejection. Conversely, exposure to high levels of the inflammatory cytokine TNF-a has been shown in certain studies to result in MSC-mediated anti-inflammatory effects via secretion of potent mediators such as TSG6, PGE2, STC-1, IL-1Ra, and sTNFR1 as demonstrated in multiple inflammation-associated disease models (Prockop and Oh, 2012; Ylostalo et al., 2012). These effects are mediated via molecular pathways such as NF-kB, PI3K, Akt, and JAK-STAT (Ranganath et al., 2012). However, it is not clear that low and high levels of TNF-a always exert the same effect on anti- versus pro-inflammatory MSC secretome. NF-kB is a central regulator of the anti-inflammatory secretome response in monolayer (Yagi et al., 2010), spheroid MSCs (Bartosh et al., 2013; Ylostalo et al., 2012) and TNFa-mediated (20 ng/ml for 120 min) apoptosis (Peng et al., 2011). Given that NF-kB can promote secretion of proinflammatory components in the MSC secretome (Lee et al., 2010), we hypothesized that NF-kB inhibition via small molecules in MSC subjected to a representative in- flammatory stimulus (10 ng/ml TNF-a) would inhibit their pro-inflammatory responses.
Adverse remodeling or cardiac fibrosis due to differentiation of cardiac fibroblasts (CF) into cardiac myofibroblasts (CMF) with pro-inflammatory phenotype and collagen deposition is the leading cause for heart failure. The secretome from exogenous MSCs has anti-fibrotic and angiogenic effects that can reduce scar formation (Preda et al., 2014) and improve ejection fraction when administered early or prior to adverse remodeling (Preda et al., 2014; Tang et al., 2010; Williams et al., 2013). Unfortunately in many cases, due to poor prognosis, MSCs may not be administered in time to prevent adverse remodeling to inhibit CF differentiation to CMF (Virag and Murry, 2003) or to prevent myocardial expression of TNF-a (Bozkurt et al., 1998; Mann, 2001). Also, when administered following an adverse remodeling event including CMF differentiation, MSCs may assume a pro-inflammatory phenotype and secretome (Naftali-Shani et al., 2013) under TNF-a (typically 5 pg/mg of total protein in myocardial infarction (MI) rat myocardium which is not significantly higher than 1–3 pg/mg protein in control rat myocardium) (Moro et al., 2007) resulting in impaired heart function. Hence, if the pro-inflammatory response of hMSCs can be suppressed it may maximize efficacy.
While the MSC phenotype can be controlled under regulated conditions in vitro, the in vivo response of MSCs post-transplantation is poorly controlled as it is dictated by highly dynamic and complex host microenvironments (Discher et al., 2009; Rodrigues et al., 2010). Factors including MSC tissue source (Melief et al., 2013; NaftaliShani et al., 2013), donor and batch-to-batch variability with respect to cytokine secretion and response to inflammatory stimuli (Zhukareva et al., 2010), gender (Crisostomo et al., 2007), and age (Liang et al., 2013) also affect the response of MSCs. Thus, it is important to develop approaches to control the MSC secretome post-transplantation regardless of their source or expansion conditions. We hypothesized that engineering MSCs to induce a specific secretome profile under a simulated host microenvironment may maximize their therapeutic utility. Previously, the MSC secretome has been regulated via genetic engineering (Gnecchi et al., 2006; Wang et al., 2009) or cytokine/small-molecule preconditioning approaches (Crisostomo et al., 2008; Mias et al., 2008). Genetically engineered human MSCs (hMSCs) pose challenging long-term regulatory hurdles given that the potential tumorigenicity has not been well characterized, and while preconditioning hMSCs with cytokines/small molecules may be safer, the phenotype-altering effects are transient. …
While TPCA-1 pretreatment of hMSCs before TNF-a stimulation showed a significant reduction in CMF numbers, this was greatly enhanced when TPCA-1 was available intracellularly via the microparticles. CM from TNF + TPCAmP-hMSC likely prevented differentiation of CF to CMF due to the continued intracellular inhibition of IKK-mediated NF-kB activation, thus preventing the release of an hMSC pro-inflammatory secretome. Surprisingly, the CMF number was reduced by over 2-fold, suggesting that TPCA-1 may activate hMSC pathways that revert the CMF phenotype. We cannot rule out the possibility that other contributors in the hMSC secretome that were not profiled might also be contributing, and their action is facilitated by intracellular TPCA-1. For instance, in an in vitro 3D model of cardiac fibrosis under hypoxic conditions, reversal of CMF to the CF phenotype was shown to be due to reduced MSC TGF-b levels (Galie and Stegemann, 2014). Our assay revealed a similar trend in terms of collagen production from CF. CF treated with CM from control-hMSCs, or TPCApre + TNF-hMSCs or mP-hMSCs secreted elevated levels of collagen into the media suggesting that only inhibited levels of pro-inflammatory mediators could not be implicated in the reduction in collagen secretion. The reduced number of a-SMA+ CMF in CM from TNF + TPCAmP-hMSCs possibly contributed to the lower secretion level of collagen in the media (Figure 5B). Upon dedifferentiation or lowered a-SMA expression, it is possible that CMFs lose collagen secretion ability. In regions of MI, CF switch to the myofibroblast phenotype due to stress from the infarct scar (Tomasek et al., 2002). High expression of a-SMA typical in such CMF (Teunissen et al., 2007) has been implicated for remodeling due to their high contractility (Santiago et al., 2010). In addition, the collagen secretion capacity of CMF is very high (Petrov et al., 2002). Overall, attenuation in the number of collagen-secreting a-SMA+ CMF could be beneficial in preventing pathological remodeling or irreversible scar formation and allowing cardiac regeneration.
Here we have demonstrated that the pro-inflammatory hMSC secretome could be inhibited using a microparticle engineering approach delivering an intracellular NF-kB inhibitor, TPCA-1. It is however important to note that the MSC secretome composition may change depending on the level of TNF-a encountered in vivo following transplantation. Nevertheless, a similar approach could be beneficial in inflammatory disease settings such as chronic inflammation and macrophage-mediated atherosclerosis. The approach of microparticle engineering of an exogenous cell population by modulating a central regulatory pathway, may find application in other cell types and pathways and could provide an attractive strategy for harnessing any cell secretome for therapy. This approach could also be potentially employed to modulate the composition of extracellular vesicles (exosomes) for therapy.
Stem Cell Reports j Vol. 6 j 926–939 j June 14, 2016

Read Full Post »

Arrhythmias: Disturbances of AV Conduction by Christine LaGrasta, MS, RN, CPNP PC/AC, OPENPediatrics

Reporter: Aviva Lev-Ari, PhD, RN

 

Watch Video

https://www.youtube.com/v/_yEkeetKqtg?fs=1&hl=fr_FR

Please visit: www.openpediatrics.org OPENPediatrics™ is an interactive digital learning platform for healthcare clinicians sponsored by Boston Children’s Hos…

Sourced through Scoop.it from: www.youtube.com

See on Scoop.itCardiovascular Disease: PHARMACO-THERAPY

Read Full Post »

Clinical Laboratory Challenges

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

CLINICAL LABORATORY NEWS   

The Lab and CJD: Safe Handling of Infectious Prion Proteins

Body fluids from individuals with possible Creutzfeldt-Jakob disease (CJD) present distinctive safety challenges for clinical laboratories. Sporadic, iatrogenic, and familial CJD (known collectively as classic CJD), along with variant CJD, kuru, Gerstmann-Sträussler-Scheinker, and fatal familial insomnia, are prion diseases, also known as transmissible spongiform encephalopathies. Prion diseases affect the central nervous system, and from the onset of symptoms follow a typically rapid progressive neurological decline. While prion diseases are rare, it is not uncommon for the most prevalent form—sporadic CJD—to be included in the differential diagnosis of individuals presenting with rapid cognitive decline. Thus, laboratories may deal with a significant number of possible CJD cases, and should have protocols in place to process specimens, even if a confirmatory diagnosis of CJD is made in only a fraction of these cases.

The Lab’s Role in Diagnosis

Laboratory protocols for handling specimens from individuals with possible, probable, and definitive cases of CJD are important to ensure timely and appropriate patient management. When the differential includes CJD, an attempt should be made to rule-in or out other causes of rapid neurological decline. Laboratories should be prepared to process blood and cerebrospinal fluid (CSF) specimens in such cases for routine analyses.

Definitive diagnosis requires identification of prion aggregates in brain tissue, which can be achieved by immunohistochemistry, a Western blot for proteinase K-resistant prions, and/or by the presence of prion fibrils. Thus, confirmatory diagnosis is typically achieved at autopsy. A probable diagnosis of CJD is supported by elevated concentration of 14-3-3 protein in CSF (a non-specific marker of neurodegeneration), EEG, and MRI findings. Thus, the laboratory may be required to process and send CSF samples to a prion surveillance center for 14-3-3 testing, as well as blood samples for sequencing of the PRNP gene (in inherited cases).

Processing Biofluids

Laboratories should follow standard protective measures when working with biofluids potentially containing abnormally folded prions, such as donning standard personal protective equipment (PPE); avoiding or minimizing the use of sharps; using single-use disposable items; and processing specimens to minimize formation of aerosols and droplets. An additional safety consideration is the use of single-use disposal PPE; otherwise, re-usable items must be either cleaned using prion-specific decontamination methods, or destroyed.

Blood. In experimental models, infectivity has been detected in the blood; however, there have been no cases of secondary transmission of classical CJD via blood product transfusions in humans. As such, blood has been classified, on epidemiological evidence by the World Health Organization (WHO), as containing “no detectible infectivity,” which means it can be processed by routine methods. Similarly, except for CSF, all other body fluids contain no infectivity and can be processed following standard procedures.

In contrast to classic CJD, there have been four cases of suspected secondary transmission of variant CJD via transfused blood products in the United Kingdom. Variant CJD, the prion disease associated with mad cow disease, is unique in its distribution of prion aggregates outside of the central nervous system, including the lymph nodes, spleen, and tonsils. For regions where variant CJD is a concern, laboratories should consult their regulatory agencies for further guidance.

CSF. Relative to highly infectious tissues of the brain, spinal cord, and eye, infectivity has been identified less often in CSF and is considered to have “low infectivity,” along with kidney, liver, and lung tissue. Since CSF can contain infectious material, WHO has recommended that analyses not be performed on automated equipment due to challenges associated with decontamination. Laboratories should perform a risk assessment of their CSF processes, and, if deemed necessary, consider using manual methods as an alternative to automated systems.

Decontamination

The infectious agent in prion disease is unlike any other infectious pathogen encountered in the laboratory; it is formed of misfolded and aggregated prion proteins. This aggregated proteinacious material forms the infectious unit, which is incredibly resilient to degradation. Moreover, in vitro studies have demonstrated that disrupting large aggregates into smaller aggregates increases cytotoxicity. Thus, if the aim is to abolish infectivity, all aggregates must be destroyed. Disinfectant procedures used for viral, bacterial, and fungal pathogens such as alcohol, boiling, formalin, dry heat (<300°C), autoclaving at 121°C for 15 minutes, and ionizing, ultraviolet, or microwave radiation, are either ineffective or variably effective against aggregated prions.

The only means to ensure no risk of residual infectious prions is to use disposable materials. This is not always practical, as, for instance, a biosafety cabinet cannot be discarded if there is a CSF spill in the hood. Fortunately, there are several protocols considered sufficient for decontamination. For surfaces and heat-sensitive instruments, such as a biosafety cabinet, WHO recommends flooding the surface with 2N NaOH or undiluted NaClO, letting stand for 1 hour, mopping up, and rinsing with water. If the surface cannot tolerate NaOH or NaClO, thorough cleaning will remove most infectivity by dilution. Laboratories may derive some additional benefit by using one of the partially effective methods discussed previously. Non-disposable heat-resistant items preferably should be immersed in 1N NaOH, heated in a gravity displacement autoclave at 121°C for 30 min, cleaned and rinsed in water, then sterilized by routine methods. WHO has outlined several alternate decontamination methods. Using disposable cover sheets is one simple solution to avoid contaminating work surfaces and associated lengthy decontamination procedures.

With standard PPE—augmented by a few additional safety measures and prion-specific decontamination procedures—laboratories can safely manage biofluid testing in cases of prion disease.

 

The Microscopic World Inside Us  

Emerging Research Points to Microbiome’s Role in Health and Disease

Thousands of species of microbes—bacteria, viruses, fungi, and protozoa—inhabit every internal and external surface of the human body. Collectively, these microbes, known as the microbiome, outnumber the body’s human cells by about 10 to 1 and include more than 1,000 species of microorganisms and several million genes residing in the skin, respiratory system, urogenital, and gastrointestinal tracts. The microbiome’s complicated relationship with its human host is increasingly considered so crucial to health that researchers sometimes call it “the forgotten organ.”

Disturbances to the microbiome can arise from nutritional deficiencies, antibiotic use, and antiseptic modern life. Imbalances in the microbiome’s diverse microbial communities, which interact constantly with cells in the human body, may contribute to chronic health conditions, including diabetes, asthma and allergies, obesity and the metabolic syndrome, digestive disorders including irritable bowel syndrome (IBS), and autoimmune disorders like multiple sclerosis and rheumatoid arthritis, research shows.

While study of the microbiome is a growing research enterprise that has attracted enthusiastic media attention and venture capital, its findings are largely preliminary. But some laboratorians are already developing a greater appreciation for the microbiome’s contributions to human biochemistry and are considering a future in which they expect to measure changes in the microbiome to monitor disease and inform clinical practice.

Pivot Toward the Microbiome

Following the National Institutes of Health (NIH) Human Genome Project, many scientists noted the considerable genetic signal from microbes in the body and the existence of technology to analyze these microorganisms. That realization led NIH to establish the Human Microbiome Project in 2007, said Lita Proctor, PhD, its program director. In the project’s first phase, researchers studied healthy adults to produce a reference set of microbiomes and a resource of metagenomic sequences of bacteria in the airways, skin, oral cavities, and the gastrointestinal and vaginal tracts, plus a catalog of microbial genome sequences of reference strains. Researchers also evaluated specific diseases associated with disturbances in the microbiome, including gastrointestinal diseases such as Crohn’s disease, ulcerative colitis, IBS, and obesity, as well as urogenital conditions, those that involve the reproductive system, and skin diseases like eczema, psoriasis, and acne.

Phase 1 studies determined the composition of many parts of the microbiome, but did not define how that composition affects health or specific disease. The project’s second phase aims to “answer the question of what microbes actually do,” explained Proctor. Researchers are now examining properties of the microbiome including gene expression, protein, and human and microbial metabolite profiles in studies of pregnant women at risk for preterm birth, the gut hormones of patients at risk for IBS, and nasal microbiomes of patients at risk for type 2 diabetes.

Promising Lines of Research

Cystic fibrosis and microbiology investigator Michael Surette, PhD, sees promising microbiome research not just in terms of evidence of its effects on specific diseases, but also in what drives changes in the microbiome. Surette is Canada research chair in interdisciplinary microbiome research in the Farncombe Family Digestive Health Research Institute at McMaster University
in Hamilton, Ontario.

One type of study on factors driving microbiome change examines how alterations in composition and imbalances in individual patients relate to improving or worsening disease. “IBS, cystic fibrosis, and chronic obstructive pulmonary disease all have periods of instability or exacerbation,” he noted. Surette hopes that one day, tests will provide clinicians the ability to monitor changes in microbial composition over time and even predict when a patient’s condition is about to deteriorate. Monitoring perturbations to the gut microbiome might also help minimize collateral damage to the microbiome during aggressive antibiotic therapy for hospitalized patients, he added.

Monitoring changes to the microbiome also might be helpful for “culture negative” patients, who now may receive multiple, unsuccessful courses of different antibiotics that drive antibiotic resistance. Frustration with standard clinical biology diagnosis of lung infections in cystic fibrosis patients first sparked Surette’s investigations into the microbiome. He hopes that future tests involving the microbiome might also help asthma patients with neutrophilia, community-acquired pneumonia patients who harbor complex microbial lung communities lacking obvious pathogens, and hospitalized patients with pneumonia or sepsis. He envisions microbiome testing that would look for short-term changes indicating whether or not a drug is effective.

Companion Diagnostics

Daniel Peterson, MD, PhD, an assistant professor of pathology at Johns Hopkins University School of Medicine in Baltimore, believes the future of clinical testing involving the microbiome lies in companion diagnostics for novel treatments, and points to companies that are already developing and marketing tests that will require such assays.

Examples of microbiome-focused enterprises abound, including Genetic Analysis, based in Oslo, Norway, with its high-throughput test that uses 54 probes targeted to specific bacteria to measure intestinal gut flora imbalances in inflammatory bowel disease and irritable bowel syndrome patients. Paris, France-based Enterome is developing both novel drugs and companion diagnostics for microbiome-related diseases such as IBS and some metabolic diseases. Second Genome, based in South San Francisco, has developed an experimental drug, SGM-1019, that the company says blocks damaging activity of the microbiome in the intestine. Cambridge, Massachusetts-based Seres Therapeutics has received Food and Drug Administration orphan drug designation for SER-109, an oral therapeutic intended to correct microbial imbalances to prevent recurrent Clostridium difficile infection in adults.

One promising clinical use of the microbiome is fecal transplantation, which both prospective and retrospective studies have shown to be effective in patients with C. difficile infections who do not respond to front-line therapies, said James Versalovic, MD, PhD, director of Texas Children’s Hospital Microbiome Center and professor of pathology at Baylor College of Medicine in Houston. “Fecal transplants and other microbiome replacement strategies can radically change the composition of the microbiome in hours to days,” he explained.

But NIH’s Proctor discourages too much enthusiasm about fecal transplant. “Natural products like stool can have [side] effects,” she pointed out. “The [microbiome research] field needs to mature and we need to verify outcomes before anything becomes routine.”

Hurdles for Lab Testing

While he is hopeful that labs someday will use the microbiome to produce clinically useful information, Surette pointed to several problems that must be solved beforehand. First, molecular methods commonly used right now should be more quantitative and accurate. Additionally, research on the microbiome encompasses a wide variety of protocols, some of which are better at extracting particular types of bacteria and therefore can give biased views of communities living in the body. Also, tests may need to distinguish between dead and live microbes. Another hurdle is that labs using varied bioinfomatic methods may produce different results from the same sample, a problem that Surette sees as ripe for a solution from clinical laboratorians, who have expertise in standardizing robust protocols and in automating tests.

One way laboratorians can prepare for future, routine microbiome testing is to expand their notion of clinical chemistry to include both microbial and human biochemistry. “The line between microbiome science and clinical science is blurring,” said Versalovic. “When developing future assays to detect biochemical changes in disease states, we must consider the contributions of microbial metabolites and proteins and how to tailor tests to detect them.” In the future, clinical labs may test for uniquely microbial metabolites in various disease states, he predicted.

 

Automated Review of Mass Spectrometry Results  

Can We Achieve Autoverification?

Author: Katherine Alexander and Andrea R. Terrell, PhD  // Date: NOV.1.2015  // Source:Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/november/automated-review-of-mass-spectrometry-results-can-we-achieve-autoverification

 

Paralleling the upswing in prescription drug misuse, clinical laboratories are receiving more requests for mass spectrometry (MS) testing as physicians rely on its specificity to monitor patient compliance with prescription regimens. However, as volume has increased, reimbursement has declined, forcing toxicology laboratories both to increase capacity and lower their operational costs—without sacrificing quality or turnaround time. Now, new solutions are available enabling laboratories to bring automation to MS testing and helping them with the growing demand for toxicology and other testing.

What is the typical MS workflow?

A typical workflow includes a long list of manual steps. By the time a sample is loaded onto the mass spectrometer, it has been collected, logged into the lab information management system (LIMS), and prepared for analysis using a variety of wet chemistry techniques.

Most commercial clinical laboratories receive enough samples for MS analysis to batch analyze those samples. A batch consists of a calibrator(s), quality control (QC) samples, and patient/donor samples. Historically, the method would be selected (i.e. “analysis of opiates”), sample identification information would be entered manually into the MS software, and the instrument would begin analyzing each sample. Upon successful completion of the batch, the MS operator would view all of the analytical data, ensure the QC results were acceptable, and review each patient/donor specimen, looking at characteristics such as peak shape, ion ratios, retention time, and calculated concentration.

The operator would then post acceptable results into the LIMS manually or through an interface, and unacceptable results would be rescheduled or dealt with according to lab-specific protocols. In our laboratory we perform a final certification step for quality assurance by reviewing all information about the batch again, prior to releasing results for final reporting through the LIMS.

What problems are associated with this workflow?

The workflow described above results in too many highly trained chemists performing manual data entry and reviewing perfectly acceptable analytical results. Lab managers would prefer that MS operators and certifying scientists focus on troubleshooting problem samples rather than reviewing mounds of good data. Not only is the current process inefficient, it is mundane work prone to user errors. This risks fatigue, disengagement, and complacency by our highly skilled scientists.

Importantly, manual processes also take time. In most clinical lab environments, turnaround time is critical for patient care and industry competitiveness. Lab directors and managers are looking for solutions to automate mundane, error-prone tasks to save time and costs, reduce staff burnout, and maintain high levels of quality.

How can software automate data transfer from MS systems to LIMS?

Automation is not a new concept in the clinical lab. Labs have automated processes in shipping and receiving, sample preparation, liquid handling, and data delivery to the end user. As more labs implement MS, companies have begun to develop special software to automate data analysis and review workflows.

In July 2011, AIT Labs incorporated ASCENT into our workflow, eliminating the initial manual peak review step. ASCENT is an algorithm-based peak picking and data review system designed specifically for chromatographic data. The software employs robust statistical and modeling approaches to the raw instrument data to present the true signal, which often can be obscured by noise or matrix components.

The system also uses an exponentially modified Gaussian (EMG) equation to apply a best-fit model to integrated peaks through what is often a noisy signal. In our experience, applying the EMG results in cleaner data from what might appear to be poor chromatography ultimately allows us to reduce the number of samples we might otherwise rerun.

How do you validate the quality of results?

We’ve developed a robust validation protocol to ensure that results are, at minimum, equivalent to results from our manual review. We begin by building the assay in ASCENT, entering assay-specific information from our internal standard operating procedure (SOP). Once the assay is configured, validation proceeds with parallel batch processing to compare results between software-reviewed data and staff-reviewed data. For new implementations we run eight to nine batches of 30–40 samples each; when we are modifying or upgrading an existing implementation we run a smaller number of batches. The parallel batches should contain multiple positive and negative results for all analytes in the method, preferably spanning the analytical measurement range of the assay.

The next step is to compare the results and calculate the percent difference between the data review methods. We require that two-thirds of the automated results fall within 20% of the manually reviewed result. In addition to validating patient sample correlation, we also test numerous quality assurance rules that should initiate a flag for further review.

What are the biggest challenges during implementation and continual improvement initiatives?

On the technological side, our largest hurdle was loading the sequence files into ASCENT. We had created an in-house mechanism for our chemists to upload the 96-well plate map for their batch into the MS software. We had some difficulty transferring this information to ASCENT, but once we resolved this issue, the technical workflow proceeded fairly smoothly.

The greater challenge was changing our employees’ mindset from one of fear that automation would displace them, to a realization that learning this new technology would actually make them more valuable. Automating a non-mechanical process can be a difficult concept for hands-on scientists, so managers must be patient and help their employees understand that this kind of technology leverages the best attributes of software and people to create a powerful partnership.

We recommend that labs considering automated data analysis engage staff in the validation and implementation to spread the workload and the knowledge. As is true with most technology, it is best not to rely on just one or two super users. We also found it critical to add supervisor level controls on data file manipulation, such as removing a sample that wasn’t run from the sequence table. This can prevent inadvertent deletion of a file, requiring reinjection of the entire batch!

 

Understanding Fibroblast Growth Factor 23

Author: Damien Gruson, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/understanding-fibroblast-growth-factor-23

What is the relationship of FGF-23 to heart failure?

A Heart failure (HF) is an increasingly common syndrome associated with high morbidity, elevated hospital readmission rates, and high mortality. Improving diagnosis, prognosis, and treatment of HF requires a better understanding of its different sub-phenotypes. As researchers gained a comprehensive understanding of neurohormonal activation—one of the hallmarks of HF—they discovered several biomarkers, including natriuretic peptides, which now are playing an important role in sub-phenotyping HF and in driving more personalized management of this chronic condition.

Like the natriuretic peptides, fibroblast growth factor 23 (FGF-23) could become important in risk-stratifying and managing HF patients. Produced by osteocytes, FGF-23 is a key regulator of phosphorus homeostasis. It binds to renal and parathyroid FGF-Klotho receptor heterodimers, resulting in phosphate excretion, decreased 1-α-hydroxylation of 25-hydroxyvitamin D, and decreased parathyroid hormone (PTH) secretion. The relationship to PTH is important because impaired homeostasis of cations and decreased glomerular filtration rate might contribute to the rise of FGF-23. The amino-terminal portion of FGF-23 (amino acids 1-24) serves as a signal peptide allowing secretion into the blood, and the carboxyl-terminal portion (aa 180-251) participates in its biological action.

How might FGF-23 improve HF risk assessment?

Studies have shown that FGF-23 is related to the risk of cardiovascular diseases and mortality. It was first demonstrated that FGF-23 levels were independently associated with left ventricular mass index and hypertrophy as well as mortality in patients with chronic kidney disease (CKD). FGF-23 also has been associated with left ventricular dysfunction and atrial fibrillation in coronary artery disease subjects, even in the absence of impaired renal function.

FGF-23 and FGF receptors are both expressed in the myocardium. It is possible that FGF-23 has direct effects on the heart and participates in the physiopathology of cardiovascular diseases and HF. Experiments have shown that for in vitro cultured rat cardiomyocytes, FGF-23 stimulates pathological hypertrophy by activating the calcineurin-NFAT pathway—and in wild-type mice—the intra-myocardial or intravenous injection of FGF-23 resulted in left ventricular hypertrophy. As such, FGF-23 appears to be a potential stimulus of myocardial hypertrophy, and increased levels may contribute to the worsening of heart failure and long-term cardiovascular death.

Researchers have documented that HF patients have elevated FGF-23 circulating levels. They have also found a significant correlation between plasma levels of FGF-23 and B-type natriuretic peptide, a biomarker related to ventricular stretch and cardiac hypertrophy, in patients with left ventricular hypertrophy. As such, measuring FGF-23 levels might be a useful tool to predict long-term adverse cardiovascular events in HF patients.

Interestingly, researchers have documented a significant relationship between FGF-23 and PTH in both CKD and HF patients. As PTH stimulates FGF-23 expression, it could be that in HF patients, increased PTH levels increase the bone expression of FGF-23, which enhances its effects on the heart.

 

The Past, Present, and Future of Western Blotting in the Clinical Laboratory

Author: Curtis Balmer, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/the-past-present-and-future-of-western-blotting-in-the-clinical-laboratory

Much of the discussion about Western blotting centers around its performance as a biological research tool. This isn’t surprising. Since its introduction in the late 1970s, the Western blot has been adopted by biology labs of virtually every stripe, and become one of the most widely used techniques in the research armamentarium. However, Western blotting has also been employed in clinical laboratories to aid in the diagnosis of various diseases and disorders—an equally important and valuable application. Yet there has been relatively little discussion of its use in this context, or of how advances in Western blotting might affect its future clinical use.

Highlighting the clinical value of Western blotting, Stanley Naides, MD, medical director of Immunology at Quest Diagnostics observed that, “Western blotting has been a very powerful tool in the laboratory and for clinical diagnosis. It’s one of many various methods that the laboratorian brings to aid the clinician in the diagnosis of disease, and the selection and monitoring of therapy.” Indeed, Western blotting has been used at one time or the other to aid in the diagnosis of infectious diseases including hepatitis C (HCV), HIV, Lyme disease, and syphilis, as well as autoimmune disorders such as paraneoplastic disease and myositis conditions.

However, Naides was quick to point out that the choice of assays to use clinically is based on their demonstrated sensitivity and performance, and that the search for something better is never-ending. “We’re constantly looking for methods that improve detection of our target [protein],” Naides said. “There have been a number of instances where we’ve moved away from Western blotting because another method proves to be more sensitive.” But this search can also lead back to Western blotting. “We’ve gone away from other methods because there’s been a Western blot that’s been developed that’s more sensitive and specific. There’s that constant movement between methods as new tests are developed.”

In recent years, this quest has been leading clinical laboratories away from Western blotting toward more sensitive and specific diagnostic assays, at least for some diseases. Using confirmatory diagnosis of HCV infection as an example, Sai Patibandla, PhD, director of the immunoassay group at Siemens Healthcare Diagnostics, explained that movement away from Western blotting for confirmatory diagnosis of HCV infection began with a technical modification called Recombinant Immunoblotting Assay (RIBA). RIBA streamlines the conventional Western blot protocol by spotting recombinant antigen onto strips which are used to screen patient samples for antibodies against HCV. This approach eliminates the need to separate proteins and transfer them onto a membrane.

The RIBA HCV assay was initially manufactured by Chiron Corporation (acquired by Novartics Vaccines and Diagnostics in 2006). It received Food and Drug Administration (FDA) approval in 1999, and was marketed as Chiron RIBA HCV 3.0 Strip Immunoblot Assay. Patibandla explained that, at the time, the Chiron assay “…was the only FDA-approved confirmatory testing for HCV.” In 2013 the assay was discontinued and withdrawn from the market due to reports that it was producing false-positive results.

Since then, clinical laboratories have continued to move away from Western blot-based assays for confirmation of HCV in favor of the more sensitive technique of nucleic acid testing (NAT). “The migration is toward NAT for confirmation of HCV [diagnosis]. We don’t use immunoblots anymore. We don’t even have a blot now to confirm HCV,” Patibandla said.

Confirming HIV infection has followed a similar path. Indeed, in 2014 the Centers for Disease Control and Prevention issued updated recommendations for HIV testing that, in part, replaced Western blotting with NAT. This change was in response to the recognition that the HIV-1 Western blot assay was producing false-negative or indeterminable results early in the course of HIV infection.

At this juncture it is difficult to predict if this trend away from Western blotting in clinical laboratories will continue. One thing that is certain, however, is that clinicians and laboratorians are infinitely pragmatic, and will eagerly replace current techniques with ones shown to be more sensitive, specific, and effective. This raises the question of whether any of the many efforts currently underway to improve Western blotting will produce an assay that exceeds the sensitivity of currently employed techniques such as NAT.

Some of the most exciting and groundbreaking work in this area is being done by Amy Herr, PhD, a professor of bioengineering at University of California, Berkeley. Herr’s group has taken on some of the most challenging limitations of Western blotting, and is developing techniques that could revolutionize the assay. For example, the Western blot is semi-quantitative at best. This weakness dramatically limits the types of answers it can provide about changes in protein concentrations under various conditions.

To make Western blotting more quantitative, Herr’s group is, among other things, identifying losses of protein sample mass during the assay protocol. About this, Herr explains that the conventional Western blot is an “open system” that involves lots of handling of assay materials, buffers, and reagents that makes it difficult to account for protein losses. Or, as Kevin Lowitz, a senior product manager at Thermo Fisher Scientific, described it, “Western blot is a [simple] technique, but a really laborious one, and there are just so many steps and so many opportunities to mess it up.”

Herr’s approach is to reduce the open aspects of Western blot. “We’ve been developing these more closed systems that allow us at each stage of the assay to account for [protein mass] losses. We can’t do this exactly for every target of interest, but it gives us a really good handle [on protein mass losses],” she said. One of the major mechanisms Herr’s lab is using to accomplish this is to secure proteins to the blot matrix with covalent bonding rather than with the much weaker hydrophobic interactions that typically keep the proteins in place on the membrane.

Herr’s group also has been developing microfluidic platforms that allow Western blotting to be done on single cells, “In our system we’re doing thousands of independent Westerns on single cells in four hours. And, hopefully, we’ll cut that down to one hour over the next couple years.”

Other exciting modifications that stand to dramatically increase the sensitivity, quantitation, and through-put of Western blotting also are being developed and explored. For example, the use of capillary electrophoresis—in which proteins are conveyed through a small electrolyte-filled tube and separated according to size and charge before being dropped onto a blotting membrane—dramatically reduces the amount of protein required for Western blot analysis, and thereby allows Westerns to be run on proteins from rare cells or for which quantities of sample are extremely limited.

Jillian Silva, PhD, an associate specialist at the University of California, San Francisco Helen Diller Family Comprehensive Cancer Center, explained that advances in detection are also extending the capabilities of Western blotting. “With the advent of fluorescence detection we have a way to quantitate Westerns, and it is now more quantitative than it’s ever been,” said Silva.

Whether or not these advances produce an assay that is adopted by clinical laboratories remains to be seen. The emphasis on Western blotting as a research rather than a clinical tool may bias advances in favor of the needs and priorities of researchers rather than clinicians, and as Patibandla pointed out, “In the research world Western blotting has a certain purpose. [Researchers] are always coming up with new things, and are trying to nail down new proteins, so you cannot take Western blotting away.” In contrast, she suggested that for now, clinical uses of Western blotting remain “limited.”

 

Adapting Next Generation Technologies to Clinical Molecular Oncology Service

Author: Ronald Carter, PhD, DVM  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/adapting-next-generation-technologies-to-clinical-molecular-oncology-service

Next generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and in the amount of information they provide. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing. This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.What follows is one viewpoint on the major challenges in adopting NGTs into diagnostic molecular oncology service.

Choosing a Platform

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

When high-throughput instruments first made their appearance, laboratories paid more attention to the accuracy of base-reading: Less accurate sequencing meant more data cleaning and resequencing (1). Now, new instrument designs have narrowed the differences, and test chemistry can have a comparatively large impact on analytical accuracy (Figure 1). The robustness of technical performance can also vary significantly depending upon specimen type. For example, LifeTechnologies’ sequencing platforms appear to be comparatively more tolerant of low DNA quality and concentration, which is an important consideration for fixed and processed tissues.

https://www.aacc.org/~/media/images/cln/articles/2015/october/carter_fig1_cln_oct15_ed.jpg

Figure 1 Comparison of Sequencing Chemistries

Sequence pile-ups of the same target sequence (2 large genes), all performed on the same analytical instrument. Results from 4 different chemistries, as designed and supplied by reagent manufacturers prior to optimization in the laboratory. Red lines represent limits of exons. Height of blue columns proportional to depth of coverage. In this case, the intent of the test design was to provide high depth of coverage so that reflex Sanger sequencing would not be necessary. Courtesy B. Sadikovic, U. of Western Ontario.

 

In addition, batching, robotics, workload volume patterns, maintenance contracts, software licenses, and platform lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT: In some applications, fees for intellectual property can represent more than 50% of the bench cost of performing a given test, and increase substantially without warning.

Laboratories must also deal with the problem of obsolescence. Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives. Before NGTs, major instruments could confidently be expected to remain current for at least 6 to 8 years. Now, a major instrument is obsolete much sooner, often within 2 to 3 years. This means that keeping it in service might cost more than investing in a new platform. Lease-purchase arrangements help mitigate year-to-year fluctuations in capital equipment costs, and maximize the value of old equipment at resale.

One Size Still Does Not Fit All

Laboratories face numerous technical considerations to optimize sequencing protocols, but the test has to be matched to the performance criteria needed for the clinical indication (2). For example, measuring response to treatment depends first upon the diagnostic recognition of mutation(s) in the tumor clone; the marker(s) then have to be quantifiable and indicative of tumor volume throughout the course of disease (Table 1).

As a result, diagnostic tests need to cover many different potential mutations, yet accurately identify any clinically relevant mutations actually present. On the other hand, tests for residual disease need to provide standardized, sensitive, and accurate quantification of a selected marker mutation against the normal background. A diagnostic panel might need 1% to 3% sensitivity across many different mutations. But quantifying early response to induction—and later assessment of minimal residual disease—needs a test that is reliably accurate to the 10-4 or 10-5 range for a specific analyte.

Covering all types of mutations in one diagnostic test is not yet possible. For example, subtyping of acute myeloid leukemia is both old school (karyotype, fluorescent in situ hybridization, and/or PCR-based or array-based testing for fusion rearrangements, deletions, and segmental gains) and new school (NGT-based panel testing for molecular mutations).

Chemistries that cover both structural variants and copy number variants are not yet in general use, but the advantages of NGTs compared to traditional methods are becoming clearer, such as in colorectal cancer (3). Researchers are also using cell-free DNA (cfDNA) to quantify residual disease and detect resistance mutations (4). Once a clinically significant clone is identified, enrichment techniques help enable extremely sensitive quantification of residual disease (5).

Validation and Quality Assurance

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assembling the resources for validation and quality assurance. The second is keeping tests up-to-date as new analytes are needed. Even if a given test chemistry has the flexibility to add analytes without revalidating the entire panel, keeping up with clinical advances is a constant priority.

Due to their throughput and multiplexing capacities, NGT platforms typically require considerable upfront investment to adopt, and training staff to perform testing takes even more time. Proper validation is harder to document: Assembling positive controls, documenting test performance criteria, developing quality assurance protocols, and conducting proficiency testing are all demanding. Labs meet these challenges in different ways. Laboratory-developed tests (LDTs) allow self-determined choice in design, innovation, and control of the test protocol, but can be very expensive to set up.

Food and Drug Administration (FDA)-approved methods are attractive but not always an option. More FDA-approved methods will be marketed, but FDA approval itself brings other trade-offs. There is a cost premium compared to LDTs, and the test methodologies are locked down and not modifiable. This is particularly frustrating for NGTs, which have the specific attraction of extensive multiplexing capacity and accommodating new analytes.

IT and the Evolution of Molecular Oncology Reporting Standards

The options for information technology (IT) pipelines for NGTs are improving rapidly. At the same time, recent studies still show significant inconsistencies and lack of reproducibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing. It can be difficult to duplicate published performances in clinical studies because of a lack of sufficient information about the protocol (chemistry) and software. Building bioinformatics capacity is a key requirement, yet skilled people are in short supply and the qualifications needed to work as a bioinformatician in a clinical service are not yet clearly defined.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-specific­ variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages (6). One of the biggest challenges is to reproducibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations (7). For multiple analyte panels, such as predictive testing for breast cancer, only the performance of the whole panel in a population of patients can be compared; individual patients may be scored into different risk categories by different tests, all for the same test indication.

In large scale sequencing of tumor genomes, which types of mutations are most informative in detecting, quantifying, and predicting the behavior of the tumor over time? The amount and complexity of mutation varies considerably across different tumor types, and while some mutations are more common, stable, and clinically informative than others, the utility of a given tumor marker varies in different clinical situations. And, for a given tumor, treatment effect and metastasis leads to retesting for changes in drug sensitivities.

These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision. One approach many labs choose is licensed technologies with shared databases that are updated in real time. These are attractive, despite their cost and licensing fees. New tests that incorporate proprietary IT with NGT platforms link the genetic signatures of tumors to clinically significant considerations like tumor classification, recommended methodologies for monitoring response, predicted drug sensitivities, eligible clinical trials, and prognostic classifications. In-house development of such solutions will be difficult, so licensing platforms from commercial partners is more likely to be the norm.

The Commercial Value of Health Records and Test Data

The future of cancer management likely rests on large-scale databases that link hereditary and somatic tumor testing with clinical outcomes. Multiple centers have such large studies underway, and data extraction and analysis is providing increasingly refined interpretations of clinical significance.

Extracting health outcomes to correlate with molecular test results is commercially valuable, as the pharmaceutical, insurance, and healthcare sectors focus on companion diagnostics, precision medicine, and evidence-based health technology assessment. Laboratories that can develop tests based on large-scale integration of test results to clinical utility will have an advantage.

NGTs do offer opportunities for net reductions in the cost of healthcare. But the lag between availability of a test and peer-evaluated demon­stration of clinical utility can be considerable. Technical developments arise faster than evidence of clinical utility. For example, immuno­histochemistry, estrogen receptor/progesterone receptor status, HER2/neu, and histology are still the major pathological criteria for prognostic evaluation of breast cancer at diagnosis, even though multiple analyte tumor profiling has been described for more than 15 years. Healthcare systems need a more concerted assessment of clinical utility if they are to take advantage of the promises of NGTs in cancer care.

Disruptive Advances

Without a doubt, “disruptive” is an appropriate buzzword in molecular oncology, and new technical advances are about to change how, where, and for whom testing is performed.

• Predictive Testing

Besides cost per analyte, one of the drivers for taking up new technologies is that they enable multiplexing many more analytes with less biopsy material. Single-analyte sequential testing for epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase, and other targets on small biopsies is not sustainable when many more analytes are needed, and even now, a significant proportion of test requests cannot be completed due to lack of suitable biopsy material. Large panels incorporating all the mutations needed to cover multiple tumor types are replacing individual tests in companion diagnostics.

• Cell-Free Tumor DNA

Challenges of cfDNA include standardizing the collection and processing methodologies, timing sampling to minimize the effect of therapeutic toxicity on analytical accuracy, and identifying the most informative sample (DNA, RNA, or protein). But for more and more tumor types, it will be possible to differentiate benign versus malignant lesions, perform molecular subtyping, predict response, monitor treatment, or screen for early detection—all without a surgical biopsy.

cfDNA technologies can also be integrated into core laboratory instrumentation. For example, blood-based EGFR analysis for lung cancer is being developed on the Roche cobas 4800 platform, which will be a significant change from the current standard of testing based upon single tests of DNA extracted from formalin-fixed, paraffin-embedded sections selected by a pathologist (8).

• Whole Genome and Whole Exome Sequencing

Whole genome and whole exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individual or multiple gene test panels as the technical cost of sequencing declines and interpretive accuracy improves (9). Laboratories can apply informatics selectively or broadly to extract much more information at relatively little increase in cost, and the interpretation of individual analytes will be improved by the context of the whole sequence.

• Minimal Residual Disease Testing

Massive resequencing and enrichment techniques can be used to detect minimal residual disease, and will provide an alternative to flow cytometry as costs decline. The challenge is to develop robust analytical platforms that can reliably produce results in a high proportion of patients with a given tumor type, despite using post-treatment specimens with therapy-induced degradation, and a very low proportion of target (tumor) sequence to benign background sequence.

The tumor markers should remain informative for the burden of disease despite clonal evolution over the course of multiple samples taken during progression of the clinical course and treatment. Quantification needs to be accurate and sensitive down to the 10-5 range, and cost competitive with flow cytometry.

• Point-of-Care Test Methodologies

Small, rapid, cheap, and single use point-of-care (POC) sequencing devices are coming. Some can multiplex with analytical times as short as 20 minutes. Accurate and timely testing will be possible in places like pharmacies, oncology clinics, patient service centers, and outreach programs. Whether physicians will trust and act on POC results alone, or will require confirmation by traditional laboratory-based testing, remains to be seen. However, in the simplest type of application, such as a patient known to have a particular mutation, the advantages of POC-based testing to quantify residual tumor burden are clear.

Conclusion

Molecular oncology is moving rapidly from an esoteric niche of diagnostics to a mainstream, required component of integrated clinical laboratory services. While NGTs are markedly reducing the cost per analyte and per specimen, and will certainly broaden the scope and volume of testing performed, the resources required to choose, install, and validate these new technologies are daunting for smaller labs. More rapid obsolescence and increased regulatory scrutiny for LDTs also present significant challenges. Aligning test capacity with approved clinical indications will require careful and constant attention to ensure competitiveness.

References

1. Liu L, Li Y, Li S, et al. Comparison of next-generation sequencing systems. J Biomed Biotechnol 2012; doi:10.1155/2012/251364.

2. Brownstein CA, Beggs AH, Homer N, et al. An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge. Genome Biol 2014;15:R53.

3. Haley L, Tseng LH, Zheng G, et al. Performance characteristics of next-generation sequencing in clinical mutation detection of colorectal ­cancers. [Epub ahead of print] Modern Pathol July 31, 2015 as doi:10.1038/modpathol.2015.86.

4. Butler TM, Johnson-Camacho K, Peto M, et al. Exome sequencing of cell-free DNA from metastatic cancer patients identifies clinically actionable mutations distinct from primary ­disease. PLoS One 2015;10:e0136407.

5. Castellanos-Rizaldos E, Milbury CA, Guha M, et al. COLD-PCR enriches low-level variant DNA sequences and increases the sensitivity of genetic testing. Methods Mol Biol 2014;1102:623–39.

6. Hiltemann S, Jenster G, Trapman J, et al. Discriminating somatic and germline mutations in tumor DNA samples without matching normals. Genome Res 2015;25:1382–90.

7. Lammers PE, Lovly CM, Horn L. A patient with metastatic lung adenocarcinoma harboring concurrent EGFR L858R, EGFR germline T790M, and PIK3CA mutations: The challenge of interpreting results of comprehensive mutational testing in lung cancer. J Natl Compr Canc Netw 2015;12:6–11.

8. Weber B, Meldgaard P, Hager H, et al. Detection of EGFR mutations in plasma and biopsies from non-small cell lung cancer patients by allele-specific PCR assays. BMC Cancer 2014;14:294.

9. Vogelstein B, Papadopoulos N, Velculescu VE, et al. Cancer genome landscapes. Science 2013;339:1546–58.

10. Heitzer E, Auer M, Gasch C, et al. Complex tumor genomes inferred from single circulating tumor cells by array-CGH and next-generation sequencing. Cancer Res 2013;73:2965–75.

11. Healy B. BRCA genes — Bookmaking, fortunetelling, and medical care. N Engl J Med 1997;336:1448–9.

 

 

 

Read Full Post »

Praluent FDA Approved

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

PRALUENT® (alirocumab) is now approved for additional LDL-C lowering on top of maximally tolerated statin therapy in patients with HeFH or clinical ASCVD1
INDICATIONS AND USAGE
PRALUENT is a PCSK9 (Proprotein Convertase Subtilisin/Kexin Type 9) inhibitor antibody indicated as adjunct to diet and maximally tolerated statin therapy for the treatment of adults with heterozygous familial hypercholesterolemia or clinical atherosclerotic cardiovascular disease, who require additional lowering of LDL-C.

The effect of PRALUENT on cardiovascular morbidity and mortality has not been determined.

DOSING INFORMATION
The recommended starting dose of PRALUENT is 75 mg administered subcutaneously once every 2 weeks, since the majority of patients achieve sufficient LDL-C reduction with this dosage. If the LDL-C response is inadequate, the dosage may be increased to the maximum dosage of 150 mg administered every 2 weeks.

Measure LDL-C levels within 4 to 8 weeks of initiating or titrating PRALUENT to assess response and adjust the dose, if needed.

PRALUENT is a human monoclonal antibody that binds to PCSK91
PRALUENT efficacy was investigated in 5 double-blind, placebo-controlled trials with 3499 patients enrolled: 36% with HeFH and 54% non-FH with clinical ASCVD.
All patients were receiving a maximally tolerated dose of statin with or without other lipid-modifying therapies
3 studies used an initial dose of 75 mg Q2W as part of an up-titration regimen with criteria-based up-titration to 150 mg Q2W at week 12 for patients who did not achieve their prespecified target LDL-C at week 8
2 studies with 150 mg Q2W dose only
Clinical ASCVD is defined in the ACC/AHA guidelines2 as acute coronary syndromes or a history of any of the following: myocardial infarction, stable or unstable angina, coronary or other arterial revascularization, transient ischemic attack or stroke, or peripheral arterial disease presumed to be of atherosclerotic origin.
All studies met their primary efficacy endpoint measured at week 241
All trials were at least 52 weeks in duration with the primary efficacy endpoint measured at week 24 (mean percent change in LDL-C from baseline)
The first and only FDA-approved PCSK9 inhibitor with 2 doses that allows you to adjust the dose based on your patients’ LDL-C lowering needs1
MyPRALUENT™: Comprehensive support for you and your patients
MyPRALUENT is designed to help meet your needs and your patients’ needs
Speak with a MyPRALUENT Care Specialist at 1-844-PRALUENT (1-844-772-5836), option 1
IMPORTANT SAFETY INFORMATION
PRALUENT is contraindicated in patients with a history of a serious hypersensitivity reaction to PRALUENT. Reactions have included hypersensitivity vasculitis and hypersensitivity reactions requiring hospitalization.

Hypersensitivity reactions (e.g., pruritus, rash, urticaria), including some serious events (e.g., hypersensitivity vasculitis and hypersensitivity reactions requiring hospitalization), have been reported with PRALUENT treatment. If signs or symptoms of serious allergic reactions occur, discontinue treatment with PRALUENT, treat according to the standard of care, and monitor until signs and symptoms resolve.

The most commonly occurring adverse reactions (≥5% of patients treated with PRALUENT and occurring more frequently than with placebo) are nasopharyngitis, injection site reactions, and influenza.

Local injection site reactions including erythema/redness, itching, swelling, and pain/tenderness were reported more frequently in patients treated with PRALUENT (7.2% versus 5.1% for PRALUENT and placebo, respectively). Few patients discontinued treatment because of these reactions (0.2% versus 0.4% for PRALUENT and placebo, respectively), but patients receiving PRALUENT had a greater number of injection site reactions, had more reports of associated symptoms, and had reactions of longer average duration than patients receiving placebo.

Neurocognitive events were reported in 0.8% of patients treated with PRALUENT and 0.7% of patients treated with placebo. Confusion or memory impairment were reported more frequently by those treated with PRALUENT (0.2% for each) than in those treated with placebo (<0.1% for each).

Liver-related disorders (primarily related to abnormalities in liver enzymes) were reported in 2.5% of patients treated with PRALUENT and 1.8% of patients treated with placebo, leading to treatment discontinuation in 0.4% and 0.2% of patients, respectively. Increases in serum transaminases to greater than 3 times the upper limit of normal occurred in 1.7% of patients treated with PRALUENT and 1.4% of patients treated with placebo.

The most common adverse reactions leading to treatment discontinuation in patients treated with PRALUENT were allergic reactions (0.6% versus 0.2% for PRALUENT and placebo, respectively) and elevated liver enzymes (0.3% versus <0.1%).

PRALUENT is a human monoclonal antibody. As with all therapeutic proteins, there is a potential for immunogenicity with PRALUENT.

Please see full Prescribing Information.

 

Read Full Post »

Eric Topol, M.D., Gary & Mary West Endowed Chair of Innovative Medicine, Scripps Research, Executive VP, Scripps Research, Ex-Chairman of Cardiovascular Medicine at Cleveland Clinic and Founder of the Cleveland Clinic Lerner College of Medicine

Curators: Larry H Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

Eric Topol, M.D. is professor of genomics and holds the Scripps endowed chair in innovative medicine. He is the director of the Scripps Translational Science Institute in La Jolla, California. Previously, he led the Cleveland Clinic to its #1 ranking in heart care, started a new medical school, and led key discoveries in heart disease.

Professor of Genomics
Department of Molecular and Experimental Medicine
California Campus
Laboratory Website
etopol@scripps.edu
(858) 554-5708

Scripps Research Joint Appointments

Director, Scripps Translational Science Institute
Faculty, Graduate Program

Other Joint Appointments

Chief Academic Officer, Scripps Health
Senior Consultant, Scripps Clinic, Division of Cardiovascular Diseases

Research Focus

My research is on indvidualized medicine, using the genome and digital technologies to understand each person at the biologic, physiologic granular level to determine appropriate therapies and prevention. An example is the use of pharmacogenomics and our research on clopidogrel (Plavix). By determining the reasons for why such a large proportion of people do not respond to this medication, we can use alternative treatment strategies to prevent blood clots.

 

Education

M.D., University of Rochester, New York, 1979
B.A., Biomedicine, University of Virginia, Charlottesville, 1975

Professional Experience

University of Virginia, B.A. With Highest Distinction, 1975
University of Rochester, M.D. With Honor, 1979
University of California, San Francisco, Internal Medicine Residency, 1979-1982
Johns Hopkins, Cardiology Fellowship, 1982-1985
University of Michigan, Professor with Tenure, Department of Internal Medicine, 1985-1991
Cleveland Clinic, Chairman of the Department of Cardiovascular Medicine, 1991-2006
Cleveland Clinic, Chief Academic Officer, 2000-2005
Cleveland Clinic Lerner College of Medicine,Founder and Provost
Case Western Reserve University, Professor of Genetics,2003-2006

Awards & Professional Activities

Elected to Institute of Medicine, National Academy of Sciences
Simon Dack Award, American College of Cardiology
American Heart Association, Top 10 Research Advances (2001, 2004)
Top 10 Most Cited Researchers in Medicine, Institute for Scientific Information
Doctor of the Decade, Thompson Scientific Award

Selected References

Goetz L, Bethel K, Topol EJ. Rebooting cancer tissue handling in the sequencing era. JAMA 309: in press, 2013

Harper AR, Topol EJPharmacogenomics in clinical practice and drug developmentNature Biotechnology, 2012 Nov;30(11):1117-24. [PMID: 23138311]

Komatireddy R, Topol EJ. Medicine Unplugged: The Future of Laboratory Medicine. Clin Chem. 2012 Oct 15. [PMID: 23071365]

Harismendy O, Notani D, Song X, Rahim NG, Tanasa B, Heintzman N, Ren B, Fu X-D, Topol EJ, Rosenfeld MG, Frazer KA. 9p21 DNAvariants associated with coronary artery disease impair interferon-c signalling response. Nature470(7333):264-268, 2011. [PMID 21307941]

Bloss CS, Schork NJ, Topol EJEffect of Direct-to-Consumer Genomewide Profiling to Assess Disease RiskNew England Journal of Medicine 364(6):524-534, 2011. [PMID 21226570]

Topol EJ, Schork NJ. Catapulting clopidogrel pharmacogenomics forward. Nature Medicine 17(1):40-41, 2011. [PMID 21217678]

Rosenberg S, et al, Topol EJ; PREDICT Investigators. Multicenter validation of the diagnostic accuracy of a blood-based gene expression test for assesing coronary artery disease in nondiabetic patients. Annals of Internal Medicine153(7):425-434, 2010. [PMID 20921541]

Topol, EJ. Transforming Medicine via Digital Innovation. Science Translational Medicine 2(16):16cm4, 2010. [PMID 20371472]

http://creativedestructionofmedicine.com/?p=3

The wireless future of medicine

http://www.ted.com/talks/eric_topol_the_wireless_future_of_medicine

FinanciaPost‘s Digital revolution in antiquated health-care industry a major operation

Listen to Dr. Topol’s podcast interview with Knowledge@Wharton

In his new book, The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care, Eric Topol argues that medicine is set to undergo its biggest shakeup in history, pushed by demanding consumers and the availability of game-changing technology. Topol — a cardiologist, director of the Scripps Translational Science Institute and co-founder of the West Wireless Health Institute in La Jolla, Calif. — was recently interviewed for Knowledge@Wharton by C. William Hanson, III, a professor of anesthesiology and critical care, and director, surgical intensive care, at the Hospital of the University of Pennsylvania. Hanson’s latest book is titled, Smart Medicine: How the Changing Role of Doctors Will Revolutionize Health Care, published in 2011.

Below is an edited transcript of the conversation.

William Hanson: I thought it might be worthwhile to quickly give you a sense of who I am and where I’m coming from [in this interview]. I’m an anesthesiologist and an intensivist, primarily a surgical intensivist, and serve as chief medical information officer at Penn. So I have some interests that will skew in that direction.

I love the title of your book. There are many people on both sides of this question. Some would say that the creative destruction of medicine is a pretty scary concept, and I think there would be plenty of us who would agree that something drastic needs to happen. You’re obviously in the latter category.

Eric Topol: I’m in the [group that feels] something drastic needs to happen. I think it can happen, it will happen and I’m hoping that we can help facilitate or catalyze that.

Hanson: You have been in a prominent role in terms of questioning traditional medical concepts. Maybe you could describe for the audience what your personal practice is and some of the issues in which you have engaged the traditional medical establishment in the past.

Topol: What I’ve done to try to change medicine in many different ways [includes] research on how to come up with better therapies. These were in large trials, as large as 40,000 patients with heart attacks, but also in [other] ways, such as starting a new medical school with a very innovative curriculum and challenging a drug safety issue which was really important for the public. So I’ve had different experiences over the years.

But what was changing for me was that four or five years ago, we recognized we had this new emerging capability of digitizing human beings, which we’ve never had before. Everybody is used to digitizing books, movies and newspapers, whatever. But when you digitize human beings by knowing the sequence of their DNA, all their physiologic metrics, like their vital signs, their anatomy [and so forth], this comes together as a unique kairos — this supreme opportune moment in medicine.

Hanson: That’s a nice lead in. I want to return to that digitization because it is something I’m dealing with in our IT systems, as I’m sure you are — what to do with the digitized information, how much of it to keep, how to analyze it. But I wanted to come back to your book title and to one of the gentlemen who endorsed the book, Clayton Christensen. He has written a couple of books as you know, including The Innovator’s Dilemma and The Innovator’s Prescription.

Topol: He has written three books on innovation and is renowned for his insights and leadership in [that] space. But there’s a little bit of a difference between us.

Hanson: Maybe you could elaborate on that.

Topol: Yes. I look to him as one of the real guiding lights of innovation. He’s not a physician. In the book, Innovator’s Prescription, he worked with a young physician, Jason Hwang, who is now out at Stanford.

Hanson: Yes, I have met him.

Topol: The difference, though, is that I am coming at it from almost three decades in the medical profession, and I’m not calling for an innovation. He calls it disruptive innovation. I’m looking at a much more radical thing. This is like taking what Clayton has popularized [and making it much bigger] … in terms of how transformative this can be, this whole ability to digitize human beings.

Hanson: In his work, he has talked about the digitization of the music industry, for example, and the film industry. Recognizing that you’re dealing at a much deeper level with the medical side of things, [it has to do with] how products enter the market at the low end and disrupt and take over higher-end products. For me and you at academic medical centers, where we think we’re providing state of the art care, I wonder to what extent we are likely to be made irrelevant by radical disruptions of the kind you’re talking about. What do you think about that?

Topol: I think that the medical community has been incredibly resistant to change. That’s across not just academic medical centers, but the whole continuum of medicine, [including] practicing physicians. But there is a consumer-driven health care revolution out there where each individual has access to their smart phone, all their vital signs and relevant data. There’s an ability to tap into their DNA sequence and all of their genomics. And of course that’s superimposed on this digital infrastructure that each person has now with a social network, with broadband Internet access and pervasive connectivity.

The Atlantic’s
 Q&A with Dr. Topol

Destroying Medicine to Rebuild It: Eric Topol on Patients Using Data

The emergency announcement on the transcontinental flight was terse and urgent: “Is there a doctor on board?” A passenger in distress was feeling intense pressure in his chest.

Eric Topol strode down the aisle to examine the passenger to see if he was having a heart attack, a diagnosis that normally would be tough at 35,000 feet. But Topol was armed with a prototype device that can take a person’s electrocardiogram (ECG) using a smartphone. The director of the Scripps Translational Science Institute near San Diego, he had just demonstrated how it worked during a lecture in Washington, D.C.

“It’s a case that fits over your iPhone with two built-in sensors connected to an app,” says Topol, showing me the device, made by Oklahoma City-based AliveCor. “You put your fingers on the sensors, or put them up to your chest, and it works like an ECG that you read in real-time on your phone.”

Dr. Topol’s guest blog for ForbesThe Power of Digitizing Human Beings and his Q&A with SalonThe Coming Medical Revolution

Read Wired’s Q&A with Dr. Topol: Why Doctors Need to Embrace Their Digital Future Now

Slate featured the book on their blog “Future Tense”

And here for an interview on Keen On (Tech Crunch TV): Why the Entrepreneurial Opportunities are Limitless http://www.pbgtoolkit.com/docs_pbg/1331561497Doc6.jpg

“Keen On” http://www.pbgtoolkit.com/docs_pbg/1331915366topol.jpg

Read Full Post »

Heroes in Basic Medical Research – Robert J. Lefkowitz

Author & Curator: Larry H Bernstein, MD, FCAP

Robert J. Lefkowitz, MD

Robert J. Lefkowitz MD, a Howard Hughes Medical Institute investigator who has spent his entire 39-year research career at the Duke University Medical Center, is sharing the 2012 Nobel Prize in Chemistry with Brian K. Kobilka of Stanford University School of Medicine, who was a post-doctoral fellow in Lefkowitz’s lab in the 1980s.

They are being recognized for their work on a class of cell surface receptors that have become the target of prescription drugs, including antihistamines, ulcer drugs and beta blockers to relieve hypertension, angina and coronary disease.

The receptors catch chemical signals from the outside and transmit their messages into the cell, providing the cell with information about changes occurring within the body. These particular receptors are called seven-transmembrane G protein-coupled receptors, or just “G-coupled receptors” for short. Serpentine in appearance, G-coupled receptors weave through the surface of the cell seven times.

The human genome contains code to make at least 1,000 different forms of these trans-membrane receptors, all of which are quite similar. The receptors also bear a strong resemblance to receptors that detect light in the eyes, smells in the nose and taste on the tongue. (See playlist of Lefkowitz science videos here.)

“Bob’s seminal discoveries related to G-protein coupled receptors ultimately became the basis for a great many medications that are in use today across many disease areas,” said Victor J. Dzau, MD, Chancellor for Health Affairs and CEO, Duke University Health System.  “He is an outstanding example of a physician-scientist whose impact can be seen in the lives of the countless patients who have benefited from his scientific discoveries. We are very proud of his magnificent achievements and grateful for his many contributions to Duke Medicine.”

After attending public elementary and junior high schools I entered The Bronx High School of Science (10th grade) in the autumn of 1956, graduating at age 16 in 1959. “Bronx Science” is one of several public high schools in New York City which admits students on the basis of a competitive examination. The student body, representing approximately the top 5% based on the exam, are gifted and interested in science and math. The accomplishments of graduates of this high school are quite remarkable. For example, I am the 8th Nobel Laureate to have graduated from this school, the 7 previous ones having received their prizes in Physics. For me, attending this school was a formative experience. Whereas in elementary and junior high school I was not greatly challenged, here I was among a group of remarkably bright, interesting and stimulating classmates. The curriculum featured many advanced classes at the college level. I was particularly drawn to chemistry and, as a result of taking these college level classes, I was able to receive full credit for two years of chemistry when I entered Columbia College in 1959. Thus I began as a college freshman with organic chemistry, a course generally taken by juniors.

The level of scholarship maintained by the student body was such that even with an average of about 94% my final class rank was about 100th out of 800. A classmate and friend at the time and at present, the famous geneticist David Botstein, had an almost identical average, a fact we tease each other about to this day.

Along with dozens of classmates, I moved on to Columbia University where I enrolled as a pre-medical student majoring in chemistry. The two year core curriculum in “Contemporary Civilization” was required of all students. With an emphasis on reading classic texts in history, philosophy, sociology and the political sciences and discussing these in small seminars, it was for me an opening to a whole new world. In addition, I took courses with and was exposed to, such intellectual giants as the literary critic Lionel Trilling, the cultural historian Jacques Barzun and the sociologist Daniel Bell, among others. I have very fond memories from this period of spending many hours in the public reading room at the 42nd Street New York Public Library, researching papers for those classes.

I also studied advanced Organic Chemistry with Cheves Walling and Physical Chemistry in a department which was strongly influenced by the then recently retired prominent physical organic chemist, Louis Hammett. However, the chemistry professor who had the most profound influence on me was actually a young Assistant Professor of Chemistry, Ronald Breslow. As a college senior I took an advanced seminar in biochemistry which he taught single handedly. This introduction to the chemistry of processes in living organisms really excited me in part, I suspect, because of his very lively teaching style. None of this, however, in any way diverted me from my goal of studying to become a practicing physician.

I greatly enjoyed my four years in medical school. I had dreamed about becoming a physician since grade school and now I was finally doing it. As a freshman immersed in the basic medical sciences I was able to deepen my interest in, and fascination with, biochemistry. Our biochemistry professors included a remarkable array of scholars (not that any of us appreciated that at the time). We heard lectures on metabolism from David Rittenberg, Chair of the Department; from David Shemin on porphyrins; from Irwin Chargaff on nucleic acids; and from David Nachmansohn on cholinergic neurotransmission.

One young professor left a lasting impression on me. Paul Marks was then a young academic hematologist who taught the Introduction to Clinical Medicine course in which we studied clinical problems for the first time, examined case histories, and looked at blood specimens. Not only was he a good clinician but he assigned readings from the basic science literature that were relevant in a very meaningful way to the cases we studied. This showed me how scientific information could be brought to bear on clinical problems. Among my classmates and friends in medical school was Harold Varmus, who was the co-recipient of the 1989 Nobel Prize for the discovery of oncogenes.

On July 1, 1968 I moved my family (now including the recently born Cheryl) to Rockville, Maryland to begin my research career at the NIH in nearby Bethesda, Maryland. I had been assigned, through a matching program, to work with Drs. Jesse Roth and Ira Pastan in the Clinical Endocrinology Branch of the National Institute of Arthritis and Metabolic Diseases (NIAMD), now known as NIDDK, the National Institute of Diabetes and Digestive and Kidney Diseases. I was a Clinical Associate, meaning that in addition to doing full time research ten months out of the year, for two months I also supervised a clinical endocrinology in-patient service. Because of this, I gained a remarkable exposure to unusual endocrine diseases which were under study at the time. An example of this was acromegaly.

It was the heyday of interest in second messenger signaling after the discovery of cAMP by Earl Sutherland. He would receive the Nobel Prize in Medicine and Physiology for this in 1971. One hormone after another was being shown to stimulate the enzyme adenylate cyclase thus increasing intracellular levels of cAMP. The idea that these different hormones might work through distinct receptors was talked about but was controversial. Moreover, at the time there were no direct methods for studying the receptors. I was assigned the challenging task of developing a radioligand binding method to study the putative receptors for adrenocorticotropic hormone (ACTH) in plasma membranes derived from an ACTH responsive adrenocortical carcinoma passaged in nude mice.

Recently, two Nobel Laureates, Mike Brown and Joe Goldstein, published a brief essay discussing the remarkable number of Nobel Laureates (9 so far) who have in common the fact that they came to the NIH as physicians during the brief space between 1964–1972 for postdoctoral research training. (1)

They dissect the unique convergence of circumstances which may have been responsible for this extraordinary result, including the quality of basic science mentors on the full time NIH staff, the competitiveness of “the best and the brightest” to obtain these positions during the Vietnam War years, and the now bygone emphasis on teaching of basic sciences in medical schools in the 1960s.

Lineages among Nobel Laureates are often commented upon. In my case, Jesse Roth had trained with Solomon Berson and Rosalyn Yalow whose development of radioimmunoassay led to the Nobel Prize in Medicine and Physiology to Yalow (1977) after Berson’s untimely death in 1972. Moreover, training in Ira Pastan’s laboratory contemporaneously with me was my medical school and house staff classmate and future Nobel Laureate, Harold Varmus. Ira had himself trained in the lab of another NIH career scientist, Earl Stadtman, who also trained a future Nobel Laureate, Mike Brown.

Dr. Edgar Haber, the Chief of Cardiology and a prominent immunochemist, allowed me to begin working in his lab. I was fascinated by receptors and what I saw as their potential to form the basis for a whole new field of research just waiting to be explored. I spent a great deal of time analyzing which receptor I should attempt to study. As an aspiring academic cardiologist I wanted to work on something related to the cardiovascular system. I also wanted a receptor known to be coupled to adenylate cyclase. I initially focused on two models, the cardiac glucagon and β-adrenergic receptors. However, my attention quickly became focused on the latter, for very practical reasons. Unlike the case for peptide hormones such as glucagon or ACTH, literally dozens, if not hundreds of analogs of adrenaline and noradrenaline, as well as their antagonists were available which could be chemically modified to develop the types of new tools which would need to be developed to study the receptors. These would include radioligands, photoaffinity probes, affinity chromatography matrices and the like. Moreover, the first β-adrenergic receptor blocker (“β-blocker”) had recently been approved for clinical use in the United States, adding further to the attractiveness of this target to me.

So in the early months of 1971 I began the quest to prove the existence of β-adrenergic receptors, to study their properties, to learn about their chemical nature, how they were regulated and how they functioned. This work has consumed me for the past forty years. Over the next several years in Boston, working mostly with membrane fractions derived from canine myocardium, I sought to develop radioligand binding approaches to tag the β-adrenergic receptors. I focused initially on the use of [3H]labeled catecholamines such as norepinephrine, which are agonists for the receptor. Specific saturable binding could be demonstrated, and I thought initially that we had developed a valid approach to label the receptors. However, it became increasingly clear over the next few years that the sites being labeled lacked many of the properties that would be expected for true physiological receptor binding sites. Coming to this realization was difficult.

During this time I also published some of the very first studies demonstrating GTP regulation of β-adrenergic receptor stimulated adenylate cyclase following after the work of Martin Rodbell on GTP regulation of glucagon sensitive adenylate cyclase. I was now a cardiology fellow. As at the NIH, nights on call were often spent in the lab doing experiments while hoping that my on call beeper would remain quiet. During these years, I had many stimulating and profitable discussions with Geoffrey Sharpe, a faculty member in the Nephrology Division with an interest in cell signaling and adenylate cyclase.

In work with postdoc Marc Caron in the spring of 1974, we succeeded in developing [3H]dihydroalprenolol. Contemporaneously, Gerald Aurbach at the NIH, and Alex Levitzki at the Hebrew University in Jerusalem also developed similar approaches using different radioligands. This was a watershed event because it finally opened the door to direct study of the receptors. Together with M.D./Ph.D. student Rusty Williams we developed comparable assays for the α-adrenergic receptors shortly thereafter.

Brian Kent Kobilka is an American physiologist and a corecipient of the 2012 Nobel Prize in Chemistry with Robert Lefkowitz for discoveries that reveal the inner workings of an important family G protein-coupled receptors.

Read Full Post »

Delineating a Role for CRISPR-Cas9 in Pharmaceutical Targeting

Author & Curator: Larry H. Bernstein, MD, FCAP

 

Chief Scientific Officer, Leaders in Pharmaceutical Intelligence, Boston, MA

http://pharmaceuticalintelligence.com

Correspondence:
larry.bernstein@gmail.com

2.1.2.2

Delineating a Role for CRISPR-Cas9 in Pharmaceutical Targeting, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 2: CRISPR for Gene Editing and DNA Repair

Abstract

The recent development of advanced methods for genome engineering has superceded methods already in used in recent years of the 21st century.  The CRISPR-Cas9 application for genome editing has real potential for pharmaceutical development, and perhaps also for diagnostics.  The importance of conjoint development of diagnostics and therapeutics can’t be overstressed. Further, the limitations of the method have to be viewed in the light of the historical development of inborn errors of human metabolism, and current understanding of complex polygenomic and environmental risk factors.

Key words: classic model, CRISPR-Cas9, DNA, genome, genome editing, genetic diseases, Hardy-Weinberg equilibrium, inborn errors of metabolism, polygenetic diseases, RNA, RNAi, translation.

Abbreviations: CRISPR-Cas9; DNA; HWE; RNA.

Introduction.

Genome editing technologies enable the deletion, insertion or correction of DNA at specific targeted sites within an organism’s genome. The power of the technology lies in its ability to specifically target any site in the genome and to alter the DNA sequence at that site. This has opened the door to potentially curing diseases caused by genetic defects, whether inherited or acquired.

Genome editing can be applied across many diverse fields of science. It has allowed researchers to gain a much deeper understanding of the role played by individual genes. Researchers working in the biomedical field use these techniques to address diseases that are known to have a genetic origin.

Early genome-editing research focused on the use of zinc finger nucleases and transcription activator-like effector nucleases (TALENs), which laid important foundations in establishing genome engineering as a potential approach for treating human diseases.

The recent discovery of CRISPR-Cas9, followed by work demonstrating its advantages over traditional approaches, promises a step-change in the use of genome editing to develop transformative medicines for serious human diseases.

Cas9* is an endonuclease (an enzyme) that can be easily programmed with RNA to cut DNA at targeted sites within the genome, enabling the deletion, insertion or correction of target genes, including those that cause diseases, with surgical precision. By using CRISPR-Cas9* genome-editing technology, scientists and clinicians are conducting pioneering research with a view to tackling both recessive and dominant genetic defects.

In order to find a place for CRISPR-Cas9 in gene therapy, it becomes necessary to consider inborn errors of metabolism and the evolution of traditional approaches to genetic diseases. Traditional gene therapy approaches to date have only been useful in correcting some recessive genetic disorders. Thanks to its ease of use and broad applicability, CRISPR-Cas9 has truly democratized genome editing and transformed many areas of research. Thousands of academic laboratories across the world are carrying out research using the technology. To this point, the technology known as CRISPR-Cas9 has been a science project, a research tool with enormous potential.

Genetic Disorders

genetic disorder is a genetic problem caused by one or more abnormalities in the genome, especially a condition that is present from birth (congenital). Most genetic disorders are quite rare and affect one person in every several thousands or millions.

Genetic disorders may or may not be heritable, i.e., passed down from the parents’ genes. In non-heritable genetic disorders, defects may be caused by new mutations or changes to the DNA. In such cases, the defect will only be heritable if it occurs in the germ line. The same disease, such as some forms of cancer, may be caused by an inherited genetic condition in some people, by new mutations in other people, and mainly by environmental causes in still other people. Whether, when and to what extent a person with the genetic defect or abnormality will actually suffer from the disease is almost always affected by the environmental factors and events in the person’s development.

single-gene disorder is the result of a single mutated gene. Over 4000 human diseases are caused by single-gene defects.[4] Single-gene disorders can be passed on to subsequent generations in several ways. Genomic imprinting and uniparental disomy, however, may affect inheritance patterns. The divisions betweenrecessive and dominant types are not “hard and fast”, although the divisions between autosomal and X-linked types are (since the latter types are distinguished purely based on the chromosomal location of the gene). For example, achondroplasia is typically considered a dominant disorder, but children with two genes for achondroplasia have a severe skeletal disorder of which achondroplasics could be viewed as carriers. Sickle-cell anemia is also considered a recessive condition, but heterozygous carriers have increased resistance to malaria in early childhood, which could be described as a related dominant condition.[5] When a couple where one partner or both are sufferers or carriers of a single-gene disorder wish to have a child, they can do so through in vitro fertilization, which means they can then have a preimplantation genetic diagnosis to check whether the embryo has the genetic disorder.[6]

Prevalence of some single-gene disorders[citation needed]
Disorder prevalence (approximate)
Autosomal dominant
Familial hypercholesterolemia 1 in 500
Polycystic kidney disease 1 in 1250
Neurofibromatosis type I 1 in 2,500
Hereditary spherocytosis 1 in 5,000
Marfan syndrome 1 in 4,000[2]
Huntington’s disease 1 in 15,000[3]
Autosomal recessive
Sickle cell anaemia 1 in 625
Cystic fibrosis 1 in 2,000
Tay-Sachs disease 1 in 3,000
Phenylketonuria 1 in 12,000
Mucopolysaccharidoses 1 in 25,000
Lysosomal acid lipase deficiency 1 in 40,000
Glycogen storage diseases 1 in 50,000
Galactosemia 1 in 57,000

Heritable Diseases and Normal Variants

Identification of Genes for Childhood Heritable Diseases

Annual Review of Medicine Jan 2014; 65: 19-31

Boycott KM, Dyment DA, Sawyer SL, Vanstone MR, and Beaulieu CL.

Children’s Hospital of Eastern Ontario Research Institute, University of Ottawa, Ottawa, Ontario, K1H 8L1 Canada

http://dx.doi.org:/10.1146/annurev-med-101712-122108

Genes causing rare heritable childhood diseases are being discovered at an accelerating pace driven by the decreasing cost and increasing accessibility of next-generation DNA sequencing combined with the maturation of strategies for successful gene identification. The findings are shedding light on the biological mechanisms of childhood disease and broadening the phenotypic spectrum of many clinical syndromes. Still, thousands of childhood disease genes remain to be identified, and given their increasing rarity, this will require large-scale collaboration that includes mechanisms for sharing phenotypic and genotypic data sets. Nonetheless, genomic technologies are poised for widespread translation to clinical practice for the benefit of children and families living with these rare diseases.

Single gene defects result in abnormalities in the synthesis or catabolism of proteins, carbohydrates, fats, or complex molecules. Most are due to a defect in an enzyme or transport protein, which results in a block in a metabolic pathway. Effects are due to toxic accumulations of substrates before the block, intermediates from alternative metabolic pathways, defects in energy production and use caused by a deficiency of products beyond the block, or a combination of these metabolic deviations. Nearly every metabolic disease has several forms that vary in age of onset, clinical severity, and, often, mode of inheritance.

There is a large number of inborn errors of metabolism.

A few examples are:

Fructose intolerance
Galactosemia
Maple sugar urine disease (MSUD)
Phenylketonuria (PKU)

Newborn screening tests can identify some of these disorders

Categories of inborn errors of metabolism, or IEMs, are as follows:

  • Disorders that result in toxic accumulation
    • Disorders of protein metabolism (eg, amino acidopathies, organic acidopathies, urea cycle defects)
    • Disorders of carbohydrate intolerance
    • Lysosomal storage disorders
  • Disorders of energy production, utilization
    • Fatty acid oxidation defects
    • Disorders of carbohydrate utilization, production (ie, glycogen storage disorders, disorders of gluconeogenesis and glycogenolysis)
    • Mitochondrial disorders
    • Peroxisomal disorders

 

 

Giants in the 20th century study of genetic medicine

  1. Victor Almon McKusick

 

 
Victor McKusick 
Known for Mendelian Inheritance in Man,OMIM and McKusick–Kaufman syndrome
Notable awards William Allan Award (1977)
Lasker Award (1997)
Japan Prize (2008)

 

Victor Almon McKusick (October 21, 1921 – July 22, 2008), an internist and medical geneticist, was the University Professor of Medical Genetics and Professor of Medicine at the Johns Hopkins HospitalBaltimore, MD, USA.[1] He was a proponent of the mapping of the human genome due to its use for studying congenital diseases. He is well known for his studies of the Amish and, what he called, “little people”. He was the original author and, until his death, remained chief editor of Mendelian Inheritance in Man (MIM) and its online counterpart Online Mendelian Inheritance in Man (OMIM). He is widely known as the “father of medical genetics”.[2]

McKusick traveled to Copenhagen to speak about the heritable disorders of connective tissue at the first international congress of human genetics. The meeting looms as the birthplace of the medical genetics field.[2] In the following decades, McKusick created and chaired a new Division of Medical Genetics at Hopkins beginning in 1957. In 1973, he served as Physician-in-Chief, William Osler Professor of Medicine, and Chairman of the Department of Medicine at Johns Hopkins Hospital and School of Medicine.[6]  He held concurrent appointments as University Professor of Medical Genetics at the McKusick–Nathans Institute of Genetic Medicine, Professor of Medicine at the Johns Hopkins School of Medicine, Professor of Epidemiology at the Johns Hopkins Bloomberg School of Public Health, and Professor of Biology at Johns Hopkins University.[5] He co-founded Genomics in 1987 with Dr. Frank Ruddle, and served as an editor.[6] He was a lead investigator in determining if Abraham Lincoln had Marfan syndrome.[8]

  1. Elizabeth F. Neufeld

Born in France, Elizabeth Neufeld immigrated to the United States in 1940. She obtained a BS from Queens College, New York and a Ph.D. from the University of California Berkeley. After postdoctoral training in, she moved to the NIH in Bethesda, MD, where she began her studies of a rare group of genetic diseases. She moved back to California in 1984 as Chair of the Department of Biological Chemistry – a position that she occupied till 2004.

The brain in a mouse model of a genetic lysosomal disorder, Sanfilippo syndrome type B

Our interests have long been the cause, consequences and treatment of human genetic diseases due to deficiency of lysosomal enzymes. The disease currently under investigation is the Sanfilippo syndrome type B (MPS III B). It is caused by mutation in the NAGLU gene, with resulting deficiency of the lysosomal enzyme alpha-N-acetyl-glucosaminidase and accumulation of its substrate (heparan sulfate). The disease manifests itself in childhood by severe mental retardation and intractable behavioral problems. The neurologic deterioration progresses to dementia, with death usually in the second decade. We use a mouse knockout model (Naglu -/-) in order to study the pathophysiology of the disease and to develop therapy. Because of the special cell biology of lysosomal enzymes, which can be taken up by receptor-mediated endocytosis, exogenous administration of the enzyme could theoretically cure the disease. Unfortunately, the blood-brain barrier (BBB) prevents the therapeutic enzyme from reaching the brain. Part of our current research is to develop a novel technology to get lysosomal enzymes across the BBB. We also study changes in gene and protein expression in some specific parts of the brain, in which there is accumulation of certain lipids and proteins which seem unrelated biochemically to each other or to the primary defect. We try to understand the cause and consequences of these accumulations. Although they are secondary defects, they may be relevant to the pathophysiology of the dieease and may have represent targets for pharmacologic intervention.

Neufeld began her scientific studies at a time when few women chose science as a career. The historical bias against women in science, compounded with an influx of men coming back from the Second World War and going to college, made positions for women rare; few women could be found in the science faculties of colleges and universities.

When she first began working on Hurler syndrome in 1967, she initially thought the problem might stem from faulty regulation of the sugars, but experiments showed the problem was in fact the abnormally slow rate at which the sugars were broken down. Working with fellow scientist Joseph Fratantoni, Neufeld attempted to isolate the problem by tagging mucopolysaccharides with radioactive sulfate, as well as mixing normal cells with MPS patient cells. Fratantoni inadvertently mixed cells from a Hurler patient and a Hunter patient—and the result was a nearly normal cell culture. The two cultures had essentially “cured” each other.

In 1973 Neufeld was named chief of NIH’s Section of Human Biochemical Genetics, and in 1979 she was named chief of the Genetics and Biochemistry Branch of the National Institute of Arthritis, Diabetes, and Digestive and Kidney Diseases (NIADDK). She served as deputy director in NIADDK’s Division of Intramural Research from 1981 to 1983. In 1984 Neufeld went back to the University of California, this time the Los Angeles campus, as chair of the biological chemistry department.

Neufeld’s research opened the way for prenatal diagnosis of such life-threatening fetal disorders as Hurler syndrome. Neufeld chaired the Scientific Advisory Board of the National MPS Society and was president of the American Society for Biochemistry and Molecular Biology from 1992 to 1993. She was elected to both the National Academy of Sciences (USA) and the American Academy of Arts and Sciences in 1977 and named a fellow of the American Association for Advancement in Science in 1988. In 1990 she was named California Scientist of the Year. She was awarded the Wolf Prize, the Albert Lasker Award for Clinical Medical Research, and was awarded the National Medal of Science in 1994 “for her contributions to the understanding of the lysosomal storage diseases, demonstrating the strong linkage between basic and applied scientific investigation.”[3]

  1. Jarvis “Jay” Edwin Seegmiller, M.D.

“Jay Seegmiller was one of the giants of American medicine,” said Edward Holmes, M.D., Vice Chancellor of Health Sciences and dean of the School of Medicine at UCSD. “He and his trainees have made innumerable contributions to our understanding of the pathogenesis of many human disorders. Seegmiller was one of the country’s leading researchers in intermediary metabolism, with a focus on purine metabolism and inherited metabolism.  He worked in the field of human biochemical genetics, with a special interest in the mechanisms by which genetically determined defects of metabolism lead to various forms of arthritis.  His laboratory identified a wide range of primary metabolic defects in metabolism responsible for development of gout.

He is perhaps best known for his discovery of the enzyme defect in Lesch-Nyhan Syndrome, a fatal disorder of the nervous system causing severe mental retardation and self-mutilation impulses.  As Director of the Human Biochemical Genetics Program at UCSD, Seegmiller’s investigations into the translation of genetic research and methods of prevention, detection and treatment of hereditary diseases led to Congressional testimony on the possibility of controlling genetic disease in the United States.  As a result, genetic referral centers have been established throughout the country.

He joined the newly established UCSD School of Medicine in 1969 as head of the Arthritis Division of the Department of Medicine. There, he directed a research program in human biochemical genetics involving senior faculty from five departments within the School of Medicine.  While a professor at UCSD, he served as a Macy Scholar both at Oxford University and at the Basel Institute in Switzerland, as well as a Guggenheim Fellow at the Swiss Institute for Experimental Cancer Research in Lausanne.

In 1983, he became the founding director of what is today UCSD’s Stein Institute for Research on Aging (SIRA). Even after his retirement, he continued to serve as Associate Director of SIRA from 1990 until his death.

“He had the foresight of proposing the formation of and then establishing a new Institute on Aging at UCSD before there was any such Institute in the entire UC system,” said Dilip Jeste, M.D., the Estelle and Edgar Levi Chair in Aging, Professor of Psychiatry and Neurosciences and current Director of SIRA.   “He was himself a role model of successful aging, and continued working in the SIRA till his very last days.

Seegmiller received his Doctor of Medicine with honors from the University of Chicago in 1948.  After he completed his internship at Johns Hopkins Hospital in Baltimore, Maryland, he trained with Bernard Horecker of the National Institute of Arthritis and Metabolic Disease at the National Institutes of Health.

Seegmiller was appointed Senior Investigator of the National Institute of Arthritis and Metabolic Disease in 1954, where he carried out biochemical and clinical studies of human hereditary disease, with a special interest in those causing various forms of arthritis.  He became Assistant Scientific Director of the Institute in 1960, and was appointed Chief of the section on Human Biochemical Genetics in 1966, becoming one of several NIH leaders recruited to help launch UC San Diego’s new medical school.

Seegmiller’s clinical activities included studies in life longevity in South America.  In 1974, he joined a team of notable scientists and traveled to the remote village of Vilcabamba in Ecuador, to find out what role genetic factors played in the population of the Andean villagers who comprised some of the longest-living people in the world.  His later work led to the discovery of free radicals and their damaging effects in the human ability to withstand diseases, bringing forward new investigations on human aging at SIRA.

Seegmiller was a member of the National Academy of Sciences, the American Academy of Arts and Sciences, and was the recipient of numerous prizes and awards in honor of his extraordinary achievements in science and medicine.  He received the United States Public Health Distinguished Service Award in 1969; and was honored as Master of the American College of Rheumatology (ACR) in 1992. He was on the advisory boards for the National Genetics Foundation, the City of Hope Medical Center in Duarte, California, the Task Force on Endocrinology and Metabolism for NIH, the Executive Editorial Board for Analytical Biochemistry, and was President of the Western Association of Physicians in 1979.

What has changed?

The 21st century has seen the mapping of the human genome. The huge focus on the genome came after the Watson and Crick discovery put the genome at the center of the translational network with the central hypothesis. What followed was transcription of RNA into placement of an amino acid into protein. The central hypothesis is DNA           RNA           protein.  However, RNAi and non-translational RNA are now also important.  RNA has a role in suppressing translation, as do proteins by allosteric effects. In addition, the most common diseases involved in age related change are strongly responsive to extracellular matrix effects, ionic fluxes, effects on the cellular matrix, and involve multicentric genome expression. This mode of expression leads one to think hard about the therapeutic target, or targets. The effect of RNA or of protein interacting with the genome is not an element of the classic construct.

Identifying a part of the problem

Type 2 diabetes mellitus, hypertension, arrhythmias, atherosclerotic plaque development, cancer, congestive heart disease, pulmonary hypertension, pulmonary interstitial sclerosis, and renovascular disease are among the common diseases that develop during a lifetime. The phenotypic presentations may have genomic associations, and there may also be population variants.  There is also a cross-talk between these phenotypic expressions. Classically, medical terminology has been based on signs and symptoms of disease.  In the increasingly complex experience, the laboratory has played an increased role in the diagnosis as well as prognostication. The laboratory experience with respect to the practice of medicine has heavily relied of either proteins, enzymes, or the products of chemical reactions.  The use of genomic profiling has rapidly emerged in the laboratory armamentarium, but has had a slow ascent in practice.

Case in Point. Pompe’s disease

William Canfield is a glycobiologist, chief scientific officer and founder of an Oklahoma City-based biotechnology company, Novazyme, which was acquired by Genzyme in August 2001 and developed, among other things, an enzyme that can stabilize (but not cure) Pompe disease, based on Canfield’s ongoing research since 1998.[1][2]   

John Crowley took over a position as a CEO in Novazyme after leaving Bristol-Myers Squibb in March 2000 and together with Dr. Y. T. Chen[4] at Duke University pushed for expedited approval by the U.S. Food and Drug Administration (FDA) of a new drug compound, NZ-1001 under orphan drug designation for the treatment of Glycogen storage disease type II in October 2005. The FDA stated: “We have determined that Novazyme’s recombinant human highly phosphorylated acid alpha-glucosidase (rhHPGAA) qualifies for orphan designation for enzyme replacement therapy in patients with all subtypes of glycogen storage disease type II (Pompe’s disease).” [5][6] Subsequent research at Genzyme on NZ-1001 along with three other potential compounds brought approval of the first enzyme replacement therapy for Pompe’s disease – Alglucosidase alfa (Myozyme or Lumizyme, Genzyme Inc) in 2006.[7]

William Canfields work with Pompes Disease was fictionalized and made the subject of a 2010 movie Extraordinary Measures in which he is called Dr. Robert Stonehill and played by Harrison Ford.[8]

Case in point.  Polymorphisms in the long non-coding RNA

Hypertension (HT) is a complex disorder influenced by both genetic and environmental factors. Recent genome-wide association studies have identified a major risk locus for atherosclerosis on chromosome 9p21.3 (chr9p21.3). SNPs within the coding sequences of CDKN2A/B proteins and the long non-coding RNA CDKN2B-AS1 could potentially contribute to HT development. Such a study has now been done. The findings suggest that SNPs rs10757274, rs2383207, rs10757278, and rs1333049, particularly those within the CDKN2B-AS1 gene, and related haplotypes may confer increased susceptibility to HT development. (unpublished)

Case in point. Lipoprotein Lipase and Atherosclerosis

Lipoprotein lipase (LPL) plays a pivotal role in lipids and metabolism of lipoprotein. Dysfunctions of LPL have been found to be associated with dyslipidemia, obesity and insulin resistance.Dyslipidemia, obesity and insulin resistance are risk factor of atherosclerosis. Japanese investigators have  hypothesized that elevating LPL activity would cause protection of atherosclerosis. (unpublished).

Case in  point. Holocaust survivors pass on stress.

Descendants of Holocaust Survivors Have Altered Stress Hormones

Parents’ traumatic experience may hamper their offspring’s ability to bounce back from trauma

Case in point. Genome engineering with CRISPR-Cas9

The new frontier of genome engineering with CRISPR-Cas9

GENOME EDITING

Jennifer A. Doudna* and Emmanuelle Charpentier*
Science Nov 2014; 346(6213) 1258096:1077 – 1087.
http://dx.doi.org:/10.1126/science.1258096

BACKGROUND: Technologies for making and manipulating DNA have enabled advances in biology ever since the discovery of the DNA double helix. But introducing site-specific modifications in the genomes of cells and organisms remained elusive. Early approaches relied on the principle of site-specific recognition of DNA sequences by oligonucleotides, small molecules, or self-splicing introns. More recently, the site-directed zinc finger nucleases (ZFNs) and TAL effector nucleases (TALENs) using the principles of DNAprotein recognition were developed. However, difficulties of protein design, synthesis, and validation remained a barrier to

SUMMARY

The field of biology is now experiencing a transformative phase with the advent of facile genome engineering in animals and plants using RNA-programmable CRISPR-Cas9. The CRISPR-Cas9 technology originates from type II CRISPR-Cas systems, which provide bacteria with adaptive immunity to viruses and plasmids. The CRISPR associated protein Cas9 is an endonuclease that uses a guide sequence within an RNA duplex, tracrRNA:crRNA, to form base pairs with DNA target sequences, enabling Cas9 to introduce a site-specific double-strand break in the DNA. The dual tracrRNA:crRNA was engineered as a single guide RNA (sgRNA) that retains two critical features: a sequence at the 5  side that determines the DNA target site by Watson-Crick base-pairing and a duplex RNA structure at the 3 side that binds to Cas9. This finding created a simple two-component system in which changes in the guide sequence of the sgRNA program Cas9 to target any DNA sequence of interest. The simplicity of CRISPR-Cas9 programming, together with a unique DNA cleaving mechanism, the capacity for multiplexed target recognition, and the existence of many natural type II CRISPR-Cas system variants, has enabled remarkable developments using this cost-effective and easy-to-use technology to precisely and efficiently target, edit, modify, regulate, and mark genomic loci of a wide array of cells and organisms.

Figure (not shown)

The Cas9 enzyme (blue) generates breaks in double-stranded DNA by using its two catalytic centers (blades) to cleave each strand of a DNA target site (gold) next to a PAM sequence (red) and matching the 20-nucleotide sequence (orange) of the single guide RNA (sgRNA). The sgRNA includes a dual-RNA sequence derived from CRISPR RNA (light green) and a separate transcript (tracrRNA, dark green) that binds and stabilizes the Cas9 protein. Cas9-sgRNA–mediated DNA cleavage produces a blunt double-stranded break that triggers repair enzymes to disrupt or replace DNA sequences at or near the cleavage site. Catalytically inactive forms of Cas9 can also be used for programmable regulation of transcription and visualization of genomic loci.

This Review illustrates the power of the technology to systematically analyze gene functions in mammalian cells, study genomic rearrangements and the progression of cancers or other diseases, and potentially correct genetic mutations responsible for inherited disorders. CRISPR-Cas9 is having a major impact on functional genomics conducted in experimental systems. Its application in genome-wide studies will enable large-scale screening for drug targets and other phenotypes and will facilitate the generation of engineered animal models that will benefit pharmacological studies and the understanding of human diseases. CRISPR-Cas9 applications in plants and fungi also promise to change the pace and course of agricultural research. Future research directions to improve the technology will include engineering or identifying smaller Cas9 variants with distinct specificity that may be more amenable to delivery in human cells. Understanding the homology-directed repair mechanisms that follow Cas9-mediated DNA cleavage will enhance insertion of new or corrected sequences into genomes. The development of specific methods for efficient and safe delivery of Cas9 and its guide RNAs to cells and tissues will also be critical for applications of the technology in human gene therapy.

Case in point.

ZFN, TALEN and CRISPR/Cas-based methods for genome engineering

Thomas Gaj1,2,3, Charles A. Gersbach4,5, and Carlos F. Barbas III1,2,3 1The Skaggs Institute for Chemical Biology, The Scripps Research Institute, La Jolla, CA, USA 2Department of Molecular Biology, The Scripps Research Institute, La Jolla, CA, USA 3Department of Chemistry, The Scripps Research Institute, La Jolla, CA, USA 4Department of Biomedical Engineering, Duke University, Durham, NC, USA 5Institutes for Genome Sciences and Policy, Duke University, Durham, NC, USA

Trends Biotechnol . 2013 July ; 31(7): 397–405. http://dx.doi.org:/10.1016/j.tibtech.2013.04.004

Abstract Zinc-finger nucleases (ZFNs) and transcription activator-like effector nucleases (TALENs) comprise a powerful class of tools that are redefining the boundaries of biological research. These chimeric nucleases are composed of programmable, sequence-specific DNA-binding modules linked to a non-specific DNA cleavage domain. ZFNs and TALENs enable a broad range of genetic modifications by inducing DNA double-strand breaks that stimulate error-prone nonhomologous end joining or homology-directed repair at specific genomic locations. Here, we review achievements made possible by site-specific nuclease technologies and discuss applications of these reagents for genetic analysis and manipulation. In addition, we highlight the therapeutic potential of ZFNs and TALENs and discuss future prospects for the field, including the emergence of CRISPR/Cas-based RNA-guided DNA endonucleases.

Keywords zinc-finger; TALE; CRISPR; nuclease; genome engineering

Classical and contemporary approaches for establishing gene function With the development of new and affordable methods for whole-genome sequencing, and the design and implementation of large-scale genome annotation projects, scientists’ are poised to deliver upon the promises of the Genomic Revolution to transform basic science and personalized medicine. The resulting wealth of information presents researchers with a new primary challenge of converting this enormous amount of data into functionally and clinically relevant knowledge. Central to this problem is the need for efficient and reliable methods that enable investigators to determine how genotype influences phenotype. Targeted gene inactivation via homologous recombination is a powerful method capable of providing conclusive information for evaluating gene function.

Several factors impede the use of these methods:

  • the low efficiency at which engineered constructs are correctly inserted into the chromosomal target site,
  • the need for time-consuming and labor-insensitive selection/screening strategies, and
  • the potential for adverse mutagenic effects.

Targeted gene knockdown by RNA interference (RNAi) has provided researchers with a rapid, inexpensive and high-throughput alternative to homologous recombination. However, knockdown by RNAi is incomplete, varies between experiments and laboratories, has unpredictable off-target effects, and provides only temporary inhibition of gene function. These restrictions impede researchers’ ability to directly link phenotype to genotype and limit the practical application of RNAi technology.

In the past decade, a new approach has emerged that enables investigators to directly manipulate virtually any gene in a diverse range of cell types and organisms. This core technology – commonly referred to as “genome editing” – is based on the use of engineered nucleases composed of sequence-specific DNA-binding domains fused to a non-specific DNA cleavage module. These chimeric nucleases enable efficient and precise genetic modifications by inducing targeted DNA double-strand breaks (DSBs) that stimulate the cellular DNA repair mechanisms, including error-prone non-homologous end joining (NHEJ) and homology-directed repair (HDR).

Case in point.

CRISPR/Cas9 and Targeted Genome Editing: A New Era in Molecular Biology

The development of efficient and reliable ways to make precise, targeted changes to the genome of living cells is a long-standing goal for biomedical researchers. Recently, a new tool based on a bacterial CRISPR-associated protein-9 nuclease (Cas9) from Streptococcus pyogenes has generated considerable excitement. This follows several attempts over the years to manipulate gene function, including homologous recombination and RNA interference (RNAi).

RNAi, in particular, became a laboratory staple enabling inexpensive and high-throughput interrogation of gene function, but it is hampered by providing only temporary inhibition of gene function and unpredictable off-target effects. Other recent approaches to targeted genome modification – zinc-finger nucleases (ZFNs), and transcription-activator like effector nucleases (TALENs) – enable researchers to generate permanent mutations by introducing double stranded breaks to activate repair pathways. These approaches are costly and time-consuming to engineer, limiting their widespread use, particularly for large scale, high-throughput studies.

The Biology of Cas9

The functions of CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) and CRISPR-associated (Cas) genes are essential in adaptive immunity in select bacteria and archaea, enabling the organisms to respond to and eliminate invading genetic material. These repeats were initially discovered in the 1980s in E. coli, but their function wasn’t confirmed until 2007 by Barrangou and colleagues, who demonstrated that S. thermophilus can acquire resistance against a bacteriophage by integrating a genome fragment of an infectious virus into its CRISPR locus.

Three types of CRISPR mechanisms have been identified, of which type II has been the most studied. In this case, invading DNA from viruses or plasmids is cut into small fragments and incorporated into a CRISPR locus amidst a series of short repeats (around 20 bps). The loci are transcribed, and transcripts are then processed to generate small RNAs (crRNA – CRISPR RNA), which are used to guide effector endonucleases that target invading DNA based on sequence complementarity (Figure 1) (not shown).

In the acquisition phase, foreign DNA is incorporated into the bacterial genome at the CRISPR loci. CRISPR loci is then transcribed and processed into crRNA during crRNA biogenesis. During interference, Cas9 endonuclease complexed with a crRNA and separate tracrRNA cleaves foreign DNA containing a 20-nucleotide crRNA complementary sequence adjacent to the PAM sequence.

Investment in CRISPR technology

CRISPR Therapeutics is a biopharmaceutical company created to translate CRISPR-Cas9, a breakthrough genome-editing technology, into transformative medicines for serious human diseases. We are uniquely positioned to translate CRISPR-Cas9 technology into human therapeutics, thanks to its multi-disciplinary team of world-renowned academics, clinicians and drug developers.

CRISPR Therapeutics’ vision is to cure serious human diseases at the molecular level using CRISPR-Cas9. The company is headquartered in Basel, Switzerland and has operations in London, UK and Cambridge, Massachusetts.

The biopharmaceutical company that is focused on translating CRISPR-Cas9 gene-editing technology into transformative medicines for serious human diseases, congratulates its scientific founder, Dr. Emmanuelle Charpentier, for being named to TIME Magazine’s TIME 100 Most Influential People in the World alongside fellow CRISPR-Cas9 discoverer, Dr. Jennifer Doudna. In addition, Dr. Emmanuelle was awarded the Louis Jeantet Prize for Medicine, considered the most prestigious European award for researchers in the life sciences, for her discovery of the CRISPR-Cas9 gene editing tool. She will receive the award in a ceremony in Geneva, Switzerland, on April 22, 2015.

Dr. Charpentier has received numerous additional awards for her research, including in 2014 the Alexander von Humboldt Professorship, the Dr Paul Janssen Award, the Grand-Prix Jean-Pierre Lecocq (French Academy of Sciences), the Göran Gustafsson Prize (Royal Swedish Academy of Sciences) and in 2015 the Breakthrough Prize in Life Sciences. She was also selected as one of the American Foreign Policy magazine’s 100 Leading Global Thinkers for 2014.

Cambridge-based Editas Medicine announced a $120 million Series B round led by Bill Gates’s chief advisor for science and technology, Boris Nikolic. The list of financiers teaming with Nikolic reads like a rolodex of so-called crossover investors. Nikolic, who joined Editas’ board, made the investment through what’s been called “bng0,” a new U.S.-based investment company backed by “large family offices with a global presence and long-term investment horizon” and formed specifically to invest in Editas. CEO Katrine Bosley confirmed that Gates is one of the individuals investing in Editas alongside Nikolic. Editas has become the first of the group not only to attract crossover backers, but to begin discussing the diseases that its targeting.

Caribou Biosciences, one of the biotech startups working to advance a much-watched new technology for precise gene editing, has raised an $11 million Series A round from venture capital firms and Swiss drug giant Novartis.

The money will help Berkeley, CA-based Caribou speed up its efforts to adapt a versatile genome editing technique co-discovered by one of its founders, UC Berkeley professor Jennifer Doudna, for a range of uses, including drug research and development, and industrial technology.

Doudna and her collaborator, Emmanuelle Charpentier of the Helmholtz Center for Infection Research in Braunschweig, Germany, and Umeå University in Sweden, figured out how to transform a bacterial defense against viral infection into a tool to edit out abnormal sections of genes, such as those that cause hereditary diseases.

Caribou’s gene editing platform is based on two elements of that bacterial molecular machinery: a guiding mechanism called CRISPR (clustered, regularly interspaced palindromic repeats), and an enzyme called Cas9, or CRISPR-associated protein 9, molecular scissors that cut a segment of DNA. Caribou was founded in 2011 to commercialize the work from Doudna’s lab.

 

 

Read Full Post »

« Newer Posts - Older Posts »