Feeds:
Posts
Comments

Archive for the ‘Population Health Management, Genetics & Pharmaceutical’ Category

New Drug-Eluting Stent Works Well in STEMI

Reporter: Aviva Lev-Ari, PhD, RN

UPDATED on 8/8/2013

Meta-analysis makes case for drug-eluting stents in STEMI

AUGUST 7, 2013 

New York, NY – Newer-generation drug-eluting stents, particularly the everolimus-eluting stent (Xience V, Abbot; Promus, Boston Scientific), significantly reduce the risk of target vessel revascularization (TVR) in patients with ST-segment-elevation MI (STEMI) without increasing the risk of adverse safety outcomes, including rates of stent thrombosis, when compared with bare-metal stents [1].

These are the principal findings of a new meta-analysis of 28 randomized, controlled clinical trials involving more than 34 000 patient-years of follow-up.

Published online August 6, 2013 in Circulation: Cardiovascular Interventions, the analysis showed that compared with the sirolimus-eluting stent (Cypher, Cordis), the paclitaxel-eluting stent (Taxus, Boston Scientific), and bare-metal stents, the use of an everolimus-eluting stent reduced the relative risk of stent thrombosis 62%, 61%, and 58%, respectively.

“I would make a strong argument to say that the current guidelines should change,” lead investigator Dr Sripal Bangalore (New York University School of Medicine) told heartwire. “The reduction in TVR is not surprising. We know that drug-eluting stents compared with bare-metal stents reduce TVR, but the biggest thing we were able show was that stent thrombosis is also reduced when compared with a bare-metal stent, as well as compared with first-generation drug-eluting stents.”

The current American College of Cardiology/American Heart Association (ACC/AHA) and European Society of Cardiology (ESC) clinical guidelines state that drug-eluting-stent implantation is a class IIa indication in STEMI patients. The recommendations are based on concerns about an increased risk of stent thrombosis with the drug-eluting stents compared with their bare-metal counterparts. Bangalore said concerns have also been raised about the risk of stent thrombosis beyond one year with the first-generation drug-eluting stents, a time point when dual antiplatelet therapy is stopped.

The newer-generation drug-eluting stents, however, have been shown in various studies to be as safe asbare-metal stents in the STEMI setting. For this reason, they conducted a meta-analysis of randomized, controlled trials comparing the sirolimus-, paclitaxel-, everolimus-, and zotarolimus-eluting stents against each other and against bare-metal stents.

When compared with bare-metal stents, the sirolimus-, paclitaxel-, and everolimus-eluting stent reduced the relative risk of TVR by 53%, 31%, and 57%, respectively. The sirolimus-eluting stent was significantly more effective than the paclitaxel-eluting stent at reducing TVR, as was the everolimus-eluting stent. Overall, there was a 67% probability that the Endeavor Resolute zotarolimus-eluting stent (Medtronic) had the lowest risk of TVR, although the data are based on just one trial with 281 patients, note the investigators.

Median rate of efficacy and definite/probable stent thrombosis

Stent type TVR rate (per 1000 patient-years of follow-up) Definite/probable stent thrombosis (per 1000 patient-years of follow-up)
Bare metal 64.00 16.60
Sirolimus 28.93 15.75
Paclitaxel 44.38 18.46
Everolimus 26.55 6.54
Zotarolimus 59.01 11.41
Zotarolimus (Resolute) 14.76 NA*

 

*For stent thrombosis, there were no available data on the Resolute stent

When compared with bare-metal stents, the everolimus-eluting stent reduced the risk of any stent thrombosis by 58%. The Xience stent was also associated with a statistically significant 62% and 61% reduction in the risk of stent thrombosis compared with the first-generation Cypher and Taxus stents.

Bangalore said that a previous patient-level analysis conducted by Dr Giuseppe De Luca (Ospedale Maggiore della Carità, Novara, Italy), reported by heartwire at that time, showed there was a significant 50% increase in the risk of late (more than one year) reinfarction with drug-eluting stents and an almost doubling of very late stent thrombosis with first-generation stents. In this newest meta-analysis, however, the researchers did not observe a similarly increased risk of very late stent thrombosis with the everolimus-eluting stent.

“Based on the totality of data, I would say that it’s time the guidelines make drug-eluting stents and especially the everolimus-eluting stent a class I indication in STEMI patients who can take dual antiplatelet therapy,” said Bangalore.

Source

  1. Bangalore S, Amoroso N, Fusaro M, Kumar S, Feit F. Outcomes with various drug-eluting or bare-metal stents in patients with ST-segment elevation myocardial infarctionCirc Cardiovasc Interv 2013; DOI:10.1161/CIRCINTERVENTIONS.113.000415. Available at: http://circinterventions.ahajournals.org.

 

New Drug-Eluting Stent Works Well in STEMI

By Michael Smith, North American Correspondent, MedPage Today

Published: August 21, 2012

Reviewed by Robert Jasmer, MD; Associate Clinical Professor of Medicine, University of California, San Francisco and Dorothy Caputo, MA, BSN, RN, Nurse Planner

 Watch Video

 A new-generation biodegradable drug-eluting stent had a lower rate of major cardiac events than similar bare-metal devices, researchers reported.

In a randomized trial, patients with ST-segment elevation myocardial infarction (STEMI) needed fewer revascularization procedures and had a lower risk of a new heart attack in the target blood vessel, according to Stephan Windecker, MD, of Bern University Hospital in Bern, Switzerland, and colleagues.

On the other hand, rates of cardiac death were not significantly different, Windecker and colleagues reported in the Aug. 22/29 issue of the Journal of the American Medical Association.

Drug-eluting stents have been shown to reduce the need for repeat revascularization, compared with bare-metal stents, but at the cost of delayed healing, chronic inflammation, and late stent thrombosis, the researchers noted.

The long-term effects result from the persistence of the polymer, Windecker and colleagues noted — something that might be avoided by using a biodegradable polymer.

The biodegradable BioMatrix Flex stent, which delivers the immunosuppressant drug biolimus, was non-inferior in a 4-year trial to the sirolimus-eluting Cypher stent, which does not break down over time.

But it had not been tested against bare-metal stents. To help fill the gap, Windecker and colleagues studied 1-year outcomes in 1,161 STEMI patients randomly assigned to get either the biolimus-eluting biodegradable stent or a similar bare-metal device.

The primary endpoint of the trial was the 1-year rate of major adverse cardiac events — a composite of cardiac death, target vessel-related re-infarction, and ischemia-driven target-lesion revascularization.

Windecker and colleagues found that 24 patients (4.3%) with biodegradable stents had a major adverse cardiac event at 1 year, compared with 49 (8.7%) who were given the bare-metal devices (HR 0.49, 95% CI 0.30 to 0.80, P=0.004).

The difference was driven by a lower risk of two of the elements of the combined endpoint: target vessel-related reinfarction and ischemia-driven target-lesion revascularization. Specifically:

  • Three patients getting the biodegradable stent (0.5%) had a re-infarction related to the target vessel, compared with 15 (2.7%) of those with bare-metal devices (HR 0.20, 95% CI 0.06 to 0.69, P=0.01).
  • Nine patients (1.6%) with biodegradable stents and 32 (5.7%) with bare-metal devices needed target-lesion revascularization (HR 0.28, 95% CI 0.13 to 0.59, P<0.001).
  • Rates of cardiac death were numerically lower, but not significantly so, in the biodegradable stent patients — 16 deaths, or 2.9%, versus 20, or 3.5%.

Definite stent thrombosis occurred in five patients treated with the drug-eluting stents and 12 patients with bare-metal stents, but the difference did not reach significance.

The findings should be “reassuring” to both doctors and patients, Windecker said in a video released by the journal.

The study is “a well-done trial with convincing results regarding its primary end point,” commented Salvatore Cassese, MD, and Adnan Kastrati, MD, both of the Technische Universitat in Munich, Germany.

But, in an accompanying editorial, they argued that it still may not settle the question of long-term complications.

Despite “positive signals,” they wrote, the study has “neither the required sample size nor the sufficient length of follow-up to provide the definitive answer about the long-term safety” of the new biodegradable drug-eluting stents.

The researchers cautioned that the biodegradable drug-eluting stent is not yet approved in the U.S., although European authorities have given it the nod.

They also noted that the study, while demonstrating superiority on the overall endpoint, did not have sufficient statistical power to address the individual components definitively.

The study had support from the Swiss National Science Foundation and Biosensors Europe SA. Windecker reported financial links through his institution with Abbott, Boston Scientific, Biosensors, Biotronik, Cordis, Medtronic, and St. Jude Medical.

The editorial authors reported support from the European Commission. Kastrati reported holding a patent related to polymer-free sirolimus and probucol coating, as well as financial links with Abbott, Biosensors, Cordis, and Medtronic.

From the American Heart Association:

Related Articles in Heart.org

Related links

Read Full Post »

 

How Genome Sequencing is Revolutionizing Clinical Diagnostics, from the ISMB Conference

Reporter: Aviva Lev-Ari, PhD, RN

 

08WednesdayAug 2012

Written by Filipe J. Ribeiro in Events

Filipe Ribeiro New York Genome CenterFilipe J. Ribeiro is a Bioinformatics Scientist at the New York Genome Center.

Recently, I attended the 20th Annual Conference of Intelligent Systems for Molecular Biology (July 15-17, 2012), organized by the International Society for Computational Biology. The conference focuses on the application of computer science, statistical, and mathematical methods to biological systems. I also attended the High Throughput Sequencing Methods and Applications (HiTSeq) satellite meeting (July 13-14, 2012). There, the speakers addressed the opportunities and challenges presented by the availability of the increasingly large genomic datasets from next-generation sequencing.

Many topics were discussed during the two days of HiTSeq, such as new data-analysis methods for RNA sequencing data, methods for improving de novoassemblies, and sequencing-data compression. What impressed me the most were the keynote addresses given by Dr. Stanley Nelson, from the Jonnson Comprehensive Cancer Center at UCLA, and Dr. Gohlson Lyon, from Cold Spring Harbor Laboratory. Both speakers focused on how whole-exome andwhole-genome sequencing are on the verge of revolutionizing clinical diagnosis of genetic disorders and what challenges need to be addressed before sequencing penetrates the clinic.

Dr. Nelson’s talk centered on the use of exome sequencing in the clinical diagnosis of genetic conditions. He presented a few case studies of young children with various rare developmental delays. Rare conditions can be hard to diagnose, and often times numerous tests need to be performed before a conclusion is reached, if a conclusion is reached at all. Also, some conditions are caused by a variety of different mutations to a single gene. These are harder to detect with conventional targeted genetic testing, which relies on known mutations. With exome sequencing a single test is performed; that one test identifies all coding mutations, known and unknown, simple and complex. Even when there is no smoking gun in the large set of mutations typically found in any single individual, the genotype can be reanalyzed at a later point, in light of new research findings.

However, challenges in genomics-based diagnosis still remain. Dr Nelson reports that in roughly 50 percent of cases studied clinically at UCLA, a known causal mutation is found. In 25 percent of cases, a novel genetic mutation is identified that is potentially causal, and in the remaining 25 percent of cases no conclusion can be drawn. Because of the large number of novel mutations that are present in any single individual’s genome, establishing causality of novel variations is often very hard, and care must be taken when interpreting results in order to avoid false positives. To minimize the risk of misdiagnosis in a clinical setting, it is fundamental to have a board of scientists and clinicians to review the conclusions of sequencing tests to ensure their validity.

Another challenge is what to do with secondary or unrelated findings—for example when a patient comes in with a set of symptoms indicative of one condition, and the genetic test finds a different one that is unrelated and asymptomatic. Some conditions (like Huntington’s disease) have no cure, and the patient might not want to learn about any diagnoses that are not actionable. A great deal of care must be taken both before and after genetic testing takes place so that patients understand the risks and the meaning of results.

On a slightly different note, Dr. Lyon focused on the ethical difficulties of returning research-grade results on genetic disorders to study participants. As an example he presented the case of a family that carries a genetic mutation that is fatal in boys at a very early age. A mutation was identified and shown to be causal in a research setting. The ethical dilemma for the researcher is: if one of the women in the family is pregnant with a boy, should she be informed of her carrier status? Research standards are not at the same level as clinical ones, and research results can at times be wrong.

It is not an easy question. Dr. Lyon’s suggestion is that research-grade whole-genome and whole-exome sequencing of study participants should be conducted under the same CLIA-certified standards as clinical tests, with the goal of returning research results to the study participants. Again, counseling and education of study participants regarding the risks and benefits of genetic testing are critical.

One barrier to the adoption of sequencing in a clinical setting is the fact that insurance companies do not cover the costs of whole genome sequencing as they are not yet convinced of the benefits. But that attitude will hopefully change as sequencing costs keep decreasing, and success stories abound. Soon it will be clear that genome sequencing is cost effective in disease diagnosis, prevention, and treatment. Also, for the most part genome sequencing is done only once in a lifetime, and therefore it is not a repetitive cost. (Cancer is an exception; one might want to sequence the cancer cells to identify which specific mutations are driving the tumor and to what drugs the tumor might respond.)

In summary, both speakers painted a picture of how whole-genome and whole-exome sequencing is quickly proving itself as a revolutionary tool in the clinic. Clearly challenges remain: test interpretation must be done carefully, ideally by a board of both scientists and clinicians, and strict CLIA standards should be in place, even in a research setting. But it is certainly clear that next generation sequencing will play an increasingly significant role in the clinic, and, most importantly, in our health.

 

http://blog.nygenome.org/

 

Read Full Post »

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

Coronary artery bypass grafting (CABG) or percutaneous transluminal coronary angioplasty (PTCA) is used to reconstitute flow into post-stenotic, chronically underperfused myocardium. This post-stenotic myocardium consists of connective tissue scars, dying cardiomyocytes, and hypo-active or chronically hibernating myocardium with microvascular disturbances due to microthrombi and decreased capillary density. When bulk flow is successfully reconstituted by CABG or PTCA, the gradual recovery of microcirculation is considered as decisive for the recovery of mechanical function of surviving post-stenotic myocardium. Traditionally, neovascularization of disturbed microcirculation was considered to result exclusively from the proliferation, migration, and remodeling of fully differentiated endothelial cells (ECs) derived from pre-existing blood vessels. Recently, however, it was demonstrated that circulating, bone marrow-derived endothelial progenitor cells (EPCs) may home to sites of postnatal neovascularization and differentiate into ECs in situ, which is called “vasculogenesis”. Vascular trauma, as it occurs during surgical procedures, or inflammation leads to a cascade of events that result in the chemoattraction of inflammatory cells or other cell types to the site of injury. These blood-borne cells produce pro-angiogenic factors that, in turn, attract other cell types such as circulating EPCs. Systemic inflammatory responses have been described after cardiac surgery with cardiopulmonary bypass (CPB). Contact of the blood components with the artificial surface of the extracorporeal circuit, ischemia-reperfusion injury, endotoxemia, and operative trauma are possible causes for this phenomenon. Trauma has been considered as the critical stimulus for the mobilization of EPCs and proangiogenic vascular endothelial growth factor (VEGF) during CABG. It has been speculated that this mobilization may contribute to the revascularization of injured tissue, which would be of great clinical relevance for a successful outcome of CABG. Presently, there is a strong trend to perform CABG in patients of advanced age. However, experimental data indicate impaired neoangiogenesis in ischemic tissues and impaired re-endothelialization of vascular lesions as a function of advanced age. The mechanisms for this age-dependent impairment of vascular repair are largely unknown. Therefore, we analyzed the influence of age on CABG-induced mobilization of EPCs and cytochemokines with angiogenesis-modulating potential in a cohort of consecutive patients with stable coronary artery disease (CAD) scheduled for elective CABG. Probably, several types of endothelial precursor or progenitor cells have angiogenic potential after homing into traumatic tissue. Therefore, we used two phenotypic markers (CD34 and AC133 or CD133), which are expressed in all EPC types, but in the case of AC133, not in differentiated ECs. This was done to exclude from our analysis any mature or dying ECs with doubtful angiogenic capacity, potentially released from damaged vessels in old patients. In this analysis, we demonstrate that the preoperative number of circulating EPCs in patients with stable CAD is reduced with increasing age, together with decreased plasma VEGF levels. During CABG, mobilization of circulating EPCs could be detected in all patients, but this mobilization remained on a persistently lower level in the older patient group, suggesting that the responsiveness for mobilization of EPCs is impaired with age. Optimized strategies for ex-vivo expansion of those cells might be especially required in the elderly, if transplantation of these cells into poststenotic tissue will develop as a future co-therapy to existing interventions of revascularization.

This study demonstrated that the basal number of circulating EPCs in patients with stable CAD is decreased with increasing age. Furthermore, plasma VEGF levels are reduced with increasing age. This age-associated decrease could not be explained by higher prevalences of other risk factors, such as male gender, diabetes mellitus, hypertension, or hyperlipoproteinemia at older ages nor by any differences in left ventricular function or New York Heart Association classes. The operative trauma of complex cardiac surgery with CPB induced a mobilization in EPCs/ lymphocytes and EPCs/blood in all patients, but this mobilization remained on a persistently lower level in the older patient group, which could not be explained by any differences in the operative procedure (time on CPB, cross-clamping time, or number of grafts) or in the operation-induced increase in cytochemokines with a reported potency for modulation of angiogenesis (IL-6, IL-8, and IL-10). Similar age-associated losses in the number of circulating endothelial-related progenitor cells in patients undergoing CABG have not been reported so far. In 45 male subjects without a history of cardiovascular disease, the Framingham risk score and impairment of endothelium-mediated, flow-dependent brachial artery dilation were strong predictors of depressed numbers in circulating progenitor cells with colony-forming capacity. In a mixed group of healthy probands and CAD patients, Vasa et al. reported age-associated losses in circulating cells positive for CD34_ and KDR_, which may include progenitor cells and mobilized ECs. In their cohort, smoking was a strong predictor of lowered values in CD34_/KDR_ cells, independent of age. In the patients, self-reported smoking status, which is notoriously unreliable before cardiac surgery, could not be verified by interrogations of spouses or relatives. This may explain why we could not detect an effect of smoking on circulating EPCs. The reasons for the age-associated losses in circulating EPCs remain unknown at present. In this study, there was an age-independent correlation of circulating EPCs with plasma VEGF levels. Circulating or transplanted EPCs contribute to post-ischemic neovascularization in animal experiments and patients, and angiogenic factors like VEGF and PlGF are involved in this neovascularization. In animals, advanced age is associated with attenuated post-ischemic neovascularization and attenuated local induction of VEGF. Similarly, arterial re-endothelialization after vascular trauma is attenuated in old animals in which trauma-induced local VEGF expression is lower and local VEGF supplementation rescues vascular healing. Experimental elevation of plasma VEGF in mice by inoculation with adenoviral vectors induced rapid mobilization of endothelial precursor cells. Therefore, it is tempting to propose that lowered VEGF levels in the elderly patients are the reason for lowered circulating EPCs. However, this causality remains to be proven for basal steady-state levels, and the cause of depressed circulating VEGF levels in elderly patients remains unknown. In experimental studies, hypoxia-inducible factor-1 stabilization by hypoxia, which mediates hypoxic VEGF expression, is attenuated in cells from old animals. This might be relevant for the attenuated and retarded CABG-induced activation of plasma VEGF in older patients. However, this may be less relevant for the age-associated lowering in basal VEGF levels before surgery. Local and systemic inflammation by vascular trauma is considered an important contributor of post-ischemic neovascularization. In the patients, the operative trauma resulted in a substantial mobilization of cytochemokines with angiogenesis-modulating potential, except for IL-18 and PlGF. These observations are in agreement with previous reports. Although none of these activated factors could be directly correlated with the individual increase in EPCs during and after the operation in the patients, it is reasonable to assume that the complex spectrum of inflammatory activation is contributing to the mobilization of surgery-induced EPCs. Similar conclusions have been derived from observations on transient mobilization of KDR_/AC133_ cells in patients after burns or CABG. The kinetics of mobilization in that study differed somewhat from our observations, but the two studies are not directly comparable owing to differences in progenitor cell analysis. It is remarkable that the substantial inflammatory activation during surgery in our study could not abolish age-associated differences in EPC levels. The decline in the fraction of lymphocytes/leukocytes after CPB down to one-quarter that of the baseline value at 12 h after CPB most likely reflects substantial homing of lymphocytes into tissues in response to the systemic inflammatory activation induced by CPB. Homing must also contribute to the decline of circulating EPCs/ blood during this time. Therefore, circulating EPC levels underestimate the amount of EPC mobilization. However, quantification of lymphocyte or EPC homing could not be obtained in our patients. Application of different populations of EPCs or other mononuclear bone marrow cells improves postischemic organ function and microcirculation in animals and patients. However, it is not clear whether the surgery-induced mobilization of EPCs is sufficient for such a contribution. Furthermore, it is unclear whether the EPCs in elderly patients have the same angiogenic potential compared with those of younger patients. The EPCs collected from the circulation can be amplified in vitro. The co-application of amplified EPCs, together with native artery recanalization or bypass grafting, probably will develop as a therapeutic option in the future. The experimental data suggested that such co-therapy is especially desirable in elderly patients. The data suggested that mobilization of such cells for therapeutic application might be more difficult with an increasing age of patients. The inflammatory activation by complex CABG does not offset this age-associated lowering. Therefore, further studies are required for a better understanding of optimized strategies for recruitment, ex-vivo expansion, and retransplantation strategies involving EPCs in aging patients.

Abbreviations and Acronyms:

APC _ allophycocyanin

CABG _ coronary artery bypass grafting

CAD _ coronary artery disease

CPB _ cardiopulmonary bypass

EC _ endothelial cell

EPC _ endothelial progenitor cell

KDR _ kinase insert domain containing receptor

IL _ interleukin

PlGF _ placental growth factor

PTCA _ percutaneous transluminal coronary angioplasty

VEGF _ vascular endothelial growth factor

Source reference:

http://www.sciencedirect.com/science/article/pii/S0735109703012725

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

 

Synthetic Biology

This collection aims to highlight PLOS ONE‘s role in the emerging interdisciplinary field of synthetic biology. The collection has its roots in PLOS ONE‘s very first issue, which included two publications from the field and since then, the number of synthetic biology articles published by the journal has grown steadily. As the field continues to develop, this collection will be updated to include new publications, thereby tracking the evolution of this dynamic research area.

Synthetic biology occurs at the intersection of a number of traditional disciplines, including biology, chemistry, and engineering. It aims to create biological systems that can be programmed to do useful things such as producing drugs and biofuel. The interdisciplinary nature of synthetic biology can make it difficult to publish in traditional journals. PLOS ONE‘s broad scope, however, allows for the publication of work crossing many traditional research boundaries, making it an ideal venue for many different types of synthetic biology publications. In addition, the journal’s focus on rigorous peer review without considering impact has made it possible to publish a body of articles that truly reflects the multifaceted nature of this research area.

One overarching theme of synthetic biology is standardization, which can only be achieved through concerted community effort. To this end, each article published in PLOS ONE can be the start of a lively conversation. The ability to comment on articles provides the community with a means to engage in a dialogue focused on specific articles, and the “Share this Article” feature allows readers to quickly send an article they find interesting to their entire networks, because all the content is openly accessible.

Articles in the Synthetic Biology Collection are presented in order of publication date and new articles will be added as they are published. PLOS ONE welcomes submissions in this field.

Collection Citation: Synthetic Biology (2012) PLOS Collections:http://www.ploscollections.org/syntheticbiology

Image Credit: Ivan Morozov (Virginia Bioinformatics Institute)

SOURCE

http://www.ploscollections.org/article/browseIssue.action?issue=info:doi/10.1371/issue.pcol.v02.i18

PLOS ONE Launches Synthetic Biology Collection

By Rachel Bernstein
Posted: August 15, 2012

Today PLOS ONE is happy to announce the launch of the Synthetic Biology Collection, including over 50 papers published in the last six years that illustrate the many facets of this dynamically evolving research area.

Synthetic biology is an innovative emerging field that exists at the intersection of many traditional disciplines, including biology, chemistry, and engineering, with aims to create biological systems that can be programmed to do useful things like produce drugs or biofuels, among other applications. Despite its potential, the heavily interdisciplinary nature of the research can make it difficult to publish in traditional discipline-specific journals.

However, PLOS ONE’s broad scope allows for the publication of work crossing many traditional research boundaries, making it an ideal venue for many different types of synthetic biology research. For example, the papers in the collection cover topics including DNA synthesis and assembly, standardized biological “parts” akin to interchangeable mechanical parts, protein engineering, and complex network and pathway analysis and modeling, as described in theCollection Overview written by collection editors Jean Peccoud of Virginia Tech and Mark Isalan of the Centre for Genomic Regulation.

The Collection has roots in PLOS ONE’s very first issue, which included two publications from the field. Since then, the number of synthetic biology articles published in the journal has grown steadily. The collection launched today highlights selected synthetic biology articles published in PLOS ONE since 2006, and it is intended to be a growing resource that will be updated regularly with new papers as the field continues to grow and develop.

Collection Citation: Synthetic Biology (2012) PLOS Collections:http://www.ploscollections.org/syntheticbiology

Image Credit: Ivan Morozov (Virginia Bioinformatics Institute)

SOURCE

http://blogs.plos.org/everyone/2012/08/15/plos-one-launches-synthetic-biology-collection/

The PLOS ONE Synthetic Biology Collection: Six Years and Counting

Jean Peccoud, Mark Isalan

PLoS ONE:
Published 15 Aug 2012 | info:doi/10.1371/journal.pone.0043231

The PLOS ONE Synthetic Biology Collection: Six Years and Counting 

Jean Peccoud1,2*, Mark Isalan3

1 Virginia Bioinformatics Institute, Virginia Tech, Blacksburg, Virginia, United States of America, 2 Center for Systems Biology of Engineered Tissues, Institute for Critical Technologies and Applied Science, Virginia Tech, Blacksburg, Virginia, United States of America, 3 EMBL/CRG Systems Biology Research Unit, Centre for Genomic Regulation (CRG) and UPF, Barcelona, Spain

Abstract 

Since it was launched in 2006, PLOS ONE has published over fifty articles illustrating the many facets of the emerging field of synthetic biology. This article reviews these publications by organizing them into broad categories focused on DNA synthesis and assembly techniques, the development of libraries of biological parts, the use of synthetic biology in protein engineering applications, and the engineering of gene regulatory networks and metabolic pathways. Finally, we review articles that describe enabling technologies such as software and modeling, along with new instrumentation. In order to increase the visibility of this body of work, the papers have been assembled into the PLOS ONE Synthetic Biology Collection (www.ploscollections.org/synbio). Many of the innovative features of the PLOS ONE web site will help make this collection a resource that will support a lively dialogue between readers and authors of PLOS ONE synthetic biology papers. The content of the collection will be updated periodically by including relevant articles as they are published by the journal. Thus, we hope that this collection will continue to meet the publishing needs of the synthetic biology community.

Citation: Peccoud J, Isalan M (2012) The PLOS ONE Synthetic Biology Collection: Six Years and Counting. PLoS ONE 7(8): e43231. doi:10.1371/journal.pone.0043231

Editor: Wei Ning Chen, Nanyang Technological University, Singapore

 

Received: May 23, 2012; Accepted: July 16, 2012; Published: August 15, 2012

Copyright: © 2012 Peccoud, Isalan. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Funding: JP is supported by National Science Foundation Awards 0850100 and 0963988 and by grants R01-GM078989 and R01-GM095955 from the National Institutes of Health. MI is funded by FP7 ERC 201249 ZINC-HUBS, Ministerio de Ciencia e Innovacion grant MICINN BFU2010-17953 and the MEC-EMBL agreement. The funders had no role in the preparation of the manuscript.

Competing interests: The authors have declared that no competing interests exist.

* E-mail: peccoud@vt.edu

Introduction

Synthetic biology is an emerging transdisciplinary field at the intersection between many engineering and scientific disciplines such as biology, chemical engineering, chemistry, electrical engineering, or computer science. The scientific milestone that inspired the development of synthetic biology is often regarded as the description of two artificial gene networks in the same issue of Nature in 2000 [1][2]. However, the year 2004 marks the emergence of synthetic biology as a scientific community. This is the year of the first synthetic biology conference, the first iGEM competition –where students compete to build biological systems (http://igem.org/) ― and the creation of the synthetic biology page on Wikipedia. Two years later, the first issue of PLOS ONE included two synthetic biology articles [3][4], marking the beginning of a trend. Since then, PLOS ONE has published a large number of articles covering all aspects of the field. Synthetic biologists resolutely push the limits of their specialties in ways that few established journals have been able to appreciate. Since the result is often more “how to build something that works” rather than primary biological insight, the papers can be hard to place in classical journals. Many synthetic biology authors have benefited from the innovative PLOS ONE editorial policy to publish scientifically sound research, irrespective of its anticipated significance.

The purpose of this article is to introduce the PLOS ONE Synthetic Biology Collection (www.ploscollections.org/synbio/). The collection highlights selected synthetic biology articles published in PLOS ONE since 2006, putting them together in one place for easy perusal. The website is intended to be a growing resource that will be updated regularly.

We review the collection here by organizing it into some broad categories: DNA synthesis and assembly, Biological parts, Protein engineering, Networks and pathways, Synthetic life, Software and modeling, and Instruments. The classification is our own; since many synthetic biology papers cited in this review span more than one category, it was sometimes difficult to assign them to one category rather than another. Nonetheless, this structure should aid in navigating the 50+ papers currently included in the collection.

Summary of Papers Included in the Collection 

DNA Synthesis and Assembly

Synthetic biology projects often begin with the assembly of complicated, multi-component gene constructs. Therefore, both DNA assembly and cloning technologies are critical enablers of synthetic biology. Not surprisingly, many recent PLOS ONE papers propose methods to improve the efficiency of the fabrication step of synthetic biology projects. For example, Golden Gate Cloning [5] is a one-step DNA assembly protocol that can join at least nine distinct DNA fragments into one plasmid vector. The technique employs type IIs restriction enzymes that cut DNA at some distance from their cognate DNA-binding site, thus allowing flexibility and uniqueness in the compatible sticky ends that are generated. A related technique is GoldenBraid Assembly [6], that also uses type IIs restriction enzymes, but applies them iteratively to standardized DNA parts (see the ‘Biological parts’ section below). This allows the indefinite growth of reusable gene modules. Similarly, type IIs restriction enzymes have been used to make a hierarchical modular cloning system aimed at making eukaryotic multigene constructs [7].

‘One-pot’ assembly and cloning systems are being developed by many groups, and the ideal systems use as few standardized components as possible. Circular polymerase extension cloning (CPEC) fits into this category, using a single polymerase to assemble and clone multiple inserts with any vector, in a one-step in vitro reaction [8]. Alternatively, successive hybridization assembling (SHA) also employs a single reaction in vitro [9].

As well as cloning one desired multi-component construct, many projects require degenerate cloning or mutagenesis to make combinatorial libraries of gene variants. The OmniChange technique, which simultaneously saturates five independent codons, has therefore been developed to generate full-length gene libraries with 5 degenerate NNK-codons while avoiding PCR-amplification [10]. Large libraries of genetic sequences can be derived from oligonucleotides synthetized in a microarray, and later pooled in libraries from which more complex sequences can be derived [11]. By combining linear DNA amplification and PCR, DNA libraries with hundreds to thousands of members can be synthesized.

PCR methods themselves can have certain limitations, such as difficulties in amplifying GC-rich DNA targets. One study optimized polymerase chain assembly (PCA) and ligase chain reaction (LCR) methods for the construction of two GC-rich gene fragments implicated in tumorigenesis, IGF2R and BRAF [12]. They found that LCR was superior and benefited from the addition of DMSO and betaine.

The many synthesis and assembly methods presented in the collection can be combined to streamline the fabrication steps of synthetic biology projects, by producing collections of standardized biological parts. Standard parts are themselves a distinctive feature of synthetic biology, as reviewed below.

Biological Parts

The Registry of Standard Biological Parts (www.partsregistry.org), based on the original vision of Tom Knight, is providing a rich collection of components for synthetic biology projects. Several articles in the PLOS ONE collection reflect the importance of this resource. For example, a global analysis of the Registry clone collection [13] helped identify certain discrepancies between the sequences recorded in the database and the physical sequences of some clones in the collection. These results prompted a change in the quality control of the submissions to the Registry that has greatly improved the overall quality of the collection. Moreover, the analysis of parts usage patterns led to organizational guidelines that may help design and manage these new types of scientific resources. As most parts in the registry are for prokaryotes, a eukaryotic collection of 52 parts was developed and is available for distribution[14]. This includes multiple cloning sites (MCS), common protein tags, protein reporters and selection markers, amongst others. Furthermore, most of the parts were designed in a format to allow fusions that maintain the reading frame.

As well as standardized coding regions, synthetic biology projects require well-characterized promoters to achieve desired expression strengths. In one study, a single yeast promoter was mutated to make a fine-graded output range promoter library [15]. Transcription Activator-Like Orthogonal Repressors were then developed synthetically to control expression of these promoters in an orthogonal manner. Such orthogonality or ‘non-cross-reactivity’ is necessary for engineering larger synthetic gene circuits that do not interfere with the physiology of the biological chassis in which they operate. Mammalian synthetic promoters have also been developed by analyzing motifs found in highly active human promoters. Thus, by modulating the amount of sequences rich in GC and CpGs, custom designed promoters were obtained [16].

Finally, entirely de novo parts that are found nowhere in nature have been engineered to slot into biological systems. Using E. coli lacking conditionally essential genes, entirely new functional proteins were obtained from scaffolds of randomized 4-helix bundles, rescuing stalled growth [17]. Similarly, a synthetic ATP-binding protein, evolved entirely from non-natural sequences, was expressed in E. coli, altering the levels of intracellular ATP [18]. Protein engineering approaches are thus a potential source of many new parts, as well as forming a branch of synthetic biology in their own right.

Protein Engineering

Protein engineering can take many forms, from directed evolution methods to protein design. The PLOS ONE Synthetic Biology Collection includes a wide range of studies in this broad field.

Phage display is one of the classic tools of protein engineering, allowing combinatorial libraries of randomized proteins to be selected from the surface of bacteriophages. Phage display was used to generate a new class of binding proteins targeted to the pointed-end of actin [19]. These proteins, called synthetic antigen binders (sABs), were based on an antibody-like scaffold where sequence diversity is introduced into the binding loops using a new “reduced genetic code” phage display library.

An example of targeted protein design was the design of a dual reporter, Gemini [20]. Here, β-galactosidase (β-gal) α-fragment was fused to GFP, resulting in increased β-gal activity and some decrease in GFP sensitivity. GFP was also modified in a study where the ten proline residues of enhanced green fluorescent protein (EGFP) were replaced by (4R)- and (4S)-fluoroprolines (FPro) [21]. In this way, protein folding and stability could be tuned.

A promising advance in the field of engineering custom sequence-specific DNA-binding proteins is the use of Transcription Activator-Like (TAL) proteins. Modular TAL units specify A, C, G or T and can be concatenated to make long designer DNA-binding domains. Thus, Golden TAL Technology [22] has adapted Golden Gate Cloning [5] for engineering new TAL proteins. These were shown to function in human and plant cells and to target activation of both exogenous and endogenous genes, after fusion with a VP16 activation domain.

As well as single proteins, entire pathways can nowadays be engineered. Computational redesign was used to create new periplasmic binding proteins in plants, to act as biosensors in combination with a histidine kinase signaling cascade [23]. This resulted in transcription factor activation and ‘de-greening’ of plants in response to small-molecule stimuli. As can be seen from this example and the ones below, the move from single protein engineering to network engineering is one of the main driving forces in synthetic biology.

Networks and Pathways

One of the first, and now most-cited, synthetic biology papers in PLOS ONE was the study on fitness-induced attractor selection [3]. Here, a synthetic mutual inhibition gene network was built in E. coli, with two states, green (GFP) and red (RFP), that were mutually exclusive. By attaching a fitness pressure to one of the states (i.e. a gene required for growth in the absence of glutamine), the authors demonstrated that the cells switched stochastically into the fittest state, restoring growth. In other words, by changing to a glutamine-free medium, the red cells switched to green, even in the absence of formal signaling machinery. This work has important messages for potential new mechanisms in gene regulation, where underlying fitness pressures can ultimately determine how much a gene is expressed, simply according to need.

Other small bacterial networks have been built to include a heritable sequential memory switch, using the fim and hin inversion recombination systems [24], and an E. coli strain for use as a ‘chemical recording device’ [25]. In the latter, the authors created a synthetic chemically sensitive genetic toggle switch to activate appropriate fluorescent protein indicators (GFP, RFP) and along with a cell division inhibitor (minC). Moving to yeast, one example of network engineering was the reconstruction of a human p53-Mdm2 negative feedback module in S. cerevisiae [26]. In this example, many aspects of p53 regulation in mammals were maintained, such as Mdm2-dependent targeting of p53 for degradation, sumoylation at lysine 386 and further regulation of this process by p14ARF. In mammalian systems, a synthetic tetracycline regulator positive feedback loop was stably integrated and yielded a bimodal expression response because such cells can only be “OFF” or “ON” [27].

One unusual work in synthetic biology aimed to rewire and control cell shape in yeast, by changing the inputs into the α-factor pathway [28]. This pathway can give rise to multiple mating projections, upon prolonged activation. The authors tested genetic manipulations that ultimately gave rise to single or multiple projections, in the absence of the natural input, α-factor.

A group of papers in the collection explore ‘synthetic ecology’, where consortia of different cells interact to give patterns at a population level. For example, by engineering two strains of E. coli, one study was able to achieve synthetic biofilms with spatial self-organization [29]. The consortia achieved defined layered structures and had unexpected growth advantages. A second paper describes a systems composed of two quorum-sensing signal transduction circuits that allowed the authors to build a synthetic ecosystem where the population dynamics could be tuned by varying the environmental signals [30]. Third, quorum components were also used in a study which generated robust but unexpected oscillations in E. coli by building synthetic suicide circuits [31]. In fact, the quorum components proved to be unnecessary to achieve oscillations: there was a density-dependent plasmid amplification that gave rise to population-level negative feedback, ultimately resulting in the cycles. As in other areas of synthetic biology, the process of building systems often leads to surprises which can result in useful new engineering tools, or to a better understanding of the underlying biological processes [32].

Pathway engineering for the production of useful chemical or product synthesis is a major field within synthetic biology. For example, an engineered yeast that efficiently secretes penicillin was built by transplanting synthesis pathway components into a host that is more suited for pharmaceutical production [33]. Artemisinin derivatives are key components of malaria therapies and their synthesis is a high-profile goal of synthetic biology because extraction from slow-growing plants currently limits supply. Consequently, one study achieved high-level production of an artemisinin precursor in E. coli[34]. Another striking synthesis paper demonstrates a synthetic enzymatic pathway consisting of 13 enzymes for high-yield hydrogen production from starch and water [35]. Building such large systems is extremely challenging; as a result, these articles have received a lot of attention.

Synthetic Life

Synthetic life is among the most controversial of synthetic biology aims, and has received a lot of attention, even in the mainstream press. Public concerns of possible biological threats resulting from the misuse of these technologies prompted the development of new biosecurity policies [36].

One branch of this field is the de novo chemical synthesis and assembly of whole plasmids, viruses and genomes which are then transplanted into host cells. The pX1.0 plasmid is an example of a fully chemically-synthesized plasmid designed by calculating consensus sequences from 8 plasmids [37], while removing genes involved in antibiotic resistance and virulence. The plasmid not only replicated inE. coli, but could also self-transfer by conjugation into two other enterobacter species. A chemical synthesis approach was also used to construct whole genomes of bacteriophage G4 (around 10 kilobases in length), resulting in infectious viruses that could pass from one strain of E. coli to another[38].

One group has the ambitious long-term aim of building a synthetic chloroplast, and has begun by transplanting photosynthetic bacteria into eukaryotic cells to see whether they can achieve synthetic symbiosis [39]. Remarkably, the authors showed that some cyanobacteria were relatively harmless in zebrafish embryos, compared to E. coli. Furthermore, by engineering invasins into the cyanobacteria, they were able to invade and divide inside mammalian macrophages. Synthetic biology is only limited by our imagination, and one can speculate that entire free-living synthetic lifeforms could find their place in the collection in the not-too-distant future.

Software and Modeling

As the number of biological parts for synthetic biology increases, databases and design methods must evolve. For example, to help researchers search and retrieve biological parts, the Knowledgebase of Standard Biological Parts (SBPkb) is a Semantic Web resource for synthetic biology [40].

The collection also includes two articles presenting Computer Assisted Design software tools. Eugene is a human readable language to specify synthetic biological designs based on biological parts. It also provides a very expressive constraint system to drive the automatic creation of composite parts or devices from a collection of individual parts [41]. Alternatively, the Proto platform also provides a high-level biologically-oriented programming language [42]. Specifications are compiled from regulatory motifs, optimized, then converted into computational simulations for numerical verification.

Ultimately the design tools are only as good as the underlying mathematical models they rely on to make predictions of design behaviors. The collection includes a number of articles applying mathematical modeling approaches rooted in various engineering specialties to the design of synthetic genetic constructs.

Modeling gene networks is at the interface of systems and synthetic biology, and many PLOS ONE modeling papers aim to guide bioengineering projects. A recent example of adapting modeling for re-engineering properties into a system used a standardized synthetic yeast network from the In-vivo Reverse-engineering and Modeling Assessment (IRMA) [43]. Reverse engineering itself was used in a study which ultimately provided guidelines for chemotaxis pathway redesign [44]. Statecharts are used to describe dynamical systems, but have not been applied to gene networks. By doing so explicitly, one study was able to model network motifs and combine them in a complicated interlocked feed-forward loop network [45].

Two-component systems are common regulatory motifs in bacteria, and comprise a kinase that senses environmental signals together with a regulator that mediates the cell response. A recent study asked the question, “what happens if you add a third component that interacts with either of the other two?”[46]. Estimating the parameter space associated with a particular function is very valuable for guiding synthetic engineering approaches, as is determining whether a function is theoretically possible at all. For example, using a geometric argument, it was shown that, surprisingly, even monomer regulators can achieve bistability. This demonstrates the possibility of switch-like behavior in feedback autoloops without resorting to multimer regulators [47].

thumbnailFigure 1. Historical distribution of synthetic biology articles published by PLOS ONE.

This figure reports the number of articles in the collection published between 2006 and 2011. It shows a rapid growth of synthetic biology that reflects the growth of the journal and the increased familiarity of synthetic biologists with PLOS ONE.

doi:10.1371/journal.pone.0043231.g001

By combining experiments and computation, one study was able to derive design algorithms for altering synonymous codons in proteins, resulting in drastic expression differences of the same protein sequence[48]. For example, with DNA polymerase and single chain antibodies, expression could be predictably tuned to obtain concentrations ranging from undetectable to 30% of cellular protein. Importantly, using partial least squares regression, the authors noticed that favorable codons were predominantly those read by tRNAs that are most highly charged during amino acid starvation, not codons that are most abundant in highly expressed E. coli proteins. This is an important discovery for building genetic constructs that express appropriately inside the target cells.

Computation is a key function of biological networks and several studies in the collection present schemes to achieve this. The first is implemented at the level of chemical reactions and describes functions such as an inverter, an incrementer, a decrementer, a copier, a comparator, a multiplier, an exponentiator, a raise-to-a-power operation, and a logarithm in base two [49]. A key simplification is that the scheme uses only two reaction rates (“fast” and “slow”). A second study models a synthetic gene network to perform frequency multiplication [50]. Both of these studies assume deterministic relationships between input and outputs. Recently, the deterministic assumption has been challenged by experimental and theoretical works analyzing the importance of noise in the dynamics of gene networks [51]. This trend is illustrated in the collection by an article demonstrating that reliable timing of decision-making processes (choosing between multistable states) can be accomplished for large enough population sizes, as long as cells are globally coupled by chemical means [52]. Modeling can often reveal subtle non-intuitive designs, and, as a means of guiding synthetic biology, is likely to become an even larger field in the future.

thumbnailFigure 2. Relationships between article-level metrics.

For articles published between 2006 and 2009, there is a positive correlation between the number of times an article is cited in the scientific literature and the number of times it is viewed (A). For articles published between 2010 and 2012, there is a positive relationship between the number of views and the number of citations in the Mendeley social network (B). Metrics, such as number of views and citations in social media, give readers and authors an estimate of the scientific impact of individual articles well before they receive citations in scientific literature.

doi:10.1371/journal.pone.0043231.g002

Instruments

Nowadays, new technology and machinery is an important driving force for both primary biological discovery and for synthetic biology. A neat example is provided by the use of inkjet printer technology to provide low-cost high-resolution tools; a bacterial piezoelectric inkjet printer was designed to print out different strains of bacteria or chemicals in small droplets onto a flat surface at high resolution [53]. Another group used an inkjet for continuous dosing of diffusible regulators to a gel culture of E. coli, allowing 2D spatiotemporal regulation [54]. Precise spatiotemporal control of cells can also be achieved with microfluidics, and a recent report grew dividing yeast cells in a remarkable planar array [55]. Transient pulses of gene expression could be triggered by briefly inducing the GAL1 or MET3 promoters, resulting in coherent induction of cell division across the cell cluster. Other novel culture systems presented in the collection include the development of a 3-D cell culture system using a designer peptide nanofiber scaffold that self-assembled [4]. The peptide could be linked to functional motifs for cell adhesion, differentiation, and bone marrow homing for use with mouse adult neural stem cells.

The Synthetic Biology Collection: A Dynamic Community Resource Top

It is remarkable that the collection includes several articles originating from engineers and computer scientists who traditionally publish their work in conference proceedings rather than the journals available to life-scientists. PLOS ONE’s indifference to subject matter made it possible to publish an unprecedented body of articles that reflects the multi-faceted nature of synthetic biology. No less remarkable is the observation that PLOS ONE published several articles originating from iGEM projects[13][41][56].

Since 2006, the number of synthetic biology articles published by the journal has been growing steadily (Figure 1). This evolution is consistent with the social trends in synthetic biology that have been mapped in an interesting bibliometric analysis included in the collection [57]. This is an indication that the synthetic biology community is becoming more aware of the services provided by the journal. Looking forward, the collection will make it easier to identify synthetic biology articles among the quickly growing volume of articles published by the journal each day. The content of the collection will be updated periodically as new synthetic biology articles are published by the journal.

Although Journal Impact Factors are a widely-discredited form of evaluating the quality of individual papers, all too often they are still used. Thus, it is imperative to find a better alternative. One of the most exciting features of the PLOS ONE web site is the Metrics tab, displaying article-based metrics that can be used to assess the impact of individual articles. These metrics naturally include traditional indicators, such as the number of citations. The two articles of the collection published in 2006 have been cited 70 and 84 times so far. Almost all the articles published in 2007 and 2008 have received more than 10 citations. The lag between the publication of an article and its citation by others is well known. Fortunately, the Metrics tab also includes more innovative indicators that give the authors and readers alike a real-time estimate of the ‘impact’ of an article. The number of times an article is viewed is an important indicator. Since PLOS ONE is an online journal, all readers view articles online in one way or another. As a result, we hypothesized that the number of times an article was viewed should be a good predictor of the number of citations it will receive. Using data reported in Table S1, we analyzed the relationship between views and citation numbers for articles included in the collection that were published between 2006 and 2009. Figure 2 shows that there is a positive correlation between the two metrics. That relationship does not hold when including more recent articles because of a difference in timing between viewing and citing activities. Articles typically receive a substantial number of views in the first few months after publication, but it takes a few years before they are cited. The 20 articles of the collection published in 2011 have recorded a lot of views, but have not had the time to be cited in the literature yet.

A non-conventional form of citations displayed in the Metrics tab is the number of times an article is bookmarked in social media. We have reported the Mendeley (www.mendeley.com) data in Table S1.Figure 2 shows that there is a positive relationship between the number of views and the number of times articles are bookmarked in this network, at least for the most recent articles of the collection. Older articles are under-represented in Mendeley because this network was not available at the time these articles were published. It will be interesting to see if citations of the collection articles in social media will be a better predictor of citations in the scientific literature than the number of views.

One overarching theme of synthetic biology is standardization [58][59], which can only be achieved through concerted efforts by members of the community. The field has therefore been deeply influenced by the development of resources such as the Registry of Standard Biological Parts (www.partsregistry.org ). More recently, the development of SBOL, the Open Language for Synthetic Biology (www.sbolstandard.org) illustrates the need to agree on data formats suitable to the development of software tool chains necessary to support experimental efforts. Each article published in PLOS ONE can be the start of a lively conversation. The journal web site provides authors and readers alike with a detailed vision of community connections. The “Share this article” feature allows readers to quickly send an article they find interesting to their networks. The comments tab of the articles provides the community with means to engage in a dialogue focused on specific articles [5][35][48][55]. This feature can also be used by authors to provide updated information about the work presented in the article [13].

When working at its best, science should be an active conversation that keeps refining ideas. We believe that PLOS ONE provides the ideal venue to achieve this, and we hope that the collection will inspire further progress in synthetic biology. Ultimately, we hope that having a clear repository in PLOS ONE should further increase its attractiveness as a home for publishing synthetic biology.

Table S1.

Article-level statistics for the Synthetic Biology Collection.

(XLSX)

Author Contributions

Wrote the paper: JP MI.

References Top

  1. Elowitz MB, Leibler S (2000) A synthetic oscillatory network of transcriptional regulators. Nature 403: 335–338. FIND THIS ARTICLE ONLINE
  2. Gardner TS, Cantor CR, Collins JJ (2000) Construction of a genetic toggle switch in Escherichia coli. Nature 403: 339–342. FIND THIS ARTICLE ONLINE
  3. Kashiwagi A, Urabe I, Kaneko K, Yomo T (2006) Adaptive Response of a Gene Network to Environmental Changes by Fitness-Induced Attractor Selection. PLoS ONE 1: e49. FIND THIS ARTICLE ONLINE
  4. Gelain F, Bottai D, Vescovi A, Zhang S (2006) Designer Self-Assembling Peptide Nanofiber Scaffolds for Adult Mouse Neural Stem Cell 3-Dimensional Cultures. PLoS ONE 1: e119. FIND THIS ARTICLE ONLINE
  5. Engler C, Gruetzner R, Kandzia R, Marillonnet S (2009) Golden gate shuffling: a one-pot DNA shuffling method based on type IIs restriction enzymes. PLoS One 4: e5553. FIND THIS ARTICLE ONLINE
  6. Sarrion-Perdigones A, Falconi E, Zandalinas S, Juárez P, Fernández-del-Carmen A, et al. (2011) GoldenBraid: An Iterative Cloning System for Standardized Assembly of Reusable Genetic Modules. PLoS ONE 6: e21622. FIND THIS ARTICLE ONLINE
  7. Weber E, Engler C, Gruetzner R, Werner S, Marillonnet S (2011) A Modular Cloning System for Standardized Assembly of Multigene Constructs. PLoS ONE 6: e16765. FIND THIS ARTICLE ONLINE
  8. Quan J, Tian J (2009) Circular Polymerase Extension Cloning of Complex Gene Libraries and Pathways. PLoS ONE 4: e6441. FIND THIS ARTICLE ONLINE
  9. Jiang X, Yang J, Zhang H, Zou H, Wang C, et al. (2012) In Vitro Assembly of Multiple DNA Fragments Using Successive Hybridization. PLoS ONE 7: e30267. FIND THIS ARTICLE ONLINE
  10. Dennig A, Shivange A, Marienhagen J, Schwaneberg U (2011) OmniChange: The Sequence Independent Method for Simultaneous Site-Saturation of Five Codons. PLoS ONE 6: e26222.FIND THIS ARTICLE ONLINE
  11. Svensen N, Díaz-Mochón JJ, Bradley M (2011) Microarray generation of thousand-member oligonucleotide libraries. PLoS ONE 6: e24906. FIND THIS ARTICLE ONLINE
  12. Jensen M, Fukushima M, Davis R (2010) DMSO and Betaine Greatly Improve Amplification of GC-Rich Constructs in De Novo Synthesis. PLoS ONE 5: e11024. FIND THIS ARTICLE ONLINE
  13. Peccoud J, Blauvelt MF, Cai Y, Cooper KL, Crasta O, et al. (2008) Targeted Development of Registries of Biological Parts. PLoS ONE 3: e2671. FIND THIS ARTICLE ONLINE
  14. Constante M, Grünberg R, Isalan M (2011) A Biobrick Library for Cloning Custom Eukaryotic Plasmids. PLoS ONE 6: e23685. FIND THIS ARTICLE ONLINE
  15. Blount BA, Weenink T, Vasylechko S, Ellis T (2012) Rational Diversification of a Promoter Providing Fine-Tuned Expression and Orthogonal Regulation for Synthetic Biology. PLoS ONE 7: e33279.FIND THIS ARTICLE ONLINE
  16. Grabherr M, Pontiller J, Mauceli E, Ernst W, Baumann M, et al. (2011) Exploiting Nucleotide Composition to Engineer Promoters. PLoS ONE 6: e20136. FIND THIS ARTICLE ONLINE
  17. Fisher M, McKinley K, Bradley L, Viola S, Hecht M (2011) De Novo Designed Proteins from a Library of Artificial Sequences Function in Escherichia Coli and Enable Cell Growth. PLoS ONE 6: e15364.FIND THIS ARTICLE ONLINE
  18. Stomel J, Wilson J, León M, Stafford P, Chaput J (2009) A Man-Made ATP-Binding Protein Evolved Independent of Nature Causes Abnormal Growth in Bacterial Cells. PLoS ONE 4: e7385. FIND THIS ARTICLE ONLINE
  19. Brawley C, Uysal S, Kossiakoff A, Rock R (2010) Characterization of Engineered Actin Binding Proteins That Control Filament Assembly and Structure. PLoS ONE 5: e13960. FIND THIS ARTICLE ONLINE
  20. Martin L, Che A, Endy D (2009) Gemini, a bifunctional enzymatic and fluorescent reporter of gene expression. PLoS One 4: e7569. FIND THIS ARTICLE ONLINE
  21. Steiner T, Hess P, Bae JH, Wiltschi B, Moroder L, et al. (2008) Synthetic Biology of Proteins: Tuning GFPs Folding and Stability with Fluoroproline. PLoS ONE 3: e1680. FIND THIS ARTICLE ONLINE
  22. Weber E, Gruetzner R, Werner S, Engler C, Marillonnet S (2011) Assembly of Designer TAL Effectors by Golden Gate Cloning. PLoS ONE 6: e19722. FIND THIS ARTICLE ONLINE
  23. Antunes M, Morey K, Smith J, Albrecht K, Bowen T, et al. (2011) Programmable Ligand Detection System in Plants through a Synthetic Signal Transduction Pathway. PLoS ONE 6: e16292. FIND THIS ARTICLE ONLINE
  24. Ham T, Lee S, Keasling J, Arkin A (2008) Design and Construction of a Double Inversion Recombination Switch for Heritable Sequential Genetic Memory. PLoS ONE 3: e2815. FIND THIS ARTICLE ONLINE
  25. Bhomkar P, Materi W, Wishart D (2011) The Bacterial Nanorecorder: Engineering E. coli to Function as a Chemical Recording Device. PLoS ONE 6: e27559. FIND THIS ARTICLE ONLINE
  26. Di Ventura B, Funaya C, Antony C, Knop M, Serrano L (2008) Reconstitution of Mdm2-Dependent Post-Translational Modifications of p53 in Yeast. PLoS ONE 3: e1507. FIND THIS ARTICLE ONLINE
  27. May T, Eccleston L, Herrmann S, Hauser H, Goncalves J, et al. (2008) Bimodal and Hysteretic Expression in Mammalian Cells from a Synthetic Gene Circuit. PLoS ONE 3: e2372. FIND THIS ARTICLE ONLINE
  28. Tanaka H, Yi T-M (2009) Synthetic Morphology Using Alternative Inputs. PLoS ONE 4: e6946.FIND THIS ARTICLE ONLINE
  29. Brenner K, Arnold F (2011) Self-Organization, Layered Structure, and Aggregation Enhance Persistence of a Synthetic Biofilm Consortium. PLoS ONE 6: e16791. FIND THIS ARTICLE ONLINE
  30. Hu B, Du J, Zou R-y, Yuan Y-j (2010) An Environment-Sensitive Synthetic Microbial Ecosystem. PLoS ONE 5: e10619. FIND THIS ARTICLE ONLINE
  31. Marguet P, Tanouchi Y, Spitz E, Smith C, You L (2010) Oscillations by Minimal Bacterial Suicide Circuits Reveal Hidden Facets of Host-Circuit Physiology. PLoS ONE 5: e11909. FIND THIS ARTICLE ONLINE
  32. Elowitz M, Lim WA (2010) Build life to understand it. Nature 468: 889–890. FIND THIS ARTICLE ONLINE
  33. Gidijala L, Kiel J, Douma R, Seifar R, van Gulik W, et al. (2009) An Engineered Yeast Efficiently Secreting Penicillin. PLoS ONE 4: e8317. FIND THIS ARTICLE ONLINE
  34. Tsuruta H, Paddon C, Eng D, Lenihan J, Horning T, et al. (2009) High-Level Production of Amorpha-4,11-Diene, a Precursor of the Antimalarial Agent Artemisinin, in Escherichia coli. PLoS ONE 4: e4489. FIND THIS ARTICLE ONLINE
  35. Zhang P, Evans B, Mielenz J, Hopkins R, Adams M (2007) High-Yield Hydrogen Production from Starch and Water by a Synthetic Enzymatic Pathway. PLoS ONE 2: e456. FIND THIS ARTICLE ONLINE
  36. Adam L, Kozar M, Letort G, Mirat O, Srivastava A, et al. (2011) Strengths and limitations of the federal guidance on synthetic DNA. Nat Biotechnol 29: 208–210. FIND THIS ARTICLE ONLINE
  37. Hansen L, Bentzon-Tilia M, Bentzon-Tilia S, Norman A, Rafty L, et al. (2011) Design and Synthesis of a Quintessential Self-Transmissible IncX1 Plasmid, pX1.0. PLoS ONE 6: e19912. FIND THIS ARTICLE ONLINE
  38. Yang R, Han Y, Ye Y, Liu Y, Jiang Z, et al. (2011) Chemical Synthesis of Bacteriophage G4. PLoS ONE 6: e27062. FIND THIS ARTICLE ONLINE
  39. Agapakis C, Niederholtmeyer H, Noche R, Lieberman T, Megason S, et al. (2011) Towards a Synthetic Chloroplast. PLoS ONE 6: e18877. FIND THIS ARTICLE ONLINE
  40. Galdzicki M, Rodriguez C, Chandran D, Sauro H, Gennari J (2011) Standard Biological Parts Knowledgebase. PLoS ONE 6: e17005. FIND THIS ARTICLE ONLINE
  41. Bilitchenko L, Liu A, Cheung S, Weeding E, Xia B, et al. (2011) Eugene – A Domain Specific Language for Specifying and Constraining Synthetic Biological Parts, Devices, and Systems. PLoS ONE 6: e18882. FIND THIS ARTICLE ONLINE
  42. Beal J, Lu T, Weiss R (2011) Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks. PLoS ONE 6: e22490. FIND THIS ARTICLE ONLINE
  43. Marucci L, Barton D, Cantone I, Ricci M, Cosma M, et al. (2009) How to Turn a Genetic Circuit into a Synthetic Tunable Oscillator, or a Bistable Switch. PLoS ONE 4: e8083. FIND THIS ARTICLE ONLINE
  44. Luo J, Wang J, Ma T, Sun Z (2010) Reverse Engineering of Bacterial Chemotaxis Pathway via Frequency Domain Analysis. PLoS ONE 5: e9182. FIND THIS ARTICLE ONLINE
  45. Shin Y-J, Nourani M (2010) Statecharts for Gene Network Modeling. PLoS ONE 5: e9376. FIND THIS ARTICLE ONLINE
  46. Salvado B, Vilaprinyo E, Karathia H, Sorribas A, Alves R (2012) Two Component Systems: Physiological Effect of a Third Component. PLoS ONE 7: e31095. FIND THIS ARTICLE ONLINE
  47. Widder S, Macía J, Solé R (2009) Monomeric Bistability and the Role of Autoloops in Gene Regulation. PLoS ONE 4: e5399. FIND THIS ARTICLE ONLINE
  48. Welch M, Govindarajan S, Ness J, Villalobos A, Gurney A, et al. (2009) Design Parameters to Control Synthetic Gene Expression in Escherichia coli. PLoS ONE 4: e7002. FIND THIS ARTICLE ONLINE
  49. Senum P, Riedel M (2011) Rate-Independent Constructs for Chemical Computation. PLoS ONE 6: e21414. FIND THIS ARTICLE ONLINE
  50. Purcell O, di Bernardo M, Grierson C, Savery N (2011) A Multi-Functional Synthetic Gene Network: A Frequency Multiplier, Oscillator and Switch. PLoS ONE 6: e16140. FIND THIS ARTICLE ONLINE
  51. Balazsi G, van Oudenaarden A, Collins JJ (2011) Cellular decision making and biological noise: from microbes to mammals. Cell 144: 910–925. FIND THIS ARTICLE ONLINE
  52. Koseska A, Zaikin A, Kurths J, García-Ojalvo J (2009) Timing Cellular Decision Making Under Noise via Cell–Cell Communication. PLoS ONE 4: e4872. FIND THIS ARTICLE ONLINE
  53. Merrin J, Leibler S, Chuang JS (2007) Printing multistrain bacterial patterns with a piezoelectric inkjet printer. PLoS ONE 2: e663. FIND THIS ARTICLE ONLINE
  54. Cohen D, Morfino R, Maharbiz M (2009) A Modified Consumer Inkjet for Spatiotemporal Control of Gene Expression. PLoS ONE 4: e7086. FIND THIS ARTICLE ONLINE
  55. Charvin G, Cross FR, Siggia ED (2008) A microfluidic device for temporally controlled gene expression and long-term fluorescent imaging in unperturbed dividing yeast cells. PLoS One 3: e1468. FIND THIS ARTICLE ONLINE
  56. Hesselman MC, Odoni DI, Ryback BM, de Groot S, van Heck RG, et al. (2012) A multi-platform flow device for microbial (co-) cultivation and microscopic analysis. PLoS One 7: e36982. FIND THIS ARTICLE ONLINE
  57. Oldham P, Hall S, Burton G (2012) Synthetic biology: mapping the scientific landscape. PLoS One 7: e34368. FIND THIS ARTICLE ONLINE
  58. Endy D (2005) Foundations for engineering biology. Nature 438: 449–453. FIND THIS ARTICLE ONLINE
  59. Canton B, Labno A, Endy D (2008) Refinement and standardization of synthetic biological parts and devices. Nat Biotechnol 26: 787–793. FIND THIS ARTICLE ONLINE
SOURCE

Research Articles

A Multi-Platform Flow Device for Microbial (Co-) Cultivation and Microscopic Analysis

Matthijn C. Hesselman, Dorett I. Odoni, Brendan M. Ryback, Suzette de Groot, Ruben G. A. van Heck, Jaap Keijsers, Pim Kolkman, David Nieuwenhuijse, Youri M. van Nuland, Erik Sebus, Rob Spee, Hugo de Vries, Marten T. Wapenaar, Colin J. Ingham, Karin Schroën, Vítor A. P. Martins dos Santos, Sebastiaan K. Spaans, Floor Hugenholtz, Mark W. J. van Passel

PLoS ONE:
Published 14 May 2012 | info:doi/10.1371/journal.pone.0036982

Synthetic Biology: Mapping the Scientific Landscape

Paul Oldham, Stephen Hall, Geoff Burton

PLoS ONE:
Published 23 Apr 2012 | info:doi/10.1371/journal.pone.0034368

Rational Diversification of a Promoter Providing Fine-Tuned Expression and Orthogonal Regulation for Synthetic Biology

Benjamin A. Blount, Tim Weenink, Serge Vasylechko, Tom Ellis

PLoS ONE:
Published 19 Mar 2012 | info:doi/10.1371/journal.pone.0033279

Two Component Systems: Physiological Effect of a Third Component

Baldiri Salvado, Ester Vilaprinyo, Hiren Karathia, Albert Sorribas, Rui Alves

PLoS ONE:
Published 17 Feb 2012 | info:doi/10.1371/journal.pone.0031095

In Vitro Assembly of Multiple DNA Fragments Using Successive Hybridization

Xinglin Jiang, Jianming Yang, Haibo Zhang, Huibin Zou, Cong Wang, Mo Xian

PLoS ONE:
Published 26 Jan 2012 | info:doi/10.1371/journal.pone.0030267

The Bacterial Nanorecorder: Engineering E. coli to Function as a Chemical Recording Device

Prasanna Bhomkar, Wayne Materi, David S. Wishart

PLoS ONE:
Published 23 Nov 2011 | info:doi/10.1371/journal.pone.0027559

Chemical Synthesis of Bacteriophage G4

Ruilin Yang, Yonghua Han, Yiwang Ye, Yuchen Liu, Zhimao Jiang, Yaoting Gui, Zhiming Cai

PLoS ONE:
Published 16 Nov 2011 | info:doi/10.1371/journal.pone.0027062

OmniChange: The Sequence Independent Method for Simultaneous Site-Saturation of Five Codons

Alexander Dennig, Amol V. Shivange, Jan Marienhagen, Ulrich Schwaneberg

PLoS ONE:
Published 19 Oct 2011 | info:doi/10.1371/journal.pone.0026222

Microarray Generation of Thousand-Member Oligonucleotide Libraries

Nina Svensen, Juan José Díaz-Mochón, Mark Bradley

PLoS ONE:
Published 23 Sep 2011 | info:doi/10.1371/journal.pone.0024906

A Biobrick Library for Cloning Custom Eukaryotic Plasmids

Marco Constante, Raik Grünberg, Mark Isalan

PLoS ONE:
Published 25 Aug 2011 | info:doi/10.1371/journal.pone.0023685

Automatic Compilation from High-Level Biologically-Oriented Programming Language to Genetic Regulatory Networks

Jacob Beal, Ting Lu, Ron Weiss

PLoS ONE:
Published 05 Aug 2011 | info:doi/10.1371/journal.pone.0022490

GoldenBraid: An Iterative Cloning System for Standardized Assembly of Reusable Genetic Modules

Alejandro Sarrion-Perdigones, Erica Elvira Falconi, Sara I. Zandalinas, Paloma Juárez, Asun Fernández-del-Carmen, Antonio Granell, Diego Orzaez

PLoS ONE:
Published 07 Jul 2011 | info:doi/10.1371/journal.pone.0021622

Rate-Independent Constructs for Chemical Computation

Phillip Senum, Marc Riedel

PLoS ONE:
Published 30 Jun 2011 | info:doi/10.1371/journal.pone.0021414

Assembly of Designer TAL Effectors by Golden Gate Cloning

Ernst Weber, Ramona Gruetzner, Stefan Werner, Carola Engler, Sylvestre Marillonnet

PLoS ONE:
Published 19 May 2011 | info:doi/10.1371/journal.pone.0019722

Design and Synthesis of a Quintessential Self-Transmissible IncX1 Plasmid, pX1.0

Lars H. Hansen, Mikkel Bentzon-Tilia, Sara Bentzon-Tilia, Anders Norman, Louise Rafty, Søren J. Sørensen

PLoS ONE:
Published 18 May 2011 | info:doi/10.1371/journal.pone.0019912

Exploiting Nucleotide Composition to Engineer Promoters

Manfred G. Grabherr, Jens Pontiller, Evan Mauceli, Wolfgang Ernst, Martina Baumann, Tara Biagi, Ross Swofford, Pamela Russell, Michael C. Zody, Federica Di Palma, Kerstin Lindblad-Toh, Reingard M. Grabherr

PLoS ONE:
Published 18 May 2011 | info:doi/10.1371/journal.pone.0020136

Eugene – A Domain Specific Language for Specifying and Constraining Synthetic Biological Parts, Devices, and Systems

Lesia Bilitchenko, Adam Liu, Sherine Cheung, Emma Weeding, Bing Xia, Mariana Leguia, J. Christopher Anderson, Douglas Densmore

PLoS ONE:
Published 29 Apr 2011 | info:doi/10.1371/journal.pone.0018882

Towards a Synthetic Chloroplast

Christina M. Agapakis, Henrike Niederholtmeyer, Ramil R. Noche, Tami D. Lieberman, Sean G. Megason, Jeffrey C. Way, Pamela A. Silver

PLoS ONE:
Published 20 Apr 2011 | info:doi/10.1371/journal.pone.0018877

Standard Biological Parts Knowledgebase

Michal Galdzicki, Cesar Rodriguez, Deepak Chandran, Herbert M. Sauro, John H. Gennari

PLoS ONE:
Published 24 Feb 2011 | info:doi/10.1371/journal.pone.0017005

A Modular Cloning System for Standardized Assembly of Multigene Constructs

Ernst Weber, Carola Engler, Ramona Gruetzner, Stefan Werner, Sylvestre Marillonnet

PLoS ONE:
Published 18 Feb 2011 | info:doi/10.1371/journal.pone.0016765

A Multi-Functional Synthetic Gene Network: A Frequency Multiplier, Oscillator and Switch

Oliver Purcell, Mario di Bernardo, Claire S. Grierson, Nigel J. Savery

PLoS ONE:
Published 17 Feb 2011 | info:doi/10.1371/journal.pone.0016140

Self-Organization, Layered Structure, and Aggregation Enhance Persistence of a Synthetic Biofilm Consortium

Katie Brenner, Frances H. Arnold

PLoS ONE:
Published 09 Feb 2011 | info:doi/10.1371/journal.pone.0016791

Programmable Ligand Detection System in Plants through a Synthetic Signal Transduction Pathway

Mauricio S. Antunes, Kevin J. Morey, J. Jeff Smith, Kirk D. Albrecht, Tessa A. Bowen, Jeffrey K. Zdunek, Jared F. Troupe, Matthew J. Cuneo, Colleen T. Webb, Homme W. Hellinga, June I. Medford

PLoS ONE:
Published 25 Jan 2011 | info:doi/10.1371/journal.pone.0016292

De Novo Designed Proteins from a Library of Artificial Sequences Function inEscherichia Coli and Enable Cell Growth

Michael A. Fisher, Kara L. McKinley, Luke H. Bradley, Sara R. Viola, Michael H. Hecht

PLoS ONE:
Published 04 Jan 2011 | info:doi/10.1371/journal.pone.0015364

Characterization of Engineered Actin Binding Proteins That Control Filament Assembly and Structure

Crista M. Brawley, Serdar Uysal, Anthony A. Kossiakoff, Ronald S. Rock

PLoS ONE:
Published 12 Nov 2010 | info:doi/10.1371/journal.pone.0013960

Oscillations by Minimal Bacterial Suicide Circuits Reveal Hidden Facets of Host-Circuit Physiology

Philippe Marguet, Yu Tanouchi, Eric Spitz, Cameron Smith, Lingchong You

PLoS ONE:
Published 30 Jul 2010 | info:doi/10.1371/journal.pone.0011909

DMSO and Betaine Greatly Improve Amplification of GC-Rich Constructs in De Novo Synthesis

Michael A. Jensen, Marilyn Fukushima, Ronald W. Davis

PLoS ONE:
Published 11 Jun 2010 | info:doi/10.1371/journal.pone.0011024

An Environment-Sensitive Synthetic Microbial Ecosystem

Bo Hu, Jin Du, Rui-yang Zou, Ying-jin Yuan

PLoS ONE:
Published 12 May 2010 | info:doi/10.1371/journal.pone.0010619

Reverse Engineering of Bacterial Chemotaxis Pathway via Frequency Domain Analysis

Junjie Luo, Jun Wang, Ting Martin Ma, Zhirong Sun

PLoS ONE:
Published 09 Mar 2010 | info:doi/10.1371/journal.pone.0009182

Statecharts for Gene Network Modeling

Yong-Jun Shin, Mehrdad Nourani

PLoS ONE:
Published 23 Feb 2010 | info:doi/10.1371/journal.pone.0009376

An Engineered Yeast Efficiently Secreting Penicillin

Loknath Gidijala, Jan A. K. W. Kiel, Rutger D. Douma, Reza M. Seifar, Walter M. van Gulik, Roel A. L. Bovenberg, Marten Veenhuis, Ida J. van der Klei

PLoS ONE:
Published 15 Dec 2009 | info:doi/10.1371/journal.pone.0008317

How to Turn a Genetic Circuit into a Synthetic Tunable Oscillator, or a Bistable Switch

Lucia Marucci, David A. W. Barton, Irene Cantone, Maria Aurelia Ricci, Maria Pia Cosma, Stefania Santini, Diego di Bernardo, Mario di Bernardo

PLoS ONE:
Published 07 Dec 2009 | info:doi/10.1371/journal.pone.0008083

Gemini, a Bifunctional Enzymatic and Fluorescent Reporter of Gene Expression

Lance Martin, Austin Che, Drew Endy

PLoS ONE:
Published 04 Nov 2009 | info:doi/10.1371/journal.pone.0007569

A Man-Made ATP-Binding Protein Evolved Independent of Nature Causes Abnormal Growth in Bacterial Cells

Joshua M. Stomel, James W. Wilson, Megan A. León, Phillip Stafford, John C. Chaput

PLoS ONE:
Published 08 Oct 2009 | info:doi/10.1371/journal.pone.0007385

A Modified Consumer Inkjet for Spatiotemporal Control of Gene Expression

Daniel J. Cohen, Roberto C. Morfino, Michel M. Maharbiz

PLoS ONE:
Published 18 Sep 2009 | info:doi/10.1371/journal.pone.0007086

Design Parameters to Control Synthetic Gene Expression in Escherichia coli

Mark Welch, Sridhar Govindarajan, Jon E. Ness, Alan Villalobos, Austin Gurney, Jeremy Minshull, Claes Gustafsson

PLoS ONE:
Published 14 Sep 2009 | info:doi/10.1371/journal.pone.0007002

Synthetic Morphology Using Alternative Inputs

Hiromasa Tanaka, Tau-Mu Yi

PLoS ONE:
Published 10 Sep 2009 | info:doi/10.1371/journal.pone.0006946

Circular Polymerase Extension Cloning of Complex Gene Libraries and Pathways

Jiayuan Quan, Jingdong Tian

PLoS ONE:
Published 30 Jul 2009 | info:doi/10.1371/journal.pone.0006441

Golden Gate Shuffling: A One-Pot DNA Shuffling Method Based on Type IIs Restriction Enzymes

Carola Engler, Ramona Gruetzner, Romy Kandzia, Sylvestre Marillonnet

PLoS ONE:
Published 14 May 2009 | info:doi/10.1371/journal.pone.0005553

Monomeric Bistability and the Role of Autoloops in Gene Regulation

Stefanie Widder, Javier Macía, Ricard Solé

PLoS ONE:
Published 30 Apr 2009 | info:doi/10.1371/journal.pone.0005399

Timing Cellular Decision Making Under Noise via Cell–Cell Communication

Aneta Koseska, Alexey Zaikin, Jürgen Kurths, Jordi García-Ojalvo

PLoS ONE:
Published 13 Mar 2009 | info:doi/10.1371/journal.pone.0004872

High-Level Production of Amorpha-4,11-Diene, a Precursor of the Antimalarial Agent Artemisinin, in Escherichia coli

Hiroko Tsuruta, Christopher J. Paddon, Diana Eng, Jacob R. Lenihan, Tizita Horning, Larry C. Anthony, Rika Regentin, Jay D. Keasling, Neil S. Renninger, Jack D. Newman

PLoS ONE:
Published 16 Feb 2009 | info:doi/10.1371/journal.pone.0004489

Design and Construction of a Double Inversion Recombination Switch for Heritable Sequential Genetic Memory

Timothy S. Ham, Sung K. Lee, Jay D. Keasling, Adam P. Arkin

PLoS ONE:
Published 30 Jul 2008 | info:doi/10.1371/journal.pone.0002815

Targeted Development of Registries of Biological Parts

Jean Peccoud, Megan F. Blauvelt, Yizhi Cai, Kristal L. Cooper, Oswald Crasta, Emily C. DeLalla, Clive Evans, Otto Folkerts, Blair M. Lyons, Shrinivasrao P. Mane, Rebecca Shelton, Matthew A. Sweede, Sally A. Waldon

PLoS ONE:
Published 16 Jul 2008 | info:doi/10.1371/journal.pone.0002671

Bimodal and Hysteretic Expression in Mammalian Cells from a Synthetic Gene Circuit

Tobias May, Lee Eccleston, Sabrina Herrmann, Hansjörg Hauser, Jorge Goncalves, Dagmar Wirth

PLoS ONE:
Published 04 Jun 2008 | info:doi/10.1371/journal.pone.0002372

Synthetic Biology of Proteins: Tuning GFPs Folding and Stability with Fluoroproline

Thomas Steiner, Petra Hess, Jae Hyun Bae, Birgit Wiltschi, Luis Moroder, Nediljko Budisa

PLoS ONE:
Published 27 Feb 2008 | info:doi/10.1371/journal.pone.0001680

Reconstitution of Mdm2-Dependent Post-Translational Modifications of p53 in Yeast

Barbara Di Ventura, Charlotta Funaya, Claude Antony, Michael Knop, Luis Serrano

PLoS ONE:
Published 30 Jan 2008 | info:doi/10.1371/journal.pone.0001507

A Microfluidic Device for Temporally Controlled Gene Expression and Long-Term Fluorescent Imaging in Unperturbed Dividing Yeast Cells

Gilles Charvin, Frederick R. Cross, Eric D. Siggia

PLoS ONE:
Published 23 Jan 2008 | info:doi/10.1371/journal.pone.0001468

Printing Multistrain Bacterial Patterns with a Piezoelectric Inkjet Printer

Jack Merrin, Stanislas Leibler, John S. Chuang

PLoS ONE:
Published 25 Jul 2007 | info:doi/10.1371/journal.pone.0000663

High-Yield Hydrogen Production from Starch and Water by a Synthetic Enzymatic Pathway

Y.-H. Percival Zhang, Barbara R. Evans, Jonathan R. Mielenz, Robert C. Hopkins, Michael W.W. Adams

PLoS ONE:
Published 23 May 2007 | info:doi/10.1371/journal.pone.0000456

Designer Self-Assembling Peptide Nanofiber Scaffolds for Adult Mouse Neural Stem Cell 3-Dimensional Cultures

Fabrizio Gelain, Daniele Bottai, Angleo Vescovi, Shuguang Zhang

PLoS ONE:
Published 27 Dec 2006 | info:doi/10.1371/journal.pone.0000119

Adaptive Response of a Gene Network to Environmental Changes by Fitness-Induced Attractor Selection

Akiko Kashiwagi, Itaru Urabe, Kunihiko Kaneko, Tetsuya Yomo

PLoS ONE:
Published 20 Dec 2006 | info:doi/10.1371/journal.pone.0000049

SOURCE

Read Full Post »

 

Tufts Health Plan to Cover Sequenom’s MaterniT21, Pathwork’s Tissue of Origin Tests

Reporter: Aviva Lev-Ari, PhD, RN

http://www.genomeweb.com/mdx/tufts-health-plan-cover-sequenoms-maternit21-pathworks-tissue-origin-tests

NEW YORK (GenomeWeb News) – Tufts Health Plan will begin covering Sequenom’s MaterniT21 Plus trisomy 21 test and Pathwork Diagnostics‘ Tissue of Origin test starting Oct. 1.

In an update to providers posted on its website, the health plan said that it may authorize coverage of the MaterniT21 test for patients who are plan members if they are at least 35 years old when they give birth; have a fetal aneuploidy screening test result including maternal serum screening and/or ultrasound evaluation that indicates the possibility of trisomy 21; or the plan member has a family history or prior pregnancy involving aneuploidy.

In a research note Oppenheimer analyst David Ferreiro said that Tufts Health Plan has approximately 1 million lives under coverage and a network of 90 hospitals and 25,000 healthcare providers.

“We view this decision as an incremental positive for [Sequenom] and as validation of the value proposition MaterniT21 presents to payors,” he said. “The adoption rate is encouraging and could positively impact payor decisions, further entrenching,” the company.

Two weeks ago, Sequenom said that in the second quarter revenues from its Sequenom Center for Molecular Medicine diagnostic services rose five-fold to $8.1 million driven by the MaterniT21 Plus test, which was launched in the fall. The test also detects for T18 and T13.

As adoption of the test continues to ramp at an increasing rate, the San Diego-based company increased its internal goal of billed MaterniT21 Plus tests for 2012 to 50,000 from an earlier goal of 40,000.

The company has stopped announcing coverage decisions by individual plans following an incident in the spring in which Coventry Health Care National Networkterminated a coverage decision for MaterniT21 Plus one week after Sequenom said that Coventy would cover the test. Sequenom said at the time that Coventry’s decision was without cause and was not a judgment on the company, Sequenom CMM, or its products.

In a statement today to GenomeWeb Daily News, a Sequenom spokesperson declined to disclose the terms of the contract with Tufts Health Plan. She said that Sequenom CMM has more than 26 million live under contract, and “we operate as an out-of-network laboratory where we are not yet contracted and bill payors accordingly.”

Tufts Health Plan also said that it will begin coverage of Pathwork Diagnostics’ Pathwork Tissue of Origin test, beginning on Oct. 1. The test is for the identification of challenging tumors, including poorly differentiated, undifferentiated, and metastatic cancers.

The plan said it may authorize coverage of the test if it is ordered by an oncologist and the plan member is diagnosed with metastatic cancer; the clinical evaluation has not identified the primary site of the cancer; the pathology report is submitted to Tufts Health Plan for review; and the pathology examination is unable to conclusively identify the primary site, or has identified two or more possible primary sites.

Use of the test to confirm a diagnosis will not be covered by the health plan.

 

 

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

Regulus Therapeutics and UC San Diego to Collaborate on Angiogenic Disease Research Utilizing microRNA Technology

http://www.fiercebiotech.com/press-releases/regulus-therapeutics-and-uc-san-diego-collaborate-angiogenic-disease-resear-0

– UC Discovery Grant award to support collaborative research –

La Jolla, Calif., April 14, 2011 – Regulus Therapeutics Inc., a biopharmaceutical company leading the discovery and development of innovative new medicines targeting microRNAs, today announced it is collaborating with researchers at the University of California, San Diego (UCSD) School of Medicine seeking novel treatments for angiogenic diseases using microRNA therapeutics. The research will combine Regulus’ leading microRNA platform with UCSD’s expertise in animal models of angiogenesis to discover anti-angiogenic microRNA-targeted therapies that could be rapidly translated for treatment of human disease.  The collaborative research program was the recent recipient of a UC Discovery Grant that promotes collaborations between the university’s researchers and industry partners.  Financial terms of the grant were not disclosed.

“We are pleased to collaborate with leading scientific institutes like UCSD and to provide industry support for programs such as the UC Discovery Grant,” said Hubert C. Chen, M.D., Regulus’ vice president of translational medicine. “Regulus continues to demonstrate a leadership position in the field of microRNA therapeutics and is committed to forging partnerships with leading academic and clinical laboratories to advance microRNA biology and therapeutic discovery.  Our network of nearly 30 academic collaborations assists us with the investigation of new microRNAs and supports microRNA discovery efforts that feed the Company’s pipeline.”

Angiogenesis, which is the formation of new blood vessels, is an important event that contributes to the severity of cancer, diabetes, macular degeneration, inflammatory disease and arthritis.  microRNAs have been implicated in regulating biological networks involved in angiogenesis.

“Our research published last year in Nature Medicine demonstrated that microRNA-132 functions as a novel angiogenic switch that turns on angiogenesis in quiescent endothelial cells, and that targeting with an anti-miR-132 decreases blood vessel formation,” said David A. Cheresh, Ph.D., professor of pathology in the UCSD School of Medicine, associate director for translational research at UCSD Moores Cancer Center and principal investigator on the grant. “The objective of our collaborative work with Regulus is to advance these initial discoveries and discover additional microRNAs involved in angiogenic diseases.”

The UC Discovery Grant program promotes collaborations between the university’s researchers and industry partners in the interest of supporting cutting-edge research, strengthening the state’s economy and serving the public good.

About microRNAs

The discovery of microRNA in humans during the last decade is one of the most exciting scientific breakthroughs in recent history. microRNAs are small RNA molecules, typically 20 to 25 nucleotides in length, that do not encode proteins but instead regulate gene expression. More than 700 microRNAs have been identified in the human genome, and over one-third of all human genes are believed to be regulated by microRNAs. A single microRNA can regulate entire networks of genes. As such, these molecules are considered master regulators of the human genome. microRNAs have been shown to play an integral role in numerous biological processes, including the immune response, cell-cycle control, metabolism, viral replication, stem cell differentiation and human development. Most microRNAs are conserved across multiple species, indicating the evolutionary importance of these molecules as modulators of critical biological pathways. Indeed, microRNA expression or function, has been shown to be significantly altered in many disease states, including cancer, heart failure and viral infections. Targeting microRNAs with anti-miRs, antisense oligonucleotide inhibitors of microRNAs, or miR-mimics, double-stranded oligonucleotides to replace microRNA function opens potential for a novel class of therapeutics and offers a unique approach to treating disease by modulating entire biological pathways. To learn more about microRNAs, please visit http://www.regulusrx.com/microrna/microrna-explained.php.

About Regulus Therapeutics Inc.

Regulus Therapeutics is a biopharmaceutical company leading the discovery and development of innovative new medicines targeting microRNAs. Regulus is using a mature therapeutic platform based on technology that has been developed over 20 years and tested in more than 5,000 humans. In addition, Regulus works with a broad network of academic collaborators and leverages the oligonucleotide drug discovery and development expertise of its founding companies, Alnylam Pharmaceuticals (NASDAQ:ALNY) and Isis Pharmaceuticals (NASDAQ:ISIS). Regulus is advancing microRNA therapeutics towards the clinic in several key areas including hepatitis C infection, immuno-inflammatory diseases, fibrosis, oncology and cardiovascular/metabolic diseases. Regulus’ intellectual property estate contains both the fundamental and core patents in the field and includes over 600 patents and more than 300 pending patent applications pertaining primarily to chemical modifications of oligonucleotides targeting microRNAs for therapeutic applications. In April 2008, Regulus formed a major alliance with GlaxoSmithKline to discover and develop microRNA therapeutics for immuno-inflammatory diseases. In February 2010, Regulus and GlaxoSmithKline entered into a new collaboration to develop and commercialize microRNA therapeutics targeting microRNA-122 for the treatment of hepatitis C infection. In June 2010, Regulus and sanofi-aventis entered into the largest-to-date strategic alliance for the development of microRNA therapeutics. This alliance is focused initially on fibrosis. For more information, please visit http://www.regulusrx.com.

Forward-Looking Statements

This press release includes forward-looking statements regarding the future therapeutic and commercial potential of Regulus’ business plans, technologies and intellectual property related to microRNA therapeutics being discovered and developed by Regulus. Any statement describing Regulus’ goals, expectations, financial or other projections, intentions or beliefs is a forward-looking statement and should be considered an at-risk statement. Such statements are subject to certain risks and uncertainties, particularly those inherent in the process of discovering, developing and commercializing drugs that are safe and effective for use as human therapeutics, and in the endeavor of building a business around such products. Such forward-looking statements also involve assumptions that, if they never materialize or prove correct, could cause the results to differ materially from those expressed or implied by such forward-looking statements. Although these forward-looking statements reflect the good faith judgment of Regulus’ management, these statements are based only on facts and factors currently known by Regulus. As a result, you are cautioned not to rely on these forward-looking statements. These and other risks concerning Regulus’ programs are described in additional detail in each of Alnylam’s and Isis’ annual report on Form 10-K for the year ended December 31, 2010, which are on file with the SEC. Copies of these and other documents are available from either Alnylam or Isis.

Read Full Post »

Obstructive Coronary Artery Disease diagnosed by RNA levels of 23 genes – CardioDx, a Pioneer in the Field of Cardiovascular Genomic Diagnostics

Curator: Aviva Lev-Ari, PhD, RN

UPDATED on 11/15/2013

CardioDx, Inc. Nixes IPO, Cites Unfavorable Market Conditions

11/15/2013 10:31:01 AM

 

CardioDx postpones its initial public offering, citing ‘unfavorable market conditions.’ California molecular diagnostics company CardioDx spiked its initial public offering, citing “unfavorable market conditions,” according to news reports. The 5.8-million-share offering by Palo Alto-based CardioDx was slated to raise $92 million at a share price of $14-$16 apiece. The IPO, originally scheduled for yesterday, would have seen CardioDx shares trade under the “CDX” symbol.

SOURCE

http://www.devicespace.com/news_story.aspx?NewsEntityId=315972&type=email&source=DS_111513

CardioDx had planned to use some of the funds to expand its commercial efforts, including its sales and marketing workforce; to fund operations as the company pursues more insurance coverage and reimbursement; to “conduct additional clinical and marketing activities” for the company’s Corus CAD blood-based gene expression test; to fund R&D activity; and for “general corporate purposes.” CardioDx will later specify just the how much it plans to put toward each of those activities.

Investors in the company include V-Sciences Investments, Longitude Venture Partners, Artiman Ventures, Kleiner Perkins Caufield & Byers, JP Morgan and Mohr Davidow Ventures.

SOURCE

http://www.massdevice.com/news/cardiodx-spikes-ipo

CardioDX pulls IPO, citing poor market conditions

CardioDX, led by David Levison, was one of three medical technology companies to postpone their IPOs on Thursday due to poor market conditions.

Senior Technology Reporter-Silicon Valley Business Journal
CardioDX postponed an IPO on Thursday after deciding that the market is unfavorable at this time.

 

The Palo Alto company led by CEO David Levison was one of three planned medical tech companies that postponed going public on Thursday. San Diego-basedCelladon and Monrovia-based Xencor also decided to hold off due to poor market conditions.

Redwood City pharmaceutical developer Relypsa, meanwhile, went ahead with a drastically reduced IPO that raised about half of what it had been projected for it.

CardioDX, which sells diagnostic tests for cardiovascular disease, reported total revenue in in 2012 of $2.5 million and a net loss of $25.6 million. The company expects to continue to show losses for the next several years and has an accumulated deficit through June totaling $165.9 million. As of June 30, it had $46.8 million in cash, equivalents and investments.

The company’s biggest existing stakeholder is V-Sciences Investments, a wholly owned subsidiary of Temasek Life Sciences Private Ltd., which holds 19.9 percent of outstanding shares.

Other big stakeholders are Longitude Venture Partners, with a 17.9 percent stake; Artiman Ventures, 13.9 percent; Kleiner Perkins Caufield & Byers, 9.5 percent; JP Morgan, 6.4 percent; and Mohr Davidow Ventures, 5.8 percent.

SOURCE

http://www.bizjournals.com/sanjose/news/2013/11/15/cardiodx-pulls-ipo-citing-poor-market.html

Cardiovascular MDx Firm CardioDx Files to Go Public

UPDATED on 10/14/2013

October 14, 2013

NEW YORK (GenomeWeb News) – Cardiovascular molecular diagnostics firm CardioDx has filed with the US Securities and Exchange Commission to go public with an intended offering of up to $86.3 million of common stock.

The Palo Alto, Calif.-based firm has not priced its offering yet or said how many shares it plans on offering. Bank of America Merrill Lynch and Jefferies are listed as joint book-running managers on the offering, while Piper Jaffray and William Blair are co-managers.

The company plans on listing on the Nasdaq Global Market under ticker symbol “CDX.”

In its Form S-1, CardioDx said that its tests provide healthcare professionals with “critical, actionable information to improve patient care and management,” with an initial focus on coronary artery diseases (CAD), arrhythmia, and heart failure.

Its flagship product is the Corus CAD, a gene expression-based test for assessing non-diabetic patients who display symptoms suggestive of obstructive CAD. The test was launched in 2009 and through June 30, CardioDx delivered results for more than 40,000 tests, it said.

Corus CAD received Medicare Part B coverage in August 2012, making it a covered benefit for about 48 million Medicare beneficiaries, the company added.

In 2012, CardioDx posted $2.5 million in revenues with a net loss of $25.6 million. Through the first six months of 2013, the firm had revenues $2.9 million and a net loss of $18.4 million.

It had $46.8 million in cash, cash equivalents, and investments as of June 30, it said.

In August 2012, CardioDx raised $58 million in private financing. Before that, it raised $60 million in a financing round. In 2010, GE Healthcare invested $5 million in the company as part of a Series D financing round.

David Levison heads the firm as President and CEO. Other members of the management team include CFO Andrew Guggenhime; Chief Scientific Officer Steven Rosenberg; Chief Medical Officer Mark Monane; and Chief Commercial Officer Deborah Kilpatrick.

CardioDx is the latest in a recent string of omics-related companies who have gone public or have filed to go public in the US. Cancer GeneticsNanoString Technologies, and Foundation Medicine launched their IPOs earlier this year. Meanwhile, VeracyteBiocept, and Evogene have filed to float.

UPDATED on 2/25/2013

CardioDx Announces Publication of COMPASS Study Demonstrating the Corus CAD Test Outperforms Myocardial Perfusion Imaging in Overall Diagnostic Accuracy for Obstructive Coronary Artery Disease

February 24, 2013
CardioDx Announces Publication of COMPASS Study Demonstrating the Corus CAD Test Outperforms Myocardial Perfusion Imaging in Overall Diagnostic Accuracy for Obstructive Coronary Artery Disease

Tue Feb 19, 2013 8:30am EST

– Study Highlights the Validity of Corus CAD as a First-Line Test to Help Clinicians Exclude Obstructive CAD as a Cause of the Patient’s Symptoms – PALO ALTO, Calif.,  Feb. 19, 2013

/PRNewswire/ — CardioDx, Inc., a pioneer in the field of  cardiovascular genomic diagnostics, today announced the publication of the COMPASS (Coronary  Obstruction Detection by  Molecular
Personalized Gene Expression) study in  Circulation: Cardiovascular Genetics,  a journal of the American Heart Association. 

Results of the prospective, multi-center U.S. study showed that  Corus®  CAD, a blood-based  gene expression test, demonstrated high accuracy with both a high negative predictive value (96 percent) and high sensitivity (89 percent) for assessing  obstructive coronary artery disease  (CAD) in a population of patients referred for stress testing with myocardial perfusion imaging (MPI).  The study’s authors conclude that using Corus CAD earlier in the diagnostic algorithm could reduce the number of invasive cardiac tests by more accurately evaluating the presence of obstructive coronary artery disease compared to the traditional algorithm of stress myocardial perfusion imaging (MPI) in these patients.

COMPASS enrolled stable patients with symptoms suggestive of CAD who had been referred for MPI at 19 U.S. sites.  A blood sample was obtained in all 431 patients prior to MPI and Corus CAD gene expression testing was performed with study investigators blinded to Corus CAD test results. Following MPI, patients underwent either invasive coronary angiography or coronary CT angiography, gold-standard anatomical tests for the diagnosis of coronary artery disease. 

The study was designed to provide additional independent validation of the Corus CAD test in a real-world intended use patient population of patients presenting for MPI, a common noninvasive test for CAD, and builds on the results of the previous PREDICT validation study. Corus CAD requires only a simple blood draw for testing, making it safe, convenient, and easy to administer. The study evaluated results in stable non-diabetic patients with typical or atypical symptoms suggestive of CAD and found that Corus CAD surpassed the accuracy of MPI, a test that was administered more 10 million times in the U.S. in 2010.[1]

“The evaluation of stable patients with chest pain and other symptoms suggestive of CAD is a common challenge for clinicians, accounting for as many as 10,000 outpatient visits each day,” said the publication’s lead author,  Gregory S. Thomas, M.D., M.P.H., Medical Director of the MemorialCare Heart & Vascular Institute at Long Beach Memorial Medical Center and Clinical Professor of Medicine and Director of Nuclear Cardiology Education at the  University of California-Irvine  School of Medicine. “In the U.S., MPI testing is often performed in these patients and is followed by referral to invasive coronary angiography. Based on the results of this study of the Corus CAD gene expression test, we now have a reliable diagnostic approach for evaluating patients with symptoms of obstructive CAD.  With its high sensitivity and negative predictive value, Corus CAD may help clinicians accurately and efficiently exclude the diagnosis of obstructive CAD early in the diagnostic pathway, so they can assess for other causes of their patients’ symptoms.”

The pre-specified primary endpoint of the COMPASS study was the receiver-operator characteristics (ROC) analysis to evaluate the ability of Corus CAD to identify coronary arterial blockages of 50 percent or greater by quantitative coronary angiography.  Corus CAD outperformed MPI in overall diagnostic accuracy for assessing obstructive CAD, with an area under the curve (AUC) of 0.79 for the Corus CAD test compared to MPI site and core-lab read AUCs of 0.59 and 0.63 respectively (p<0.001).  In addition, Corus CAD performed better than MPI in sensitivity (89 percent vs. 27 percent, p<0.001) and negative predictive value (96 percent vs. 88 percent, p<0.001) parameters, thus demonstrating excellent performance for excluding obstructive CAD as the cause of a patient’s symptoms.  The COMPASS results corroborated earlier findings from the PREDICT multicenter U.S. validation study[2] demonstrating that the Corus CAD score is proportional to coronary artery stenosis severity.

“Corus CAD can help solve an enormous unmet need in healthcare by providing clinicians with a safe, convenient and reliable tool to help evaluate common patient symptoms and triage them more appropriately for subsequent therapy or additional testing,” said  David Levison, President and CEO of CardioDx.  “In addition to its higher diagnostic accuracy, Corus CAD holds potential to reduce a major healthcare expense category – unnecessary noninvasive imaging and/or invasive coronary angiography procedures and their associated risks and side effects. We have worked closely with leading clinicians to build a solid clinical and economic foundation for Corus CAD, leading to its growing acceptance in the medical and payer communities as evidenced by the more than 35,000 tests performed to date and Medicare’s decision to cover the test.”

 SOURCE:

http://www.fiercemedicaldevices.com/press-releases/cardiodx-announces-publication-compass-study-demonstrating-corus-cad-test-o

CardioDx is promoting yet another post-marketing study whose data may help the company’s gene expression test for obstructive coronary artery disease reach more patients, better compete with the standard of care and also build vital market share.

Executives at the California-based 2012 Fierce 15 company say they wanted more data on Corus CAD‘s real-world use, building on its previous PREDICT validation trial as a result. The test has been on sale commercially since 2009 and won crucial Medicare reimbursement last fall. Chief Scientific Officer Steven Rosenberg told FierceMedicalDevices via email that the results from the latest study pointed in a number of positive directions.

“It demonstrates performance at least as good as that seen in the PREDICT study, but in the population the Corus CAD is indicated for,” Rosenberg said, “It shows significantly higher performance for obstructive CAD than MPI, which is the most common non-invasive imaging test used in this regard.”

A 431-patient clinical study of the blood diagnostic rated the test with a 96% negative predictive value and 89% high sensitivity, in assessing the condition in patients who were referred for stress testing with myocardial perfusion imaging (MPI). (Last November, CardioDx heralded similar results from another study using Corus CAD on 98 geriatric patients.) Details are published in the journal Circulation: Cardiovascular Genetics.

The blood test, conducted at 19 U.S. sites through multiple academic institutions, determined that using Corus CAD earlier in the diagnostic process better assessed the presence of coronary artery disease versus MPI. This might encourage doctors to cut back on invasive, more expensive cardiac tests by ruling out obstructive CAD sooner. In other words, determining a patient doesn’t have obstructive CAD eliminates the need for diagnostic procedures such as coronary angiography or coronary CT angiography, the company explains.

Post-marketing studies are increasingly important in today’s health care market, with the need to demonstrate the utility of a device or diagnostic in as most detailed a way possible. And it’s not just boosting the standard of care; the Affordable Care Act means value matters, too, more than ever before. Success with this mission can help broaden market share and also increase the chance of private as well as government insurance coverage. Additionally, new post-marketing trials can also set the stage for expanded indications down the line.

SOURCE:

http://www.fiercemedicaldevices.com/story/cardiodx-cad-dx-passes-another-post-marketing-test/2013-02-24?utm_medium=nl&utm_source=internal

A Blood Based Gene Expression Test for Obstructive Coronary Artery Disease Tested in Symptomatic Non-Diabetic Patients Referred for Myocardial Perfusion Imaging: The COMPASS Study

  1. Gregory S. Thomas1*,
  2. Szilard Voros2,
  3. John A. McPherson3,
  4. Alexandra J. Lansky4,
  5. Mary E. Winn5,
  6. Timothy M. Bateman6,
  7. Michael R. Elashoff7,
  8. Hsiao D. Lieu7,
  9. Andrea M. Johnson7,
  10. Susan E. Daniels7,
  11. Joseph A. Ladapo8,
  12. Charles E. Phelps9,
  13. Pamela S. Douglas10 and
  14. Steven Rosenberg7

+Author Affiliations


  1. 1Long Beach Memorial Medical Center, Long Beach & University of California, Irvine, CA

  2. 2Stony Brook University Medical Center, Stony Brook, NY

  3. 3Vanderbilt University, Nashville, TN

  4. 4Yale University School of Medicine, New Haven, CN

  5. 5Scripps Translational Science Institute, La Jolla, CA

  6. 6University of Missouri, Kansas City, MO

  7. 7CardioDx, Inc., Palo Alto, CA

  8. 8New York University School of Medicine, New York, NY

  9. 9University of Rochester, Rochester, NY

  10. 10Duke Clinical Research Institute, Duke University, Durham, NC
  1. * MemorialCare Heart and Vascular Institute, Long Beach Memorial Medical Center, 2801 Atlantic Avenue, Long Beach, CA 90806 gthomas@mimg.com

Abstract

Background—Obstructive coronary artery disease (CAD) diagnosis in symptomatic patients often involves non-invasive testing before invasive coronary angiography (ICA). A blood-based gene expression score (GES) was previously validated in non-diabetic patients referred for ICA but not in symptomatic patients referred for myocardial perfusion imaging (MPI).

Methods and Results—This prospective multi-center study obtained peripheral blood samples for GES before MPI in 537 consecutive patients. Patients with abnormal MPI usually underwent ICA; all others had research coronary CT-angiography (CTA), with core laboratories defining coronary anatomy. A total of 431 patients completed GES, coronary imaging (ICA or CTA), and MPI. Mean age was 56±10 (48% women). The pre-specified primary endpoint was GES receiver-operator characteristics (ROC) analysis to discriminate ≥50% stenosis (15% prevalence by core laboratory analysis). ROC curve area (AUC) for GES was 0.79 (95% CI 0.73-0.84, p<.001), with sensitivity, specificity, and negative predictive value (NPV) of 89%, 52%, and 96%, respectively, at a pre-specified threshold of ≤15 with 46% of patients below this score. The GES outperformed clinical factors by ROC and reclassification analysis and also showed significant correlation with maximum percent stenosis. Six-month follow-up on 97% of patients showed that 27/28 patients with adverse cardiovascular events or revascularization had GES >15. Site and core-lab MPI had AUCs of 0.59 and 0.63, respectively, significantly less than GES.

ConclusionsA GES has high sensitivity and NPV for obstructive CAD. In this population clinically referred for MPI, the GES outperformed clinical factors and MPI.

Clinical Trial Registration Information—www.clinicaltrials.gov; Identifier: NCT01117506.

  • Received June 6, 2012.
  • Revision received January 15, 2013.
  • Accepted February 5, 2013.

http://circgenetics.ahajournals.org/content/early/2013/02/15/CIRCGENETICS.112.964015.abstract?sid=74741525-8453-460e-8407-f11022fe9a24

http://www.bizjournals.com/sanfrancisco/blog/biotech/2012/08/cardiodx-corus-medicare-heart-disease.html

CardioDx heart disease test wins Medicare coverage

San Francisco Business Times by Ron Leuty, Reporter

Date: Wednesday, August 8, 2012, 4:00am PDT

CardioDx's test for obstructive heart disease will be covered by Medicare retroactive to Jan. 1.
Photo supplied by CardioDx

CardioDx’s test for obstructive heart disease will be covered by Medicare retroactive to Jan. 1.

Reporter- San Francisco Business Times
 

A key national Medicare contractor will cover the cost of a coronary artery disease test developed by CardioDx Inc.

The move is important for Palo Alto-based CardioDx because private insurers tend to follow the federal government’s Medicare health insurance program. The company has had to seek reimbursement on a case-by-case basis with those private insurers since its Corus CAD gene expression test hit the market in June 2009.

The decision disclosed Tuesday by Palmetto GBA, a national contractor that administers Medicare benefits in Columbia, S.C., means that Medicare will cover the test for as many as 40 million enrollees. Coverage is retroactive to Jan. 1.

Corus CAD is a shoebox-size kit that uses a simple blood draw to measure the RNA levels of 23 genes. Using an algorithm, it then creates a score that determines the likelihood that a patient has obstructive coronary artery disease.

“By providing Medicare beneficiaries access to Corus CAD, this coverage decision enables patients to avoid unnecessary procedures and risks associated with cardiac imaging and elective invasive angiography, while helping payers address an area of significant healthcare spending,” CardioDx President and CEO David Levison said in a press release.

The decision represents the latest Medicare-coverage win for Bay Area diagnostic test makers. Palmetto earlier this year opted to cover the Afirma gene expression test from South San Francisco’s Veracyte Inc. to diagnosis thyroid nodules, and last summer Palmetto said it would cover Redwood City-based Genomic Health Inc.’s (NASDAQ: GHDX)colon cancer recurrence test.

Read Full Post »

Doctors, Patients, and Lawyers — Two Centuries of Health Law

Reporter: Aviva Lev-Ari, PhD, RN

George J. Annas, J.D., M.P.H.

N Engl J Med 2012; 367:445-450   August 2, 2012

Medical care in 2012 is unrecognizable as compared with what it was in 1812, and no 19th-century physician would be at home in a modern hospital. A 19th-century lawyer, however, would be completely at home in a contemporary courtroom, as would a present-day lawyer transported back to the early 19th century. Although slavery was still legal and women did not yet have the right to vote, the U.S. Supreme Court was the highest court in the land and the U.S. Constitution and its Bill of Rights would be familiar, as would the jury and the common law system adopted from England.

Physicians and lawyers did not necessarily get along better in 1812 than they do today, primarily because of medical malpractice litigation. Herman Melville’s 1851 metaphoric Massachusetts masterpiece, Moby-Dick, symbolizes the view of many physicians, then and now, that medical malpractice litigation is the white whale: evil, ubiquitous, and seemingly immortal (Figure 1FIGURE 1The Whale.). Medicine and law were nonetheless often viewed as the two major professions, and for the leading physicians at that time, including Walter Channing (Figure 2FIGURE 2Portrait of Dr. Walter Channing (William Franklin Draper, after Joseph Alexander Ames, 1946).), editor-in-chief from 1825 to 1835 of what is now theNew England Journal of Medicine, the relationship between medicine and law was of great intellectual and practical interest.1

Over the past two centuries, the discipline of medical jurisprudence — the application of medical knowledge to the needs of justice — has been renamed legal medicine (including forensic science), and applying the law to medicine has expanded from medical law to health law. Legal procedures and courtrooms have changed little, but there have been almost as many changes in the application of law to medicine over the past 200 years as there have been changes in the practice of medicine. Health law’s intimate relationship with medical ethics also has a strong precedent. Thomas Percival’s original title for his 1803 Medical Ethics text, which has been described as “the most influential treatise on medical ethics in the past two centuries,”2 was Medical Jurisprudence.3 More than half of Percival’s text specifically addresses “professional duties . . . which require a knowledge of law.”3

Walter Channing’s almost-poetic academic title was Professor of Midwifery and Medical Jurisprudence.1 In his lectures on the latter subject at what would become Harvard Medical School, he relied primarily on the 1823 text by Theodoric Beck, Elements of Medical Jurisprudence.1,4 The major areas of medical jurisprudence in the early and mid-19th century were forensic pathology (determination of the cause of death in criminal cases, especially when poisoning was suspected) and forensic psychiatry (determination, for example, of whether a defendant was “sane” at the time he committed a crime). In 1854, the year Channing retired from teaching, his course was entitled “Obstetrics and Medical Jurisprudence.”1 Insight into the medical jurisprudence of Channing’s times can be found in a remarkable three-part book review, spanning 24 journal pages, which he wrote 6 years later.5 The book he reviewed, by physician–lawyer John J. Elwell, A Medico-Legal Treatise on Malpractice and Medical Evidence: Comprising the Elements of Medical Jurisprudence, was also published in 1860.6 Both the book and Channing’s review can help us see how medical jurisprudence evolved into health law in much the way that midwifery evolved into obstetrics.

PHYSICIANS AND THE LAW

Apart from the many areas of the law that directly affected the practice of obstetrics in the 19th century (most notably, abortion, feticide, and infanticide), medical jurisprudence was not Channing’s main subject.1 Nonetheless, primarily on the basis of the importance of medical testimony in both civil and criminal cases and on the basis of his own courtroom experiences as an expert witness, Channing strongly believed that physicians should know enough law to be useful and credible witnesses in court. He made this conviction a core of his medical school lectures on the subject. Channing believed that medicine and law, “two of the most diverse callings may act in perfect harmony, and for the equal benefit of both.”5 He also quoted medicolegal expert David Paul Brown: “A doctor who knows nothing of law, and a lawyer who knows nothing of medicine, are deficient in essential requisites of their respective professions.”5 Two cases dealt with in some detail by Elwell illustrate the standards to which courts held physicians (and quacks) in the late 18th and early 19th centuries.

MEDICAL MALPRACTICE AND MEDICAL LICENSURE

The first is the celebrated case of Slater v. Baker and Stapleton, decided in England in 1767.7Slater had broken his leg, it had not healed well, and he had sought treatment from another physician, a surgeon named Baker (and apothecary Stapleton). They broke the leg again and set it in “a heavy steel thing that had teeth” to stretch it, with a poor result. Slater sued them, and three surgeons testified that the “steel thing” should not have been used.7 The jury awarded Slater £500 (approximately £60,000 today), and the defendants appealed. The appeals court affirmed the award, saying that a radical experiment could itself be considered malpractice, at least in the absence of the patient’s consent. In the court’s words,

this was the first experiment made with this new instrument; and although the defendants in general may be as skillful in their respective professions as any two gentlemen in England, yet the Court cannot help saying that in this particular case they have acted ignorantly and unskillfully, contrary to the known rule and usage of surgeons.7

Elwell reasonably objected to the court’s conclusion that if a physician is engaging in a unique experiment, then that fact alone makes the physician “guilty of rashness and recklessness.”6 He noted that the “recklessness” standard “points strongly to criminal intent or of foolhardiness and culpable rashness [which would make the physician] actually guilty of a crime.”6 Later in his text, Elwell described just such a case, which he termed “the leading American case on criminal malpractice” and which I call “the case of the coffee quack.”8

The coffee quack was charged with murder in the death of his patient. He had come to Beverly, Massachusetts, in 1807 and announced himself as a physician with “the ability to cure all fevers.” He used several concoctions, including drugs he called “coffee,” “well-my-gristle,” and “ram-cats.” He administered these drugs, together with heat and blankets, for approximately 1 week to a patient who had employed him to cure a severe cold. The patient vomited frequently, became exhausted, and within days suffered a series of convulsions from which he died. There was testimony that in high doses the “coffee” drug could act as a poison. The jury was instructed that to find the coffee quack guilty of murder they must find that the killing was done with malice, and there was no evidence of this. A finding of manslaughter required that the killing be “the consequence of some unlawful act,”8 but there was no legal requirement at the time for either licensure or education in order to call oneself a physician. The judge summed up his instructions to the jury:

It is to be exceedingly lamented, that people are so easily persuaded to put confidence in these itinerant quacks . . . If this astonishing infatuation should continue, there seems to be no adequate remedy by a criminal prosecution, without the interference of the legislature, if the quack . . . should prescribe, with honest intentions and expectations of relieving his patients.8

The jury accordingly found the defendant not guilty. At least partially as a result of this verdict, the Massachusetts legislature passed its first physician-licensing law in 1818. That law prohibited unlicensed healers from using the courts to collect payment. It was not until the end of that century that practicing medicine without a license was made a crime.9

MEDICAL MALPRACTICE AND LAY JURIES

Historian Michael Bliss argues that in the 19th century, “much of the therapeutic power of medicine stemmed from surgery.”10 Whether or not it was therapeutic, Channing noted in the mid-19th century that malpractice was “almost exclusively charged on surgical practice.”5 Elwell catalogued and provided specific examples of the most common surgical malpractice cases, those involving amputation and the treatment of fractures.6 Beck also appropriately devoted, in Channing’s words, “much of his work” (15 of 42 chapters, and 232 of 582 pages) to the issue of medical malpractice, which “gives to his volume a great value, and makes him a large benefactor to the profession.”4,5

Although Channing thought the jury a wonderful institution, he did not think it was appropriate for medical malpractice cases. He argued that medicine was inherently difficult to understand and not suited to lay juries, which he thought were mostly influenced by dueling expert witnesses whose testimony they could not fathom.5 Channing asked, in words that find common expression today, “What shall be done to remedy so glaring a defect in our jurisprudence — a defect involving so much evil to the accused, and to a profession?”5 His own response was to suggest that, like military officers, physicians should be tried by their “peers” because “there is no other way it is possible for them to get justice.”5

His view was not unique at the time. It has been independently reported that “between 1845 and 1861 physicians were truly alarmed at the increase of malpractice claims,” and an 1850 communication to the Massachusetts Medical Society referred to the “alarmingly frequent” prosecutions for malpractice and the belief that some surgeons were closing their practices because of this.11 The Massachusetts Medical Society “recommended that a disinterested physician be engaged to adjudicate a threat of malpractice by a disgruntled patient.”11

A century and a half of “malpractice reforms” has not changed the medical profession’s views on medical malpractice litigation, which is still seen as unnecessarily adversarial, shaming, and unfair.12-14 To many physicians, medical malpractice litigation remains the dangerous white whale. Lawyers themselves are not uncommonly viewed as sharks or vultures, bringing to mind Melville’s description of the sharks that harass the whale boats, “seemingly rising from out the dark waters . . . maliciously snap[ping] at the oars . . . following them in the same prescient way that vultures hover.”

CONTEMPORARY HEALTH LAW AND THE SUPREME COURT

Law and medicine have been intimately associated for at least the past two centuries, but it was not until 1964 that the Journal inaugurated a regular feature on the subject (then called “medicolegal relations”) and William J. Curran began writing his “Law–Medicine Notes.”15 Like Elwell and Beck before him, Curran devoted a significant number of his articles to medical malpractice (including hospital liability), forensic medicine (including abortion), and forensic psychiatry, but he also addressed new topics, including the physician’s changing roles in capital punishment, torture, care of the dying, fetal research, and determining death according to brain criteria.15

In 1991, I began writing a Journal feature called “Legal Issues in Medicine” (now “Health Law, Ethics, and Human Rights”). Of the 60 articles that I have written under these two rubrics, approximately 20% have dealt with the power of government over physicians and medical practice; 20% with abortion, pregnancy, and childbirth; 20% with public health issues; and the remainder with research, care of the dying, patient rights, forensic medicine, and forensic psychiatry. What is perhaps most noteworthy, however, is the number of health law cases that have been decided by the U.S. Supreme Court.

Health law — that is, law applied to the health care field — has expanded far beyond anything Channing could have imagined. The recognition of patients’ rights and the expansion of regulatory-oversight rules and mechanisms, for both medical practice and financing, has vastly enlarged the field. Patients’ rights, especially the doctrine of informed consent, were furthered by such judgments as that at the trial of the Nazi doctors at Nuremberg (1946–1947)16 and the Supreme Court’s decision on abortion in Roe v. Wade (1973).17 Informed consent is the core of the Nuremberg Code, as it could have been the core of Slater v. Baker nearly 250 years ago. On its face, Roe v. Wade overturned most state laws that made abortion a crime, but its impact on medical care goes far beyond abortion. The Court ruled that the rights of both the physician and the patient have a constitutional dimension that limits the state’s power to interfere in the physician–patient relationship.17 The politics of abortion have led the Court to decide more than 3 dozen cases on state abortion laws in the past 40 years. The evolving structures of health care financing and practice would also be unrecognizable to 19th-century medical practitioners, including private health insurance plans, Medicare and Medicaid, managed care, the health insurance exchanges and accountable care organizations encouraged by the Affordable Care Act, antitrust regulations, measures to prevent fraud and abuse, and financial disclosure requirements.

A third development is also noteworthy — the application of health law to the field of international human rights, including the right to health, the regulation of research on human subjects, and the physician’s role in war and civil conflict. Physicians and lawyers now work together in U.S.-based organizations such as Physicians for Human Rights and Global Lawyers and Physicians. Working separately, medical associations, including the British Medical Association and the World Medical Association, rather than legal associations, deserve much of the credit for the growth of the international “health and human rights” arena.18 Both law and medicine are critical tools for improving health and well-being on a global level, and each profession is more effective when the two work together.

Law remains interwoven with the practice of medicine, as it was in the 19th century. Physicians who do not have a basic understanding of the law are, as Channing recognized, at a distinct disadvantage when practicing medicine. The evolution of medical jurisprudence into health law over the past two centuries has been dramatic (Table 1TABLE 1Some Health Law Highlights.). But equally consequential are the ways in which health law issues are framed and the legal forums in which they are resolved. State laws governing medical practice (including abortion and end-of-life care) are now challenged as unconstitutional infringements of individual rights, with the final determination made by the Supreme Court. The Court has also become active in determining the constitutionality of federal health-related legislation and in interpreting the meaning of federal statutes in the health field, ranging from regulation of tobacco and drugs to gun control. The fate of the Affordable Care Act, the major “health law” of the past decade, has also been decided by the Supreme Court — unthinkable in Channing’s day.

The changes in substance and emphasis in health law from the publication of Moby-Dick can be appreciated by reading a contemporary nonfiction best seller about an event that occurred in 1951, which was 100 years after Melville published his masterpiece: the taking of cells that would later be called “HeLa” cells from Henrietta Lacks.19 Although malpractice remains a concern, more central legal issues in contemporary medical practice include the fiduciary nature of the doctor–patient relationship, patient rights and patient safety, informed consent, privacy, commercialization, the regulation of medical research and biobanking, the patenting of genes and cell lines, the application of genomic information to medical practice, racial disparities, and equitable access to quality medical care.20,21 The author of The Immortal Life of Henrietta Lacks, Rebecca Skloot, opens her book with the words of Elie Wiesel that almost all physicians and lawyers would agree should apply to all patients, not least because of the “fiduciary duty” that physicians owe patients under the law (and medical ethics): “We must not see any person as an abstraction. Instead, we must see in every person a universe with its own secrets, with its own sources of anguish, and with some measure of triumph.”19

Disclosure forms provided by the author are available with the full text of this article at NEJM.org.

I thank my health law colleagues Leonard Glantz and Wendy Mariner for their thoughtful comments on early drafts of this article.

SOURCE INFORMATION

From the Department of Health Law, Bioethics, and Human Rights, Boston University School of Public Health, Boston.

Address reprint requests to Dr. Annas at the Department of Health Law, Bioethics, and Human Rights, Boston University School of Public Health, Boston, MA 02118, or at annasgj@bu.edu.

Read Full Post »

The Role of Informatics in The Laboratory

Larry H. Bernstein, M.D.

Introduction

The clinical laboratory industry, as part of a larger healthcare entrerprise, is in the midst of large changes that can be traced to the mid 1980’s, and that have accelerated in the last decade.   These changes are associated with a host of dramatic events that require accelerated readjustments in the work force, scientific endeavors, education, and the healthcare enterprise.   These changes are highlighted by the following (not unrelated) events:  globalization, a postindustrial information explosion driven by advances in computers and telecommunications networks, genomics and proteomics in drug discovery, consolidation in retail, communication, transportation, the healthcare and pharmaceutical industries.   Let us consider some of these events.   Globalization is driven by the principle that a manufacturer may seek to purchase labor, parts or supplies from sources that are less than is available at home.   The changes in the airline industry have been characterized by growth in travel, reductions in force, and ability of customers to find the best fares.   The discoveries in genetics that have evolved from asking questions about replication, translation and transcription of the genetic code, has moved to functional genomics and to elucidation of cell signaling pathways.   All of these changes were impossible without the information explosion.

The Laboratory as a Production Environment

The clinical laboratory produces about 60 percent of the information used by nurses and physicians to make decisions about patient care.   In addition, the actual cost of the laboratory is only about 3 – 4 percent of the cost of the enterprise.   The result is that the requirements for the support of the laboratory don’t receive attention without a proactive argument of how it contributes to realizing the goals of the organization.   The key issues affecting laboratory performance are:  staffing requirement, instrument configuration, workflow, what to send out, what to move to point-of-care, how to reconfigure workstations, and how to manage the information generated by the laboratory.

Staffing requirement, instrument configuration and workflow are being addressed by industry automation.   The first attempt was based on connecting instruments by tracks.   This  system proved unable to handle STAT specimens without a noticeable degrading of turnaround time.   The consequence of the failure is to drive creation of  a parallel system of point-of-care, and connecting them in a network with a RAWLS.  Another adjustment was to have an infrastructure for pneumatic tube delivery of specimens, and to redesign the laboratory. This had some success, but required capitalization.   The pneumatic tube system could be justified on the basis to a value to the organization in supporting services besides the laboratory. The industry is moving in the direction of connected modules that share an automated pipettor and reduce the amount of specimen splitting.   These are primarily PREANALYTICAL refinements.

There are other improvements that affect quality and cost that are not standard, and should be.   These are:  autoverification, embedded quality control rules and algorithms, and incorporation of the X-bar into standard quality monitoring.   This can be accomplished using middleware between the enterprise computer and the instruments designed to do more than just connect instruments with the medical information system.   The most common problem encountered when installing a medical repository is the repeated slowdown of the system as more users are connected with the system.   The laboratory has to be protected from this phenomenon, which can be relieved considerably by an open-architecture.   Another function of middleware will be to keep track of productivity by instrument, and to establish the cost per reportable result.

The Laboratory and Informatics

A few informatics requirements for the processing of tests are:

  1. Reject release of runs that fail Quality Control rules
  2. Flag results that fail clinical rules for automatic review
  3. Ability to construct a report that has correlated information for physician review, regardless of where the test is produced (RBC, MCV, reticulocytes and ferritin)
  4. Ability to present critical information in a production environment without technologist intervention (platelet count or hemoglobin in preparation of transfusion request)
  5. Ability to download 20,000 patients from an instrument for review of reference ranges
  6. Ability to look at quality control of results on more than one test on more than one instrument at a time
  7. Ability to present risks in a report for physicians for medical decisions as an alternative to a traditional cutoff value

I list essential steps of the workload processing sequence and identification of informatics enhancement of the process (bolded):

Prelaboratory (ER) 1:

Nurse draws specimens from patient (without specimen ID) and places tubes in bag labeled with name

Nurse prints labels after patient is entered.

Labels put on tubes

Orders entered into computer and labels put on tubes

Tubes sent to laboratory

Lab test  is shown as PENDING

Prelaboratory 2:

Tubes in bags sent to lab (by pneumatic tube)

Time of arrival is not same as time of order entry (10 minutes later)

If order entry is not done prior to sending specimen – entry is done in front processing area –

Sent to lab area 10 minutes later after test is entered into computer

Preanalytical:

Centrifugation

Delivery to workareas (bins)

Aliquoting for serological testing

Workstation assignment

Dating and amount of reagents

Blood gas or co-oximetry – no centrifugation

Hematology – CBC – no centrifugation
send specimen for Hgb A1c

Send specimen for Hgb electrophoresis and Hgb F/Hgb A2

Specimen to Aeroset and then to Centaur

Analytical:

Use of bar code to encode information

Check alignment of bar code

Quality control and calibration at required interval – check before run

Run tests

Manual:

2 hrs per run

enter accession #

enter results 1 accession at a time

Post analytical:

Return to racks or send to another workarea

Verify results

Enter special comments

Special  problems:

Calling results

Add-on tests

Misaligned bar code label

Inability to find specimen

Coagulation

Manual differentials

Informatics and Information Technology

The traditional view of the laboratory environment has been that it is a manufacturing center, but the main product of the laboratory is information, and the environment is a knowledge business.   This will require changes in the education of clinical laboratory professionals.   Biomedical Informatics has been defined as the scientific field that deals with the storage, retrieval, sharing, and optimal use of biomedical information, data, and knowledge for problem solving and decision making. It touches on all basic and applied fields in biomedical science and is closely tied to modern information technologies, notably in the areas of computing and communication.   The services supported by an informatics architecture include operations and quality management, clinical monitoring, data acquisition and management, and statistics supported by information technology.

The importance of a network architecture is clear.   We are moving from computer-centric processing to a data-centric environment. We will soon manage a wide array of complex and inter-related decision-making resources. The resources, commonly referred to as objects and contents, can now include voice, video, text, data, images, 3D models, photos, drawings, graphics, audio and compound documents.  The architectural features required to achieve this is in Fig 1.

According to Coeira and Dowton (Coiera E and Dowton SB. Reinventing ourselves: How innovations such as on-line ‘just-in-time’ CME may help bring about a genuinely evidence-based clinical practice. Medical Journal of Australia 2000;173:343-344), echoing Lawrence Weed, “Clinicians in the past were trained to master clinical knowledge and become experts in knowing why and how. Today’s clinicians have no hope of mastering any substantial portion of the medical knowledge base.  Every time we make a clinical decision, we should stop to consider whether we need to access the clinical evidence-base. Sometimes that will be in the form of on-line guidelines, systematic reviews or the primary clinical literature.”

Fig 1

Interoperability across environments

Define representation for storage that is independent of  implementation

Define a representation of collection that is independent of the database – schema, table structures

Informatics and the Education of Laboratory Professionals

The increasing dependence on laboratory information and the incorporation of laboratory information into Evidence-Based Guidelines necessitates a significant component of education in informatics.   The public health service has mandated informatics as a component of competencies for health services professionals (“Core Competencies for Public Health Professionals” compendium developed by the Council on Linkages Between Academia and Public Health Practice.), and nursing informatics competencies have already been written.   Coiera (E. Coiera, Medical informatics meets medical education: There’s more to understanding information than technology, Medical Journal of Australia 1998; 168: 319-320) has suggested 10 essential informatics skills for physicians.

I have put together a list below with items taken from Coiera and the Public Health Service competencies for elaboration of competencies for Clinical Laboratory Sciences.
A.   Personal Maintenance
1.   Understands the dynamic and uncertain nature of medical knowledge and know how to keep personal knowledge and skills up-to-date

  1.  Searches for and assesses knowledge according to the statistical basis of scientific evidence
  2. Understands some of the logical and statistical models of the diagnostic process
  3. Interprets uncertain clinical data and deals with artefact and error
  4. Evaluates clinical outcomes in terms of risks and benefits

B.   Effective Use of Information

Analytic Assessment Skills

  1. Identifies and retrieves current relevant scientific evidence
  2. Identifies the limitations of research
  3. Determines appropriate uses and limitations of both quantitative and qualitative data

9.  Evaluates the integrity and comparability of data and identifies gaps in data sources

10.  Applies ethical principles to the collection, maintenance, use, and dissemination of data and information
11.  Makes relevant inferences from quantitative and qualitative data
12.  Applies data collection processes, information technology applications, and computer systems storage/retrieval strategies

13.  Manages information systems for collection, retrieval, and use of data for decision-making
14.  Conducts cost-effectiveness, cost-benefit, and cost utility analyses

  1. Effective Use of Information Technology
  1.  Select and utilize the most appropriate communication method for a given task (eg, face-to-face conversation, telephone, e-mail, video, voice-mail, letter)
  2.  Structure and communicate messages in a manner most suited to the recipient, task and chosen communication medium.

17.  Utilizes personal computers and other office information technologies for working with documents and other computerized files

  1.  Utilizes modern information technology tools for the full range of electronic communication appropriate to one’s duties and programmatic area.
  2. Utilizes information technology so as to ensure the integrity and protection of electronic files and computer systems
  1.  Applies all relevant procedures (policies) and technical means (security) to ensure that confidential information is appropriately protected.

I expand on these recommended standards.   The first item is personal maintenance.   This requires continued education to meet the changing needs of the profession in expanding knowledge and access to knowledge that requires critical evaluation.   The payment for the profession has been paid for recognizing the technical contributions made by the laboratory profession as a task oriented contribution, but not for a contribution as a knowledge worker.   This can be changed, but it can’t be realized through the usual bacchalaureate educated requirement.   Most technologists want to get out in the workforce, but after they are out in the workforce – what next?   In many institutions, it falls back on the laboratory to provide the expertise to drive the organization in the computer and information restructuring, from staff taken from the transfusion service, microbiology, and elsewhere.   The laboratory is recognized for an information expertise, but then there is still reason to do more.   The fact is that the mind set of the laboratory staff has been in a manufacturing productivity related to test production, but the data that the production represents is information.   We have the quality control of the test process, but we are required to manage the total process, including the quality of the information we generate.   Another consideration is that the information we generate is used for clinical trials, and a huge variation in the way the information is used is problematic.

The first category for discussion is personal maintenance.  These items are keeping up with knowledge about advances in medical knowledge,  being critical about the quality of the evidence for current knowledge, and being aware of the statistical underpinnings for that thinking (1-5).   It is not enough to keep up with changes in medical thinking using only the professional laboratory literature. A systematic review of problem topics using PubMed as a guide is also essential.  This requires that the clinical laboratory scientist will have to know how to access the internet and search for key studies concerning the questions that are being asked.   The reading of abstracts and papers also requires an education in methods of statistical analysis, contingency tables, study design, and critical thinking.   The most common methods used in clinical laboratory evaluation are linear regression, linear regression, and yes, linear regression.  A discussion over distance learning among members of the American Statistical Association reveals that much of statistical education for the biologists, chemists, and engineers now comes from *software*.  Knowledge workers in drug development and in molecular diagnostics are increasingly challenged with larger, more complicated data sets, and there is a need to interpret and report results quickly. This need is not confined to basic research or the clinical setting, and it may have to be done without consulting with statisticians.  Category A slides into category B, effective use of information.

Effective use of information requires skills that support the design of evaluations of laboratory tests, methods of statistical analysis, and the critical assessment of published work (6-9), and the processes for collecting data, using information technology application, and interpreting the data (10-12).   Items 13 and 14 address management issues.

There is a vocabulary that has to be mastered and certain questions that have to be answered whenever a topic is being investigated.   I identify a number of these at this point in the discussion.

Contingency Table:  A table of frequencies, usually two-way, with event type in columns and test results as positive or negative in rows.   A multi-way table can be used for multivalued categorical analysis.   The conventional 2X2 contingency table is shown below –

No disease Disease
Test negative A  (TN) B  (FN) A+B

PVN =

TN/(FN+TN)Test positiveC  (FP)D  (TP)C+D

PVP =

TP/(TP+FP) A+C

Specificity=
TN/(FP+TN)B+C

Sensitivity =
TP/(TP+FN)A+B+C+D

Type I error:  There is no finding when one actually exists (missed diagnosis)(false negative error).

Type II error:  There is a finding when none exists (false positive error).

Sensitivity:  Percentage of true positive results.  D/(B + D)

Specificity:  Percentage of true negative results. A/(A + C)

False positive error rate:  The percentage of results that are positive in the absence of disease (1 – specificity).  C/(A + C)

ROC curve:  Receiver operator characteristic curve is plot of sensitivity vs I-specificity.  Two methods can be compared in ROC analysis by the area under the curve.   The optimum decision point can be identified as within a narrow range of coordinates on the curve.

Predictive value (+)(PVP):  Probability there is disease when a test is positive (D/C + D), or percentage of patients with disease, given a positive test.   The observed and expected probability may be the same or different.

Predictive value (-)(PVN):  Probability of absence of disease given a negative test result (A/A + B), or percentage of patients without disease given a negative test.  The observed and expected probability may be the same or different.

Power:   When a statement is made that there is no effect, or a test fails to predict the finding of disease, are there enough patients included in the study to see the effect if it exists.   This applies to randomized controlled drug studies as well as studies of tests. Power protects against the error of finding no effect when it exists.

Selection Bias:  It is common to find a high performance claimed for a test that is not later substantiated when it is introduced and widely used.   Why does this occur?   A common practice in experimental design is to define inclusion criteria and exclusion criteria so that the effect is very specific for the condition and to eliminate the interference by “confounders”, unanticipated effects that are not intended.   A common example of this is the removal of patients with acute renal failure and chronic renal insufficiency because of delayed clearance of analytes from the circulation.   The result is that the test is introduced into a population different than the trial population with claims  based on the performance in a limited population.   The error introduced could be prediction of disease in an individual in whom the effect is not true.   This error is reduced by elimination of selection bias, which may require multiple studies using patients who have the confounding conditions (renal insufficiency, myxedema).   Unanticipated effects often aren’t designed into a study.   In many studies about cardiac markers, the study design included only patients who had Acute Coronary Syndrome (ACS)  This is an example of selection bias.   Patients who have ACS  have chest pain of anginal nature that lasts at least 30 minutes, and usually have more than a single episode in 24 hours.   That is not how a majority of patients present to the emergency department who are suspected of having a myocardial infarct.   How then is one to evaluate the effectiveness of a cardiac marker?

Randomization:   Randomization is the assignment of the treatment group to either placebo (no treatment) or treatment.   The investigator and the participant enrolled in the study are blinded.   The analyst might also be blinded.   A potential problem is selection bias from dropouts who skew the characteristics of the population.

Critical questions:

What is the design of the study that you are reading?   Is there sufficient power or is there selection bias?  What are the conclusions of the authors?   Are the conclusions in line with the study design, or overstated?

Statistical tests and terms:   

Normal distribution:  Symmetrical bell shaped curve (Gaussian distribution).   The 2 standard deviation limits is approximately the 95% confidence interval.

Chi square test:  Has a chi square distribution.   Used for measuring probability from a contingency table.   Non-parametric test.

Student’s t-test:  Parametric measure of difference between two population means.

F-test:  An F-test ( Snedecor and Cochran, 1983) is used to test if the standard deviations of two populations are equal.  In comparing two independent samples of size N1 and N2 the F Test provides a measure for the probability that they have the same variance. The estimators of the variance are s12 and s22. We define as test statistic their ratio T = s12/ s22, which follows an F Distribution with f1= N1-1 and f2= N2-1 degrees of freedom.

F Distribution: The F distribution is the ratio of two chi-square distributions with degrees of freedom and , respectively, where each chi-square has first been divided by its degrees of freedom.

Z scores:  Z scores are sometimes called “standard scores”. The z score transformation is especially useful when seeking to compare the relative standings of items from distributions with different means and/or different standard deviations.

Analysis of variance:  Parametric measure of two or more population means by the comparison of variances between the populations.   Probability is measured by the F-test.

Linear Regression:  A classic statistical problem is to try to determine the relationship between two random variables  X and Y. For example, we might consider height and weight of a sample of adults.  Linear regression attempts to explain this relationship with a straight line fit to the data.  The simplest case of regression — one dependent and one independent variable — one can visualize in a scatterplot, is simple linear regression (see below).   The linear regression model is the most commonly used model in Clinical Chemistry.

Multiple Regression:  The general purpose of multiple regression (the term was first used by Pearson, 1908) is to learn more about the relationship between several independent or predictor variables and a dependent or criterion variable.  The general computational problem that needs to be solved in multiple regression analysis is to fit a straight line to a number of points.  A multiple regression fits a line using two or more predictors to the dependent variable by a model — Y = a1X1 + a2X + b + g.

Discriminant function:  Discriminant analysis is a technique for classifying a set of observations into predefined classes. The purpose is to determine the class of an observation based on a set of variables known as predictors or input variables. The model is built based on a set of observations for which the classes are known. This set of observations is sometimes referred to as the training set. Based on the training set , the technique constructs a set of linear functions of the predictors, known as discriminant functions, such that

L = b1x1 + b2x2 + … + bnxn + c , where the b’s are discriminant coefficients, the x’s are the input variables or predictors and c is a constant.

These discriminant functions are used to predict the class of a new observation with unknown class. For a k class problem k discriminant functions are constructed. Given a new observation, all the k discriminant functions are evaluated and the observation is assigned to class i if the ith discriminant function has the highest value.

Nonparametric Methods:

Logistic Regression: Researchers often want to analyze whether some event occurred or not.  The outcome is binary.  Logistic regression is a type of regression analysis where the dependent variable is a dummy variable (coded 0, 1).   The linear probability model, expressed as Y = a + bX + e, is problematic because

  1. The variance of the dependent variable is dependent on the values of the independent variables.
  2. e, the error term, is not normally distributed.
  3. The predicted probabilities can be greater than 1 or less than 0.

The “logit” model has the form:

ln[p/(1-p)] = a + BX + e or

[p/(1-p)] = expa expBX expe

where:

  • ln is the natural logarithm, logexp, where exp=2.71828…
  • p is the probability that the event Y occurs, p(Y=1)
  • p/(1-p) is the “odds ratio”
  • ln[p/(1-p)] is the log odds ratio, or “logit”

The logistic regression model is simply a non-linear transformation of the linear regression. The logit distribution constrains the estimated probabilities to lie between 0 and 1.

Graphical Ordinal Logit Regression:  The logistic regression fits a non-parametric solution to a two-valued event.   The outcome in question might have 3 or more values.

For example, scaled values of a test – low, normal, and high – might have different meanings.   This type of behavior occurs in certain classification problems.  For example, the model has to deal with anemia, normal, and polycythemia, or similarly, neutropenia, normal, and systemic inflammatory response (sepsis).   This model fits the data quite readily.

Clustering methods:  There are a number of methods to classify data when the dependent variable is not known, but is presumed to exist.   A commonly used method classifies data using geometric distance of the average point coordinates.   A very powerful method used is Latent Class Cluster analysis.

Data Extraction:

Data can be extracted from databases, but have to be worked at in a flat file format.   The easiest and most commonly used methods are to collect data in a relational database, such as Access (if the format is predefined), or the convert data into an Excel format.   A common problem is the inability to extract certain data because it is not in an extractable or usable format.

Let us examine how these methods are actually used in a clinical laboratory setting.

The first example is a test introduced almost 30 years ago into quality control in hematology by Brian Bull at Loma LindaUniversity called the x-bar function (also the Bull algorithm).   The method looks at the means of runs of the population data on the assumption the means of the MCV don’t vary for a stable population from day-to-day.  This is a very useful method that can be applied to the evaluation of laboratory.   It is a standard quality control program used in industrial processes since the 1930s.

We next examine the Chi Square distribution.  Review the formula for calculating chi square and calculations of expected frequencies.  Take a two-by-two table of the type

Effect               No effect          Sum Column

Predictor positive          87                     12                     99

Predictor negative         18                     93                    111

Sum Rows                    105                   105                   210

Experiment with the recalculation of chi square by changing the frequencies in the columns for effect and no effect, keeping the total frequencies the same.  The result is a decrease in the chi square as predictor negative – effect and predictor positive – no effect both increase.  The exercise can be carried out on the chi square calculator using Google to find the site.   The chi square can be used to test the contingency table that is used to indicate the effectiveness of fetal fibronectin for assessing low risk of preterm delivery.

For example,

No Preterm Labor Yes Preterm Labor Sum Row
FFN – neg

99

1

100

FFN – pos

35

65

100

Sum Column

134

66

200

PVN = 100*(1/100)% = 99%

99% observed probability that there will not be preterm delivery with a negative test.

Chi square goodness of fit:

Degrees of freedom: 1
Chi-square = 92.6277702397105
p is less than or equal to 0.001.
The distribution is significant.

Examine the effects of scaling of continuous data from a heart attack study to obtain ordered intervals.  Look at the chi square test for the heart attack test by a Nx2 table with the table columns as heart attack or no heart attack.  This allowed us to determine the significance of the test in predicting heart attack.   Look at the Student T test for comparing the continuous values of the test between the heart attack and non-heart attack population.  The T test is like the one-way analysis of variance with only two values for the factor variable.  The T test and ANOVA1 compares the means between two populations.  If the result is significant, then the null hypothesis that the data is taken from the same population is rejected.  The alternative hypothesis is that they are different.

One can visualize the difference by plotting the means and confidence intervals for the two groups.

One can visualize the difference by plotting the means and confidence intervals for the two groups.

We can plot a frequency distribution before we calculate the means and check the distribution around the means.   The simplest way to do this is the histogram.   The histogram for a large sample of potassium values is used to illustrate this.   The mean is 4.2.

We can use a method for quality control called the X-bar (Beckman Coulter has it on the hematology analyzer) to test the deviation from the means of runs.   I illustrate the validity of the X-bar by comparing the means of a series of runs.

Sample size                   =       958

Lowest value                  =        84.0000

Highest value                 =        90.7000

Arithmetic mean               =        87.8058

Median                        =        87.8000

Standard deviation            =         0.9362

————————————————————

Kolmogorov-Smirnov test

for Normal distribution       :   accept Normality (P=0.353)

If I compare the means by the T-test, I am testing whether the sampling is taken from the same or different populations.   When we introduce a third group, then we are asking whether the sampling is taken from a single population or to reject the hypothesis, taking the alternative hypothesis that the samples are different.   This is illustrated by sampling from a group of patients with no cardiac disease and normal, neither of which have acute myocardial infarction.   This is illustrated below:

Two-sample t-test on CKMB grouped by OTHER against Alternative = ‘not equal’

      Group N Mean SD
   0          660       1.396       3.085
   1          90       4.366       4.976

Separate variance:

t                         =       -5.518

df                        =         98.5

p-value                   =        0.000

Bonferroni adj p-value    =        0.000

Pooled variance:

t                         =       -7.851

df                        =          748

p-value                   =        0.000

Bonferroni adj p-value    =        0.000

Two-sample t-test on TROP grouped by OTHER against Alternative = ‘not equal’

      Group N Mean SD
   0          661       0.065       0.444
   1          90       1.072       3.833

Separate variance:

t                         =       -2.489

df                        =         89.3

p-value                   =        0.015

Bonferroni adj p-value    =        0.029

Pooled variance:

t                         =       -6.465

df                        =          749

p-value                   =        0.000

Bonferroni adj p-value    =        0.000

Another example illustrates the application of this significance test.   Beta thalassemia is characterized by an increase in hemoglobin A2.   Thalassemia gets more complicated when we consider delta beta deletion and alpha thalassemia.   Nevertheless, we measure the hemoglobin A2 by liquid chromatography on the Biorad Variant II.   The comparison of hemoglobin A2 in affected and unaffected is shown below (with random resampling):

Two-sample t-test on A2 grouped by THALASSEMIA DIAGNOSIS against Alternative = ‘not equal’

      Group N Mean SD
   0          257       3.250       1.131
   1          61       6.305       2.541

Separate variance:

t                         =       -9.177

df                        =         65.7

p-value                   =        0.000

Bonferroni adj p-value    =        0.000

Pooled variance:

t                         =      -14.263

df                        =          316

p-value                   =        0.000

Bonferroni adj p-value    =        0.000

When we do a paired comparison of the Variant hemoglobin A2 versus quantitation of Helena isoelectric focusing, the results with the T-test shows no significance.

Paired samples t-test on A2 vs A2E with 130 cases

Alternative = ‘not equal’

Mean A2                   =        3.638

Mean A2E                  =        3.453

Mean difference           =        0.185

SD of difference          =        1.960

t                         =        1.074

df                        =          129

p-value                   =        0.285

Bonferroni adj p-value    =        0.285

Consider overlay box plots of the troponin I means for normal, stable cardiac patients and AMI patients:

The means between two subgroups may be close and the confidence intervals around the means may be wide so that it is not clear whether to accept or reject the null hypothesis.  I illustrate this by taking for comparison the two groups that feature normal cardiac status and stable cardiac disease, neither having myocardial infarction.   I use the nonparametric Kruskal Wallis analysis of ranks between two groups, and I increase the sample size to 100,000 patients by a resampling algorithm.   The result for CKMB and for troponin I is:

Kruskal-Wallis One-Way Analysis of Variance for 93538 cases

Dependent variable is CKMB

Grouping variable is OTHER

Group       Count   Rank Sum

0                     83405            3.64937E+09

1                    10133             7.25351E+08

Mann-Whitney U test statistic =  1.71136E+08

Probability is        0.000

Chi-square approximation =     9619.624 with 1 df

Kruskal-Wallis One-Way Analysis of Variance for 93676 cases

Dependent variable is TROP

Grouping variable is OTHER

Group       Count   Rank Sum

0                    83543             3.59446E+09

1                    10133             7.93180E+08

Mann-Whitney U test statistic =  1.04705E+08

Probability is        0.000

Chi-square approximation =    21850.251 with 1 df

Examine a unique data set in which a test is done on amniotic fluid to determine whether there is adequate surfactant activity so that fetal lung compliance is good at delivery.  If there is inadequate surfactant activity there is risk of respiratory distress of the newborn soon after delivery.  The data includes the measure of surfactant activity, gestational age, and fetal status at delivery.  This study emphasized the calculation of the odds-ratio and probability of RDA using surfactant measurement with, and without gestational age for infants delivered within 72 hours of the test.  The statistical method (Goldmine) has a graphical display with the factor variable as the abscissa and the scaled predictor and odds-ratio as the ordinate.  The data acquisition required a multicenter study of the National Academy of Clinical Biochemistry led by John Chapman (Chapel Hill, NC) and Lawrence Kaplan (Bellevue Hospital, NY, NY), published in Clin Chimica Acta (2002).

The table generated is as follows:

Probability and Odds-Ratios for Regression of S/A on Respiratory Outcomes

S/A interval Probability of RDS Odds Ratio
0 – 10  0.87  713
11 – 20  0.69  239
21 – 34  0.43  80
35 – 44  0.20  27
45 – 54  0.08 9
55 – 70  0.03 3
> 70 0.01 1

There is a plot corresponding to the table above.  It is patented as GOLDminer (graphical ordinal logit display).  As the risk increases, the odds-ratio (and probability of an event)  increases.  The calculation is an advantage when there is more than two values of the factor variable, such as, heart attack, not heart attack, and something else.  We  look at the use of the Goldminer algorithm, this time using the acute myocardial infarction and troponin T example.   The ECG finding is scaled so that the result is normal (0), NSSTT (1), ST depression or t-wave inversion, ST elevation.   The troponin T is scaled to: 0.03, 0.031-0.06, 0.061-0.085, 0.086-0.1, 0.11-0.2, > 0.20 ug/L.   The Goldminer plot is shown below with troponin T as 2nd predictor.

(Joint Y)                                   DXSCALE

average            0                      4

X-profile          score                1.00                 0.00

4,5                   3.64                 0.00                 0.68

4,4                   3.51                 0.00                 0.59

4,3                   3.35                 0.00                 0.48

3,5                   3.07                 0.01                 0.34

4,1                   2.87                 0.02                 0.27

3,4                   2.79                 0.02                 0.24

4,0                   2.54                 0.04                 0.17

3,3                   2.43                 0.06                 0.15

3,2                   2.00                 0.12                 0.08

2,5                   1.88                 0.15                 0.07

3,1                   1.55                 0.23                 0.04

2,4                   1.42                 0.26                 0.03

3,0                   1.12                 0.36                 0.01

2,3                   1.02                 0.40                 0.01

2,2                   0.70                 0.53                 0.00

2,1                   0.47                 0.65                 0.00

2,0                   0.32                 0.74                 0.00

1,3                   0.29                 0.77                 0.00

1,2                   0.20                 0.83                 0.00

1,1                   0.13                 0.88                 0.00

1,0                   0.09                 0.91                 0.00

The table is the table of probabilities from the Goldminer program.   The diagnosis scale 4 is MI.   Diagnosis 0 is baseline normal.

We return to a comparison of CKMB and troponin I.   CKMB may be used as a surrogate test for examining the use of troponin I.   We scale the CKMB to 3 and the troponin to 6 intervals.   We construct a 3-by-6 table shown below, with the chi square analysis.

Frequencies

TNISCALE (rows) by CKMBSCALE (columns)

0 1 2 Total
         0          709          12          9          730
         1          14          0          2          16
         2          3          0          0          3
         3          2          0          0          2
         4          4          0          0          4
         5          22          5          17          44
Total          754          17          28          799

Expected values

TNISCALE (rows) by CKMBSCALE (columns)

0 1 2
   0             688.886             15.532             25.582
   1             15.099             0.340             0.561
   2             2.831             0.064             0.105
   3             1.887             0.043             0.070
   4             3.775             0.085             0.140
   5             41.522             0.936             1.542
Test statistic Value df Prob
Pearson Chi-square             198.580             10.000             0.000

How do we select the best value for a test?  The standard accepted method is a ROC plot.  We have seen how to calculate sensitivity, specificity, and error rates.  The false positive error is 1 – specificity.  The ROC curve plots sensitivity vs 1 – specificity.  The ROC plot requires determination of the “disease” variable by some means other than the test that is being evaluated.   What if the true diagnosis is not accurately known?   The question posed introduces the concept of Latent Class Models.

 

A special nutritional study set was used in which the definition of the effect is not as clear as that for heart attack.  The risk of malnutrition is assessed at the bedside by a dietitian using observed features (presence of wound, malnutrition related condition, and poor oral intake), and by laboratory tests, using serum albumin (protein), red cell hemoglobin, and lymphocyte count.  The composite score was a value of 1 to 4.  Data was collected by Linda Brugler, RD, MBA, at St.FrancisHospital, (Wilmington, DE) on 62 patients to determine whether a better model could be developed using new predictors.

The new predictors were laboratory tests not used in the definition of the risk level, which could be problematic.  The tests albumin, lymphocyte count, and hemoglobin were expected to be highly correlated with the risk level because they were used in its definition.  The prealbumin, but not retinol binding protein or C reactive protein, was correlated with risk score and improved the prediction model.

The crosstable for risk level versus albumin is significant at p < 0.0001.

  A GOLDminer plot showed scaled prealbumin versus levels 3 & 4.    A value less than 5 is severe malnutrition and over 19 is not malnourished.  Mild and moderate malnutrition are between these values.

A method called latent class cluster analysis is used to classify the data.   A latent class is identified when the classification isn’t accurately known.   The result of the analysis is shown in Table 4.   The percent of variable subclasses are shown within each class and total 1.00 (100%).

Cluster1           Cluster2           Cluster3

Cluster Size

0.5545             0.3304             0.1151

PAB1COD

1          0.6841             0.0383             0.0454

2          0.3134             0.6346             0.6662

3          0.0024             0.1781             0.1656

4          0.0001             0.1490             0.1227

ALB0COD

1          0.9491             0.4865             0.1013

2          0.0389             0.1445             0.0869

3          0.0117             0.3167             0.5497

4          0.0003             0.0523             0.2621

LCCOD

1          0.1229             0.0097             0.7600

2          0.3680             0.0687             0.2381

4          0.2297             0.2383             0.0016

5          0.2793             0.6832             0.0002

There are other aspects of informatics that are essential for educational design of the laboratory professional of the future.  These include preparation of powerpoint presentations, use of the internet to obtain current information, quality control designed into the process of handling laboratory testing, evaluating data from different correlated workstations, and instrument integration.   The integrated open architecture will be essential for financial management of the laboratory as well. The continued improvement of the technology base of the laboratory will become routine over the next few years.   The education of the CLS for a professional career in medical technology will require an individual who is adaptive and well prepared for a changing technology environment.   The next section of this document will describe the information structure needed just to carry out the day-to-day operations of the laboratory.

Cost linkages important to define value

Traditional accounting methods do not take into account the cost relationships that are essential for economic survival in a competitive environment so that the only items on the ledger are materials and supplies, labor and benefits, and indirect costs.   This is a description of the business as set forth by an NCCLS cost manual, but it is not sufficient to account for the dimensions of the business in relationship to its activities.   The emergence of spreadsheets, and even as importantly, the development of relational database structures, has transformed and is transforming how we can look at the costing of organizations in relationship to how individuals and groups within the organization carry out the business plan and realize the mission set forth by the governing body.   In this sense, the traditional model was incomplete because it only accounted for the costs incurred by departments in a structure that allocates resources to each department based on the assessed use of resources in providing services.   The model has to account for the allocation of resources to product lines of services (as a DRG model developed by Dr. Eleanor Travers).   A revised model has to take into account two new dimensions.   The first dimension is that of the allocation of resources to provide services that are distinct medical/clinical activities.   This means that in the laboratory service business there may be distinctive services as well as market sectors.   That is, health care organizations view their markets as defined by service Zip codes which delineate the lines drawn between their market and the competition (in the absence of clear overlap).

We have to keep in mind that there are service groups that were defined by John Thompson and Robert Fetter in the development of the DRGs (Diagnosis Related Groups) that have a real relationship to resource requirements for pediatric, geriatric, obstetrics, gynecology, hematology, oncology, cardiology, medical and surgical.   These groups are derived from bundles of ICDs (International Code of Diagnosis) that have comparable within group use of laboratory, radiology, nutrition, pharmacy and other resources.   There was an early concern that there was too much variability within DRGs, which was addressed by severity of illness adjustment (Susan Horn).   It is now clear that ICD’s don’t capture a significant content of the medical record. A method is being devised to correct this problem by Kaiser and Mayo using the SNOMED codes as a starting point.   The point is that it is essential that the activities, resources required, and payment be aligned for validity of the payment system.   Of some interest is the association of severity of illness with more than two comorbidities, and of an association with critical values of a few laboratory tests, e.g., albumin, sodium, potassium, hemoglobin, white cell count.   The actual linkages of these resources to cost of the ten or 20 most common diagnostic categories is only a recent event.   As a rule the top 25 categories account for a substantial volume of the costs that it is of great interest to control.   The improvement of database technology makes it conceivable that 100 categories of disease classification could be controlled without difficulty in the next ten years.

Quality cost synergism

What is traditionally described is only one dimension of the business of the operation.   It is the business of the organization, but it is only one-third of the description of the organization and the costs that drive it.   The second dimension of the organization’s cost profile is only obtained by cost accounting how the organization creates value.   Value is simply the ratio of outputs to inputs.   The traditional cost accounting model looks only at business value added.   The value generated by an organization is attributable to a service or good produced that a customer is willing to purchase.   We have to measure the value by measuring some variable that is highly correlated with the value created.   That measure is partly accounted for by transaction times.  We can borrow from the same model that is used in other industries.   The transportation business is an example.   A colleague has designed a surgical pathology information system on the premise that a report in the pathology office or a phone inquiry by a surgeon is a failure of the service.   This is analogous to the Southeast Airlines mission to have the lowest time on the ground in the industry.   The growing complexity of service needs, the capital requirements to support the needs, and the contractual requirements are driving redesign of services in a constantly changing environment.

 

Technology requirements

We have gone from predominantly batch and large scale production to predominantly random access and a growing point-of-care application with pneumatic tube delivery systems in the acute care setting in the last 15 years.   The emphasis on population-based health and increasing shift from acute care to ambulatory care has increased the pressure for point-of-care testing to reduce second visits for adjustment of medication.   The laboratory, radiology and imaging services, and pharmacy information have to be directed to a medical record that may be accessed in acute care or ambulatory setting.   We not only have the proposition that faster is better, but access is from anyplace and almost anytime – connectivity.

There has been a strategic discussion about configuration of information services that is resolving itself by the needs of the marketplace.   Large, self contained organizations are short-lived, and with the emergence of networked provider organizations there will be no compelling interest in having systems that are not tailored to the variety of applications and environments that are served.   The migration from minicomputer to microcomputer client-server networks will go rapidly to N-tiered systems with distributed object-oriented features.   The need for laboratory information systems as a separate application can be seriously challenged by the new paradigm.

 

Utilization and Cost Linkages

Laboratory utilization has to be looked at from more than one perspective in relationship to costs and revenues.   The redefinition of panels cuts the marginal added cost to produce an additional test, but it doesn’t cut the largest cost in obtaining and processing the specimen.   Unfortunately, there is a fixed cost of the operations that has to be achieved, which also drives the formation of laboratory consolidations to have sufficient volume.    If one looks at the capital requirements and labor to support a minimum volume of testing, the marginal cost of added tests decreases with large volume.   The problem with the consolidation argument is that one has to remove testing from the local site in order to increase the volume with an anticipated effect on cycle time for processing.   There is also a significant resource cost for courier service, specimen handling and reporting.   Lets look at the reverse.   What is the effect of decreasing utilization?   One increases the marginal added cost per unit of testing on specimens or accessions.   There is the same basic fixed cost, and if the volume of testing needed to break even is met, the advantage of additional volume is lost.   Fixing the expected cost per patient or per accession becomes problematic if there is a requirement to reduce utilization.

The key volume for processing in the service sense is the number of specimens processed, which has an enormous impact on the processing requirements (number of tests adds to reagent costs and turnaround time per accession).   The result is that one might consider the reduction of testing that is done to monitor critical patients’ status more frequently than is needed.   One can examine the frequency of the CBC, PT/APTT, panels, electrolytes, glucose, and blood gases in the ICUs.   The use of the laboratory is expected to be more intense, reflecting severity of illness, in this setting.   On the other hand, excess redundancy may reflect testing that makes no meaningful contribution to patient care.   This may be suggested by repeated testing with no significant variation in the lab results.

Intangible elements

Competitive advantage may have marginal costs with enormous value enhancement.   This is in the manner of reporting the results.   My colleagues have proposed the importance of a scale-free representation of the laboratory data for presentation to the provider and the patients.   This can be extended further by the scaling of the normalized data into intervals associated with expected risks for outcomes.   This would move the laboratory into the domain of assisting in the management of population adjusted health outcomes.

Blume P. Design of a clinical laboratory computer system. Laboratory and  Hospital Information Systems. In Clinics Lab Med 1991;11:83-104.

Didner RS. Back-to-front systems design: a guns and butter approach. Proc Intl Ergonomics Assoc 1982;–

Didner RS, Butler KA. Information requirements for user decision support: designing systems from back to front. Proc Intl Conf on Cybernetics and Society. IEEE. 1982;–:415-419.

Bernstein LH. An LIS is not all pluses. MLO 1986;18:75-80.

Bernstein LH, Sachs B. Selecting an automated chemistry analyzer: cost analysis. Amer Clin Prod Rev 1988;–:16-19.

Bernstein L, Sachs E, Stapleton V, Gorton J. Replacement of a laboratory instrument system based on workflow design. Amer Clin Prod Rev 1988; –: 22-24.

Bernstein LH. Computer-assisted restructuring services. Amer Clin Prod Rev1986;9:–

Bernstein LH, Sachs B, Stapleton V, Gorton J, Lardas O. Implementing a laboratory information management system and verifying its performance. Informatics in Pathol 1986;1:224-233.

Bernstein LH. Selecting a laboratory computer system: the importance of auditing laboratory performance. Amer Clin Prod Rev 1985;–:30-33.

Castaneda-Mendez K, Bernstein LH. Linking costs and quality improvement to clinical outcomes through added value. J Healthcare Qual 1997;19:11-16.

Bernstein LH. The contribution of laboratory information systems to quality assurance. Amer Clin Prod Rev 1987;18:10-15.

Bernstein LH. Predicting the costs of laboratory testing. Pathologist 1985;39:–

Bernstein LH, Davis G, Pelton T. Managing and reducing lab costs. MLO 1984;16:53-56.

Bernstein LH, Brouillette R. The negative impact of untimely data in the diagnosis of acute myocardial infarction. Amer Clin Lab 1990;__:38-40.

Bernstein LH, Spiekerman AM, Qamar A, Babb J. Effective resource management using a clinical and laboratory algorithm for chest pain triage. Clin Lab Management Rev 1996;–:143-152.

Shaw-Stiffel TA, Zarny LA, Pleban WE, Rosman DD, Rudolph RA, Bernstein LH. Effect of nutrition status and other factors on length of hospital stay after major gastrointestinal surgery. Nutrition (Intl) 1993;9:140-145.

Bernstein LH. Relationship of nutritional markers to length of hospital stay. Nutrition (Intl)(suppl) 1995;11:205-209.

Bernstein LH, Coles M, Granata A. The BridgeportHospital experience with autologous transfusion in orthopedic surgery. Orthopedics 1997;20:677-680.

Bernstein LH. Realization of the projected impact of a chemistry workflow management system at BridgeportHospital. In Quality and Statistics: Total Quality Management. Kowalewski MJ, Ed. 1994; 120-133 ASTM: STP 1209. Phila, PA.

Bernstein LH, Kleinman GM, Davis GL, Chiga M. Part A reimbursement: what is your role in medical quality assurance? Pathologist 1986;40:–.

Bernstein LH. What constitutes a laboratory quality monitoring program? Amer J Qual Util Rev 1990;5:95-99.

Mozes B, Easterling J, Sheiner LB, Melmon KL, Kline R, Goldman ES, Brown AN. Case-mix adjustment using objective measures of severity: the case for laboratory data. Health Serv Res 1994;28:689711.

            Bernstein LH, Shaw-Stiffel T, Zarny L, Pleban W. An informational approach to likelihood of malnutrition. Nutr (Intl) 1996;12:772-226.

Read Full Post »

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

Gametogenesis is a biological process by which precursor cells undergo cell division and differentiation to form mature haploid gametes. Human gametogenesis occurs by mitotic division of gametogonia, followed by meiotic division of gametocytes into various gametes. During this process, the gamete genome experiences both programmed and spontaneous changes, among which meiotic recombination shuffles the two haploid somatic genomes to create a unique hybrid haploid genome for each gamete cell, while accumulated replication errors contribute point mutations that may affect the gametes’ functionality. This results in an enormous variety of new genomes being created in the gametes, thereby enabling one’s children to add to the genetic diversity of the human race in a more complex manner than by simply mixing and matching entire parental chromosomes. The genome-wide recombination activity and de novo mutation rate have been directly characterized in many model organisms. However, it has been unclear how an individual human’s genome is edited during gametogenesis. Despite the advances in personal genomics, gamete genome variation within individuals, especially fine-scale personal recombination activity and germline mutation rates, has been as yet generally inaccessible.

An important feature of single molecule multiple displacement amplification (MDA) is its repetitive usage of the originating genuine template molecule. Even if an amplification error happens in the initial stage, there will still be a large fraction of products preserving the correct base information from the original template, and the power of statistics from multiple coverage discriminates these errors from true genomic variation. Using this microfluidic MDA approach, for the first genome-wide single-cell analysis of human sperm was reported. A personal recombination map was created for an individual to measure the rate of de novo mutations in this individual’s germline. The advantage of sampling a large set of meioses from a single individual for fine-scale analysis allowed to uncover individual specific features potentially buried under population data. It was proposed that this partially overlapping feature is also the general pattern in individuals. While some hot spots are dying in some people, new recombination activities evolve to refill the hot spot pool. Support for this theory comes from single-cell analysis. Recombination data from 91 single sperm cells presented a comprehensive landscape of personal recombination activity. Genome-wide meiotic drive and gene conversion were also directly tested. Single-cell whole-genome sequencing further revealed primary information about human sperm genome instability and mutation rate. In this study, microfluidics to single-cell whole genome amplification was applied. This technique not only enabled great parallelization, but also improved amplification performance. MDA is sensitive to environmental contamination, and extensive sample purification is required for traditional bench-top whole genome amplifications.

The data from this study suggested that the germline mutation rate can vary greatly among different individuals, but not among different cells from the same individual. This may explain why the male mutation rate is not always higher than the female. DNA methylation also affects genome instability and C/T point mutation levels but in opposite ways. A fine tuned methylation level is therefore required for high-quality sperm genome. The ability to study a large number of single sperm cells has offered several new insights in meiosis. Studying the germline genome is but one application of single-cell genomics, and it is expected that the method will find applications in many other fields, including cancer, aging, immunology, and developmental biology.

Source References:

Genome-wide Single-Cell Analysis of Recombination Activity and De Novo Mutation Rates in Human Sperm.

(http://www.ncbi.nlm.nih.gov/pubmed?term=Genome-wide%20Single-Cell%20Analysis%20of%20Recombination%20Activity%20and%20De%20Novo%20Mutation%20Rates%20in%20Human%20Sperm)

Personal Recombination Map from Individual’s Sperm Cell and its Importance

(http://pharmaceuticalintelligence.com/2012/07/23/personal-recombination-map-from-individuals-sperm-cell-and-its-importance/).

Read Full Post »

« Newer Posts - Older Posts »