Feeds:
Posts
Comments

Archive for the ‘Drug Toxicity’ Category

Healthcare conglomeration to access Big Data and lower costs, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Healthcare conglomeration to access Big Data and lower costs

Curator: Larry H. Bernstein, MD, FCAP 

 

 

UPDATED on 3/17/2019

https://www.medpagetoday.com/cardiology/prevention/78202?xid=nl_mpt_SRCardiology_2019-02-25&eun=g99985d0r&utm_source=Sailthru&utm_medium=email&utm_campaign=CardioUpdate_022519&utm_term=NL_Spec_Cardiology_Update_Active

Medicare Advantage plans may be driving up quality of care in terms of preventive treatment for coronary artery disease patients, but that has had little impact on outcomes compared with fee-for-service Medicare, researchers reported in JAMA Cardiology.

 

The expected benefits are not as easily realized as anticipated.   The problem of access to data sources is not as difficult as the content needed for evaluation.

 

Healthcare Big Data Drives a New Round of Collaborations between Hospitals, Health Systems, and Care Management Companies

DARK DAILY   DARK DAILY info@darkreport.com

 

January 13, 2016

Recently-announced partnerships want to use big data to improve patient outcomes and lower costs; clinical laboratory test data will have a major role in these efforts

In the race to use healthcare big data to improve patient outcomes, several companies are using acquisitions and joint ventures to beef up and gain access to bigger pools of data. Pathologists and clinical laboratory managers have an interest in this trend, because medical laboratory test data will be a large proportion of the information that resides in these huge healthcare databases.

For health systems that want to be players in the healthcare big data market, one strategy is to do arisk-sharing venture with third-party care-management companies. This allows the health systems to leverage their extensive amounts of patient data while benefiting from the expertise of their venture partners.

Cardinal Health Acquires 71% Interest in naviHealth

One company that wants to work with hospitals and health systems in these types of arrangements is Cardinal Health. It recently acquired a 71% interest in Nashville-based naviHealth. This company partners with health plans, health systems, physicians, and post-acute providers to manage the entire continuum of post-acute care (PAC), according to a news release on the naviHealth website. NaviHealth’s business model involves sharing the financial risk with its clients and leveraging big data to predict best outcomes and lower costs.

“We created an economic model to take on the entire post-acute-care episode,” declared naviHealth CEO and President Clay Richards in a company news release. “It’s leveraging the technology and analytics to create individual care protocols.”

Click here to see image

“The most basic, and the most important, thing is … they [Cardinal Health] share the same core values as we do, which is to be on the right side of healthcare,” naviHealth CEO Clay Richards told The Tennessean. “It’s about how you deliver better outcomes for patients with lower costs: How do you solve the problems [with growing costs]? That’s what we and Cardinal define as being on the right side of healthcare.” (Caption and image copyright: The Tennessean.)

Provider Investments Signal Continuation of Trend

Cardinal Health intends to combine its ability to reduce costs while providing effective care with naviHealth’s evidence-based, personalized post-acute-care plans. This is one approach to harness the power of big data to improve patient care. One goal is focus this expertise on post-acute care, which is one of Medicare’s quality measures.

Patients and their families often are unsure of what to expect after being discharged. And, according to an article published in Kaiser Health News, a 2013 Institutes of Medicine (IOM) report noted a link between the quality of post-acute care and healthcare spending following the discharge of Medicare patients.

However, maximizing the use of healthcare big data requires the participation of multiple stakeholders. Information scientists, hospital administrators, software developers, insurers, clinicians, and patients themselves must all perform a role in order for big data to reach its full potential. No single sector will be able to bring the benefits of big data to fruition; rather collaboration and partnerships will be necessary.

Other Collaborations and Alliances Target Healthcare Big Data

Two other organizations engaged in a similar collaboration are the Mayo Clinic andOptum360, a revenue management services company that focuses on simplifying and streamlining the revenue cycle process. In a press release, the companies announced that they were partnering to “develop new revenue management services capabilities aimed at improving patient experiences and satisfaction while reducing administrative costs for healthcare providers.” (See Dark Daily, “When It Comes to Mining Healthcare Big Data, Including Medical Laboratory Test Results, Optum Labs Is the Company to Watch,” December 14, 2015.)

In order to accomplish this, Mayo will have to share its revenue cycle management (RCM) data with Optum360, which will use the data to devise improved revenue cycle processes and systems.

“What we’re trying to find out, if we can, is what does healthcare cost, and what of that spend really adds value to a patient’s outcome over time, especially with these high-impact diseases,” stated Mayo Clinic President and CEO John Noseworthy, MD, in a story published by the Star Tribune. He was referencing another big data project Mayo is engaged in with UnitedHealth Group. “Ultimately, we as a country have to figure this out, so people can have access to high-quality care and it doesn’t bankrupt them or the country.”

Click here to see image

Mayo Clinic President and CEO John Noseworthy, MD, believes big data may be the key to transforming healthcare costs by informing clinical decision-making and altering patient outcomes. (Photo copyright: Mayo Clinic.)

Another interesting healthcare big data partnership is the Pittsburgh Health Data Alliance (The Alliance). It involves a collaboration between Carnegie Mellon University (CMU), the University of Pittsburgh (PITT), and the University of Pittsburgh Medical Center (UPMC). The aim of The Alliance is to take raw data from wearable devices, insurance records, medical appointments, as well as other common sources, and develop ways to improve the health of individuals and the wider community.

The common thread among all these collaborative efforts is a desire to improve outcomes while reducing costs. This is the promise of healthcare big data. And no matter which direction the effort takes, clinical laboratories, which generate a vast amount of critical health data, are in a good position to play important roles involving the contribution of lab test data and identifying ways to use healthcare big data projects to improve patient care.
—Dava Stewart

Read Full Post »

P13K delta-gamma anticancer agent

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

RP 6350, Rhizen Pharmaceuticals S.A. and Novartis tieup for Rhizen’s inhaled dual Pl3K-delta gamma inhibitor

by DR ANTHONY MELVIN CRASTO Ph.D

 

(A)           and                         (Al)                  and                (A2)

(S)-2-(l-(9H-purin-6-ylamino)propyl)-3-(3-fluorophenyl)-4H-chromen-4-one (Compound A1 is RP 6350).

 

str1

 

RP 6350, RP6350, RP-6350

(S)-2-(l-(9H-purin-6-ylamino)propyl)-3-(3-fluorophenyl)-4H-chromen-4-one

mw 415

Rhizen Pharmaceuticals is developing RP-6530, a PI3K delta and gamma dual inhibitor, for the potential oral treatment of cancer and inflammation  In November 2013, a phase I trial in patients with hematologic malignancies was initiated in Italy ]\. In September 2015, a phase I/Ib study was initiated in the US, in patients with relapsed and refractory T-cell lymphoma. At that time, the study was expected to complete in December 2016

PATENTS……..WO 11/055215 ,  WO 12/151525.

  • Antineoplastics; Small molecules
  • Mechanism of Action Phosphatidylinositol 3 kinase delta inhibitors; Phosphatidylinositol 3 kinase gamma inhibitors
  • Phase I Haematological malignancies
  • Preclinical Multiple myeloma

 

Swaroop K. V. S. Vakkalanka,
COMPANY Rhizen Pharmaceuticals Sa

https://clinicaltrials.gov/ct2/show/NCT02017613

 

PI3K delta/gamma inhibitor RP6530 An orally active, highly selective, small molecule inhibitor of the delta and gamma isoforms of phosphoinositide-3 kinase (PI3K) with potential immunomodulating and antineoplastic activities. Upon administration, PI3K delta/gamma inhibitor RP6530 inhibits the PI3K delta and gamma isoforms and prevents the activation of the PI3K/AKT-mediated signaling pathway. This may lead to a reduction in cellular proliferation in PI3K delta/gamma-expressing tumor cells. In addition, this agent modulates inflammatory responses through various mechanisms, including the inhibition of both the release of reactive oxygen species (ROS) from neutrophils and tumor necrosis factor (TNF)-alpha activity. Unlike other isoforms of PI3K, the delta and gamma isoforms are overexpressed primarily in hematologic malignancies and in inflammatory and autoimmune diseases. By selectively targeting these isoforms, PI3K signaling in normal, non-neoplastic cells is minimally impacted or not affected at all, which minimizes the side effect profile for this agent. Check for active clinical trials using this agent. (NCI Thesaurus)

Company Rhizen Pharmaceuticals S.A.
Description Dual phosphoinositide 3-kinase (PI3K) delta and gamma inhibitor
Molecular Target Phosphoinositide 3-kinase (PI3K) delta ; Phosphoinositide 3-kinase (PI3K) gamma
Mechanism of Action Phosphoinositide 3-kinase (PI3K) delta inhibitor; Phosphoinositide 3-kinase (PI3K) gamma inhibitor
Therapeutic Modality Small molecule

 

Dual PI3Kδ/γ Inhibition By RP6530 Induces Apoptosis and Cytotoxicity In B-Lymphoma Cells
 Swaroop Vakkalanka, PhD*,1, Srikant Viswanadha, Ph.D.*,2, Eugenio Gaudio, PhD*,3, Emanuele Zucca, MD4, Francesco Bertoni, MD5, Elena Bernasconi, B.Sc.*,3, Davide Rossi, MD, Ph.D.*,6, and Anastasios Stathis, MD*,7
 1Rhizen Pharmaceuticals S A, La Chaux-de-Fonds, Switzerland, 2Incozen Therapeutics Pvt. Ltd., Hyderabad, India, 3Lymphoma & Genomics Research Program, IOR-Institute of Oncology Research, Bellinzona, Switzerland, 4IOSI Oncology Institute of Southern Switzerland, Bellinzona, Switzerland, 5Lymphoma Unit, IOSI-Oncology Institute of Southern Switzerland, Bellinzona, Switzerland, 6Italian Multiple Myeloma Network, GIMEMA, Italy, 7Oncology Institute of Southern Switzerland, Bellinzona, Switzerland

RP6530 is a potent and selective dual PI3Kδ/γ inhibitor that inhibited growth of B-cell lymphoma cell lines with a concomitant reduction in the downstream biomarker, pAKT. Additionally, the compound showed cytotoxicity in a panel of lymphoma primary cells. Findings provide a rationale for future clinical trials in B-cell malignancies.

POSTER SESSIONS
Blood 2013 122:4411; published ahead of print December 6, 2013
Swaroop Vakkalanka, Srikant Viswanadha, Eugenio Gaudio, Emanuele Zucca, Francesco Bertoni, Elena Bernasconi, Davide Rossi, Anastasios Stathis
  • Dual PI3K delta/gamma Inhibition By RP6530 Induces Apoptosis and Cytotoxicity
  • RP6530, a novel, small molecule PI3K delta/gamma
  • Activity and selectivity of RP6530 for PI3K delta and gamma isoforms

Introduction Activation of the PI3K pathway triggers multiple events including cell growth, cell cycle entry, cell survival and motility. While α and β isoforms are ubiquitous in their distribution, expression of δ and γ is restricted to cells of the hematopoietic system. Because these isoforms contribute to the development, maintenance, transformation, and proliferation of immune cells, dual targeting of PI3Kδ and γ represents a promising approach in the treatment of lymphomas. The objective of the experiments was to explore the therapeutic potential of RP6530, a novel, small molecule PI3Kδ/γ inhibitor, in B-cell lymphomas.

Methods Activity and selectivity of RP6530 for PI3Kδ and γ isoforms and subsequent downstream activity was determined in enzyme and cell-based assays. Additionally, RP6530 was tested for potency in viability, apoptosis, and Akt phosphorylation assays using a range of immortalized B-cell lymphoma cell lines (Raji, TOLEDO, KG-1, JEKO, OCI-LY-1, OCI-LY-10, MAVER, and REC-1). Viability was assessed using the colorimetric MTT reagent after incubation of cells for 72 h. Inhibition of pAKT was estimated by Western Blotting and bands were quantified using ImageJ after normalization with Actin. Primary cells from lymphoid tumors [1 chronic lymphocytic leukemia (CLL), 2 diffuse large B-cell lymphomas (DLBCL), 2 mantle cell lymphoma (MCL), 1 splenic marginal zone lymphoma (SMZL), and 1 extranodal MZL (EMZL)] were isolated, incubated with 4 µM RP6530, and analyzed for apoptosis or cytotoxicity by Annexin V/PI staining.

Results RP6530 demonstrated high potency against PI3Kδ (IC50=24.5 nM) and γ (IC50=33.2 nM) enzymes with selectivity over α (>300-fold) and β (>100-fold) isoforms. Cellular potency was confirmed in target-specific assays, namely anti-FcεR1-(EC50=37.8 nM) or fMLP (EC50=39.0 nM) induced CD63 expression in human whole blood basophils, LPS induced CD19+ cell proliferation in human whole blood (EC50=250 nM), and LPS induced CD45R+ cell proliferation in mouse whole blood (EC50=101 nM). RP6530 caused a dose-dependent inhibition (>50% @ 2-7 μM) in growth of immortalized (Raji, TOLEDO, KG-1, JEKO, REC-1) B-cell lymphoma cells. Effect was more pronounced in the DLBCL cell lines, OCI-LY-1 and OCI-LY-10 (>50% inhibition @ 0.1-0.7 μM), and the reduction in viability was accompanied by corresponding inhibition of pAKT with EC50 of 6 & 70 nM respectively. Treatment of patient-derived primary cells with 4 µM RP6530 caused an increase in cell death. Fold-increase in cytotoxicity as evident from PI+ staining was 1.6 for CLL, 1.1 for DLBCL, 1.2 for MCL, 2.2 for SMZL, and 2.3 for EMZL. Cells in early apotosis (Annexin V+/PI-) were not different between the DMSO blank and RP6530 samples.

Conclusions RP6530 is a potent and selective dual PI3Kδ/γ inhibitor that inhibited growth of B-cell lymphoma cell lines with a concomitant reduction in the downstream biomarker, pAKT. Additionally, the compound showed cytotoxicity in a panel of lymphoma primary cells. Findings provide a rationale for future clinical trials in B-cell malignancies.

Disclosures:Vakkalanka:Rhizen Pharmaceuticals, S.A.: Employment, Equity Ownership; Incozen Therapeutics Pvt. Ltd.: Employment, Equity Ownership.Viswanadha:Incozen Therapeutics Pvt. Ltd.: Employment. Bertoni:Rhizen Pharmaceuticals SA: Research Funding.

 

PI3K Dual Inhibitor (RP-6530)


Therapeutic Area Respiratory , Oncology – Liquid Tumors , Rheumatology Molecule Type Small Molecule
Indication Peripheral T-cell lymphoma (PTCL) , Non-Hodgkins Lymphoma , Asthma , Chronic Obstructive Pulmonary Disease (COPD) , Rheumatoid Arthritis
Development Phase Phase I Rt. of Administration Oral

Description

Rhizen is developing dual PI3K gamma/delta inhibitors for liquid tumors and inflammatory conditions.

Situation Overview

Dual Pl3K inhibition is strongly implicated as an intervention treatment in allergic and non-allergic inflammation of the airways and autoimmune diseases manifested by a reduction in neutrophilia and TNF in response to LPS. Scientific evidence for PI3-kinase involvement in various cellular processes underlying asthma and COPD stems from inhibitor studies and gene-targeting approaches, which makes it a potential target for treatment of respiratory disease. Resistance to conventional therapies such as corticosteroids in several patients has been attributed to an up-regulation of the PI3K pathway; thus, disruption of PI3K signaling provides a novel strategy aimed at counteracting the immuno-inflammatory response. Given the established criticality of these isoforms in immune surveillance, inhibitors specifically targeting the ? and ? isoforms would be expected to attenuate the progression of immune response encountered in most variations of airway inflammation and arthritis.

Mechanism of Action

While alpha and beta isoforms are ubiquitous in their distribution, expression of delta and gamma is restricted to circulating hematogenous cells and endothelial cells. Unlike PI3K-alpha or beta, mice lacking expression of gamma or delta do not show any adverse phenotype indicating that targeting of these specific isoforms would not result in overt toxicity. Dual delta/gamma inhibition is strongly implicated as an intervention strategy in allergic and non-allergic inflammation of the airways and other autoimmune diseases. Scientific evidence for PI3K-delta and gamma involvement in various cellular processes underlying asthma and COPD stems from inhibitor studies and gene-targeting approaches. Also, resistance to conventional therapies such as corticosteroids in several COPD patients has been attributed to an up-regulation of the PI3K delta/gamma pathway. Disruption of PI3K-delta/gamma signalling therefore provides a novel strategy aimed at counteracting the immuno-inflammatory response. Due to the pivotal role played by PI3K-delta and gamma in mediating inflammatory cell functionality such as leukocyte migration and activation, and mast cell degranulation, blocking these isoforms may also be an effective strategy for the treatment of rheumatoid arthritis as well.

Given the established criticality of these isoforms in immune surveillance, inhibitors specifically targeting the delta and gamma isoforms would be expected to attenuate the progression of immune response encountered in airway inflammation and rheumatoid arthritis.

 

http://www.rhizen.com/images/backgrounds/pi3k%20delta%20gamma%20ii.png

http://www.rhizen.com/images/backgrounds/pi3k%20delta%20gamma%20ii.pngtps:/

Clinical Trials

Rhizen has identified an orally active Lead Molecule, RP-6530, that has an excellent pre-clinical profile. RP-6530 is currently in non-GLP Tox studies and is expected to enter Clinical Development in H2 2013.

In December 2013, Rhizen announced the start of a Phase I clinical trial. The study entitled A Phase-I, Dose Escalation Study to Evaluate Safety and Efficacy of RP6530, a dual PI3K delta /gamma inhibitor, in patients with Relapsed or Refractory Hematologic Malignancies is designed primarily to establish the safety and tolerability of RP6530. Secondary objectives include clinical efficacy assessment and biomarker response to allow dose determination and potential patient stratification in subsequent expansion studies.

 

Partners by Region

Rhizen’s pipeline consists of internally discovered (with 100% IP ownership) novel small molecule programs aimed at high value markets of Oncology, Immuno-inflammtion and Metabolic Disorders. Rhizen has been successful in securing critical IP space in these areas and efforts are on for further expansion in to several indications. Rhizen seeks partnerships to unlock the potential of these valuable assets for further development from global pharmaceutical partners. At present global rights on all programs are available and Rhizen is flexible to consider suitable business models for licensing/collaboration.

In 2012, Rhizen announced a joint venture collaboration with TG Therapeutics for global development and commercialization of Rhizen’s Novel Selective PI3K Kinase Inhibitors. The selected lead RP5264 (hereafter, to be developed as TGR-1202) is an orally available, small molecule, PI3K specific inhibitor currently being positioned for the treatment of hematological malignancies.

PATENT
WO2014195888, DUAL SELECTIVE PI3 DELTA AND GAMMA KINASE INHIBITORS

This scheme provides a synthetic route for the preparation of compound of formula wherein all the variables are as described herein in above

Figure imgf000094_0001

15 14 10 12 12a

REFERENCES
April 2015, preclinical data were presented at the 106th AACR Meeting in Philadelphia, PA. RP-6530 had GI50 values of 17,028 and 22,014 nM, respectively
December 2014, data were presented at the 56th ASH Meeting in San Francisco, CA.
December 2013, preclinical data were presented at the 55th ASH Meeting in New Orleans, LA.
June 2013, preclinical data were presented at the 18th Annual EHA Congress in Stockholm, Sweden. RP-6530 inhibited PI3K delta and gamma isoforms with IC50 values of 24.5 and 33.2 nM, respectively.
  • 01 Sep 2015 Phase-I clinical trials in Hematological malignancies (Second-line therapy or greater) in USA (PO) (NCT02567656)
  • 18 Nov 2014 Preclinical trials in Multiple myeloma in Switzerland (PO) prior to November 2014
  • 18 Nov 2014 Early research in Multiple myeloma in Switzerland (PO) prior to November 2014

 

WO2011055215A2 Nov 3, 2010 May 12, 2011 Incozen Therapeutics Pvt. Ltd. Novel kinase modulators
WO2012151525A1 May 4, 2012 Nov 8, 2012 Rhizen Pharmaceuticals Sa Novel compounds as modulators of protein kinases
WO2013164801A1 May 3, 2013 Nov 7, 2013 Rhizen Pharmaceuticals Sa Process for preparation of optically pure and optionally substituted 2- (1 -hydroxy- alkyl) – chromen – 4 – one derivatives and their use in preparing pharmaceuticals
US20110118257 May 19, 2011 Rhizen Pharmaceuticals Sa Novel kinase modulators
US20120289496 May 4, 2012 Nov 15, 2012 Rhizen Pharmaceuticals Sa Novel compounds as modulators of protein kinases
WO 2011055215

 

 

Read Full Post »

New antibiotics to address anti-microbial resistance

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

WCK 5107 in PHASE 1 FROM WOCKHARDT

Dr. Anthony Melvin Castro

 

Figure imgf000036_0001

WCK 5107

Wockhardt Limited

Useful for treating bacterial infections

CAS 1436861-97-0

disclosed in PCT International Patent Application No. PCT/IB2012/054290D

trans- sulphuric acid mono-[2-(N’-[(R)-piperidin-3-carbonyl]-hydrazinocarbonyl)-7-oxo-l,6-diaza-bicyclo[3.2.1]oct-6-yl] ester

(2S, 5R)-sulphuric acid mono-[2-(N’-[(R)-piperidin-3-carbonyl]-hydrazinocarbonyl)-7-oxo-l,6-diaza-bicyclo[3.2.1]oct-6-yl] ester

(lR,2S,5R)-l,6-Diazabicyclo [3.2.1] octane-2-carboxylic acid, 7-oxo-6-(sulfooxy)-, 2-[2-[(3R)-3-piperidinylcarbonyl]hydrazide]

trans- sulphuric acid mono-[2-(N’-[(R)-piperidin-3-carbonyl]-hydrazinocarbonyl)-7-oxo-l,6-diaza-bicyclo[3.2.1]oct-6-yl] ester

(2S, 5R)-sulphuric acid mono-[2-(N’-[(R)-piperidin-3-carbonyl]-hydrazinocarbonyl)-7-oxo-l,6-diaza-bicyclo[3.2.1]oct-6-yl] ester

(lR,2S,5R)-l,6-Diazabicyclo [3.2.1] octane-2-carboxylic acid, 7-oxo-6-(sulfooxy)-, 2-[2-[(3R)-3 -piperidinylcarbonyl] hydrazide]

 

In September 2015, the drug was reported to be in phase I clinical trial.One of the family members US09132133, claims a combination of sulbactam and WCK-5107.

Bacterial infections continue to remain one of the major causes contributing towards human diseases. One of the key challenges in treatment of bacterial infections is the ability of bacteria to develop resistance to one or more antibacterial agents over time. Examples of such bacteria that have developed resistance to typical antibacterial agents include: Penicillin-resistant Streptococcus pneumoniae, Vancomycin-resistant Enterococci, and Methicillin-resistant Staphylococcus aureus. The problem of emerging drug-resistance in bacteria is often tackled by switching to newer antibacterial agents, which can be more expensive and sometimes more toxic. Additionally, this may not be a permanent solution as the bacteria often develop resistance to the newer antibacterial agents as well in due course. In general, bacteria are particularly efficient in developing resistance, because of their ability to multiply very rapidly and pass on the resistance genes as they replicate.

Treatment of infections caused by resistant bacteria remains a key challenge for the clinician community. One example of such challenging pathogen is Acinetobacter baumannii (A. baumannii), which continues to be an increasingly important and demanding species in healthcare settings. The multidrug resistant nature of this pathogen and its unpredictable susceptibility patterns make empirical and therapeutic decisions more difficult. A. baumannii is associated with infections such as pneumonia, bacteremia, wound infections, urinary tract infections and meningitis.

Therefore, there is a need for development of newer ways to treat infections that are becoming resistant to known therapies and methods. Surprisingly, it has been found that a compositions comprising cefepime and certain nitrogen containing bicyclic compounds (disclosed in PCT/IB2012/054290) exhibit unexpectedly synergistic antibacterial activity, even against highly resistant bacterial strains.

PATENT

http://www.google.com/patents/WO2013030733A1?cl=en

 

 

……

REFERENCES

Study to Evaluate the Safety, Tolerability, and Pharmacokinetics of WCK-5107 Alone and in Combination With Cefepime (NCT02532140)  https://clinicaltrials.gov/show/NCT02532140
ClinicalTrials.gov Web Site 2015, September 01, To evaluate the safety,tolerability and pharmacokinetics of single intravenous doses of WCK 5107 alone and in combination with cefepime in healthy adult human subjects.

WO2013030733A1* Aug 24, 2012 Mar 7, 2013 Wockhardt Limited 1,6- diazabicyclo [3,2,1] octan-7-one derivatives and their use in the treatment of bacterial infections
WO2014135931A1* Oct 12, 2013 Sep 12, 2014 Wockhardt Limited A process for preparation of (2s, 5r)-7-oxo-6-sulphooxy-2-[((3r)-piperidine-3-carbonyl)-hydrazino carbonyl]-1,6-diaza-bicyclo [3.2.1]- octane
IB2012054290W Title not available

Read Full Post »

Chemotherapy in AML

Curator: Larry H. Bernstein, MD, FCAP

 

 

Sorafenib Showed Efficacy as Chemotherapy Add-On in AML
Results from the phase II SORAML trial indicated that adding sorafenib to standard chemotherapy for younger patients with acute myeloid leukemia was effective, but also resulted in increased toxicity

Reduced-Intensity HSCT Extends Remission in Older AML Patients
The use of reduced-intensity conditioning HSCT as a method to maintain remission was effective in a select group of older patients with acute myeloid leukemia

 

Sorafenib Showed Efficacy as Chemotherapy Add-On in AML

– See more at: http://www.cancernetwork.com/leukemia-lymphoma/sorafenib-showed-efficacy-chemotherapy-add-aml

 

Results from the phase II SORAML trial indicated that adding sorafenib to standard chemotherapy for younger patients with acute myeloid leukemia (AML) was effective, but also resulted in increased toxicity.

 

The drug increased event-free survival and reduced need for salvage therapy and allogeneic stem cell transplantation, but also produced worse grade 3 or higher fever, diarrhea, bleeding, cardiac events, and rash compared with placebo.

“After a decade of assessing the potential of kinase inhibitors in acute myeloid leukemia, their use in combination with standard treatment is becoming an important option for newly diagnosed younger patients,” wrote Christoph Röllig, MD, of Medizinische Klinik und Poliklinik I, Universitätsklinikum der Technischen Universität in Dresden, Germany, and colleagues in Lancet Oncology.

Patients age 18 to 60 years were enrolled in the phase II study between 2009 and 2011. All patients had to have newly diagnosed, treatment-naive AML and a performance status of 0–2. Patients were randomly assigned to 2 cycles of induction daunorubicin plus cytarabine followed by 3 cycles of high-dose cytarabine consolidation therapy plus either sorafenib 400 mg twice daily (n = 134) or placebo (n = 133).

With a median follow-up of 3 years, the researchers found that adding sorafenib to standard chemotherapy significantly improved event-free survival, from a median of 9 months with placebo to a median of 21 months with sorafenib. Patients assigned sorafenib had a 3-year event-free survival rate of 40% compared with 22% for patients assigned placebo (P = .013).

“The improvement in event-free survival and relapse-free survival is significant and clinically relevant since salvage treatment with or without allogeneic stem cell transplantation could be prevented or substantially delayed by sorafenib treatment,” the researchers wrote.

At 3 years, 63% of patients assigned sorafenib and 56% of patients assigned placebo were still alive, and the median overall survival was not reached in either group. Patients assigned sorafenib had fewer relapses after complete remission compared with placebo (54 vs 34) and, therefore, fewer allogeneic stem cell transplantations were required among patients assigned sorafenib (31 vs 18).

Finally, withdrawal from the trial due to adverse events was more common among patients assigned sorafenib (24% vs 12%).

In an editorial published with the study, Naval Daver, MD, and Marina Konopleva, MD, PhD, of the University of Texas MD Anderson Cancer Center in Houston, pointed out that these results contrast findings by Serve et al who found that “the addition of sorafenib to standard chemotherapy in patients older than 60 years with acute myeloid leukemia resulted in increased toxicity and early mortality,” without improved antileukemic efficacy compared with placebo, suggesting that older patients were unable to tolerate the toxicities associated with the addition of sorafenib to standard chemotherapy.

Daver and Konopleva agreed with Röllig and colleagues, writing that the lack of improvement in overall survival despite an improvement in event-free survival requires “further investigation to develop future strategies that will improve overall survival.”

 

Sorafenib and novel multikinase inhibitors in AML

Naval Daver, Marina Konopleva

The Lancet Oncology 2015.          DOI: http://dx.doi.org/10.1016/S1470-2045(15)00454-4

Induction chemotherapy can produce complete remission in most (50–70%) patients with acute myeloid leukaemia.1 However, between 50% and 80% of patients relapse, and only 20–30% achieve long-term disease-free survival.

 

Reduced-Intensity HSCT Extends Remission in Older AML Patients

– See more at: http://www.cancernetwork.com/leukemia-lymphoma/reduced-intensity-hsct-extends-remission-older-aml-patients

 

The use of reduced-intensity conditioning hematopoietic stem cell transplantation (HSCT) as a method to maintain remission was effective in a select group of older patients with acute myeloid leukemia (AML), resulting in a nonrelapse mortality (NRM) similar to that seen in younger patients, according to the results of the phase II Cancer and Leukemia Group B 100103/Blood and Marrow Transplant Clinical Trial Network 0502 trial. 

 

“Of critical importance, for the first time (to the best of our knowledge), favorable results in transplantation of older patients have been obtained in a multicenter cooperative group setting, which makes the results more likely to be generalizable,” wrote Steven M. Devine, MD, of the Ohio State University in Columbus, Ohio, and colleagues in the Journal of Clinical Oncology.

According to the study, although patients aged older than 60 have complete remission rates of 50% to 60%, many will ultimately relapse. HSCT is associated with lower rates of relapse compared with chemotherapy in younger patients, but has been considered too toxic for older patients.

This study looked at the use of reduced-intensity conditioning HSCT in an older patient population aged 60 to 74 years. It included 114 patients with AML who were in first complete remission. The median age of patients was 65 years. A little more than half of the patients received transplants from unrelated donors and were given rabbit antithymocyte globulin (ATG) for graft-versus-host disease (GVHD) prophylaxis.

At follow-up, 71 patients had died. The median follow-up of the 43 surviving patients was 1,602 days. At 2 years, the rate of disease-free survival (DFS) was 42% and overall survival (OS) was 48%. Among patients who had unrelated donors, the 2-year DFS was 40% and the OS was 50%.

“The 2-year DFS and OS rates in this group compare favorably to those in studies of conventional chemotherapy–based approaches to remission consolidation in which DFS and OS rates beyond 2 years are typically below 20%,” the researchers wrote.

The NRM at 2 years was 15% and was not different among those patients with related vs unrelated donors. Forty-four percent of patients relapsed at 2 years.

“The 44% relapse rate at 2 years was high, although relapse rates approaching 80% to 90% have been observed in older patients after conventional chemotherapy, suggesting a potential graft-versus-leukemia effect,” the researchers wrote. “Interpretation of our trial results is limited somewhat by lack of consistent knowledge of the mutational status of the patients at diagnosis or of disease burden at complete remission by minimal residual disease assessment.”

There was a cumulative incidence of grades 2 to 4 GVHD of 9.6% and of grade 3 to 4 GVHD of 2.6% at 100 days. The incidence of GVHD did not vary by donor type. Chronic GVHD occurred in 28% of patients.

Devine and colleagues noted that these rates were lower than they anticipated.

“The incorporation of rabbit ATG into the conditioning regimen for all patients, including recipients with matched sibling donors, may have contributed to the relatively low rates of GVHD and NRM, as has been observed in previous studies,” they wrote.

 

Read Full Post »

Computer Aided Design, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Computer Aided Design

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

IBM’s Watson shown to enhance human-computer co-creativity, support biologically inspired design

Watson Engagement Advisor AI system was trained to “learn” about biologically inspired design from biology articles, then answer questions
November 13, 2015   http://www.kurzweilai.net/ibms-watson-shown-to-enhance-human-computer-co-creativity-support-biologically-inspired-design

http://www.kurzweilai.net/images/Enhancing-Human-Computer-Co-Creativity.jpg

Georgia Institute of Technology researchers, working with student teams, trained a cloud-based version of IBM’s Watson called the Watson Engagement Advisor to provide answers to questions about biologically inspired design (biomimetics), a design paradigm that uses biological systems as analogues for inventing technological systems.

 

Ashok Goel, a professor at Georgia Tech’s School of Interactive Computing who conducts research on computational creativity. In an experiment, he used this version of Watson as an “intelligent research assistant” to support teaching about biologically inspired design and computational creativity in the Georgia Tech CS4803/8803 class on Computational Creativity in Spring 2015. Goel found that Watson’s ability to retrieve natural language information could allow a novice to quickly “train up” about complex topics and better determine whether their idea or hypothesis is worth pursuing.

An intelligent research assistant

In the form of a class project, the students fed Watson several hundred biology articles from Biologue, an interactive biology repository, and 1,200 question-answer pairs. The teams then posed questions to Watson about the research it had “learned” regarding big design challenges in areas such as engineering, architecture, systems, and computing.

Examples of questions:

“How do you make a better desalination process for consuming sea water?” (Animals have a variety of answers for this, such as how seagulls filter out seawater salt through special glands.)

“How can manufacturers develop better solar cells for long-term space travel?” One answer: Replicate how plants in harsh climates use high-temperature fibrous insulation material to regulate temperature.

Watson effectively acted as an intelligent sounding board to steer students through what would otherwise be a daunting task of parsing a wide volume of research that may fall outside their expertise.

This version of Watson also prompts users with alternate ways to ask questions for better results. Those results are packaged as a “treetop” where each answer is a “leaf” that varies in size based on its weighted importance. This was intended to allow the average user to navigate results more easily on a given topic.

 

http://www.kurzweilai.net/images/GT-Watson-Plus-Concept-Results.png

Results from training the Watson AI system were packaged as a “treetop” where each answer is a “leaf” that varies in size based on its weighted importance. Each leaf is the starting point for a Q&A with Watson. (credit: Georgia Tech)

 

“Imagine if you could ask Google a complicated question and it immediately responded with your answer — not just a list of links to manually open, says Goel. “That’s what we did with Watson. Researchers are provided a quickly digestible visual map of the concepts relevant to the query and the degree to which they are relevant. We were able to add more semantic and contextual meaning to Watson to give some notion of a conversation with the AI.”

 

http://www.kurzweilai.net/images/Watson-Screenshot.png

Georgia Tech’s Watson Engagement Advisor (credit: Georgia Tech)

 

Goel believes this approach to using Watson could assist professionals in a variety of fields by allowing them to ask questions and receive answers as quickly as in a natural conversation. He plans to investigate other areas with Watson such as online learning and healthcare.

The work was presented at the Association for the Advancement of Artificial Intelligence (AAAI) 2015 Fall Symposium on Cognitive Assistance in Government, Nov. 12–14, in Arlington, Va. and was published in Procs. AAAI 2015 Fall Symposium on Cognitive Assistance (open access).

 

Abstract of Using Watson for Enhancing Human-Computer Co-Creativity

We describe an experiment in using IBM’s Watson cognitive system to teach about human-computer co-creativity in
a Georgia Tech Spring 2015 class on computational creativity. The project-based class used Watson to support biologically
inspired design, a design paradigm that uses biological systems as analogues for inventing technological
systems. The twenty-four students in the class self-organized into six teams of four students each, and developed semester-long projects that built on Watson to support biologically inspired design. In this paper, we describe this experiment in using Watson to teach about human-computer co-creativity, present one project in detail, and summarize the remaining five projects. We also draw lessons on building on Watson for (i) supporting biologically inspired design, and (ii) enhancing human-computer co-creativity.

sjwilliams

Interesting however Google had just announced a big AI venture of their own. Although it is curious why they needed such a defined training set. It seems, as was said in the EmTechMIT lectures that AI is still in its infancy and is nowhere near a true AI system. It is also interesting to note how rapidly China is expanding their supercomputing power (growth of supercomputers in China is dwarfing that in the US, in fact US has 20 less suipercomputers).

Read Full Post »

Predict Toxicity from Structure

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

chemTox

http://www.cyprotex.com/insilico/physiological_modelling/chemtox

Toxicity Prediction Directly from Chemical Structure

  • Predicts key toxicity parameters (Ames mutagenicity, rat acute dose LD50 following iv or po administration) and aqueous solubility directly from structure.
  • No in vitro physicochemical or toxicity data required.
  • Save money and time by allowing toxicity to be assessed virtually (no synthesis required).
  • Provides early stage filter for directing chemistry and prioritizing screening.

A virtual screening tool for predicting toxicity from chemical structure alone.

 

chemTox Input Requirements Chemical structure, e.g. SMILES, mol or sdf
chemTox Data Delivery Predicted toxicity measures: Ames mutagenicity, rat acute dose LD50 following iv or po administration
Predicted aqueous solubility

 

chemTox is implemented as a node for the KNIME analytics platform which executes a model of the workflow illustrated in Figure 1 below.

Figure 1
chemTox workflow.
KNIME can be downloaded easily and for free. Cyprotex then provide the bespoke toxicity modules (chemTox) for the KNIME platform.
Properties
The following properties are reported:

  1. Ames mutagenicity classification (mutagenic/non-mutagenic).
  2. Ames mutagenicity probability (probability of being a mutagen).
  3. Rat LD50 (mmol/kg) following acute administration by intravenous route.
  4. Rat LD50 (mmol/kg) following acute administration by oral route.
  5. Aqueous solubility (mol/l).

Output is a KNIME data table facilitating saving to a file or database, or using the predictions as inputs to subsequent workflow steps.

Model Development
  • Models are quantitative-structure property relationships (QSPR) that calculate the toxicity properties of interest in terms of a compound’s structural descriptors.
  • Models have been trained using large, well-validated datasets.
  • Training was performed using up-to-date, rigorous statistical pattern recognition methods.
  • Repeated 10-fold cross-validation has been used to generate the most robust statistics for prediction performance.

Data from chemTox

Figure 2
Receiver operating characteristic (ROC) plots for classification of rat acute LD50 following oral administration. (a) LD50 < 5mg/kg (b). LD50 < 50mg/kg (c). LD50 < 300mg/kg.

(a).
LD50 < 5mg/kg
(b).
LD50 < 50mg/kg
(c).
LD50 < 300mg/kg
Statistics
Area under the ROC curve 0.905
Sensitivity 0.85
Specificity 0.82
Table 1
Statistics for predicted Ames mutagenicity based on 4336 compounds (1935 non-mutagenic and 2401 mutagenic).

Model variants can be generated having different balance of sensitivity and specificity to suit different screening requirements.

 

SJ Williams

Tempting but there have been other databases like ToxNet which have used SAR (structure activity relationship) for Ames prediction but usually that is far as it can go. This is usually a first screen but long term carcinogenicity studies are still required for obvious reasons. In addition no system can predict IDILI (idiopathic drug induced liver injury) which has and still is plaguing the drug development industry. This is where relationships need to be worked on and this problem will need a more indepth ‘omic strategy as there are no SARs which correlate with these “idiosyncratic” toxicities. However be that as it may there have been chemicals which failed the Ames test however were not carcinogenic in subchronic or chronic tests so it is still good to conduct those studies anyway, even though most regulatory bodies give a failed Ames test a bad thumbs down.

Read Full Post »

Adenosine Receptor Agonist Increases Plasma Homocysteine

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

The Adenosine Receptor Agonist 5’-N-Ethylcarboxamide-Adenosine Increases Mouse Serum Total Homocysteine Levels, Which Is a Risk Factor for Cardiovascular Diseases

Spring Zhou Editor at Scientific Research Publishing

I would like to share this paper with you. Any comments on this article are welcome.

 

An increase in total homocysteine (Hcy) levels (protein-bound and free Hcy in the serum) has been identified as a risk factor for vascular diseases. Hcy is a product of the methionine cycle and is a precursor of glutathione in the transsulfuration pathway. The methionine cycle mainly occurs in the liver, with Hcy being exported out of the liver and subsequently bound to serum proteins. When the non-specific adenosine receptor agonist 5’-N-ethylcarboxamide-adenosine (NECA; 0.1 or 0.3 mg/kg body weight) was intraperitoneally administered to mice that had been fasted for 16 h, total Hcy levels in the serum significantly increased 1 h after its administration. The NECA treatment may have inhibited transsulfuration because glutathione levels were significantly decreased in the liver. After the intraperitoneal administration of a high dose of NECA (0.3 mg/kg body weight), elevations in total Hcy levels in the serum continued for up to 10 h. The mRNA expression of methionine metabolic enzymes in the liver was significantly reduced 6 h after the administration of NECA. NECA-induced elevations in total serum Hcy levels may be maintained in the long term through the attenuated expression of methionine metabolic enzymes.

 

Comments:

  1.  Is level of protein consumption a factor?
  2. Is reliance on plant food products a factor?
  3. What are the levels of transthyretin?
  4. Is there a concomitant decrease in vitamin A or vitamin D?

 

 

The Adenosine Receptor Agonist 5’-N-Ethylcarboxamide-Adenosine Increases Mouse Serum Total Homocysteine Levels, Which Is a Risk Factor for Cardiovascular Diseases

Shigeko Fujimoto Sakata*, Koichi Matsuda, Yoko Horikawa, Yasuto Sasaki     Faculty of Nutrition, Kobe Gakuin University, Kobe, Japan.

http://www.scirp.org/journal/PaperInformation.aspx    DOI: 10.4236/pp.2015.610048

Cite this paper

Sakata, S. , Matsuda, K. , Horikawa, Y. and Sasaki, Y. (2015) The Adenosine Receptor Agonist 5’-N-Ethylcarboxamide-Adenosine Increases Mouse Serum Total Homocysteine Levels, Which Is a Risk Factor for Cardiovascular Diseases. Pharmacology & Pharmacy, 6, 461-470. doi: 10.4236/pp.2015.610048.
An increase in total serum homocysteine levels (total Hcy: serum protein-bound and free Hcy) has been identified as a risk factor for cardiovascular disease [1] [2] and liver fibrosis [3]. The normal range of total Hcy in adults is typically 5 – 15 μM, with the mean level being approximately 10 μM [2]. Plasma Hcy concentrations were previously found to be strongly associated with the presence and number of small infarctions, or infarction of the putamen in elderly diabetic patients [4]. High levels of Hcy have been shown to induce endoplasmic reticulum (ER) stress and increase the production of reactive oxygen species (ROS) [5]. Hcy has strong reducibility and modifies disulfide bonds in proteins. Only 1% to 2% of Hcy occurs as thiol homocysteine in the serum; 75% of Hcy has been suggested to bind to proteins through disulfide bonds with protein cysteines [6]. Hcy is formed as an intermediary in methionine metabolism [7] [8]. Methionine metabolism mainly occurs in the livers of mammals. Methionine receives an adenosine group from ATP to become S-adenosylmethionine (AdoMet) in the methionine cycle. This reaction is catalyzed in the liver by liver-specific methionine adenosyltransferase I/III (MAT I/III), which is encoded by the methionine adenosyltransferase 1A (MAT1A) gene [9]. AdoMet then transfers its methyl group to a large number of compounds, a process that is catalyzed by various methyltransferases (e.g., glycine N-methyltransferase: GNMT; DNA methyltransferase; phosphatidylethanolamine N-methyl- transferase), to produce S-adenosylhomocysteine (AdoHcy). Hcy is formed from AdoHcy by AdoHcy hydrolase (SAHH). The reaction that generates Hcy from AdoHcy is reversible, and AdoHcy from Hcy is shown to be thermodynamically favored over the synthesis of Hcy [10]. A previous study reported that Hcy levels were very low in the liver [11]. This reaction then proceeds toward the synthesis of Hcy when the products (Hcy and adenosine) are removed by further metabolism [12]. Three enzymes metabolize Hcy, with the betaine-homocysteine S-methyltransferase (BHMT) and methionine synthase (MS) reactions both yielding methionine. A large proportion of Hcy in the liver is remethylated by BHMT [3]. The third enzyme, cystathionine β-synthase (CBS) catalyzes Hcy to cystathionine in the transsulfuration pathway. Previous studies of whole body methionine kinetics demonstrated that 62% of Hcy was converted to cystathionine during each cycle in males fed a basal diet, resulting in the production of glutathione (GSH), while 38% of Hcy was remethylated to methionine [13]. Hcy is located at an important regulatory branch point: remethylation to methionine; conversion to cystathionine; export from the cells.
A decrease in intracellular ATP levels, accompanied by the accumulation of 5’-AMP and subsequently adenosine, is known to follow ischemia. Adenosine levels in interstitial fluids were shown to increase 100 – 1000- fold from basal levels (10 – 300 nM) with ischemia [14]. Furthermore, adenosine levels in hepatocytes were increased by a hypoxic challenge, with excess amounts of adenosine being exported out of cells [14]. Adenosine levels were also found to increase 10-fold due to hypoxia, stress, and inflammation [15]. Adenosine has been shown to activate A1, A2a, and A3 receptors with EC50 values in the range of 0.2 – 0.7 μM, and also A2b receptors with an EC50 of 24 μM [16]. A1 and A3 receptors have been classified as adenylate cyclase inhibitory receptors, and A2a and A2b receptors as adenylate cyclase-activating receptors [17]. The activation of adenosine receptors accompanied by ischemia may increase total Hcy levels in the serum because hepatic ischemia is known to decrease the content of GSH and activity of MAT [18].
We previously reported that the non-specific adenosine receptor agonist 5’-N-ethylcarboxamide-adenosine (NECA) increased serum glucose levels and the expression of a glucogenic enzyme (glucose 6-phosphatase) in the liver [19] [20]. Based on the dose of NECA administered in these studies and plasma concentrations after the administration of other adenosine agonists [21], it was inferred that the serum NECA concentration was in the μM range and also that NECA activated adenosine A2b receptors. In the present study, we measured methionine metabolites, including Hcy, in NECA-treated mice in order to determine whether the activation of adenosine receptors increased total Hcy levels in the serum. The results obtained clearly demonstrated that NECA increased total Hcy levels in the serum.
Measurement of Methionine Metabolites AdoMet and AdoHcy levels in the liver were measured using an HPLC method [25] and total GSH in the liver was measured using a microtiter plate assay [26], as described previously [23]. Total Hcy and total cysteine levels (total Cys: free and protein-bound cysteine) in the serum were measured using an HPLC method [27]. Briefly, a mixture of 50 μL of serum, 25 μL of an internal standard, and 25 μL of phosphate-buffered saline (PBS, pH 7.4) was incubated with 10 μL of 100 mg/mL TCEP for 30 min at room temperature in order to reduce and release protein-bound thiols. After this incubation, 90 μL of 100 mg/mL trichloroacetic acid containing 1 mmol/L EDTA was added for deproteinization, centrifuged at 15,000 ×g for 10 min, and 50 μL of the supernatant was added to a tube containing 10 μL of 1.55 mol/L NaOH; 125 μL of 0.125 mol/L borate buffer containing 4 mmol/L EDTA, pH 9.5; and 50 μL of 1 mg/mL SBD-F in the borate buffer. The sample was then incubated for 60 min at 60˚C. HPLC was performed on a Waters M-600 pump equipped with a Waters 2475 Multi λ Fluorescence Detector (385 nm excitation, 515 nm emission). The separation of SBD-derivatized thiols was performed on a μ-BONDASPHERE C18 column (Waters, 5 μm, 100 A, 150 × 3.9 mm) with a 20-μL injection volume and 0.1 mol/L acetate buffer, pH 5.5, containing 30 ml/L methanol as the mobile phase at a flow rate of 1.0 mL/min and column temperature of 29˚C.
3.1. Effects of NECA on Total Hcy and Total Cys Levels in the Serum As shown in Table 1, serum total Hcy and total Cys levels significantly increased after 16 h of fasting. The administration of a low dose of NECA (NECA0.1 group) to mice fasted for 16 h resulted in higher serum total Hcy levels than those in the control group at 1 h (Experiment 1). Serum total Hcy levels were also significantly elevated at 3 h (Experiment 2), but were not significantly different from those in the control group at 6 h (Experiment 3). The administration of a high dose of NECA (NECA0.3 group) resulted in significantly higher serum total Hcy levels than those in the control group at 1 h, 3 h, 6 h, and 10 h (Experiments 4, 5, 6, and 7), gradually increasing Hcy levels to 19.7 μM. The effects of NECA on serum total Cys levels were the same as those on total Hcy levels.
Table 1. Effects of NECA on the content of total homocysteine and total cysteine in the serum.

3.2. Effects of NECA on Other Methionine Metabolite Levels in the Liver We previously reported that fasting for 16 h decreased AdoMet and GSH levels, and increased AdoHcy levels in the livers of mice [23]. In the present study, as shown in Table 2, the administration of a low dose of NECA (NECA0.1 group) to mice fasted for 16 h resulted in lower liver GSH levels than those in the control group at 1 h (Experiment 1). Liver GSH levels were also significantly lower at 3 h (Experiment 2), while GSH levels were not significantly different from those in the control group at 6 h (Experiment 3). The administration of a high dose of NECA (NECA0.3 group) resulted in liver GSH levels that were significantly lower than those in the control group at 1 h, 6 h, and 10 h (Experiments 4, 6, and 7). The effects of NECA on total Hcy levels in the serum and GSH levels in the liver were similar at each dose and time. Furthermore, the low and high doses of NECA both led to significantly higher AdoMet levels than those in the control group at 1 h (Experiments 1 and 4). AdoMet levels at 3 h, 6 h, and 10 h were not significantly different from those in the control group (Experiments 2, 3, 5, 6, and 7). AdoHcy levels were significantly lower in the NECA0.3 group than in the control group 6 h and 10 h after the administration of NECA (Experiments 6 and 7), while the administration of a low dose of NECA had less of an impact on AdoHcy levels.

Table 2. Effects of NECA on the content of methionine metabolites in the liver.

3.3. Effects of NECA on mRNA Expression of Methionine Cycle Enzymes in the Liver Figure 1 shows changes in the mRNA expression of methionine cycle enzymes in Experiments 4, 5, and 6. The expression of methionine cycle enzymes did not significantly change 1 h after the administration of NECA. The expression of MAT1A mRNA was significantly decreased in the liver 6 h after the NECA treatment, while that of MAT2A was increased. The changes observed in the expression of MAT in the present study were consistent with previous findings obtained in ischemic livers [18] or with liver regeneration [28]. The expression of GNMT, which eliminates excess AdoMet, was significantly decreased 6 h after the NECA treatment. The expression of CBS, which converts Hcy to cystathionine through the transsulfuration pathway, and BHMT, which converts Hcy to methionine, was also decreased at 6 h.

Figure 1 shows changes in the mRNA expression of methionine cycle enzymes in Experiments 4, 5, and 6. The expression of methionine cycle enzymes did not significantly change 1 h after the administration of NECA. The expression of MAT1A mRNA was significantly decreased in the liver 6 h after the NECA treatment, while that of MAT2A was increased. The changes observed in the expression of MAT in the present study were consistent with previous findings obtained in ischemic livers [18] or with liver regeneration [28]. The expression of GNMT, which eliminates excess AdoMet, was significantly decreased 6 h after the NECA treatment. The expression of CBS, which converts Hcy to cystathionine through the transsulfuration pathway, and BHMT, which converts Hcy to methionine, was also decreased at 6 h.
Figure 1. Effects of NECA on the mRNA expression of methionine cycle enzymes in the mouse liver. Northern hybridization was performed on the liver RNA of mice in experiments 4, 5, and 6. The mean ± SEM of the ratio of each enzyme mRNA to the level of the 18S rRNA signal is shown as an arbitrary unit. Unpaired Student’s t-tests were used to compare NECA- treated groups with the control groups. *p < 0.05, **p < 0.01: significantly different from each control.
4. Discussion In the present study, an increase in total Hcy levels and AdoMet levels, and decrease in GSH levels occurred 1 h after the NECA treatment. These results were not due to changes in the expression of methionine metabolic enzymes, which remained unchanged 1 h after the NECA treatment (Figure 1). The effects of NECA on methionine metabolism are summarized in Figure 2. No previous study has demonstrated that adenosine has the ability to directly affect CBS; however, the overproduction of carbon monoxide (CO), which is generated by heme oxygenase (HO), is found to inhibit transsulfuration [11]. CO has been shown to inhibit CBS activity and increase AdoMet concentrations [11]. Adenosine and NECA were previously reported to markedly induce HO in macrophages [29]. Hcy, which is a substrate of CBS, may be increased by NECA via the CO-induced inhibition of CBS, and GSH may be decreased by the CO-induced inhibition of transsulfuration. However, the mechanism by which NECA affects transsulfuration in the short term has not yet been elucidated.
Figure 2. Effects of NECA on the methionine metabolic pathway. MAT: methionine adenosyltransferase, GNMT: glycine N-methyltransferase, CBS: cystathionine β-synthase, BHMT: betaine-homocysteine S-methyltransferase, MS: methionine synthase (Map is based on Sakata SF 2005).
GSH was maintained at a low level for up to 10 h by the NECA0.3 treatment and transsulfuration may have been continuously inhibited by the NECA0.3 treatment. Total Hcy levels were also continuously increased for up to 10 h by the NECA0.3 treatment, and decreased AdoHcy levels were observed 6 h and 10 h after the NECA0.3 treatment. Long-term elevations in serum total Hcy levels by NECA may be maintained by attenuating the expression of methionine metabolic enzymes via the following mechanisms: The expression of methionine metabolic enzymes in the liver was reduced 6 h after the NECA0.3 treatment (Figure 1); the flow of the methionine cycle may have been decreased by changes in the expression of MAT (decreased liver-specific MAT1A expression and increased non-liver type MAT2A expression) because MATIII (Km for methionine: 215 μM – 7 mM) is the true liver-specific isoform responsible for methionine metabolism [30] and the generation rate of AdoMet by MATII (non-liver type enzyme) was modest with a low Km (80 μM for methionine) [31]; inhibition of the methyltransferases, BHMT [32] and GNMT [33], induces hyperhomocysteinemia; decreases in AdoHcy levels may be caused by reductions in methyltransferase levels. However, the mechanisms by which NECA continuously increased total Hcy levels have not yet been elucidated in detail. 5. Conclusion The present study confirmed that the non-specific adenosine receptor agonist NECA continuously increased total Hcy levels in the serum. The inhibition of adenosine receptors may decrease the risk of cardiovascular diseases because an increase in serum total Hcy levels is a known risk factor.

References

[1] Antoniades, C., Antonopoulos, A.S., Tousoulis, D., Marinou, K. and Stefanadis, C. (2009) Homocysteine and Coronary Atherosclerosis: from Folate Fortification to the Recent Clinical Trials. European Heart Journal, 30, 6-15.
http://dx.doi.org/10.1093/eurheartj/ehn515
[2] Refsum, H., Ueland, P.M., Nygard, O. and Vollset, S.E. (1998) Homocysteine and Cardiovascular Disease. Annual Review of Medicine, 49, 31-62.
http://dx.doi.org/10.1146/annurev.med.49.1.31
[3] Garcia-Tevijano, E.R., Berasain, C., Rodriguez, J.A., Corrales, F.J., Arias, R., Martin-Duce, A., Caballeria, J., Mato, J.M. and Avila, M.A. (2001) Hyperhomocysteinemia in Liver Cirrhosis: Mechanisms and Role in Vascular and Hepatic Fibrosis. Hypertension, 38, 1217-1221.
http://dx.doi.org/10.1161/hy1101.099499
[4] Araki, A., Ito, H., Majima, Y., Hosoi, T. and Orimo, H. (2003) Association between Plasma Homocysteine Concentrations and Asymptomatic Cerebral Infarction or Leukoaraiosis in Elderly Diabetic Patients. Geriatrics & Gerontology International, 3, 15-23.
http://dx.doi.org/10.1046/j.1444-1586.2003.00051.x
[5] Elanchezhian, R., Palsamy, P., Madson, C.J., Lynch, D.W. and Shinohara, T. (2012) Age-Related Cataracts: Homocysteine Coupled Endoplasmic Reticulum Stress and Suppression of Nrf2-Dependent Antioxidant Protection. Chemico-Biological Interactions, 200, 1-10.
http://dx.doi.org/10.1016/j.cbi.2012.08.017
[6] Mudd, S.H., Finkelstein, J.D., Refsum, H., Ueland, P.M., Malinow, M.R., Lentz, S.R., Jacobsen, D.W., Brattstrom, L., Wilcken, B., Wilcken, D.E., Blom, H.J., Stabler, S.P., Allen, R.H., Selhub, J. and Rosenberg, I.H. (2000) Homocysteine and Its Disulfide Derivatives: A Suggested Consensus Terminology. Arteriosclerosis Thrombosis and Vascular Biology, 20, 1704-1706.
http://dx.doi.org/10.1161/01.ATV.20.7.1704
[7] Finkelstein, J.D. (1990) Methionine Metabolism in Mammals. The Journal of Nutritional Biochemistry, 1, 228-237.
http://dx.doi.org/10.1016/0955-2863(90)90070-2
[8] Stipanuk, M.H. (2004) Sulfur Amino Acid Metabolism: Pathways for Production and Removal of Homocysteine and Cysteine. Annual Review of Nutrition, 24, 539-577.
http://dx.doi.org/10.1146/annurev.nutr.24.012003.132418
[9] Chou, J.Y. (2000) Molecular Genetics of Hepatic Methionine Adenosyltransferase Deficiency. Pharmacology & Therapeutics, 85, 1-9.
http://dx.doi.org/10.1016/s0163-7258(99)00047-9
[10] De La Haba, G. and Cantoni, G.L. (1959) The Enzymatic Synthesis of S-Adenosyl-L-Homocysteine from Adenosine and Homocysteine. The Journal of Biological Chemistry, 234, 603-608.
http://www.jbc.org/content/234/3/603.short

…. more

 

 

Read Full Post »

Clinical Laboratory Challenges

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

CLINICAL LABORATORY NEWS   

The Lab and CJD: Safe Handling of Infectious Prion Proteins

Body fluids from individuals with possible Creutzfeldt-Jakob disease (CJD) present distinctive safety challenges for clinical laboratories. Sporadic, iatrogenic, and familial CJD (known collectively as classic CJD), along with variant CJD, kuru, Gerstmann-Sträussler-Scheinker, and fatal familial insomnia, are prion diseases, also known as transmissible spongiform encephalopathies. Prion diseases affect the central nervous system, and from the onset of symptoms follow a typically rapid progressive neurological decline. While prion diseases are rare, it is not uncommon for the most prevalent form—sporadic CJD—to be included in the differential diagnosis of individuals presenting with rapid cognitive decline. Thus, laboratories may deal with a significant number of possible CJD cases, and should have protocols in place to process specimens, even if a confirmatory diagnosis of CJD is made in only a fraction of these cases.

The Lab’s Role in Diagnosis

Laboratory protocols for handling specimens from individuals with possible, probable, and definitive cases of CJD are important to ensure timely and appropriate patient management. When the differential includes CJD, an attempt should be made to rule-in or out other causes of rapid neurological decline. Laboratories should be prepared to process blood and cerebrospinal fluid (CSF) specimens in such cases for routine analyses.

Definitive diagnosis requires identification of prion aggregates in brain tissue, which can be achieved by immunohistochemistry, a Western blot for proteinase K-resistant prions, and/or by the presence of prion fibrils. Thus, confirmatory diagnosis is typically achieved at autopsy. A probable diagnosis of CJD is supported by elevated concentration of 14-3-3 protein in CSF (a non-specific marker of neurodegeneration), EEG, and MRI findings. Thus, the laboratory may be required to process and send CSF samples to a prion surveillance center for 14-3-3 testing, as well as blood samples for sequencing of the PRNP gene (in inherited cases).

Processing Biofluids

Laboratories should follow standard protective measures when working with biofluids potentially containing abnormally folded prions, such as donning standard personal protective equipment (PPE); avoiding or minimizing the use of sharps; using single-use disposable items; and processing specimens to minimize formation of aerosols and droplets. An additional safety consideration is the use of single-use disposal PPE; otherwise, re-usable items must be either cleaned using prion-specific decontamination methods, or destroyed.

Blood. In experimental models, infectivity has been detected in the blood; however, there have been no cases of secondary transmission of classical CJD via blood product transfusions in humans. As such, blood has been classified, on epidemiological evidence by the World Health Organization (WHO), as containing “no detectible infectivity,” which means it can be processed by routine methods. Similarly, except for CSF, all other body fluids contain no infectivity and can be processed following standard procedures.

In contrast to classic CJD, there have been four cases of suspected secondary transmission of variant CJD via transfused blood products in the United Kingdom. Variant CJD, the prion disease associated with mad cow disease, is unique in its distribution of prion aggregates outside of the central nervous system, including the lymph nodes, spleen, and tonsils. For regions where variant CJD is a concern, laboratories should consult their regulatory agencies for further guidance.

CSF. Relative to highly infectious tissues of the brain, spinal cord, and eye, infectivity has been identified less often in CSF and is considered to have “low infectivity,” along with kidney, liver, and lung tissue. Since CSF can contain infectious material, WHO has recommended that analyses not be performed on automated equipment due to challenges associated with decontamination. Laboratories should perform a risk assessment of their CSF processes, and, if deemed necessary, consider using manual methods as an alternative to automated systems.

Decontamination

The infectious agent in prion disease is unlike any other infectious pathogen encountered in the laboratory; it is formed of misfolded and aggregated prion proteins. This aggregated proteinacious material forms the infectious unit, which is incredibly resilient to degradation. Moreover, in vitro studies have demonstrated that disrupting large aggregates into smaller aggregates increases cytotoxicity. Thus, if the aim is to abolish infectivity, all aggregates must be destroyed. Disinfectant procedures used for viral, bacterial, and fungal pathogens such as alcohol, boiling, formalin, dry heat (<300°C), autoclaving at 121°C for 15 minutes, and ionizing, ultraviolet, or microwave radiation, are either ineffective or variably effective against aggregated prions.

The only means to ensure no risk of residual infectious prions is to use disposable materials. This is not always practical, as, for instance, a biosafety cabinet cannot be discarded if there is a CSF spill in the hood. Fortunately, there are several protocols considered sufficient for decontamination. For surfaces and heat-sensitive instruments, such as a biosafety cabinet, WHO recommends flooding the surface with 2N NaOH or undiluted NaClO, letting stand for 1 hour, mopping up, and rinsing with water. If the surface cannot tolerate NaOH or NaClO, thorough cleaning will remove most infectivity by dilution. Laboratories may derive some additional benefit by using one of the partially effective methods discussed previously. Non-disposable heat-resistant items preferably should be immersed in 1N NaOH, heated in a gravity displacement autoclave at 121°C for 30 min, cleaned and rinsed in water, then sterilized by routine methods. WHO has outlined several alternate decontamination methods. Using disposable cover sheets is one simple solution to avoid contaminating work surfaces and associated lengthy decontamination procedures.

With standard PPE—augmented by a few additional safety measures and prion-specific decontamination procedures—laboratories can safely manage biofluid testing in cases of prion disease.

 

The Microscopic World Inside Us  

Emerging Research Points to Microbiome’s Role in Health and Disease

Thousands of species of microbes—bacteria, viruses, fungi, and protozoa—inhabit every internal and external surface of the human body. Collectively, these microbes, known as the microbiome, outnumber the body’s human cells by about 10 to 1 and include more than 1,000 species of microorganisms and several million genes residing in the skin, respiratory system, urogenital, and gastrointestinal tracts. The microbiome’s complicated relationship with its human host is increasingly considered so crucial to health that researchers sometimes call it “the forgotten organ.”

Disturbances to the microbiome can arise from nutritional deficiencies, antibiotic use, and antiseptic modern life. Imbalances in the microbiome’s diverse microbial communities, which interact constantly with cells in the human body, may contribute to chronic health conditions, including diabetes, asthma and allergies, obesity and the metabolic syndrome, digestive disorders including irritable bowel syndrome (IBS), and autoimmune disorders like multiple sclerosis and rheumatoid arthritis, research shows.

While study of the microbiome is a growing research enterprise that has attracted enthusiastic media attention and venture capital, its findings are largely preliminary. But some laboratorians are already developing a greater appreciation for the microbiome’s contributions to human biochemistry and are considering a future in which they expect to measure changes in the microbiome to monitor disease and inform clinical practice.

Pivot Toward the Microbiome

Following the National Institutes of Health (NIH) Human Genome Project, many scientists noted the considerable genetic signal from microbes in the body and the existence of technology to analyze these microorganisms. That realization led NIH to establish the Human Microbiome Project in 2007, said Lita Proctor, PhD, its program director. In the project’s first phase, researchers studied healthy adults to produce a reference set of microbiomes and a resource of metagenomic sequences of bacteria in the airways, skin, oral cavities, and the gastrointestinal and vaginal tracts, plus a catalog of microbial genome sequences of reference strains. Researchers also evaluated specific diseases associated with disturbances in the microbiome, including gastrointestinal diseases such as Crohn’s disease, ulcerative colitis, IBS, and obesity, as well as urogenital conditions, those that involve the reproductive system, and skin diseases like eczema, psoriasis, and acne.

Phase 1 studies determined the composition of many parts of the microbiome, but did not define how that composition affects health or specific disease. The project’s second phase aims to “answer the question of what microbes actually do,” explained Proctor. Researchers are now examining properties of the microbiome including gene expression, protein, and human and microbial metabolite profiles in studies of pregnant women at risk for preterm birth, the gut hormones of patients at risk for IBS, and nasal microbiomes of patients at risk for type 2 diabetes.

Promising Lines of Research

Cystic fibrosis and microbiology investigator Michael Surette, PhD, sees promising microbiome research not just in terms of evidence of its effects on specific diseases, but also in what drives changes in the microbiome. Surette is Canada research chair in interdisciplinary microbiome research in the Farncombe Family Digestive Health Research Institute at McMaster University
in Hamilton, Ontario.

One type of study on factors driving microbiome change examines how alterations in composition and imbalances in individual patients relate to improving or worsening disease. “IBS, cystic fibrosis, and chronic obstructive pulmonary disease all have periods of instability or exacerbation,” he noted. Surette hopes that one day, tests will provide clinicians the ability to monitor changes in microbial composition over time and even predict when a patient’s condition is about to deteriorate. Monitoring perturbations to the gut microbiome might also help minimize collateral damage to the microbiome during aggressive antibiotic therapy for hospitalized patients, he added.

Monitoring changes to the microbiome also might be helpful for “culture negative” patients, who now may receive multiple, unsuccessful courses of different antibiotics that drive antibiotic resistance. Frustration with standard clinical biology diagnosis of lung infections in cystic fibrosis patients first sparked Surette’s investigations into the microbiome. He hopes that future tests involving the microbiome might also help asthma patients with neutrophilia, community-acquired pneumonia patients who harbor complex microbial lung communities lacking obvious pathogens, and hospitalized patients with pneumonia or sepsis. He envisions microbiome testing that would look for short-term changes indicating whether or not a drug is effective.

Companion Diagnostics

Daniel Peterson, MD, PhD, an assistant professor of pathology at Johns Hopkins University School of Medicine in Baltimore, believes the future of clinical testing involving the microbiome lies in companion diagnostics for novel treatments, and points to companies that are already developing and marketing tests that will require such assays.

Examples of microbiome-focused enterprises abound, including Genetic Analysis, based in Oslo, Norway, with its high-throughput test that uses 54 probes targeted to specific bacteria to measure intestinal gut flora imbalances in inflammatory bowel disease and irritable bowel syndrome patients. Paris, France-based Enterome is developing both novel drugs and companion diagnostics for microbiome-related diseases such as IBS and some metabolic diseases. Second Genome, based in South San Francisco, has developed an experimental drug, SGM-1019, that the company says blocks damaging activity of the microbiome in the intestine. Cambridge, Massachusetts-based Seres Therapeutics has received Food and Drug Administration orphan drug designation for SER-109, an oral therapeutic intended to correct microbial imbalances to prevent recurrent Clostridium difficile infection in adults.

One promising clinical use of the microbiome is fecal transplantation, which both prospective and retrospective studies have shown to be effective in patients with C. difficile infections who do not respond to front-line therapies, said James Versalovic, MD, PhD, director of Texas Children’s Hospital Microbiome Center and professor of pathology at Baylor College of Medicine in Houston. “Fecal transplants and other microbiome replacement strategies can radically change the composition of the microbiome in hours to days,” he explained.

But NIH’s Proctor discourages too much enthusiasm about fecal transplant. “Natural products like stool can have [side] effects,” she pointed out. “The [microbiome research] field needs to mature and we need to verify outcomes before anything becomes routine.”

Hurdles for Lab Testing

While he is hopeful that labs someday will use the microbiome to produce clinically useful information, Surette pointed to several problems that must be solved beforehand. First, molecular methods commonly used right now should be more quantitative and accurate. Additionally, research on the microbiome encompasses a wide variety of protocols, some of which are better at extracting particular types of bacteria and therefore can give biased views of communities living in the body. Also, tests may need to distinguish between dead and live microbes. Another hurdle is that labs using varied bioinfomatic methods may produce different results from the same sample, a problem that Surette sees as ripe for a solution from clinical laboratorians, who have expertise in standardizing robust protocols and in automating tests.

One way laboratorians can prepare for future, routine microbiome testing is to expand their notion of clinical chemistry to include both microbial and human biochemistry. “The line between microbiome science and clinical science is blurring,” said Versalovic. “When developing future assays to detect biochemical changes in disease states, we must consider the contributions of microbial metabolites and proteins and how to tailor tests to detect them.” In the future, clinical labs may test for uniquely microbial metabolites in various disease states, he predicted.

 

Automated Review of Mass Spectrometry Results  

Can We Achieve Autoverification?

Author: Katherine Alexander and Andrea R. Terrell, PhD  // Date: NOV.1.2015  // Source:Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/november/automated-review-of-mass-spectrometry-results-can-we-achieve-autoverification

 

Paralleling the upswing in prescription drug misuse, clinical laboratories are receiving more requests for mass spectrometry (MS) testing as physicians rely on its specificity to monitor patient compliance with prescription regimens. However, as volume has increased, reimbursement has declined, forcing toxicology laboratories both to increase capacity and lower their operational costs—without sacrificing quality or turnaround time. Now, new solutions are available enabling laboratories to bring automation to MS testing and helping them with the growing demand for toxicology and other testing.

What is the typical MS workflow?

A typical workflow includes a long list of manual steps. By the time a sample is loaded onto the mass spectrometer, it has been collected, logged into the lab information management system (LIMS), and prepared for analysis using a variety of wet chemistry techniques.

Most commercial clinical laboratories receive enough samples for MS analysis to batch analyze those samples. A batch consists of a calibrator(s), quality control (QC) samples, and patient/donor samples. Historically, the method would be selected (i.e. “analysis of opiates”), sample identification information would be entered manually into the MS software, and the instrument would begin analyzing each sample. Upon successful completion of the batch, the MS operator would view all of the analytical data, ensure the QC results were acceptable, and review each patient/donor specimen, looking at characteristics such as peak shape, ion ratios, retention time, and calculated concentration.

The operator would then post acceptable results into the LIMS manually or through an interface, and unacceptable results would be rescheduled or dealt with according to lab-specific protocols. In our laboratory we perform a final certification step for quality assurance by reviewing all information about the batch again, prior to releasing results for final reporting through the LIMS.

What problems are associated with this workflow?

The workflow described above results in too many highly trained chemists performing manual data entry and reviewing perfectly acceptable analytical results. Lab managers would prefer that MS operators and certifying scientists focus on troubleshooting problem samples rather than reviewing mounds of good data. Not only is the current process inefficient, it is mundane work prone to user errors. This risks fatigue, disengagement, and complacency by our highly skilled scientists.

Importantly, manual processes also take time. In most clinical lab environments, turnaround time is critical for patient care and industry competitiveness. Lab directors and managers are looking for solutions to automate mundane, error-prone tasks to save time and costs, reduce staff burnout, and maintain high levels of quality.

How can software automate data transfer from MS systems to LIMS?

Automation is not a new concept in the clinical lab. Labs have automated processes in shipping and receiving, sample preparation, liquid handling, and data delivery to the end user. As more labs implement MS, companies have begun to develop special software to automate data analysis and review workflows.

In July 2011, AIT Labs incorporated ASCENT into our workflow, eliminating the initial manual peak review step. ASCENT is an algorithm-based peak picking and data review system designed specifically for chromatographic data. The software employs robust statistical and modeling approaches to the raw instrument data to present the true signal, which often can be obscured by noise or matrix components.

The system also uses an exponentially modified Gaussian (EMG) equation to apply a best-fit model to integrated peaks through what is often a noisy signal. In our experience, applying the EMG results in cleaner data from what might appear to be poor chromatography ultimately allows us to reduce the number of samples we might otherwise rerun.

How do you validate the quality of results?

We’ve developed a robust validation protocol to ensure that results are, at minimum, equivalent to results from our manual review. We begin by building the assay in ASCENT, entering assay-specific information from our internal standard operating procedure (SOP). Once the assay is configured, validation proceeds with parallel batch processing to compare results between software-reviewed data and staff-reviewed data. For new implementations we run eight to nine batches of 30–40 samples each; when we are modifying or upgrading an existing implementation we run a smaller number of batches. The parallel batches should contain multiple positive and negative results for all analytes in the method, preferably spanning the analytical measurement range of the assay.

The next step is to compare the results and calculate the percent difference between the data review methods. We require that two-thirds of the automated results fall within 20% of the manually reviewed result. In addition to validating patient sample correlation, we also test numerous quality assurance rules that should initiate a flag for further review.

What are the biggest challenges during implementation and continual improvement initiatives?

On the technological side, our largest hurdle was loading the sequence files into ASCENT. We had created an in-house mechanism for our chemists to upload the 96-well plate map for their batch into the MS software. We had some difficulty transferring this information to ASCENT, but once we resolved this issue, the technical workflow proceeded fairly smoothly.

The greater challenge was changing our employees’ mindset from one of fear that automation would displace them, to a realization that learning this new technology would actually make them more valuable. Automating a non-mechanical process can be a difficult concept for hands-on scientists, so managers must be patient and help their employees understand that this kind of technology leverages the best attributes of software and people to create a powerful partnership.

We recommend that labs considering automated data analysis engage staff in the validation and implementation to spread the workload and the knowledge. As is true with most technology, it is best not to rely on just one or two super users. We also found it critical to add supervisor level controls on data file manipulation, such as removing a sample that wasn’t run from the sequence table. This can prevent inadvertent deletion of a file, requiring reinjection of the entire batch!

 

Understanding Fibroblast Growth Factor 23

Author: Damien Gruson, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/understanding-fibroblast-growth-factor-23

What is the relationship of FGF-23 to heart failure?

A Heart failure (HF) is an increasingly common syndrome associated with high morbidity, elevated hospital readmission rates, and high mortality. Improving diagnosis, prognosis, and treatment of HF requires a better understanding of its different sub-phenotypes. As researchers gained a comprehensive understanding of neurohormonal activation—one of the hallmarks of HF—they discovered several biomarkers, including natriuretic peptides, which now are playing an important role in sub-phenotyping HF and in driving more personalized management of this chronic condition.

Like the natriuretic peptides, fibroblast growth factor 23 (FGF-23) could become important in risk-stratifying and managing HF patients. Produced by osteocytes, FGF-23 is a key regulator of phosphorus homeostasis. It binds to renal and parathyroid FGF-Klotho receptor heterodimers, resulting in phosphate excretion, decreased 1-α-hydroxylation of 25-hydroxyvitamin D, and decreased parathyroid hormone (PTH) secretion. The relationship to PTH is important because impaired homeostasis of cations and decreased glomerular filtration rate might contribute to the rise of FGF-23. The amino-terminal portion of FGF-23 (amino acids 1-24) serves as a signal peptide allowing secretion into the blood, and the carboxyl-terminal portion (aa 180-251) participates in its biological action.

How might FGF-23 improve HF risk assessment?

Studies have shown that FGF-23 is related to the risk of cardiovascular diseases and mortality. It was first demonstrated that FGF-23 levels were independently associated with left ventricular mass index and hypertrophy as well as mortality in patients with chronic kidney disease (CKD). FGF-23 also has been associated with left ventricular dysfunction and atrial fibrillation in coronary artery disease subjects, even in the absence of impaired renal function.

FGF-23 and FGF receptors are both expressed in the myocardium. It is possible that FGF-23 has direct effects on the heart and participates in the physiopathology of cardiovascular diseases and HF. Experiments have shown that for in vitro cultured rat cardiomyocytes, FGF-23 stimulates pathological hypertrophy by activating the calcineurin-NFAT pathway—and in wild-type mice—the intra-myocardial or intravenous injection of FGF-23 resulted in left ventricular hypertrophy. As such, FGF-23 appears to be a potential stimulus of myocardial hypertrophy, and increased levels may contribute to the worsening of heart failure and long-term cardiovascular death.

Researchers have documented that HF patients have elevated FGF-23 circulating levels. They have also found a significant correlation between plasma levels of FGF-23 and B-type natriuretic peptide, a biomarker related to ventricular stretch and cardiac hypertrophy, in patients with left ventricular hypertrophy. As such, measuring FGF-23 levels might be a useful tool to predict long-term adverse cardiovascular events in HF patients.

Interestingly, researchers have documented a significant relationship between FGF-23 and PTH in both CKD and HF patients. As PTH stimulates FGF-23 expression, it could be that in HF patients, increased PTH levels increase the bone expression of FGF-23, which enhances its effects on the heart.

 

The Past, Present, and Future of Western Blotting in the Clinical Laboratory

Author: Curtis Balmer, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/the-past-present-and-future-of-western-blotting-in-the-clinical-laboratory

Much of the discussion about Western blotting centers around its performance as a biological research tool. This isn’t surprising. Since its introduction in the late 1970s, the Western blot has been adopted by biology labs of virtually every stripe, and become one of the most widely used techniques in the research armamentarium. However, Western blotting has also been employed in clinical laboratories to aid in the diagnosis of various diseases and disorders—an equally important and valuable application. Yet there has been relatively little discussion of its use in this context, or of how advances in Western blotting might affect its future clinical use.

Highlighting the clinical value of Western blotting, Stanley Naides, MD, medical director of Immunology at Quest Diagnostics observed that, “Western blotting has been a very powerful tool in the laboratory and for clinical diagnosis. It’s one of many various methods that the laboratorian brings to aid the clinician in the diagnosis of disease, and the selection and monitoring of therapy.” Indeed, Western blotting has been used at one time or the other to aid in the diagnosis of infectious diseases including hepatitis C (HCV), HIV, Lyme disease, and syphilis, as well as autoimmune disorders such as paraneoplastic disease and myositis conditions.

However, Naides was quick to point out that the choice of assays to use clinically is based on their demonstrated sensitivity and performance, and that the search for something better is never-ending. “We’re constantly looking for methods that improve detection of our target [protein],” Naides said. “There have been a number of instances where we’ve moved away from Western blotting because another method proves to be more sensitive.” But this search can also lead back to Western blotting. “We’ve gone away from other methods because there’s been a Western blot that’s been developed that’s more sensitive and specific. There’s that constant movement between methods as new tests are developed.”

In recent years, this quest has been leading clinical laboratories away from Western blotting toward more sensitive and specific diagnostic assays, at least for some diseases. Using confirmatory diagnosis of HCV infection as an example, Sai Patibandla, PhD, director of the immunoassay group at Siemens Healthcare Diagnostics, explained that movement away from Western blotting for confirmatory diagnosis of HCV infection began with a technical modification called Recombinant Immunoblotting Assay (RIBA). RIBA streamlines the conventional Western blot protocol by spotting recombinant antigen onto strips which are used to screen patient samples for antibodies against HCV. This approach eliminates the need to separate proteins and transfer them onto a membrane.

The RIBA HCV assay was initially manufactured by Chiron Corporation (acquired by Novartics Vaccines and Diagnostics in 2006). It received Food and Drug Administration (FDA) approval in 1999, and was marketed as Chiron RIBA HCV 3.0 Strip Immunoblot Assay. Patibandla explained that, at the time, the Chiron assay “…was the only FDA-approved confirmatory testing for HCV.” In 2013 the assay was discontinued and withdrawn from the market due to reports that it was producing false-positive results.

Since then, clinical laboratories have continued to move away from Western blot-based assays for confirmation of HCV in favor of the more sensitive technique of nucleic acid testing (NAT). “The migration is toward NAT for confirmation of HCV [diagnosis]. We don’t use immunoblots anymore. We don’t even have a blot now to confirm HCV,” Patibandla said.

Confirming HIV infection has followed a similar path. Indeed, in 2014 the Centers for Disease Control and Prevention issued updated recommendations for HIV testing that, in part, replaced Western blotting with NAT. This change was in response to the recognition that the HIV-1 Western blot assay was producing false-negative or indeterminable results early in the course of HIV infection.

At this juncture it is difficult to predict if this trend away from Western blotting in clinical laboratories will continue. One thing that is certain, however, is that clinicians and laboratorians are infinitely pragmatic, and will eagerly replace current techniques with ones shown to be more sensitive, specific, and effective. This raises the question of whether any of the many efforts currently underway to improve Western blotting will produce an assay that exceeds the sensitivity of currently employed techniques such as NAT.

Some of the most exciting and groundbreaking work in this area is being done by Amy Herr, PhD, a professor of bioengineering at University of California, Berkeley. Herr’s group has taken on some of the most challenging limitations of Western blotting, and is developing techniques that could revolutionize the assay. For example, the Western blot is semi-quantitative at best. This weakness dramatically limits the types of answers it can provide about changes in protein concentrations under various conditions.

To make Western blotting more quantitative, Herr’s group is, among other things, identifying losses of protein sample mass during the assay protocol. About this, Herr explains that the conventional Western blot is an “open system” that involves lots of handling of assay materials, buffers, and reagents that makes it difficult to account for protein losses. Or, as Kevin Lowitz, a senior product manager at Thermo Fisher Scientific, described it, “Western blot is a [simple] technique, but a really laborious one, and there are just so many steps and so many opportunities to mess it up.”

Herr’s approach is to reduce the open aspects of Western blot. “We’ve been developing these more closed systems that allow us at each stage of the assay to account for [protein mass] losses. We can’t do this exactly for every target of interest, but it gives us a really good handle [on protein mass losses],” she said. One of the major mechanisms Herr’s lab is using to accomplish this is to secure proteins to the blot matrix with covalent bonding rather than with the much weaker hydrophobic interactions that typically keep the proteins in place on the membrane.

Herr’s group also has been developing microfluidic platforms that allow Western blotting to be done on single cells, “In our system we’re doing thousands of independent Westerns on single cells in four hours. And, hopefully, we’ll cut that down to one hour over the next couple years.”

Other exciting modifications that stand to dramatically increase the sensitivity, quantitation, and through-put of Western blotting also are being developed and explored. For example, the use of capillary electrophoresis—in which proteins are conveyed through a small electrolyte-filled tube and separated according to size and charge before being dropped onto a blotting membrane—dramatically reduces the amount of protein required for Western blot analysis, and thereby allows Westerns to be run on proteins from rare cells or for which quantities of sample are extremely limited.

Jillian Silva, PhD, an associate specialist at the University of California, San Francisco Helen Diller Family Comprehensive Cancer Center, explained that advances in detection are also extending the capabilities of Western blotting. “With the advent of fluorescence detection we have a way to quantitate Westerns, and it is now more quantitative than it’s ever been,” said Silva.

Whether or not these advances produce an assay that is adopted by clinical laboratories remains to be seen. The emphasis on Western blotting as a research rather than a clinical tool may bias advances in favor of the needs and priorities of researchers rather than clinicians, and as Patibandla pointed out, “In the research world Western blotting has a certain purpose. [Researchers] are always coming up with new things, and are trying to nail down new proteins, so you cannot take Western blotting away.” In contrast, she suggested that for now, clinical uses of Western blotting remain “limited.”

 

Adapting Next Generation Technologies to Clinical Molecular Oncology Service

Author: Ronald Carter, PhD, DVM  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/adapting-next-generation-technologies-to-clinical-molecular-oncology-service

Next generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and in the amount of information they provide. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing. This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.What follows is one viewpoint on the major challenges in adopting NGTs into diagnostic molecular oncology service.

Choosing a Platform

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

When high-throughput instruments first made their appearance, laboratories paid more attention to the accuracy of base-reading: Less accurate sequencing meant more data cleaning and resequencing (1). Now, new instrument designs have narrowed the differences, and test chemistry can have a comparatively large impact on analytical accuracy (Figure 1). The robustness of technical performance can also vary significantly depending upon specimen type. For example, LifeTechnologies’ sequencing platforms appear to be comparatively more tolerant of low DNA quality and concentration, which is an important consideration for fixed and processed tissues.

https://www.aacc.org/~/media/images/cln/articles/2015/october/carter_fig1_cln_oct15_ed.jpg

Figure 1 Comparison of Sequencing Chemistries

Sequence pile-ups of the same target sequence (2 large genes), all performed on the same analytical instrument. Results from 4 different chemistries, as designed and supplied by reagent manufacturers prior to optimization in the laboratory. Red lines represent limits of exons. Height of blue columns proportional to depth of coverage. In this case, the intent of the test design was to provide high depth of coverage so that reflex Sanger sequencing would not be necessary. Courtesy B. Sadikovic, U. of Western Ontario.

 

In addition, batching, robotics, workload volume patterns, maintenance contracts, software licenses, and platform lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT: In some applications, fees for intellectual property can represent more than 50% of the bench cost of performing a given test, and increase substantially without warning.

Laboratories must also deal with the problem of obsolescence. Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives. Before NGTs, major instruments could confidently be expected to remain current for at least 6 to 8 years. Now, a major instrument is obsolete much sooner, often within 2 to 3 years. This means that keeping it in service might cost more than investing in a new platform. Lease-purchase arrangements help mitigate year-to-year fluctuations in capital equipment costs, and maximize the value of old equipment at resale.

One Size Still Does Not Fit All

Laboratories face numerous technical considerations to optimize sequencing protocols, but the test has to be matched to the performance criteria needed for the clinical indication (2). For example, measuring response to treatment depends first upon the diagnostic recognition of mutation(s) in the tumor clone; the marker(s) then have to be quantifiable and indicative of tumor volume throughout the course of disease (Table 1).

As a result, diagnostic tests need to cover many different potential mutations, yet accurately identify any clinically relevant mutations actually present. On the other hand, tests for residual disease need to provide standardized, sensitive, and accurate quantification of a selected marker mutation against the normal background. A diagnostic panel might need 1% to 3% sensitivity across many different mutations. But quantifying early response to induction—and later assessment of minimal residual disease—needs a test that is reliably accurate to the 10-4 or 10-5 range for a specific analyte.

Covering all types of mutations in one diagnostic test is not yet possible. For example, subtyping of acute myeloid leukemia is both old school (karyotype, fluorescent in situ hybridization, and/or PCR-based or array-based testing for fusion rearrangements, deletions, and segmental gains) and new school (NGT-based panel testing for molecular mutations).

Chemistries that cover both structural variants and copy number variants are not yet in general use, but the advantages of NGTs compared to traditional methods are becoming clearer, such as in colorectal cancer (3). Researchers are also using cell-free DNA (cfDNA) to quantify residual disease and detect resistance mutations (4). Once a clinically significant clone is identified, enrichment techniques help enable extremely sensitive quantification of residual disease (5).

Validation and Quality Assurance

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assembling the resources for validation and quality assurance. The second is keeping tests up-to-date as new analytes are needed. Even if a given test chemistry has the flexibility to add analytes without revalidating the entire panel, keeping up with clinical advances is a constant priority.

Due to their throughput and multiplexing capacities, NGT platforms typically require considerable upfront investment to adopt, and training staff to perform testing takes even more time. Proper validation is harder to document: Assembling positive controls, documenting test performance criteria, developing quality assurance protocols, and conducting proficiency testing are all demanding. Labs meet these challenges in different ways. Laboratory-developed tests (LDTs) allow self-determined choice in design, innovation, and control of the test protocol, but can be very expensive to set up.

Food and Drug Administration (FDA)-approved methods are attractive but not always an option. More FDA-approved methods will be marketed, but FDA approval itself brings other trade-offs. There is a cost premium compared to LDTs, and the test methodologies are locked down and not modifiable. This is particularly frustrating for NGTs, which have the specific attraction of extensive multiplexing capacity and accommodating new analytes.

IT and the Evolution of Molecular Oncology Reporting Standards

The options for information technology (IT) pipelines for NGTs are improving rapidly. At the same time, recent studies still show significant inconsistencies and lack of reproducibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing. It can be difficult to duplicate published performances in clinical studies because of a lack of sufficient information about the protocol (chemistry) and software. Building bioinformatics capacity is a key requirement, yet skilled people are in short supply and the qualifications needed to work as a bioinformatician in a clinical service are not yet clearly defined.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-specific­ variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages (6). One of the biggest challenges is to reproducibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations (7). For multiple analyte panels, such as predictive testing for breast cancer, only the performance of the whole panel in a population of patients can be compared; individual patients may be scored into different risk categories by different tests, all for the same test indication.

In large scale sequencing of tumor genomes, which types of mutations are most informative in detecting, quantifying, and predicting the behavior of the tumor over time? The amount and complexity of mutation varies considerably across different tumor types, and while some mutations are more common, stable, and clinically informative than others, the utility of a given tumor marker varies in different clinical situations. And, for a given tumor, treatment effect and metastasis leads to retesting for changes in drug sensitivities.

These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision. One approach many labs choose is licensed technologies with shared databases that are updated in real time. These are attractive, despite their cost and licensing fees. New tests that incorporate proprietary IT with NGT platforms link the genetic signatures of tumors to clinically significant considerations like tumor classification, recommended methodologies for monitoring response, predicted drug sensitivities, eligible clinical trials, and prognostic classifications. In-house development of such solutions will be difficult, so licensing platforms from commercial partners is more likely to be the norm.

The Commercial Value of Health Records and Test Data

The future of cancer management likely rests on large-scale databases that link hereditary and somatic tumor testing with clinical outcomes. Multiple centers have such large studies underway, and data extraction and analysis is providing increasingly refined interpretations of clinical significance.

Extracting health outcomes to correlate with molecular test results is commercially valuable, as the pharmaceutical, insurance, and healthcare sectors focus on companion diagnostics, precision medicine, and evidence-based health technology assessment. Laboratories that can develop tests based on large-scale integration of test results to clinical utility will have an advantage.

NGTs do offer opportunities for net reductions in the cost of healthcare. But the lag between availability of a test and peer-evaluated demon­stration of clinical utility can be considerable. Technical developments arise faster than evidence of clinical utility. For example, immuno­histochemistry, estrogen receptor/progesterone receptor status, HER2/neu, and histology are still the major pathological criteria for prognostic evaluation of breast cancer at diagnosis, even though multiple analyte tumor profiling has been described for more than 15 years. Healthcare systems need a more concerted assessment of clinical utility if they are to take advantage of the promises of NGTs in cancer care.

Disruptive Advances

Without a doubt, “disruptive” is an appropriate buzzword in molecular oncology, and new technical advances are about to change how, where, and for whom testing is performed.

• Predictive Testing

Besides cost per analyte, one of the drivers for taking up new technologies is that they enable multiplexing many more analytes with less biopsy material. Single-analyte sequential testing for epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase, and other targets on small biopsies is not sustainable when many more analytes are needed, and even now, a significant proportion of test requests cannot be completed due to lack of suitable biopsy material. Large panels incorporating all the mutations needed to cover multiple tumor types are replacing individual tests in companion diagnostics.

• Cell-Free Tumor DNA

Challenges of cfDNA include standardizing the collection and processing methodologies, timing sampling to minimize the effect of therapeutic toxicity on analytical accuracy, and identifying the most informative sample (DNA, RNA, or protein). But for more and more tumor types, it will be possible to differentiate benign versus malignant lesions, perform molecular subtyping, predict response, monitor treatment, or screen for early detection—all without a surgical biopsy.

cfDNA technologies can also be integrated into core laboratory instrumentation. For example, blood-based EGFR analysis for lung cancer is being developed on the Roche cobas 4800 platform, which will be a significant change from the current standard of testing based upon single tests of DNA extracted from formalin-fixed, paraffin-embedded sections selected by a pathologist (8).

• Whole Genome and Whole Exome Sequencing

Whole genome and whole exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individual or multiple gene test panels as the technical cost of sequencing declines and interpretive accuracy improves (9). Laboratories can apply informatics selectively or broadly to extract much more information at relatively little increase in cost, and the interpretation of individual analytes will be improved by the context of the whole sequence.

• Minimal Residual Disease Testing

Massive resequencing and enrichment techniques can be used to detect minimal residual disease, and will provide an alternative to flow cytometry as costs decline. The challenge is to develop robust analytical platforms that can reliably produce results in a high proportion of patients with a given tumor type, despite using post-treatment specimens with therapy-induced degradation, and a very low proportion of target (tumor) sequence to benign background sequence.

The tumor markers should remain informative for the burden of disease despite clonal evolution over the course of multiple samples taken during progression of the clinical course and treatment. Quantification needs to be accurate and sensitive down to the 10-5 range, and cost competitive with flow cytometry.

• Point-of-Care Test Methodologies

Small, rapid, cheap, and single use point-of-care (POC) sequencing devices are coming. Some can multiplex with analytical times as short as 20 minutes. Accurate and timely testing will be possible in places like pharmacies, oncology clinics, patient service centers, and outreach programs. Whether physicians will trust and act on POC results alone, or will require confirmation by traditional laboratory-based testing, remains to be seen. However, in the simplest type of application, such as a patient known to have a particular mutation, the advantages of POC-based testing to quantify residual tumor burden are clear.

Conclusion

Molecular oncology is moving rapidly from an esoteric niche of diagnostics to a mainstream, required component of integrated clinical laboratory services. While NGTs are markedly reducing the cost per analyte and per specimen, and will certainly broaden the scope and volume of testing performed, the resources required to choose, install, and validate these new technologies are daunting for smaller labs. More rapid obsolescence and increased regulatory scrutiny for LDTs also present significant challenges. Aligning test capacity with approved clinical indications will require careful and constant attention to ensure competitiveness.

References

1. Liu L, Li Y, Li S, et al. Comparison of next-generation sequencing systems. J Biomed Biotechnol 2012; doi:10.1155/2012/251364.

2. Brownstein CA, Beggs AH, Homer N, et al. An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge. Genome Biol 2014;15:R53.

3. Haley L, Tseng LH, Zheng G, et al. Performance characteristics of next-generation sequencing in clinical mutation detection of colorectal ­cancers. [Epub ahead of print] Modern Pathol July 31, 2015 as doi:10.1038/modpathol.2015.86.

4. Butler TM, Johnson-Camacho K, Peto M, et al. Exome sequencing of cell-free DNA from metastatic cancer patients identifies clinically actionable mutations distinct from primary ­disease. PLoS One 2015;10:e0136407.

5. Castellanos-Rizaldos E, Milbury CA, Guha M, et al. COLD-PCR enriches low-level variant DNA sequences and increases the sensitivity of genetic testing. Methods Mol Biol 2014;1102:623–39.

6. Hiltemann S, Jenster G, Trapman J, et al. Discriminating somatic and germline mutations in tumor DNA samples without matching normals. Genome Res 2015;25:1382–90.

7. Lammers PE, Lovly CM, Horn L. A patient with metastatic lung adenocarcinoma harboring concurrent EGFR L858R, EGFR germline T790M, and PIK3CA mutations: The challenge of interpreting results of comprehensive mutational testing in lung cancer. J Natl Compr Canc Netw 2015;12:6–11.

8. Weber B, Meldgaard P, Hager H, et al. Detection of EGFR mutations in plasma and biopsies from non-small cell lung cancer patients by allele-specific PCR assays. BMC Cancer 2014;14:294.

9. Vogelstein B, Papadopoulos N, Velculescu VE, et al. Cancer genome landscapes. Science 2013;339:1546–58.

10. Heitzer E, Auer M, Gasch C, et al. Complex tumor genomes inferred from single circulating tumor cells by array-CGH and next-generation sequencing. Cancer Res 2013;73:2965–75.

11. Healy B. BRCA genes — Bookmaking, fortunetelling, and medical care. N Engl J Med 1997;336:1448–9.

 

 

 

Read Full Post »

Pain Management

Larry H Bernstein, MD, FCAP, Curator

LPBI

 

Pain Management Health Center

http://www.webmd.com/pain-management/

 

Pain Management Overview

Pain management is important for ongoing pain control, especially if you suffer with long-term or chronic pain. After getting a pain assessment, your doctor can prescribe pain medicine, other pain treatments, or psychotherapy to help with pain relief.

Nearly any part of your body is vulnerable to pain. Acute pain warns us that something may be wrong. Chronic pain can rob us of our daily life, making it difficult and even unbearable. Many people with chronic pain can be helped by understanding the causes, symptoms, and treatments for pain – and how to cope with the frustrations.

You know your pain better than anyone — and as hard as it’s been to handle it, your experience holds the key to making a plan to treat it.

Each person and their pain are unique. The best way to manage your case could be very different from what works for someone else. Your treatment will depend upon things such as:

  • The cause
  • How intense it is
  • How long it’s lasted
  • What makes it worse or better

It can be a process to find your best plan. You can try a combination of things and then report back to your doctor about how your pain is doing. Together, you can tweak your program based on what’s working and what needs more help.

All Pain Is Not the Same

In order to make your pain management plan, your doctor will first consider whether you have sudden (“acute”) or long-term (“chronic”) pain.

Acute pain starts suddenly and usually feels sharp. Broken bones, burns, or cuts are classic examples. So is pain after surgery or giving birth.

Acute pain may be mild and last just a moment. Or it may be severe and last for weeks or months. In most cases, acute pain does not last longer than 6 months, and it stops when its underlying cause has been treated or has healed.

If the problem that causes short-term pain isn’t treated, it may lead to long-term, or “chronic” pain.

Chronic pain lasts longer than 3 months, often despite the fact that an injury has healed. It could even last for years. Some examples include:

  • Headache
  • Low back pain
  • Cancer pain
  • Arthritis pain
  • Pain caused by nerve damage

It can cause tense muscles, problems with moving, a lack of energy, and changes in appetite. It can also affect your emotions. Some people feel depressed, angry, or anxious about the pain and injury coming back.

Chronic pain doesn’t always have an obvious physical cause.

What Can I Do to Feel Better?

1. Keep moving. You might think it’s best to rest on the sidelines. But being active is a good idea. You’ll get stronger and move better.

The key is knowing what’s OK for you to do to get stronger and challenge your body, without doing too much, too soon.

Your doctor can let you know what changes to make. For instance, if you used to run and your joints can’t take that now because you have a chronic condition like osteoarthritis, you might be able to switch to something like biking or swimming.

2. Physical and occupational therapy. Take your recovery to the next level with these treatments. In PT, you’ll focus on the exact muscles you need to strengthen, stretch, and recover from injury. Your doctor may also recommend “occupational therapy,” which focuses on how to do specific tasks, like walking up and down stairs, opening a jar, or getting in and out of a car, with less pain.

3. Counseling. If pain gets you down, reach out. A counselor can help you get back to feeling like yourself again. You can say anything, set goals, and get support. Even a few sessions are a good idea. Look for a counselor who does “cognitive behavioral therapy,” in which you learn ways that your thinking can support you as you work toward solutions.

4. Massage therapy. It’s not a cure, but it can help you feel better temporarily and ease tension in your muscles. Ask your doctor or physical therapist to recommend a massage therapist. At your first appointment, tell them about the pain you have. And be sure to let them know if the massage feels too intense.

5. Relaxation. Meditation and deep breathing are two techniques to try. You could also picture a peaceful scene, do some gentle stretching, or listen to music you love. Another technique is to scan your body slowly in your mind, and consciously try to relax each part of your body, one by one, from head to toe. Any healthy activity that helps you unwind is good for you and can help you feel better prepared to manage your pain.

6. Consider complementary treatments such as acupuncture, biofeedback, and spinal manipulation. In acupuncture, a trained practitioner briefly inserts very thin needles in certain places on your skin to tap into your “chi,” which is an inner energy noted in traditional Chinese medicine. It doesn’t hurt.

Biofeedback trains you to control how your body responds to pain. In a session of it, you’ll wear electrodes hooked up to a machine that tracks your heart rate, breathing, and skin temperature, so you can see the results.

When you get spinal manipulation, a medical professional uses their hands or a device to adjust your spine so that you can move better and have less pain. Some MDs do this. So do chiropractors, osteopathic doctors (they have “DO” after their name instead of “MD”), and some physical therapists.

Are There Devices That Help?

Although there are no products that take pain away completely, there are some that you and your doctor could consider.

TENS and ultrasound. Transcutaneous electrical nerve stimulation, or TENS, uses a device to send an electric current to the skin over the area where you have pain. Ultrasound sends sound waves to the places you have pain. Both may offer relief by blocking the pain messages sent to your brain.

Spinal cord stimulation. An implanted device delivers low-voltage electricity to the spine to block pain.  If your doctor thinks it’s an option, you would use it for a trial period before you get surgery to have it permanently implanted. In most cases, you can go home the same day as the procedure.

What About Medicine?

Your doctor will consider what’s causing your pain, how long you’ve had it, how intense it is, and what medications will help. They may recommend one or more of the following:

These may include over-the-counter pain relievers such as acetaminophen, aspirin, ibuprofen, or naproxen. Or you may need stronger medications that require a prescription, such as steroids, morphine, codeine, or anesthesia.

Some are pills or tablets. Others are shots. There are also sprays or lotions that go on your skin.

Other drugs, like muscle relaxers and some antidepressants, are also used for pain. Some people may need anesthetic drugs to block pain.

Will I Need Surgery?

It depends on why you’re in pain. If you’ve had a sudden injury or accident, you might need surgery right away.

But if you have chronic pain, you may or may not need an operation or another procedure, such as a nerve block (done with anesthetics or other types of prescription drugs to halt pain signals) or a spinal injection (such as a shot of cortisone or an anesthetic drug).

Talk with your doctor about what results you can expect and any side effects, so you can weigh the risks and the benefits. Also ask how many times the doctor has done the procedure they recommend and what their patients have said about how much relief they’ve gotten.

WebMD Medical Reference

Reviewed by Jennifer Robinson, MD on September 20, 2015

Read Full Post »

Amyloid-Targeting Immunotherapy

Curator: Larry H. Bernstein, MD, FCAP

Possible Reasons Found for Failure of Alzheimer’s Treatment

By Staff Editor

http://www.healthnewsdigest.com/news/Alzheimer_Issues_680/Possible-Reasons-Found-for-Failure-of-Alzheimer-s-Treatment.shtml

(HealthNewsDigest.com) – Agglutinated proteins in the brain, known as amyloid-β plaques, are a key characteristic of Alzheimer’s. One treatment option uses special antibodies to break down these plaques. This approach yielded good results in the animal model, but for reasons that are not yet clear, it has so far been unsuccessful in patient studies. Scientists at the Technical University of Munich (TUM) have now discovered one possible cause: they noticed that, in mice that received one antibody treatment, nerve cell disorders did not improve and were even exacerbated.

Immunotherapies with antibodies that target amyloid-β were long considered promising for treating Alzheimer’s. Experiments with animals showed that they reduced plaques and reversed memory loss. In clinical studies on patients, however, it has not yet been possible to confirm these results. A team of researchers working with Dr. Dr. Marc Aurel Busche, a scientist at the TUM hospital Klinikum rechts der Isar Klinik und Poliklinik für Psychiatrie und Psychotherapie and at the TUM Institute of Neuroscience, and Prof. Arthur Konnerth from the Institute of Neuroscience has now clarified one possible reason for this. The findings were published in Nature Neuroscience.

Immunotherapy Increases Number of Hyperactive Nerve Cells

The researchers used Alzheimer’s mice models for their study. These animals carry a transgene for the amyloid-β precursor protein, which, as in humans, leads to the formation of amyloid-β plaques in the brain and causes memory disorders. The scientists treated the animals with immunotherapy antibodies and then analyzed nerve cell activity using high-resolution two-photon microscopy. They found that, while the plaques disappeared, the number of abnormally hyperactive neurons rose sharply.

“Hyperactive neurons can no longer perform their normal functions and, after some time, wear themselves out. They then fall silent and, later, possibly die off,” says Busche, explaining the significance of their discovery. “This could explain why patients who received the immunotherapy experienced no real improvement in their condition despite the decrease in plaques,” he adds.

Released Oligomers Potential Reason for Hyperactivity

Even in young Alzheimer’s mice, when no plaques were yet detectable in the brain, the antibody treatment led to increased development of hyperactive nerve cells. “Looking at these findings, even using the examined immunotherapies at an early stage, before the plaques appear, would offer little chance of success. As the scientist explains, the treatment already exhibits these side effects here, too.

“We suspect that the mechanism is as follows: The antibodies used in treatment release increasing numbers of soluble oligomers. These are precursors of the plaques and have been considered problematic for some time now. This could cause the increase in hyperactivity,” says Busche.

The work was funded by an Advanced ERC grant to Prof. Arthur Konnerth, the EU FP7 program (Project Corticonic) and the Deutsche Forschungsgemeinschaft (IRTG 1373 and SFB870). Marc Aurel Busche was supported by the Hans und Klementia Langmatz Stiftung.

Publication
Marc Aurel Busche, Christine Grienberger, Aylin D. Keskin, Beomjong Song, Ulf Neumann, Matthias Staufenbiel, Hans Förstl and Arthur Konnerth, Decreased amyloid-β and increased neuronal hyperactivity by immunotherapy in Alzheimer’s models, Nature Neuroscience, November 9, 2015.
DOI: 10.1038/nn.4163
http://www.nature.com/neuro/journal/vaop/ncurrent/full/nn.4163.html

Amyloid-Targeting Immunotherapy Disrupts Neuronal Function

Some antibodies designed to eliminate the plaques prominent in Alzheimer’s disease can aggravate neuronal hyperactivity in mice.

By Karen Zusi | November 9, 2015  http://www.the-scientist.com//?articles.view/articleNo/44435/title/Amyloid-Targeting-Immunotherapy-Disrupts-Neuronal-Function/

http://www.the-scientist.com/images/News/November2015/10_alzheimerbrain_b.jpg

Removing built-up plaques of amyloid-β in the brain is a long-sought therapy for patients with Alzheimer’s disease, but for a variety of reasons, few treatments have succeeded in alleviating symptoms once they reach clinical trials. In a study published today (November 9) in Nature Neuroscience, an international team examined the effects of two amyloid-β antibodies on neuronal activity in a mouse model, finding that the antibodies in fact led to an increase in neuronal dysfunction.

Decreased amyloid-β and increased neuronal hyperactivity by immunotherapy in Alzheimer’s models

Marc Aurel BuscheChristine GrienbergerAylin D KeskinBeomjong SongUlf NeumannMatthias StaufenbielHans Förstl & Arthur Konnerth
Nature Neuroscience (2015)
    http://dx.doi.org:/10.1038/nn.4163

Among the most promising approaches for treating Alzheimer´s disease is immunotherapy with amyloid-β (Aβ)-targeting antibodies. Using in vivo two-photon imaging in mouse models, we found that two different antibodies to Aβ used for treatment were ineffective at repairing neuronal dysfunction and caused an increase in cortical hyperactivity. This unexpected finding provides a possible cellular explanation for the lack of cognitive improvement by immunotherapy in human studies.

Marc Busche, a psychiatrist at Technical University of Munich in Germany, and others had previously found that neuronal hyperactivity is common in mouse models of Alzheimer’s disease. The chronically rapid-firing neurons can interfere with normal brain function in mice. “There’s evidence from human fMRI [functional magnetic resonance imaging] studies that humans will show hyperactivation early in the disease, followed by hypoactivation later on,” Busche told The Scientist. “It’s an early stage of neuronal dysfunction that can later turn into neural silencing.”

To investigate whether certain antibodies would alleviate this Alzheimer’s disease-associated phenotype, Busche and his colleagues first turned to bapineuzumab—a human monoclonal antibody that initially showed promise in treating mice modeling Alzheimer’s disease, but failed in human clinical trials. The dominant hypothesis for bapineuzumab’s failure is that it was administered too late in the disease progression, said Busche. “But it’s still a hypothesis,” he added. “There’s no real explanation for why these antibodies failed.”

The team’s latest experimenters used mice with a genetic mutation that caused them to overexpress the human amyloid-β protein; these engineered mice also displayed neuronal hyperactivity. The researchers injected 3D6, the mouse version of bapineuzumab, into the engineered mice, as well as into wild-type mice that had normal expression levels of the mouse amyloid-β protein. The team observed the effects using two-photon calcium imaging in a blinded study.

As expected, 3D6 decreased the amount of amyloid-β plaques in the engineered mice, while the control mice displayed no reaction to the injected antibodies. However, the mice engineered to overexpress human amyloid-β showed increased neuronal hyperactivity in response to the antibody, regardless of what stage of plaque development they were in. Even mice too young to have developed plaques showed aggravated hyperactive neurons. The team observed the same phenomenon when it tested a second antibody, β1, which went through early stages of drug development but was never used in human clinical trials.

As expected, 3D6 decreased the amount of amyloid-β plaques in the engineered mice, while the control mice displayed no reaction to the injected antibodies. However, the mice engineered to overexpress human amyloid-β showed increased neuronal hyperactivity in response to the antibody, regardless of what stage of plaque development they were in. Even mice too young to have developed plaques showed aggravated hyperactive neurons. The team observed the same phenomenon when it tested a second antibody, β1, which went through early stages of drug development but was never used in human clinical trials.

The results surprised Busche. “When it turned out that the antibody group was worse than the control group, it was unbelievable. But we checked many times and there was no mistake,” he said. “We don’t see this effect in wild-type mice so it must be dependent on the interaction between the antibody and amyloid-β.”

Busche was quick to point out that the mouse model is not the same as a human Alzheimer’s patient. However, he said, “it gives a sense that we don’t understand the antibody’s action, and this might go on in the human brain as well.”

“I fully believe in their results, but I have some hesitation in saying that this result explains the failed clinical trials for amyloid-β immunotherapy,” said Cynthia Lemere, a neurologist and Alzheimer’s disease researcher at the Brigham and Women’s Hospital in Boston. “I think the major reason for clinical trials failing for immunotherapy is that up until now, they’ve been done in people with moderate-to-severe Alzheimer’s disease, and then mild-to-moderate. Now the studies are going further to include people with very early stages of clinical symptoms—and to my knowledge, they haven’t been stopped because patients are getting worse.”

Thomas Wisniewski, a cognitive neurologist at New York University, voiced a similar perspective. “I don’t think this is an explanation for why immunotherapy isn’t working—I think there are other more plausible reasons for that,” he said, citing clinical trials that treated patients during later stages of Alzheimer’s disease progression, as well as those that haven’t addressed tau-related pathologies, or didn’t target the key types of amyloid-β. “[The neuronal hyperactivity] is an interesting phenomenon to be studied,” he added, “but I think it’s a separate issue.”

M.A. Busche et al., “Decreased amyloid-β and increased neuronal hyperactivity by immunotherapy in Alzheimer’s models,” Nature Neuroscience, doi:10.1038/nn.4163, 2015.

Figure 2: Worsening of neuronal dysfunction by anti-Aβ antibodies can occur independently of the effects on Aβ pathology.

Worsening of neuronal dysfunction by anti-A[beta] antibodies can occur independently of the effects on A[beta] pathology.

(a) Top, representative in vivo activity maps in WT (left) as well as isotype-treated (middle) and β1-treated (right) Tg2576 mice. Bottom, Ca2+ transients of neurons indicated above. The further aggravation of neuronal hyperactivity (mi…

http://www.nature.com/neuro/journal/vaop/ncurrent/carousel/nn.4163-F2.jpg

Anti-Aβ treatment aggravates abnormal brain activity in a mouse model of Alzheimer’s disease

Nature Neuroscience   Nov 10, 2015

http://www.natureasia.com/en/research/highlight/10316

Therapies that reduce deposits of amyloid-β (Aβ) in the brain are ineffective at repairing neuronal impairment in mice and actually increase it, finds a study published online in Nature Neuroscience. Aβ deposits aggregate into clumps in the brain which are a pathological hallmark of Alzheimer’s disease.

Expression of mutant human amyloid protein in animals results in deposits of Aβ plaques that induce abnormal increases in neuronal activity and impair the normal function of neuronal circuits.

Arthur Konnerth, Marc Busche and colleagues explored whether they could reverse these impairments by treating mice that overexpress the human mutant amyloid precursor protein with either of two different antibodies targeting Aβ (14 mice) or a control antibody (19 mice). They found that, although treatment with the Aβ targeting antibodies reduced the amount of plaques in the animals’ brains, it also increased the amount of hyperactive neurons.

This was true whether the treatment was given to older mice (14 treated, 19 control) or younger mice in which the accumulation of Aβ had yet to occur (10 treated, 13 control). The same therapies had no effect on neuronal activity in a group of normal mice (5 treated, 3 control), suggesting that the observed exacerbation in mutant mice is dependent on the presence of Aβ and cannot be explained by incidental effects of inflammation in response to the antibodies.

The authors note that, although other research has shown that anti-Aβ treatment can prevent the weakening of neuronal connections and memory impairments in animal models of Alzheimer’s disease, these benefits are not enough to repair neuronal dysfunction.

They suggest that their findings provide a cellular mechanism that may explain, in part, why treatments targeting Aβ in human clinical trials have failed to improve cognitive deficits. However, the authors point out that future studies are needed to determine whether the increase in abnormal neural activity seen in their animal models is related to the poor efficacy of Aβ therapy in patients.

 

ANAVEX™ 2-73

ANAVEX™ 2-73 is an orally available drug candidate developed to potentially modify Alzheimer’s disease rather than temporarily address its symptoms. It has a clean Phase 1 data profile and shows reversal of memory loss (anti-amnesic properties) and neuroprotection in several models of Alzheimer’s disease.

Successful Phase 1 Clinical Trial

A Phase 1 single ascending dose human clinical trial of ANAVEX 2-73 was successfully completed in healthy human volunteers. It was a randomized, placebo-controlled study. Healthy male volunteers aged 18 to 55 received single, ascending oral doses over the course of the trial. The trial objectives were to define the maximum tolerated dose, assess pharmacokinetics (PK), clinical and lab safety.

Results:

  • Dosing from 1-60 mg.
  • Maximum tolerated dose 55-60 mg; above the equivalent dose shown to have positive effects in mouse models of Alzheimer’s disease.
  • Well tolerated below the 55-60 mg dose with only mild adverse events in some volunteers.
  • Observed adverse events at doses above the maximum tolerated single dose included headache and dizziness, which were moderate in severity and reversible. These side effects are often seen with drugs that target central nervous system (CNS) conditions, including Alzheimer’s disease.
  • No significant changes in blood safety measurements.
  • No changes in ECG.
  • Favorable PK profile.
    • Rapid absorption into blood.
    • Dose proportional kinetics.

The trial was conducted in Germany by ABX-CRO in collaboration with the Technical University of Dresden. ABX-CRO and the Technical University of Dresden are well regarded for their experience with clinical trials and CNS compounds.

 

ANAVEX 2-73,

Clinical-stage biopharmaceutical company Anavex Life Sciences Corp. is working on an investigational oral treatment for Alzheimer’s disease called ANAVEX 2-73, with full PART A data and preliminary PART B data from its ongoing Phase 2a clinical trial to be presented during the Clinical Trials on Alzheimer’s Disease (CTAD) conference, November 5 and 7 in Barcelona, Spain.

The trial’s Principal Investigator, Stephen Macfarlane, who also serves as director and associate professor at Aged Psychiatry, Caulfield Hospital in Melbourne, Australia, will represent the company and host a late-breaking oral session entitled “New Exploratory Alzheimer’s Drug ANAVEX 2-73: Assessment of Safety and Cognitive Performance in a Phase 2a Study in mild-to-moderate Alzheimer’s Patients.” During the presentation, which will take place Saturday, November 7, at 9:45 a.m. CET, at the Gran Hotel Princesa Sofia, in Barcelona, Macfarlane will focus on the the multicenter Phase 2a clinical trial of ANAVEX 2-73. The study includes two separate phases and includes 32 mild-to-moderate Alzheimer’s patients. While PART A is a simple randomized, open-label, two-period, cross-over, adaptive trial of up to 36 days, PART B is an open-label extension trial for an additional 52 weeks.

The research intends to assess the maximum dose of treatment tolerated by patients, and to explore cognitive efficacy using mini-mental state examination score (MMSE), dose response, bioavailability, Cogstate and electroencephalographic (EEG) activity, including event-related potentials (EEG/ERP), as well as the preformance of ANAVEX 2-73 as an add-on therapy to donepezil (Aricept).

ANAVEX 2-73 is Anavex’s lead investigational treatment for Alzheimer’s disease, in line with the company’s goal of finding effective therapies for Alzheimer’s disease, other central nervous system (CNS) diseases, pain, and various types of cancer. The novel drug targets sigma-1 and muscarinic receptors, which are thought to decrease the amount of protein misfolding, beta amyloid tau and inflammation through upstream actions.

Last November, the biopharmaceutical company presented encouraging results from their phase 1 clinical trial for Anavex 2-73, during the CNS Summit 2014 in Boca Raton, Florida. The phase 1 study demonstrated that the treatment is safe and well tolerated, suggesting a favorable pharmacokinetics profile. During the randomized, double-blind, placebo-controlled study no severe adverse events were registered, while the adverse events reported included moderate and reversible headache and dizziness, which are common symptoms associated with drugs that target central nervous system (CNS) conditions, such as Alzheimer’s.

New Exploratory Alzheimer’s Drug ANAVEX 2-73: Assessment of Safety and Cognitive Performance in a Phase 2a Study in mild-to-moderate Alzheimer’s Patients

Steve Macfarlane, MD1 , Paul Maruff, PhD2 , Marco Cecchi, PhD3 , Dennis Moore, PhD3 , Anastasios Zografidis, PhD4 , Christopher Missling, PhD4 (1)

Caulfield Hospital, Melbourne, Australia (2), Cogstate, Melbourne, Australia (3), Neuronetrix, KY, USA (4), Anavex Life Sciences, Corp., New York, NY, USA

Background: Despite major efforts aimed at finding a treatment for Alzheimer’s disease (AD), progress in developing compounds that can relieve cognitive deficits associated with the disease has been slow. ANAVEX 2-73 is a sigma-1 and muscarinic receptor agonist that in preclinical studies has shown memory-preserving and neuroprotective effects. In our ongoing phase 2a clinical study we are assessing ANAVEX 2-73 safety in subjects with mild-to-moderated AD, and measuring drug effects on MMSE, EEG and Event Related Potentials (ERP) cognitive measures, and Cogstate test batteries to optimize dosing.

Methods: Thirty-two subjects that meet NINCDS-ADRDA criteria for probable AD are being recruited at up to seven clinical sites in Melbourne, Australia. Subjects are between 55 and 85 years of age, and have an MMSE of 16 to 28. In PART A of the study, participants are administered ANAVEX 2-73 orally and IV in an open-label, 2-period, cross-over trial with adaptive study design lasting up to 36 days for each participant. In PART B of the study, all participants are administered ANAVEX 2-73 daily orally. MMSE, EEG/ERP (P300) and Cogstate tests are performed at baseline and subsequently at weeks 12, 26, 38 and 52 of the PART B open label extension.

Results: The primary outcome of the study is safety, and ANAVEX 2-73 was well tolerated. In the secondary outcome endpoints preliminary analysis of data from subjects shows an average improvement of the MMSE score at week 5. A majority of all patients tested so far improved their respective MMSE score. The average EEG/ERP (P300 amplitude) signal also improved and also the average Cogstate test improved across the test batteries.

Conclusions: Data collected so far indicate that ANAVEX 2-73 is safe and well tolerated. Interim results also show improved cognitive performance after drug administration in subjects with mild-to-moderate AD. The current results seem to justify a prospective comparison with current standard of care in a larger clinical trial study. A more complete set of results will be available at the time of the conference.

Read Full Post »

« Newer Posts - Older Posts »