Feeds:
Posts
Comments

Archive for the ‘Explanatory’ Category

Article Title, Author/Curator’s Name and Article Views >1,000, 4/2012 – 1/2019 @pharmaceuticalintelligence.com

 

Reporter: Aviva Lev-Ari, PhD, RN

 

Expert, Author, Writer’s Initials

Name & Bio

Roles

@LPBI Group

LHB Larry Bernstein, MD, FACP,

 

Member of the Board

Expert, Author, Writer – All Specialties of Medicine & Pathology

Content Consultant to Series B,C,D,E

Editor, Series D, Vol. 1, Series E, Vols 2,3,

Co-Editor – BioMed E-Series 13 of the 16 Vols

JDP Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC,

 

Expert, Author, Writer, All Specialties of Medicine, Cardiology and Cardiac Imaging

Content Consultant for SERIES A, Cardiovascular Diseases Co-Editor: Vols 2,3,4,5,6

ALA Aviva Lev-Ari, PhD, RN,

-Ex – SRI, Int’l

-Ex – MITRE

-Ex – McGraw-Hill

Director and Founder

Editor-in-Chief, @pharmaceuticalintelligence.com

Methodologies Developer:

  • Journal Platform Architect,
  • CURATION of Scientific Findings Modules,
  • REALTIME eProceedings Digital 1-Click Publishing

Expert, Author, Writer:

  • Analytics
  • Molecular Cardiology
  • Vascular Biology
TB Tilda Barliya, PhD,

@BIU

Expert, Author, Writer: Nanotechnology for Drug Delivery

Co-Editor, Series C, Vols. 1,2

DN Dror Nir, PhD,

 

Expert, Author, Writer: Cancer & Medical Imaging Algorithms
ZR       
Ziv Raviv, PhD,
@Technion
Expert, Author, Writer: Biological Sciences, Cancer
ZS Zohi Sternberg, PhD, Expert, GUEST Author, Writer

 

Expert, GUEST Author, Writer

Neurological Sciences

SJW Stephen J. Williams, PhD Pharmacology, BSc Toxicology

Ex-Fox Chase

EAW – Cancer Biology

Co-Editor, Series A, Vol.1

Co-Editor, Series B, Genomics: Vols. 1,2

Co-Editor, Series C, Cancer, Vols. 1,2

DS Demet Sag, PhD, CRA, GCP,

 

Expert, Author, Writer: Genome Biology, Immunology, Biological Sciences: Cancer
SS Sudipta Saha, PhD,

 

Expert, Author, Writer: Reproductive Biology, Endocrinology, Bio-Instrumentation

Co-Editor, Series D, Volume 2, Infectious Diseases

AV Aviral Vatsa, PhD, MBBS

 

Expert, Author, Writer: Medical Sciences, Bone Disease, Human Sensation and Cellular Transduction: Physiology and Therapeutics

 

RS Ritu Saxena, PhD,

 

Expert, Author, Writer: Biological Sciences, Bone Disease, Cancer (Lung, Liver)
GST Gail S. Thornton, PhD(c),

Ex-MERCK

Contributing Editor, Author and Medical Writer

Co-Editor, Series E, Vol.1 Voices of Patients

RN Raphael Nir, PhD, MSM, MSc

Ex-ScheringPlough

– Expert, Author, Writer – Member of the Cancer Research Team: Brain Cancer, Liver Cancer, Cytokines

– CSO, SBH Sciences, Inc.

MB Michael R. Briggs, Ph.D.

Ex-Pfizer

– Expert, Author, Writer – Member of the Cancer Research Team: NASH

– CSO, Woodland Biosciences

AK Alan F. Kaul, R.Ph., Pharm.D, M.Sc., M.B.A., FCCP, Expert, Author, Writer

Ex-Director BWH Pharmacy

Expert, Author, Writer: Pharmacology – all aspects of Drug development and dispensation, Policy analyst
AS Anamika Sarkar, PhD,

 

Expert, Author, Writer: Computation Biology & Bioinformatics
MWF Marcus Feldman, PhD,

Stanford University, Biological Sciences, Center for Genomics

751
Research items
51,402
Reads
39,126
Citations
Member of the Board,

Scientific Counsel: Life Sciences,

Content Consultant Series B, Genomics, Vols. 1,2

Co-Editor, Vol. 2, NGS

 

Article Title and Views >1,000,

4/2012 – -1/2018

 

 

 

 

Home page / Archives

Authors

Curators

Reporters

by Name

 

Views

by 

eReaders

 

 

 

 

 

600,145

Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View? LHB 16,720
Do Novel Anticoagulants Affect the PT/INR? The Cases of XARELTO (rivaroxaban) and PRADAXA (dabigatran)

JDP

ALA

13,225
Paclitaxel vs Abraxane (albumin-bound paclitaxel) TB 11,872
Recent comprehensive review on the role of ultrasound in breast cancer management DN 11,715
Clinical Indications for Use of Inhaled Nitric Oxide (iNO) in the Adult Patient Market: Clinical Outcomes after Use, Therapy Demand and Cost of Care ALA 7,045
Apixaban (Eliquis): Mechanism of Action, Drug Comparison and Additional Indications ALA 6,435
Mesothelin: An early detection biomarker for cancer (By Jack Andraka) TB 6,309
Our TEAM ALA 6,213
Akt inhibition for cancer treatment, where do we stand today? ZR 4,744
Biochemistry of the Coagulation Cascade and Platelet Aggregation: Nitric Oxide: Platelets, Circulatory Disorders, and Coagulation Effects LHB 4,508
Newer Treatments for Depression: Monoamine, Neurotrophic Factor & Pharmacokinetic Hypotheses ZS 4,188
AstraZeneca’s WEE1 protein inhibitor AZD1775 Shows Success Against Tumors with a SETD2 mutation SJW 4,128
Confined Indolamine 2, 3 dioxygenase (IDO) Controls the Hemeostasis of Immune Responses for Good and Bad DS 3,678
The Centrality of Ca(2+) Signaling and Cytoskeleton Involving Calmodulin Kinases and Ryanodine Receptors in Cardiac Failure, Arterial Smooth Muscle, Post-ischemic Arrhythmia, Similarities and Differences, and Pharmaceutical Targets LHB 3,652
FDA Guidelines For Developmental and Reproductive Toxicology (DART) Studies for Small Molecules SJW 3,625
Cardiovascular Diseases, Volume One: Perspectives on Nitric Oxide in Disease Mechanisms Multiple

Authors

3,575
Interaction of enzymes and hormones SS 3,546
AMPK Is a Negative Regulator of the Warburg Effect and Suppresses Tumor Growth In Vivo SJW 3,403
Causes and imaging features of false positives and false negatives on 18F-PET/CT in oncologic imaging DN 3,399
Introduction to Transdermal Drug Delivery (TDD) system and nanotechnology TB 3,371
Founder ALA 3,363
BioMed e-Series ALA 3,246
Signaling and Signaling Pathways LHB 3,178
Sexed Semen and Embryo Selection in Human Reproduction and Fertility Treatment SS 3,044
Alternative Designs for the Human Artificial Heart: Patients in Heart Failure – Outcomes of Transplant (donor)/Implantation (artificial) and Monitoring Technologies for the Transplant/Implant Patient in the Community

JDP

LHB

ALA

3,034
The mechanism of action of the drug ‘Acthar’ for Systemic Lupus Erythematosus (SLE) Dr. Karra 3,016
VISION ALA 2,988
Targeting the Wnt Pathway [7.11] LHB 2,961
Bone regeneration and nanotechnology AV 2,922
Pacemakers, Implantable Cardioverter Defibrillators (ICD) and Cardiac Resynchronization Therapy (CRT) ALA 2,892
The History and Creators of Total Parenteral Nutrition LHB 2,846
Funding, Deals & Partnerships ALA 2,708
Paclitaxel: Pharmacokinetic (PK), Pharmacodynamic (PD) and Pharmacogenpmics (PG) TB 2,700
LIK 066, Novartis, for the treatment of type 2 diabetes LHB 2,693
FDA Adds Cardiac Drugs to Watch List – TOPROL-XL® ALA 2,606
Mitochondria: Origin from oxygen free environment, role in aerobic glycolysis, metabolic adaptation LHB 2,579
Nitric Oxide and Platelet Aggregation Dr. Karra 2,550
Treatment Options for Left Ventricular Failure – Temporary Circulatory Support: Intra-aortic balloon pump (IABP) – Impella Recover LD/LP 5.0 and 2.5, Pump Catheters (Non-surgical) vs Bridge Therapy: Percutaneous Left Ventricular Assist Devices (pLVADs) and LVADs (Surgical) LHB 2,549
Isoenzymes in cell metabolic pathways LHB 2,535
“The Molecular pathology of Breast Cancer Progression” TB 2,491
In focus: Circulating Tumor Cells RS 2,465
Nitric Oxide Function in Coagulation – Part II LHB 2,444
Monoclonal Antibody Therapy and Market DS 2,443
Update on FDA Policy Regarding 3D Bioprinted Material SJW 2,410
Journal PharmaceuticalIntelligence.com ALA 2,340
A Primer on DNA and DNA Replication LHB 2,323
Pyrroloquinoline quinone (PQQ) – an unproved supplement LHB 2,294
Integrins, Cadherins, Signaling and the Cytoskeleton LHB 2,265
Evolution of Myoglobin and Hemoglobin LHB 2,251
DNA Structure and Oligonucleotides LHB 2,187
Lipid Metabolism LHB 2,176
Non-small Cell Lung Cancer drugs – where does the Future lie? RS 2,143
Biosimilars: CMC Issues and Regulatory Requirements ALA 2,101
The SCID Pig: How Pigs are becoming a Great Alternate Model for Cancer Research SJW 2,092
About ALA 2,076
Sex Hormones LHB 2,066
CD47: Target Therapy for Cancer TB 2,041
Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes ALA 2,017
Swiss Paraplegic Centre, Nottwil, Switzerland – A World-Class Clinic for Spinal Cord Injuries GST 1,989
Introduction to Tissue Engineering; Nanotechnology applications TB 1,964
Problems of vegetarianism SS 1,940
The History of Infectious Diseases and Epidemiology in the late 19th and 20th Century LHB 1,817
The top 15 best-selling cancer drugs in 2022 & Projected Sales in 2020 of World’s Top Ten Oncology Drugs ALA 1,816
Nanotechnology: Detecting and Treating metastatic cancer in the lymph node TB 1,812
Unique Selling Proposition (USP) — Building Pharmaceuticals Brands ALA 1,809
Wnt/β-catenin Signaling [7.10] LHB 1,777
The role of biomarkers in the diagnosis of sepsis and patient management LHB 1,766
Neonatal Pathophysiology LHB 1,718
Nanotechnology and MRI imaging TB 1,672
Cardiovascular Complications: Death from Reoperative Sternotomy after prior CABG, MVR, AVR, or Radiation; Complications of PCI; Sepsis from Cardiovascular Interventions JDP

ALA

1,659
Ultrasound-based Screening for Ovarian Cancer DN 1,655
Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC, Expert, Author, Writer, Editor & Content Consultant for e-SERIES A: Cardiovascular Diseases JDP 1,653
Scientific and Medical Affairs Chronological CV ALA 1,619
Competition in the Ecosystem of Medical Devices in Cardiac and Vascular Repair: Heart Valves, Stents, Catheterization Tools and Kits for Open Heart and Minimally Invasive Surgery (MIS) ALA 1,609
Stenting for Proximal LAD Lesions ALA 1,603
Mitral Valve Repair: Who is a Patient Candidate for a Non-Ablative Fully Non-Invasive Procedure? JDP

ALA

1,602
Nitric Oxide, Platelets, Endothelium and Hemostasis (Coagulation Part II) LHB 1,597
Outcomes in High Cardiovascular Risk Patients: Prasugrel (Effient) vs. Clopidogrel (Plavix); Aliskiren (Tekturna) added to ACE or added to ARB LHB 1,588
Diet and Diabetes LHB 1,572
Clinical Trials Results for Endothelin System: Pathophysiological role in Chronic Heart Failure, Acute Coronary Syndromes and MI – Marker of Disease Severity or Genetic Determination? ALA 1,546
Dealing with the Use of the High Sensitivity Troponin (hs cTn) Assays LHB 1,540
Biosimilars: Intellectual Property Creation and Protection by Pioneer and by Biosimilar Manufacturers ALA 1,534
Altitude Adaptation LHB 1,527
Baby’s microbiome changing due to caesarean birth and formula feeding SS 1,498
Interview with the co-discoverer of the structure of DNA: Watson on The Double Helix and his changing view of Rosalind Franklin ALA 1,488
Triple Antihypertensive Combination Therapy Significantly Lowers Blood Pressure in Hard-to-Treat Patients with Hypertension and Diabetes ALA 1,476
IDO for Commitment of a Life Time: The Origins and Mechanisms of IDO, indolamine 2, 3-dioxygenase DS 1,469
CRISPR/Cas9: Contributions on Endoribonuclease Structure and Function, Role in Immunity and Applications in Genome Engineering LHB 1,468
Cancer Signaling Pathways and Tumor Progression: Images of Biological Processes in the Voice of a Pathologist Cancer Expert LHB 1,452
Signaling transduction tutorial LHB 1,443
Diagnostic Evaluation of SIRS by Immature Granulocytes LHB 1,440
UPDATED: PLATO Trial on ACS: BRILINTA (ticagrelor) better than Plavix® (clopidogrel bisulfate): Lowering chances of having another heart attack ALA 1,426
Cardio-oncology and Onco-Cardiology Programs: Treatments for Cancer Patients with a History of Cardiovascular Disease ALA 1,424
Nanotechnology and Heart Disease TB 1,419
Aviva Lev-Ari, PhD, RN, Director and Founder ALA 1,416
Cardiotoxicity and Cardiomyopathy Related to Drugs Adverse Effects LHB 1,415
Nitric Oxide and it’s impact on Cardiothoracic Surgery TB 1,405
A New Standard in Health Care – Farrer Park Hospital, Singapore’s First Fully Integrated Healthcare/Hospitality Complex GST 1,402
Mitochondrial Damage and Repair under Oxidative Stress LHB 1,398
Ovarian Cancer and fluorescence-guided surgery: A report TB 1,395
Sex determination vs. Sex differentiation SS 1,393
LPBI Group ALA 1,372
Closing the Mammography gap DN 1,368
Cytoskeleton and Cell Membrane Physiology LHB 1,367
Crucial role of Nitric Oxide in Cancer RS 1,364
Medical 3D Printing ALA 1,332
Survivals Comparison of Coronary Artery Bypass Graft (CABG) and Percutaneous Coronary Intervention (PCI) / Coronary Angioplasty LHB 1,325
The Final Considerations of the Role of Platelets and Platelet Endothelial Reactions in Atherosclerosis and Novel Treatments LHB 1,310
Disruption of Calcium Homeostasis: Cardiomyocytes and Vascular Smooth Muscle Cells: The Cardiac and Cardiovascular Calcium Signaling Mechanism

LHB

JDP

ALA

1,301
Mitochondrial Dynamics and Cardiovascular Diseases RS 1,284
Nitric Oxide and Immune Responses: Part 2 AV 1,282
Liver Toxicity halts Clinical Trial of IAP Antagonist for Advanced Solid Tumors SJW 1,269
Inactivation of the human papillomavirus E6 or E7 gene in cervical carcinoma cells using a bacterial CRISPR/Cas ALA 1,261
Autophagy LHB 1,255
Mitochondrial fission and fusion: potential therapeutic targets? RS 1,246
Summary of Lipid Metabolism LHB 1,239
Nitric Oxide has a Ubiquitous Role in the Regulation of Glycolysis – with a Concomitant Influence on Mitochondrial Function LHB 1,233
Future of Calcitonin…? Dr. Karra 1,211
Transcatheter Aortic Valve Implantation (TAVI): FDA approves expanded indication for two transcatheter heart valves for patients at intermediate risk for death or complications associated with open-heart surgery ALA 1,197
Gamma Linolenic Acid (GLA) as a Therapeutic tool in the Management of Glioblastoma

RN

MB

1,193
Nanotechnology and HIV/AIDS Treatment TB 1,181
Patiromer – New drug for Hyperkalemia ALA 1,179
‘Gamifying’ Drug R&D: Boehringer Ingelheim, Sanofi, Eli Lilly ALA 1,177
A Patient’s Perspective: On Open Heart Surgery from Diagnosis and Intervention to Recovery Guest Author: Ferez S. Nallaseth, Ph.D. 1,173
Assessing Cardiovascular Disease with Biomarkers LHB 1,167
Development Of Super-Resolved Fluorescence Microscopy LHB 1,166
Ubiquitin-Proteosome pathway, Autophagy, the Mitochondrion, Proteolysis and Cell Apoptosis: Part III LHB 1,162
Atrial Fibrillation contributing factor to Death, Autopsy suggests CEO Dave Goldberg had heart arrhythmia before death ALA 1,159
Linus Pauling: On Lipoprotein(a) Patents and On Vitamin C ALA 1,156
Bystolic’s generic Nebivolol – Positive Effect on circulating Endothelial Progenitor Cells Endogenous Augmentation ALA 1,154
The History of Hematology and Related Sciences LHB 1,151
Heroes in Medical Research: Barnett Rosenberg and the Discovery of Cisplatin SJW 1,146
Overview of New Strategy for Treatment of T2DM: SGLT2 Inhibiting Oral Antidiabetic Agents AV 1,143
Imatinib (Gleevec) May Help Treat Aggressive Lymphoma: Chronic Lymphocytic Leukemia (CLL) ALA 1,140
Issues in Personalized Medicine in Cancer: Intratumor Heterogeneity and Branched Evolution Revealed by Multiregion Sequencing SJW 1,137
New England Compounding Center: A Family Business AK 1,120
EpCAM [7.4] LHB 1,113
Amyloidosis with Cardiomyopathy LHB 1,110
Can Mobile Health Apps Improve Oral-Chemotherapy Adherence? The Benefit of Gamification. SJW 1,095
Acoustic Neuroma, Neurinoma or Vestibular Schwannoma: Treatment Options ALA 1,089
Treatment of Refractory Hypertension via Percutaneous Renal Denervation ALA 1,088
Proteomics – The Pathway to Understanding and Decision-making in Medicine LHB 1,085
Low Bioavailability of Nitric Oxide due to Misbalance in Cell Free Hemoglobin in Sickle Cell Disease – A Computational Model AS 1,085
Pancreatic Cancer: Genetics, Genomics and Immunotherapy TB 1,083
A NEW ERA OF GENETIC MANIPULATION   DS 1,075
Targeting Mitochondrial-bound Hexokinase for Cancer Therapy ZR 1,074
Normal and Anomalous Coronary Arteries: Dual Source CT in Cardiothoracic Imaging JDP

ALA

1,062
Transdermal drug delivery (TDD) system and nanotechnology: Part II TB 1,057
Lung Cancer (NSCLC), drug administration and nanotechnology TB 1,046
Pharma World: The Pharmaceutical Industry in Southeast Asia – Pharma CPhI 20-22 March, 2013, Jakarta International Expo, Jakarta, Indonesia ALA 1,045
Nitric Oxide and Sepsis, Hemodynamic Collapse, and the Search for Therapeutic Options LHB 1,044
Targeted delivery of therapeutics to bone and connective tissues: current status and challenges- Part I AV 1,044
Press Coverage ALA 1,036
Carbohydrate Metabolism LHB 1,036
Open Abdominal Aortic Aneurysm (AAA) repair (OAR) vs. Endovascular AAA Repair (EVAR) in Chronic Kidney Disease Patients – Comparison of Surgery Outcomes LHB

ALA

1,032
In focus: Melanoma Genetics RS 1,018
Cholesteryl Ester Transfer Protein (CETP) Inhibitor: Potential of Anacetrapib to treat Atherosclerosis and CAD ALA 1,015
Medical Devices Start Ups in Israel: Venture Capital Sourced Locally – Rainbow Medical (GlenRock) & AccelMed (Arkin Holdings) ALA 1,007
The Development of siRNA-Based Therapies for Cancer ZR 1,003

Other related articles published in this Open Access Online Scientific Journal include the following:

FIVE years of e-Scientific Publishing @pharmaceuticalintellicence.com, Top Articles by Author and by e-Views >1,000, 4/27/2012 to 1/29/2018

https://pharmaceuticalintelligence.com/2017/04/28/five-years-of-e-scientific-publishing-pharmaceuticalintellicence-com-top-articles-by-author-and-by-e-views-1000-4272012-to-4272017/

Read Full Post »

Electronic Scientific AGORA: Comment Exchanges by Global Scientists on Articles published in the Open Access Journal @pharmaceuticalintelligence.com – Four Case Studies

Curator and Editor-in-Chief: Journal and BioMed e-Series, Aviva Lev-Ari, PhD, RN

 

Introduction

Case Study #1: 40 Responses

  • Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View?

Author: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-cancer-a-21st-century-view/

Case Study #2: 26 Responses

·      Knowing the tumor’s size and location, could we target treatment to THE ROI by applying…..

Author: Dror Nir, PhD

https://pharmaceuticalintelligence.com/2012/10/16/knowing-the-tumors-size-and-location-could-we-target-treatment-to-the-roi-by-applying-imaging-guided-intervention/

Case Study #3: 24 Responses

  • Personalized Medicine: Cancer Cell Biology and Minimally Invasive Surgery (MIS)

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/12/01/personalized-medicine-cancer-cell-biology-and-minimally-invasive-surgery-mis/

Case Study #4: 13 Responses

  • Judging the ‘Tumor response’-there is more food for thought

https://pharmaceuticalintelligence.com/2012/12/04/judging-the-tumor-response-there-is-more-food-for-thought/

Conclusions

 

Introduction

Members of our Team published 5,295 articles, in the period between 4/2012 to 4/10/2018, and engaged in Comment Exchanges with Global Scientists Online. 1,412,106 eReaders had viewed our articles and 7,283 scientific comments are included in the Journal Archive.

Team Members’ Profile

Team Profile: DrugDiscovery @LPBI Group – A BioTech Start Up submitted for Funding Competition to MassChallenge Boston 2016 Accelerator

In our Scientific Agora: Multi Scientific Comment exchanges between Global e-Readers Scientists and LPBI’s Scientists/Experts/Authors/Writers take place. In this curation I am presenting four articles that generated dozens of scientific comments and multifaceted exchanges.

The Voice of Aviva Lev-Ari, PhD, RN:

It is my strongest conviction on the merit of the following features of Global SHARING the Scientific product, aka “An Article written by a Scientist” in the Digital Scientific Publishing Age:

  • Every new article published in Open Access Journals contributes to mitigate the most acute challenge of the e-Scientific Publishing industry today: Information Obsolescence – the newness of findings
  • Every new article published in Open Access Journals contributes AND in the Subscription-based Journals contributes to the second most acute challenge of of the e-Scientific Publishing industry today: Information Explosion – the volume of findings
  • The Scientific Agora as presented, below, in four Case Studies is an optimal means for Global SHARING in Real Time scientific knowledge deriving from clinical expertise and lab experience of all the participants in the Agora. REAL TIME means minimization of the negative impact of the most acute challenge of of the e-Scientific Publishing industry today: Information Obsolescence 
  • Knowledge SHARING of our Scientists articles occurs among two FORUMS:

Forum One, is the Scientists that joined the comment exchanges between the Article Author and other members of our Team on a given Scientific product, aka “An Article written by a Scientist”

Forum Two, is the Global Universe of Scientists that (a) are e-mail Followers opting to our Open Access Journal free subscription and (b) eReaders of our Journal that did not yet opt to follow the Journal by e-mail, a robust crowd of +1.4 Million Scientists

  • We mitigate the negative impact of the second most acute challenge of the e-Scientific Publishing industry today: Information Explosion by our own developed and advanced achievements reached in the practice of
  1. Development of the Methodology for Curation of Scientific Findings, Curation of Scientific Content @Leaders in Pharmaceutical Business Intelligence (LPBI) Group, Boston
  2. Application of the Methodology for Curation of Scientific Findings in a BioMed e-Series of 16-Volumes in Medicine and Life Sciences on Amazon.com

electronic Table of Contents (eTOCs) of each Volume in the SIXTEEN Volume BioMed e-Series

WE ARE ON AMAZON.COM

https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Ddigital-text&field-keywords=Aviva+Lev-Ari&rh=n%3A133140011%2Ck%3AAviva+Lev-Ari

Commentaries on each Volume’s Contribution to Medical Education by L.H. Bernstein, MD, FCAP and by Aviva Lev-Ari, PhD, RN – BioMedical e-Books e-Series: Multiple Volumes in Five e-Series

https://pharmaceuticalintelligence.com/biomed-e-books/commentaries-on-each-volumes-contribution-to-medical-education-by-l-h-bernstein-md-fcap-and-aviva-lev-ari-phd-rn-biomedical-e-books-e-series-multiple-volumes-in-five-e-series/

In 2016, LPBI’s BioMed e-Series was Submitted for Nomination for 2016 COMMUNICATION AWARD FOR EXCELLENCE IN REPORTING SCIENCE, MEDICINE AND ENGINEERING – Reference #: 9076095, on 1/27/2016

https://pharmaceuticalintelligence.com/biomed-e-books/

  • Lastly, It is my strong belief that the Methodology of Curation will become a major tool used in Content creation for Curriculum Development in Medical Schools, in the Life Sciences and Healthcare Allied professions.
  • We have pioneered and showed the way BY EXAMPLE, +5,200 Scientific products, aka “An Article written by a Scientist” constitute our Journal Archive created by content curation
  • More New e-Book Titles are coming in 2018-2019 in LPBI’s BioMed e-Series.
  • More e-Scientific Publishers will use the Methodology of Creation of electronic Table of Contents of e-Books by combing Archives by very experienced subject matter Editors.
  • Global SHARING of Information became best practice for Academic Course Contents in the last ten years
  • On-Line Degrees are spreading in many disciplines and are offered by very many colleges, including the Ivy League
  • Open Access Scientific Journals is the FUTURE of the e-Scientific Publishing Industry.

 

Case Study #1:

  • Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View?

Author: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-cancer-a-21st-century-view/

40 Responses

  1. This is OUTSTANDING.

    Now we need a “shortcliff” post to follow one chart that traces the dynamic process, no reader shall get lost inside any of the process boxes.

  2. Really nice overview and very interesting metabolic changes.
    However, related to the title, the cancerous changes- event always comes first before lactate preferred metabolism comes into place. Right?

  3. This is what has been inferred. So if that is the premise, then the mutation would be the first event. That position has been successfully challenged and also poses a challenge to the proper view of genomic discovery. The real event may very well be the ongoing oxidative stress with aging, and decreased physiochemical reserve.

    I haven’t developed the whole picture. Nitric oxide and nitrosylation contribute to both vascular relaxation and vasoconstriction, which is also different in major organs. The major carriers of H+ are NADH and FADH2. Electron transport is in the ETC in mitochondria. I called attention to the “escape” of energy in aerobic glycolysis. As disease ensues, it appears that lactate generation is preferential as the mitochondrion takes up substrate from gluconeogenesis. Whether it is an endotoxic shock or a highly malignant fast growing tumor, the body becomes trapped in “autocatabolism”. So the tumor progresses, apoptosis is suppressed, and there is a loss of lean body mass.
    All of this is tied to genetic instability.

    We see the genetic instability as first because of the model DNA–RNA–protein. We don’t have a map.

  4. It is a very nice report. I did work for a short time to develop compounds to block the glucose uptake especially using glucose-mimics. I wonder is there any research on this area going on now?

  5. Thanks. I have been researching this exhaustively. There are even many patents trying to damp this down. You were on the right track. The biggest problem has been multidrug resistance and tumor progression.

  6. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? (pharmaceuticalintelligence.com) […]

  7. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? (pharmaceuticalintelligence.com) […]

  8. Martin Canizales • Warburg effect (http://www.cellsignal.com/reference/pathway/warburg_effect.html), is responsible of overactivation of the PI3K… the produced peroxide via free radicals over activate the cyclooxigenase and consequently the PI3K pathway activating there, the most important protein-kinase ever described in the last mmmh, 60-70 years? maybe… to broke the Warburg effect, will stop the PI3K activation (http://www.cellsignal.com/reference/pathway/Akt_PKB.html) then all the cancer protein related with the generation of tumor (pAKT,pP70S6K, Cyclin D1, HIF1, VEGF, EGFrc, GSK, Myc, etc, etc, etc), will get down regulation. That is what happen, when I knock down the new protein-kinase in pancreatic cancer cell lines… stable KD of pancreatic cancer cell lines divide very-very-veeeery slow (by Western blotting, cyclin D1 disapear, VEGF, HIF1a, MyC, pAKT, pP70S6K, GSK, and more and more also has, very-very few consume of glucose [diabetes and cancer]. Stable cells can be without change the media for 3 weeks and the color doesn’t change, cells divide but VERY slow and are alive [longevity]) are not able to generate xenograft tumors related, to scramble shRNA stable cell lines. When, we broke the warburg effect, the protein kinase get’s down as well all the others. Is the same, with bacteria infections…. bacteria infections, has many things to teach us about cancer and cell proliferation (http://www.ncbi.nlm.nih.gov/pubmed/22750098)

  9. edit this on November 12, 2012 at 5:41 PM | Replyhijoprodigoendistancia

    research paper, should be ready (writing) very soon and must be submmited before end this year. Hee hee! you know… end of the world is in December 21 2012

    • The emphasis on p13 and the work on pancreatic cancer is very interesting. I’ll check the references you give. The Warburg effect is still metabolic, and it looks like you are able to suppress the growth of either cancer cells or bacteria. The outstanding question is whether you can get a head start on the SIR transition to sepsis to severe sepsis to MODS, to shock.

      It looks like an article will be necessary after your work is accepted for publication. Thanks a lot for the response.

  10. edit this on November 12, 2012 at 8:52 PM | Replyhijoprodigoendistancia

    Also, when this protein-kinase is over expressed… UCP1 get down..then, less mitochondria, consequently less aerobic cell functions…in adipose tissue, less mitochondria promote the differentiation of BAT (Brown Adipose Tissue) to, WAT (White Agipose Tissue). Has relation with AS160 phosphorylation, Glut4 membrane translocation, promote the GABA phosphorylation (schizophrenia-autism), neuronal differentiation (NPCs:Neural Progenitor Cells), dopaminergic cell differentiation….

  11. edit this on November 12, 2012 at 8:55 PM | Replyhijoprodigoendistancia

    Larry, all comments are part of the second paper.

  12. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  13. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  14. Larry please take a look at Gonzalez et al. The Bioenergetic theory of Carcinogenesis. Med Hypotheses 2012; 79: 433-439 and let me know your thoughts.

  15. […] The Initiation and Growth of Molecular Biology and Genomics, Part I […]

  16. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  17. edit this on May 22, 2013 at 11:36 PM | ReplyAashir Awan, Phd

    Informative article especially concerning activation of HIF under normoxic conditions. Recently, a paper has come out showing patients showing symptoms of mood disorder having increased expression of Hif1a. Also, there are reports that Hif1a is important in development of certain tissue types.

  18. COLOURS AND LIFE. The basic idea of this theory is that the oxidation of hydrogen and carbon atoms, arising from the degradation of carbohydrates, is by two distinct processes based on oxidation-reduction electron transfer and photochemical process of energy release on the basis of color complementary, predominance of one or another depending on intracellular acid-base balance. I can not understand why nobody wants to do this experiment. I’m sure this assumption hides a truth. Before considering it a fiction to be checked experimentally. I would like to present a research project that concerns me for a long time that I can not experience myself.
    Involuntarily, after many years of searching, I have concluded that in the final biological oxidation, in addition to the oxidation-reduction electron transfer occurs photo-chemical process, accordance to the principle of color complementary energy transfer. I imagine an experiment that might be relevant (sure it can be improved). In my opinion, if this hypothesis proves true, one can control the energy metabolism of the cell by chromotherapy, as the structures involved are photosensitive and colorful. I would be very happy if this experiment were done under your leadership. Sincerely yours Dr. Viorel Bungau

    INNER LIGHT – LIGHT OF LIFE.
    CHROMOTHERAPY AND THE IMPLICATIONS IN THE METABOLISM OF THE NORMAL AND NEOPLASTIC CELL. “Chlorophyll and hemoglobin pigments of life porphyrin structure differs only in that chlorophyll is green because of magnesium atoms in the structure, and hemoglobin in red because of iron atoms in the structure. This is evidence of the common origin of life.” (Heilmeyer) We propose an experiment to prove that the final biological oxidation, in addition to its oxidation-reduction, with formation of H2O and CO2, there is a photochemical effect, by which energy is transferred from the H atom, or C, process is done selct, the colors, complementary colors on the basis of the structures involved are colored (red hemoglobin Fe, Mg chlorophyll green, blue ceruloplasmin Cu, Fe cytochrome oxidase red, green cytochrome oxidase with Cu etc.). The basic idea is that if life pigments (chlorophyll, hemoglobin, cytochromes), which provides energy metabolism of the cell, are colored, we can control their activities through chromotherapy, on the basis of complementary color and energy rebalance the body, with a figured X- body-colored-ray.
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. “Duality of cytochrome oxidase. Proliferation (growth) and Differentiation (maturation) cell.” Cytochrome oxidase is present in two forms, depending on the context of acid-base internal environment : 1.- Form acidic (acidosis), which contains two Iron atoms, will be red, will absorb the additional green energy of the hydrogen atom, derived from carbohydrates, with formation of H2O, metabolic context that will promote cell proliferation. 2.-Form alkaline (alkalosis), containing two copper atoms, will be green, will absorb the additional red energy of the carbon atom, derived from carbohydrates, with formation of CO2, metabolic context that will promote cell differentiation. Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors.
    “Inner Light- Light of Life. Endogenous monochromatic irradiation. Red ferment of Warburg – Green ferment of Warburg.”
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. If the structures involved in biological oxidation finals are colored, then their energy absorption is made based on the principle of complementary colors. If we can determine the absorption spectrum at different levels, we can control energy metabolism by chromotherapy – EXOGENOUS MONOCHROMATIC IRRADIATION . Energy absorption in biological oxidation process itself, based on complementary colors, the structures involved (cytochromes), is the nature of porphyrins, in combination with a metal becomes colored, will absorb the complementary color, corresponding to a specific absorption spectrum, it will be in – ENDOGENOUS MONOCHROMATIC IRRADIATION.
    This entitles us to believe that: In photosynthesis, light absorption and its storage form of carbohydrates, are selected, the colors, as in cellular energy metabolism, absorption of energy by the degradation of carbohydrates, is also done selectively, based on complementary colors. In the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process,based on complementary colors, the first in the electron transfer, the second in the energy transfer. So, in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with iron, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin, green cytochrome oxidase with copper, etc.)
    According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive, leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis. In connection with my research proposal, to prove that the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process, the first in the electron transfer, the second in the energy transfer.
    I SUGGEST TO YOU AN EXPERIMENT:

    TWO PLANTS, A RED (CORAILLE) LIGHT ONLY, IN BASIC MEDIUM, WITH ADDED COPPER, WILL GROW, FLOWER AND FRUIT WILL SHORT TIME, AND THE OTHER ONLY GREEN LIGHT (TOURQUOISE), IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR , WHICH GROWS THROUGHOUT WILL NOT GROW FLOWERS AND FRUIT WILL DO.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH MONOCHROMATIC GREEN ( TOURQUOISE) LIGHT, IN AN ALKALINE MEDIUM, WITH ADDED COPPER, WILL IN REGRESSION OF THE TISSUE CULTURE.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH RED ( CORAILLE) LIGHT, IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR, WILL LEAD TO EXAGERATED AND ANARCHICAL MULTIPLICATION.
    If in photosynthesis is the direct effect of monochromatic irradiation, in the final biological oxidation effect is reversed. Exogenous irradiation with green, induces endogenous irradiation with red, and vice versa. A body with cancer disease will become chemically color “red”- Acid -(pH, Rh, pCO2, alkaline reserve), and in terms of energy, green (X-body-colored-ray). A healthy body will become chemically color “green”-Alkaline – (as evidenced by laboratory), and in terms of energy, red (visible by X-body-colored-ray). Sincerely, Dr. Viorel Bungau

    -In addition-
    “Life balance: Darkness and Light – Water and Fire – Inn and Yang.”

    Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors. If neoplastic cells, because acidosis is overactive acid form of cytochrome oxidase (red with iron atoms), which will absorb the additional green energy hydrogen atom (exclusively), the production of H20 , so water will prevail, in Schizophrenia , neuronal intracellular alkaline environment, will promote the basic form of cytochrome oxidase (green with copper atoms), which will oxidize only carbon atoms, the energy absorption of red (complementary) and production of CO2, so the fire will prevail. Drawn from this theory interdependent relationship between water and fire, of hydrogen(H2O) and carbon(CO2) ,in a controlled relationship with oxygen (O2). If photosynthesis is a process of reducing carbon oxide(CO2) and hydrogen oxide(H2O), by increasing electronegativity of C and H atoms, with the electrons back to oxygen, which will be released in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful. It means that matter and energy in the universe are found in a relationship based on complementary colors, each color of energy, corresponding with a certain chemical structure. In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. The final biological oxidation is achieved through a process of oxidation-reduction, while a photochemical process, based on the principle of complementary colors, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with copper, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin,etc. If satisfied, the final biological oxidation is achieved by a photochemical mechanism (besides the oxidation-reduction), that energy is released based on complementary colors, means that we can control the final biological oxidation mechanism, irreversibly disrupted in cancer, by chromotherapy and correction of acid-base imbalance that underlies this disorder.We reached this conclusions studying the final biological oxidation, for understanding the biochemical mechanism of aerobic glycolysis in cancer. We found that cancer cell, energy metabolism is almost exclusively on hydrogen by oxidative dehydrogenation, due to excessive acidosis , coenzymes which makes carbon oxidation, as dormant (these coenzymes have become inactive). If we accept the nature of these coenzymes chloride (see Warburg ferment red), could be rectivate, by correcting acidosis (because that became leucoderivat), and by chromoterapie, on the basis of complementary colors. According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive (it contains two copper atoms) leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis.

    Malignant transformation occurs by energy metabolism imbalance in power generation purposes in the predominantly (exclusively) of the hydrogen atom of carbon oxidation is impossible. Thus at the cellular level will produce a multiplication (growth) exaggerated (exclusive), energy from hydrogen favoring growth, multiplication, at the expense of differentiation (maturation). Differentiation is achieved by energy obtained by oxidation of the carbon atom can not take, leading to carcinogenesis . The energy metabolism of the cell, an energy source is carbohydrate degradation, which is done by OXIDATIVE DEHYDROGENATION AND OXIDATIVE DECARBOXYLATION , to obtain energy and CO2 and H2O. In normal cells there is a balance between the two energy sources. If cancer cells, oxidation of the carbon atom is not possible, the cell being forced to summarize the only energy source available, of hydrogen. This disorder underlying malignant transformation of cells and affect the whole body, in various degrees, often managing to rebalance process, until at some point it becomes irreversible. The exclusive production of hydrogen energy will cause excessive multiplication, of immature cells, without functional differentiation. Exclusive carbon energy production will lead to hyperdifferentiation, hyperfunctional, multiplication is impossible. Normal cell is between two extremes, between some limits depending on the adjustment factors of homeostasis. Energy from energy metabolism is vital for cell (body). If the energy comes predominantly (or exclusively) by oxidation of the hydrogen atom, green energy, will occur at the structural level (biochemical), acidification of the cellular structures that will turn red, so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “RED”, WITH “GREEN” ENERGY. This background predisposes to accelerated growth, without differentiation, reaching up uncontrolled, anarchical. ENERGY STRUCTURE OF THE CELL BODY WOULD BE INN. If necessary energy cell derived mainly by oxidation of the carbon atom, red energy,cell structures will be colored green, will be alkaline(basic), so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “GREEN”, WITH “RED” ENERGY, on the same principle of complementarity. This context will lead hyperdifferentiation, hyperfunctional ,maturation, and grouth stops. ENERGY STRUCTURE OF THE CELL BODY WOULD BE YANG. If in photosynthesis, porphyrins chemicals group, whic be photosensitivity (their first feature), shows and a great affinity for metals with chelate forming and becoming colored (pigments of life), can absorb monochromatic light complementary, so if these pigments, which constitutes the group of chromoprotheine, in photosynthesis will achieve CO2 and H2O reduction the recovery of C, H respectively, and the issuance of and release of O, atoms as H and C that reduced the energy load, representing carbohydrates, is in the form of solar energy storage, in cellular energy metabolism, processes necessary life, energy will come from the degradation of substances produced in photosynthesis, the carbohydrates, by oxidative dehydrogenation and oxidative decarboxylation, through like substances, which form chelates with the metals, are colored, metals contained in the form of oxides of various colors(green Mg, red Fe, blue Cu,etc.),suffering from complementary color absorption process of reduction with H in case,if the oxidative dehydrogenation, when chelated metal pigment is red, becoming leucoderivat (colorless) by absorbing complementary color (green) of hydrogen, formation of H2O, or C, if the oxidative decarboxylation when chelated metallic pigment is green, energy absorbing additional, red energy of atom C, CO2 production, the process is identical. The process that lies at base cellular energy metabolism, takes place in the final biological oxidation, reducing the O atom in the form of metal oxide, in combination with photosensitive substance, porohyrin, colorful,absorbing complementary color, will reduce the O atom, with H and C, with the production of H2O and CO2. Green energy release of H atom in the oxidative dehydrogenation process, it is a process of”IRRADIATION MONOCHROMATIC ENDOGENOUS WITH GREEN”, and red energy release of C atom in the oxidative decarboxylation process, consists in an “IRRADIATION MONOCHROMATIC ENDOGENOUS WITH RED”. Porphyrin-metal combination in photosynthesis, the chelated form, by absorbing light in the visible spectrum, will be able to reduce to low and turn, C and H respectively, the state of oxide (CO2 and H2O),release of O. The final biological oxidation, the combination of metal-porphyrins in aerobically in the absence of light, will find in the oxidized state, so in the form of porphyrins and metal-oxide, will oxidize to C and H atom of hydrocarbonates, with formation of CO2 and H2O, or rather, will be reduced by C and H atom of hydrocarbonates,formation of CO2 and H2O, by absorbing energy produced by photosynthesis. If we can control the final biological oxidation, we can control cellular growth, thus multiplying, and on the other hand, maturation, so differentiation. Green energy will prevail if the cell (body) which multiplies (during growth), will in case of adult cell (functional) will prevail red energy . The two types of energy, that obtained by oxidative dehydrogenation , which will cause cell multiplication without differentiation , and that obtained by oxidative decarboxylation , which will be to stop proliferation, and will determine the differentiation (maturity, functionality). This process is carried out based on complementary colors, which are coenzymes oxidative dehydrogenation and oxidative decarboxylation is colored . It reveals the importance of acid-base balance, the predominance of the acidic or basic, as an acid structure (red), not only can gain energy from the carbon atom red (the principle of complementarity), but can not assimilate ( under the same principle). It must therefore acid-base balance of internal environment, and alkalinization his intake of organic substances by the electron donor. By alkalinization (addition of electrons) will occur neutralize acid structures, the red, they become leucoderivat, colorless, and inactive, while the basic, which because of acidosis became neutral, colorless and inactive, will be alkaline in electron contribution, will be in green, and will absorb red energy from the carbon atom. So, on two kinds of vital energy, it is clear correlation between the chemical structure of the cell(body),and type of energy that can produce and use. Thus a cell with acidic chemical structure, can produce only energy by oxidative dehydrogenation (green energy), because the acid can only be active coenzymes with acid chemical structure, red, will absorb the complementarity only green energy of hydrogen. Basic structures which should absorb red energy from carbon , are inactive due to acid environment, which in turn chemically in leucoderivat, so colorless structures, inactive. Conversion of these structures to normal, operation by alkalinization could be a long lasting process, therefore, we use parallel chromotherapy, based on the fact that these COENZYMES INVOLVED IN BIOLOGICAL OXIDATION FINALS ARE COLORED AND PHOTOSENSITIVE. Thus, exogenous irradiation with monochromatic green will neutralize, by complementarity, coenzymes red, acidic. In will reactivate alkaline coenzymes, which have become due acidosis leucoderivat, so colorless and inactive. Without producing CO2, carbonic anhydrase can not form H2CO3, severable and thus transferred through mitochondrial membrane. Will accumulate in the respiratory Flavin, OH groups, leading to excessive hydroxylation, followed by consecutive inclusion of amino (NH2). It is thus an imbalance between the hydrogenation-carboxylation and hydroxylation-amination, in favor of the latter. This will predominate AMINATION and HYDROXYLATION at the expense CARBOXYLATION and HYDROGENATION, leading to CONVERSION OF STRUCTURAL PROTEINS IN NUCLEIC ACIDS. Meanwhile, after chemical criteria not genetic, it synthesizes the remaining unoxidized carbon atoms, nucleic bases “de novo” by the same process of hydroxylation-amination, leading to THE SYNTHESIS OF NUCLEIC ACIDS “DE NOVO”. Sincerely yours, Dr. Viorel Bungau viorelbungau20@yahoo.com

    • Dr. Viorel Bungau,

      Your comment is beautiful, clorful, insightful, magestic.

      This article has drawn 3007 views

      Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
      2012 242 362 247 851
      2013 283 330 465 390 288 208 187 164 255 274 163 3,007

  19. Dear Mr. Professor, Please join me in this research proposal, as leader, because I can not go alone.
    The basic idea of this theory is that the oxidation of hydrogen and carbon atoms, arising from the degradation of carbohydrates, is by two distinct processes based on oxidation-reduction electron transfer and photochemical process of energy release on the basis of color complementary, predominance of one or another depending on intracellular acid-base balance. I can not understand why nobody wants to do this experiment. I’m sure this assumption hides a truth. Before considering it a fiction to be checked experimentally. I would like to present a research project that concerns me for a long time that I can not experience myself.
    Involuntarily, after many years of searching, I have concluded that in the final biological oxidation, in addition to the oxidation-reduction electron transfer occurs photo-chemical process, accordance to the principle of color complementary energy transfer. I imagine an experiment that might be relevant (sure it can be improved). In my opinion, if this hypothesis proves true, one can control the energy metabolism of the cell by chromotherapy, as the structures involved are photosensitive and colorful. I would be very happy if this experiment were done under your leadership. Sincerely yours, Dr. Viorel Bungau

    INNER LIGHT – LIGHT OF LIFE.
    CHROMOTHERAPY AND THE IMPLICATIONS IN THE METABOLISM OF THE NORMAL AND NEOPLASTIC CELL. “Chlorophyll and hemoglobin pigments of life porphyrin structure differs only in that chlorophyll is green because of magnesium atoms in the structure, and hemoglobin in red because of iron atoms in the structure. This is evidence of the common origin of life.” (Heilmeyer) We propose an experiment to prove that the final biological oxidation, in addition to its oxidation-reduction, with formation of H2O and CO2, there is a photochemical effect, by which energy is transferred from the H atom, or C, process is done selct, the colors, complementary colors on the basis of the structures involved are colored (red hemoglobin Fe, Mg chlorophyll green, blue ceruloplasmin Cu, Fe cytochrome oxidase red, green cytochrome oxidase with Cu etc.). The basic idea is that if life pigments (chlorophyll, hemoglobin, cytochromes), which provides energy metabolism of the cell, are colored, we can control their activities through chromotherapy, on the basis of complementary color and energy rebalance the body, with a figured X- body-colored-ray.
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. “Duality of cytochrome oxidase. Proliferation (growth) and Differentiation (maturation) cell.” Cytochrome oxidase is present in two forms, depending on the context of acid-base internal environment : 1.- Form acidic (acidosis), which contains two Iron atoms, will be red, will absorb the additional green energy of the hydrogen atom, derived from carbohydrates, with formation of H2O, metabolic context that will promote cell proliferation. 2.-Form alkaline (alkalosis), containing two copper atoms, will be green, will absorb the additional red energy of the carbon atom, derived from carbohydrates, with formation of CO2, metabolic context that will promote cell differentiation. Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors.
    “Inner Light- Light of Life. Endogenous monochromatic irradiation. Red ferment of Warburg – Green ferment of Warburg.”
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. If the structures involved in biological oxidation finals are colored, then their energy absorption is made based on the principle of complementary colors. If we can determine the absorption spectrum at different levels, we can control energy metabolism by chromotherapy – EXOGENOUS MONOCHROMATIC IRRADIATION . Energy absorption in biological oxidation process itself, based on complementary colors, the structures involved (cytochromes), is the nature of porphyrins, in combination with a metal becomes colored, will absorb the complementary color, corresponding to a specific absorption spectrum, it will be in – ENDOGENOUS MONOCHROMATIC IRRADIATION.
    This entitles us to believe that: In photosynthesis, light absorption and its storage form of carbohydrates, are selected, the colors, as in cellular energy metabolism, absorption of energy by the degradation of carbohydrates, is also done selectively, based on complementary colors. In the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process,based on complementary colors, the first in the electron transfer, the second in the energy transfer. So, in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with iron, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin, green cytochrome oxidase with copper, etc.)
    According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive, leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis. In connection with my research proposal, to prove that the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process, the first in the electron transfer, the second in the energy transfer.
    I SUGGEST TO YOU AN EXPERIMENT:

    TWO PLANTS, A RED (CORAILLE) LIGHT ONLY, IN BASIC MEDIUM, WITH ADDED COPPER, WILL GROW, FLOWER AND FRUIT WILL SHORT TIME, AND THE OTHER ONLY GREEN LIGHT (TOURQUOISE), IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR , WHICH GROWS THROUGHOUT WILL NOT GROW FLOWERS AND FRUIT WILL DO.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH MONOCHROMATIC GREEN ( TOURQUOISE) LIGHT, IN AN ALKALINE MEDIUM, WITH ADDED COPPER, WILL IN REGRESSION OF THE TISSUE CULTURE.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH RED ( CORAILLE) LIGHT, IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR, WILL LEAD TO EXAGERATED AND ANARCHICAL MULTIPLICATION.
    If in photosynthesis is the direct effect of monochromatic irradiation, in the final biological oxidation effect is reversed. Exogenous irradiation with green, induces endogenous irradiation with red, and vice versa. A body with cancer disease will become chemically color “red”- Acid -(pH, Rh, pCO2, alkaline reserve), and in terms of energy, green (X-body-colored-ray). A healthy body will become chemically color “green”-Alkaline – (as evidenced by laboratory), and in terms of energy, red (visible by X-body-colored-ray). Sincerely yours, Dr. Viorel Bungau

    -In addition-
    Life balance: Darkness and Light – Water and Fire – Inn and Yang.

    Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors. If neoplastic cells, because acidosis is overactive acid form of cytochrome oxidase (red with iron atoms), which will absorb the additional green energy hydrogen atom (exclusively), the production of H20 , so water will prevail, in Schizophrenia , neuronal intracellular alkaline environment, will promote the basic form of cytochrome oxidase (green with copper atoms), which will oxidize only carbon atoms, the energy absorption of red (complementary) and production of CO2, so the fire will prevail. Drawn from this theory interdependent relationship between water and fire, of hydrogen(H2O) and carbon(CO2) ,in a controlled relationship with oxygen (O2). If photosynthesis is a process of reducing carbon oxide(CO2) and hydrogen oxide(H2O), by increasing electronegativity of C and H atoms, with the electrons back to oxygen, which will be released in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful. It means that matter and energy in the universe are found in a relationship based on complementary colors, each color of energy, corresponding with a certain chemical structure. In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. The final biological oxidation is achieved through a process of oxidation-reduction, while a photochemical process, based on the principle of complementary colors, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with copper, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin,etc. If satisfied, the final biological oxidation is achieved by a photochemical mechanism (besides the oxidation-reduction), that energy is released based on complementary colors, means that we can control the final biological oxidation mechanism, irreversibly disrupted in cancer, by chromotherapy and correction of acid-base imbalance that underlies this disorder.We reached this conclusions studying the final biological oxidation, for understanding the biochemical mechanism of aerobic glycolysis in cancer. We found that cancer cell, energy metabolism is almost exclusively on hydrogen by oxidative dehydrogenation, due to excessive acidosis , coenzymes which makes carbon oxidation, as dormant (these coenzymes have become inactive). If we accept the nature of these coenzymes chloride (see Warburg ferment red), could be rectivate, by correcting acidosis (because that became leucoderivat), and by chromoterapie, on the basis of complementary colors. According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive (it contains two copper atoms) leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis.

    Malignant transformation occurs by energy metabolism imbalance in power generation purposes in the predominantly (exclusively) of the hydrogen atom of carbon oxidation is impossible. Thus at the cellular level will produce a multiplication (growth) exaggerated (exclusive), energy from hydrogen favoring growth, multiplication, at the expense of differentiation (maturation). Differentiation is achieved by energy obtained by oxidation of the carbon atom can not take, leading to carcinogenesis . The energy metabolism of the cell, an energy source is carbohydrate degradation, which is done by OXIDATIVE DEHYDROGENATION AND OXIDATIVE DECARBOXYLATION , to obtain energy and CO2 and H2O. In normal cells there is a balance between the two energy sources. If cancer cells, oxidation of the carbon atom is not possible, the cell being forced to summarize the only energy source available, of hydrogen. This disorder underlying malignant transformation of cells and affect the whole body, in various degrees, often managing to rebalance process, until at some point it becomes irreversible. The exclusive production of hydrogen energy will cause excessive multiplication, of immature cells, without functional differentiation. Exclusive carbon energy production will lead to hyperdifferentiation, hyperfunctional, multiplication is impossible. Normal cell is between two extremes, between some limits depending on the adjustment factors of homeostasis. Energy from energy metabolism is vital for cell (body). If the energy comes predominantly (or exclusively) by oxidation of the hydrogen atom, green energy, will occur at the structural level (biochemical), acidification of the cellular structures that will turn red, so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “RED”, WITH “GREEN” ENERGY. This background predisposes to accelerated growth, without differentiation, reaching up uncontrolled, anarchical. ENERGY STRUCTURE OF THE CELL BODY WOULD BE INN. If necessary energy cell derived mainly by oxidation of the carbon atom, red energy,cell structures will be colored green, will be alkaline(basic), so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “GREEN”, WITH “RED” ENERGY, on the same principle of complementarity. This context will lead hyperdifferentiation, hyperfunctional ,maturation, and grouth stops. ENERGY STRUCTURE OF THE CELL BODY WOULD BE YANG. If in photosynthesis, porphyrins chemicals group, whic be photosensitivity (their first feature), shows and a great affinity for metals with chelate forming and becoming colored (pigments of life), can absorb monochromatic light complementary, so if these pigments, which constitutes the group of chromoprotheine, in photosynthesis will achieve CO2 and H2O reduction the recovery of C, H respectively, and the issuance of and release of O, atoms as H and C that reduced the energy load, representing carbohydrates, is in the form of solar energy storage, in cellular energy metabolism, processes necessary life, energy will come from the degradation of substances produced in photosynthesis, the carbohydrates, by oxidative dehydrogenation and oxidative decarboxylation, through like substances, which form chelates with the metals, are colored, metals contained in the form of oxides of various colors(green Mg, red Fe, blue Cu,etc.),suffering from complementary color absorption process of reduction with H in case,if the oxidative dehydrogenation, when chelated metal pigment is red, becoming leucoderivat (colorless) by absorbing complementary color (green) of hydrogen, formation of H2O, or C, if the oxidative decarboxylation when chelated metallic pigment is green, energy absorbing additional, red energy of atom C, CO2 production, the process is identical. The process that lies at base cellular energy metabolism, takes place in the final biological oxidation, reducing the O atom in the form of metal oxide, in combination with photosensitive substance, porohyrin, colorful,absorbing complementary color, will reduce the O atom, with H and C, with the production of H2O and CO2. Green energy release of H atom in the oxidative dehydrogenation process, it is a process of”IRRADIATION MONOCHROMATIC ENDOGENOUS WITH GREEN”, and red energy release of C atom in the oxidative decarboxylation process, consists in an “IRRADIATION MONOCHROMATIC ENDOGENOUS WITH RED”. Porphyrin-metal combination in photosynthesis, the chelated form, by absorbing light in the visible spectrum, will be able to reduce to low and turn, C and H respectively, the state of oxide (CO2 and H2O),release of O. The final biological oxidation, the combination of metal-porphyrins in aerobically in the absence of light, will find in the oxidized state, so in the form of porphyrins and metal-oxide, will oxidize to C and H atom of hydrocarbonates, with formation of CO2 and H2O, or rather, will be reduced by C and H atom of hydrocarbonates,formation of CO2 and H2O, by absorbing energy produced by photosynthesis. If we can control the final biological oxidation, we can control cellular growth, thus multiplying, and on the other hand, maturation, so differentiation. Green energy will prevail if the cell (body) which multiplies (during growth), will in case of adult cell (functional) will prevail red energy . The two types of energy, that obtained by oxidative dehydrogenation , which will cause cell multiplication without differentiation , and that obtained by oxidative decarboxylation , which will be to stop proliferation, and will determine the differentiation (maturity, functionality). This process is carried out based on complementary colors, which are coenzymes oxidative dehydrogenation and oxidative decarboxylation is colored . It reveals the importance of acid-base balance, the predominance of the acidic or basic, as an acid structure (red), not only can gain energy from the carbon atom red (the principle of complementarity), but can not assimilate ( under the same principle). It must therefore acid-base balance of internal environment, and alkalinization his intake of organic substances by the electron donor. By alkalinization (addition of electrons) will occur neutralize acid structures, the red, they become leucoderivat, colorless, and inactive, while the basic, which because of acidosis became neutral, colorless and inactive, will be alkaline in electron contribution, will be in green, and will absorb red energy from the carbon atom. So, on two kinds of vital energy, it is clear correlation between the chemical structure of the cell(body),and type of energy that can produce and use. Thus a cell with acidic chemical structure, can produce only energy by oxidative dehydrogenation (green energy), because the acid can only be active coenzymes with acid chemical structure, red, will absorb the complementarity only green energy of hydrogen. Basic structures which should absorb red energy from carbon , are inactive due to acid environment, which in turn chemically in leucoderivat, so colorless structures, inactive. Conversion of these structures to normal, operation by alkalinization could be a long lasting process, therefore, we use parallel chromotherapy, based on the fact that these COENZYMES INVOLVED IN BIOLOGICAL OXIDATION FINALS ARE COLORED AND PHOTOSENSITIVE. Thus, exogenous irradiation with monochromatic green will neutralize, by complementarity, coenzymes red, acidic. In will reactivate alkaline coenzymes, which have become due acidosis leucoderivat, so colorless and inactive. Without producing CO2, carbonic anhydrase can not form H2CO3, severable and thus transferred through mitochondrial membrane. Will accumulate in the respiratory Flavin, OH groups, leading to excessive hydroxylation, followed by consecutive inclusion of amino (NH2). It is thus an imbalance between the hydrogenation-carboxylation and hydroxylation-amination, in favor of the latter. This will predominate AMINATION and HYDROXYLATION at the expense CARBOXYLATION and HYDROGENATION, leading to CONVERSION OF STRUCTURAL PROTEINS IN NUCLEIC ACIDS. Meanwhile, after chemical criteria not genetic, it synthesizes the remaining unoxidized carbon atoms, nucleic bases “de novo” by the same process of hydroxylation-amination, leading to THE SYNTHESIS OF NUCLEIC ACIDS “DE NOVO”. Sincerely yours, Dr. Viorel Bungau viorelbungau20@yahoo.com

  20. […] Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View? Author: Larry H. Bernstein, MD, FCAP https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-ca… […]

Case Study #2:

·      Knowing the tumor’s size and location, could we target treatment to THE ROI by applying…..

Author: Dror Nir, PhD

https://pharmaceuticalintelligence.com/2012/10/16/knowing-the-tumors-size-and-location-could-we-target-treatment-to-the-roi-by-applying-imaging-guided-intervention/

26 Responses

  1. GREAT work.

    I’ll read and comment later on

  2. Highlights of The 2012 Johns Hopkins Prostate Disorders White Paper include:

    A promising new treatment for men with frequent nighttime urination.
    Answers to 8 common questions about sacral nerve stimulation for lower urinary tract symptoms.
    Surprising research on the link between smoking and prostate cancer recurrence.
    How men who drink 6 cups of coffee a day or more may reduce their risk of aggressive prostate cancer.
    Should you have a PSA screening test? Answers to important questions on the controversial USPSTF recommendation.
    Watchful waiting or radical prostatectomy for men with early-stage prostate cancer? What the research suggests.
    A look at state-of-the-art surveillance strategies for men on active surveillance for prostate cancer.
    Locally advanced prostate cancer: Will you benefit from radiation and hormones?
    New drug offers hope for men with metastatic castrate-resistant prostate cancer.
    Behavioral therapy for incontinence: Why it might be worth a try.

    You’ll also get the latest news on benign prostatic enlargement (BPE), also known as benign prostatic hyperplasia (BPH) and prostatitis:
    What’s your Prostate Symptom Score? Here’s a quick quiz you can take right now to determine if you should seek treatment for your enlarged prostate.
    Your surgical choices: a close look at simple prostatectomy, transurethral prostatectomy and open prostatectomy.
    New warnings about 5-alpha-reductase inhibitors and aggressive prostate cancer.

  3. Promising technique.

    INCORE pointed out in detail about the general problem judging response and the stil missing quality in standardization:

    http://www.futuremedicine.com/doi/abs/10.2217/fon.12.78?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dwww.ncbi.nlm.nih.gov

    I did research in response evaluation and prediction for about 15y now and being honest: neither the clinical, nor the molecular biological data proved significant benefit in changing a strategy in patient diagnosis and / or treatment. I would state: this brings us back on the ground and not upon the sky. Additionally it means: we have to ´work harder on that and the WHO has to take responsibility: clinicians use a reponse classification without knowing, that this is just related to “ONE” experiment from the 70’s and that this experiment never had been rescrutinized (please read the Editorial I provided – we use a clinical response classification since more than 30 years worldwide (Miller et al. Cancer 1981) but it is useless !

  4. Dr. BB

    Thank you for your comment.
    Dr. Nir will reply to your comment.
    Regarding the Response Classification in use, it seems that the College of Oncology should champion a task force to revisit the Best Practice in use in this domain and issue a revised version or a new effort for a a new classification system for Clinical Response to treatment in Cancer.

  5. I’m sorry that I was looking for this paper again earlier and didn’t find it. I answered my view on your article earlier.

    This is a method demonstration, but not a proof of concept by any means. It adds to the cacophany of approaches, and in a much larger study would prove to be beneficial in treatment, but not a cure for serious prostate cancer because it is unlikely that it can get beyond the margin, and also because there is overtreatment at the cutoff of PSA at 4.0. There is now a proved prediction model that went to press some 4 months ago. I think that the pathologist has to see the tissue, and the standard in pathology now is for any result that is cancer, two pathologist or a group sitting together should see it. It’s not an easy diagnosis.

    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital, & Alexander Stojadinovic. Tumor response criteria: are they appropriate? Future Oncol. (2012) 8(8), 903–906. 10.2217/FON.12.78. ISSN 1479-6694.

    ..Tumor heterogeneity is a ubiquitous phemomenon. In particular, there are important differences among the various types of gastrointestinal (GI) cancers in terms of tumor biology, treatment response and prognosis.

    ..This forms the principal basis for targeted therapy directed by tumor-specific testing at either the gene or protein level. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    ..Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?

    ..In 2000 theNCI with the European Association for Research and Treatment of Cancer, proposed a replacement of 2D measurement with a decrease in the largest tumor diameter by 30% in one dimension. Tumor response as defined would translate into a 50% decrease for a spherical lesion

    ..We must rethink how we may better determine treatment response in a reliable, reproducible way that is aimed at individualizing the therapy of cancer patients.

    ..we must change the tools we use to assess tumor response. The new modality should be based on empirical evidence that translates into relevant and meaningful clinical outcome data.

    ..This becomes a conundrum of sorts in an era of ‘minimally invasive treatment’.

    ..integrated multidisciplinary panel of international experts – not sure that that will do it

    Several years ago i heard Stamey present the totality of his work at Stanford, with great disappointment over hsPSA that they pioneered in. The outcomes were disappointing.

    I had published a review of all of our cases reviewed for 1 year with Marguerite Pinto.
    There’s a reason that the physicians line up outside of her office for her opinion.
    The review showed that a PSA over 24 ng/ml is predictive of bone metastasis. Any result over 10 was as likely to be prostatitis, BPH or cancer.

    I did an ordinal regression in the next study with Gustave Davis using a bivariate ordinal regression to predict lymph node metastasis using the PSA and the Gleason score. It was better than any univariate model, but there was no followup.

    I reviewed a paper for Clin Biochemistry (Elsevier) on a new method for PSA, very different than what we are familiar with. It was the most elegant paper I have seen in the treatment of the data. The model could predict post procedural time to recurrence to 8 years.

    • I hope we are in agreement on the fact that imaging guided interventions are needed for better treatment outcome. The point I’m trying to make in this post is that people are investing in developing imaging guided intervention and it is making progress.

      Over diagnosis and over treatment is another issue altogether. I think that many of my other posts are dealing with that.

  6. Tumor response criteria: are they appropriate?
    Future Oncology 2012; 8(8): 903-906 , DOI 10.2217/fon.12.78 (doi:10.2217/fon.12.78)
    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital & Alexander Stojadinovic
    Tumor heterogeneity is a problematic because of differences among the metabolic variety among types of gastrointestinal (GI) cancers, confounding treatment response and prognosis.
    This is in response to … a group of investigators from Sunnybrook Health Sciences Centre, University of Toronto, Ontario, Canada who evaluate the feasibility and safety of magnetic resonance (MR) imaging–controlled transurethral ultrasound therapy for prostate cancer in humans. Their study’s objective was to prove that using real-time MRI guidance of HIFU treatment is possible and it guarantees that the location of ablated tissue indeed corresponds to the locations planned for treatment.
    1. There is a difference between expected response to esophageal or gastric neoplasms both biologically and in expected response, even given variability within a class. The expected time to recurrence is usually longer in the latter case, but the confounders are – age at time of discovery, biological time of detection, presence of lymph node and/or distant metastasis, microscopic vascular invasion.
    2. There is a long latent period in abdominal cancers before discovery, unless a lesion is found incidentally in surgery for another reason.
    3. The undeniable reality is that it is not difficult to identify the main lesion, but it is difficult to identify adjacent epithelium that is at risk (transitional or pretransitional). Pathologists have a very good idea about precancerous cervical neoplasia.

    The heterogeneity rests within each tumor and between the primary and metastatic sites, which is expected to be improved by targeted therapy directed by tumor-specific testing. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    The heterogeneity is a problem that will take at least another decade to unravel because of the number of signaling pathways and the crosstalk that is specifically at issue.

    I must refer back to the work of Frank Dixon, Herschel Sidransky, and others, who did much to develop a concept of neoplasia occurring in several stages – minimal deviation and fast growing. These have differences in growth rates, anaplasia, and biochemical. This resembles the multiple “hit” theory that is described in “systemic inflammatory” disease leading to a final stage, as in sepsis and septic shock.
    In 1920, Otto Warburg received the Nobel Prize for his work on respiration. He postulated that cancer cells become anaerobic compared with their normal counterpart that uses aerobic respiration to meet most energy needs. He attributed this to “mitochondrial dysfunction. In fact, we now think that in response to oxidative stress, the mitochondrion relies on the Lynen Cycle to make more cells and the major source of energy becomes glycolytic, which is at the expense of the lean body mass (muscle), which produces gluconeogenic precursors from muscle proteolysis (cancer cachexia). There is a loss of about 26 ATP ~Ps in the transition.
    The mitochondrial gene expression system includes the mitochondrial genome, mitochondrial ribosomes, and the transcription and translation machinery needed to regulate and conduct gene expression as well as mtDNA replication and repair. Machinery involved in energetics includes the enzymes of the Kreb’s citric acid or TCA (tricarboxylic acid) cycle, some of the enzymes involved in fatty acid catabolism (β-oxidation), and the proteins needed to help regulate these systems. The inner membrane is central to mitochondrial physiology and, as such, contains multiple protein systems of interest. These include the protein complexes involved in the electron transport component of oxidative phosphorylation and proteins involved in substrate and ion transport.
    Mitochondrial roles in, and effects on, cellular homeostasis extend far beyond the production of ATP, but the transformation of energy is central to most mitochondrial functions. Reducing equivalents are also used for anabolic reactions. The energy produced by mitochondria is most commonly thought of to come from the pyruvate that results from glycolysis, but it is important to keep in mind that the chemical energy contained in both fats and amino acids can also be converted into NADH and FADH2 through mitochondrial pathways. The major mechanism for harvesting energy from fats is β-oxidation; the major mechanism for harvesting energy from amino acids and pyruvate is the TCA cycle. Once the chemical energy has been transformed into NADH and FADH2 (also discovered by Warburg and the basis for a second Nobel nomination in 1934), these compounds are fed into the mitochondrial respiratory chain.
    The hydroxyl free radical is extremely reactive. It will react with most, if not all, compounds found in the living cell (including DNA, proteins, lipids and a host of small molecules). The hydroxyl free radical is so aggressive that it will react within 5 (or so) molecular diameters from its site of production. The damage caused by it, therefore, is very site specific. The reactions of the hydroxyl free radical can be classified as hydrogen abstraction, electron transfer, and addition.
    The formation of the hydroxyl free radical can be disastrous for living organisms. Unlike superoxide and hydrogen peroxide, which are mainly controlled enzymatically, the hydroxyl free radical is far too reactive to be restricted in such a way – it will even attack antioxidant enzymes. Instead, biological defenses have evolved that reduce the chance that the hydroxyl free radical will be produced and, as nothing is perfect, to repair damage.
    Currently, some endogenous markers are being proposed as useful measures of total “oxidative stress” e.g., 8-hydroxy-2’deoxyguanosine in urine. The ideal scavenger must be non-toxic, have limited or no biological activity, readily reach the site of hydroxyl free radical production (i.e., pass through barriers such as the blood-brain barrier), react rapidly with the free radical, be specific for this radical, and neither the scavenger nor its product(s) should undergo further metabolism.
    Nitric oxide has a single unpaired electron in its π*2p antibonding orbital and is therefore paramagnetic. This unpaired electron also weakens the overall bonding seen in diatomic nitrogen molecules so that the nitrogen and oxygen atoms are joined by only 2.5 bonds. The structure of nitric oxide is a resonance hybrid of two forms.
    In living organisms nitric oxide is produced enzymatically. Microbes can generate nitric oxide by the reduction of nitrite or oxidation of ammonia. In mammals nitric oxide is produced by stepwise oxidation of L-arginine catalyzed by nitric oxide synthase (NOS). Nitric oxide is formed from the guanidino nitrogen of the L-arginine in a reaction that consumes five electrons and requires flavin adenine dinucleotide (FAD), flavin mononucleotide (FMN) tetrahydrobiopterin (BH4), and iron protoporphyrin IX as cofactors. The primary product of NOS activity may be the nitroxyl anion that is then converted to nitric oxide by electron acceptors.
    The thiol-disulfide redox couple is very important to oxidative metabolism. GSH is a reducing cofactor for glutathione peroxidase, an antioxidant enzyme responsible for the destruction of hydrogen peroxide. Thiols and disulfides can readily undergo exchange reactions, forming mixed disulfides. Thiol-disulfide exchange is biologically very important. For example, GSH can react with protein cystine groups and influence the correct folding of proteins, and it GSH may play a direct role in cellular signaling through thiol-disulfide exchange reactions with membrane bound receptor proteins (e.g., the insulin receptor complex), transcription factors (e.g., nuclear factor κB), and regulatory proteins in cells. Conditions that alter the redox status of the cell can have important consequences on cellular function.
    So the complexity of life is not yet unraveled.

    Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?
    The goal is not just complete response. Histopathological response seems to be related post-treatment histopathological assessment but it is not free from the challenge of accurately determining treatment response, as this method cannot delineate whether or not there are residual cancer cells. Functional imaging to assess metabolic response by 18-fluorodeoxyglucose PET also has its limits, as the results are impacted significantly by several variables:

    • tumor type
    • sizing
    • doubling time
    • anaplasia?
    • extent of tumor necrosis
    • type of antitumor therapy and the time when response was determined.
    The new modality should be based on individualized histopathology as well as tumor molecular, genetic and functional characteristics, and individual patients’ characteristics, a greater challenge in an era of ‘minimally invasive treatment’.
    This listing suggests that for every cancer the following data has to be collected (except doubling time). If there are five variables, the classification based on these alone would calculate to be very sizable based on Eugene Rypka’s feature extraction and classification. But looking forward, time to remission and disease free survival are additionally important. Treatment for cure is not the endpoint, but the best that can be done is to extend the time of survival to a realistic long term goal and retain a quality of life.

    Brücher BLDM, Piso P, Verwaal V et al. Peritoneal carcinomatosis: overview and basics. Cancer Invest.30(3),209–224 (2012).
    Brücher BLDM, Swisher S, Königsrainer A et al. Response to preoperative therapy in upper gastrointestinal cancers. Ann. Surg. Oncol.16(4),878–886 (2009).
    Miller AB, Hoogstraten B, Staquet M, Winkler A. Reporting results of cancer treatment. Cancer47(1),207–214 (1981).
    Therasse P, Arbuck SG, Eisenhauer EA et al. New guidelines to evaluate the response to treatment in solid tumors. European Organization for Research and Treatment of Cancer, National Cancer Institute of the United States, National Cancer Institute of Canada. J. Natl Cancer Inst.92(3),205–216 (2000).
    Brücher BLDM, Becker K, Lordick F et al. The clinical impact of histopathological response assessment by residual tumor cell quantification in esophageal squamous cell carcinomas. Cancer106(10),2119–2127 (2006).

    • Dr. Larry,

      Thank you for this comment.

      Please carry it as a stand alone post, Dr. Ritu will refer to it and reference it in her FORTHCOMING pst on Tumor Response which will integrate multiple sources.

      Please execute my instruction

      Thank you

    • Thank you Larry for this educating comment. It explains very well why the Canadian investigators did not try to measure therapy response!

      What they have demonstrated is the technological feasibility of coupling a treatment device to an imaging device and use that in order to guide the treatment to the right place.

      the issue of “choice of treatment” to which you are referring is not in the scope of this publication.
      The point is: if one treatment modality can be guided, other can as well! This should encourage others, to try and develop imaging-based treatment guidance systems.

  7. The crux of the matter in terms of capability is that the cancer tissue, adjacent tissue, and the fibrous matrix are all in transition to the cancerous state. It is taught to resect leaving “free margin”, which is better aesthetically, and has had success in breast surgery. The dilemma is that the patient may return, but how soon?

    • Correct. The philosophy behind lumpectomy is preserving quality of life. It was Prof. Veronesi (IEO) who introduced this method 30 years ago noticing that in the majority of cases, the patient will die from something else before presenting recurrence of breast cancer..

      It is well established that when the resection margins are declared by a pathologist (as good as he/she could be) as “free of cancer”, the probability of recurrence is much lower than otherwise.

  8. Dr. Larry,

    To assist Dr. Ritu, PLEASE carry ALL your comments above into a stand alone post and ADD to it your comment on my post on MIS

    Thank you

  9. Great post! Dr. Nir, can the ultrasound be used in conjunction with PET scanning as well to determine a spatial and functional map of the tumor. With a disease like serous ovarian cancer we typically see an intraperitoneal carcimatosis and it appears that clinicians are wanting to use fluorogenic probes and fiberoptics to visualize the numerous nodules located within the cavity Also is the technique being used mainy for surgery or image guided radiotherapy or can you use this for detecting response to various chemotherapeutics including immunotherapy.

    • Ultrasound can and is actually used in conjunction with PET scanning in many cases. The choice of using ultrasound is always left to the practitioner! Being a non-invasive, low cost procedure makes the use of ultrasound a non-issue. The down-side is that because it is so easy to access and operate, nobody bothers to develop rigorous guidelines about using it and the benefits remains the property of individuals.

      In regards to the possibility of screening for ovarian cancer and characterising pelvic masses using ultrasound I can refer you to scientific work in which I was involved:

      1. VAES (E.), MANCHANDA (R), AUTIER, NIR (R), NIR (D.), BLEIBERG (H.), ROBERT (A.), MENON (U.). Differential diagnosis of adnexal masses: Sequential use of the Risk of Malignancy Index and a novel computer aided diagnostic tool. Published in Ultrasound in Obstetrics & Gynecology. Issue 1 (January). Vol. 39. Page(s): 91-98.

      2. VAES (E.), MANCHANDA (R), NIR (R), NIR (D.), BLEIBERG (H.), AUTIER (P.), MENON (U.), ROBERT (A.). Mathematical models to discriminate between benign and malignant adnexal masses: potential diagnostic improvement using Ovarian HistoScanning. Published in International Journal of Gynecologic Cancer (IJGC). Issue 1. Vol. 21. Page(s): 35-43.

      3. LUCIDARME (0.), AKAKPO (J.-P.), GRANBERG (S.), SIDERI (M.), LEVAVI (H.), SCHNEIDER (A.), AUTIER (P.), NIR (D.), BLEIBERG (H.). A new computer aided diagnostic tool for non-invasive characterisation of malignant ovarian masses: Results of a multicentre validation study. Published in European Radiology. Issue 8. Vol. 20. Page(s): 1822-1830.

      Dror Nir, PhD
      Managing partner

      BE: +32 (0) 473 981896
      UK: +44 (0) 2032392424

      web: http://www.radbee.com/
      blogs: http://radbee.wordpress.com/ ; http://www.MedDevOnIce.com

       

  10. totally true and i am very thankfull for these briliant comments.

    Remember: 10years ago: every cancer researcher stated: “look at the tumor cells only – forget the stroma”. The era of laser-captured tumor-cell dissection started. Now , everyone knows: it is a system we are looking at and viewing and analyzing tumor cells only is really not enough.

    So if we would be honest, we would have to declare, that all data, which had been produced 13-8years ago, dealing with laser capture microdissection, that al these data would need a re-scrutinization, cause the influence of the stroma was “forgotten”. I ‘d better not try thinking about the waisted millions of dollars.

    If we keep on being honest: the surgeon looks at the “free margin” in a kind of reductionable model, the pathologist is more the control instance. I personally see the pathologist as “the control instance” of surgical quality. Therefore, not the wish of the surgeon is important, the objective way of looking into problems or challenges. Can a pathologist always state, if a R0-resection had been performed ?

    The use of the Resectability Classification:
    There had been many many surrogate marker analysis – nothing new. BUT never a real substantial well tought through structured analysis had been done: mm by mm by mm by mm and afterwards analyzing that by a ROC analysis. BUt against which goldstandard ? If you perform statistically a ROC analysis – you need a golstandard to compare to. Therefore what is the real R0-resectiòn? It had been not proven. It just had been stated in this or that tumor entity that this or that margin with this margin free mm distance or that mm distance is enough and it had been declared as “the real R0-classification”. In some organs it is very very difficult and we all (surgeons, pathologists, clinicians) that we always get to the limit, if we try interpretating the R-classification within the 3rd dimension. Often it is just declared and stated.

    Otherwise: if lymph nodes are negative it does not mean, lymph nodes are really negative, cause up to 38% for example in upper GI cancers have histological negative lymph nodes, but immunohistochemical positive lymph nodes. And this had been also shown by Stojadinovic at el analyzing the ultrastaging in colorectal cancer. So the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.

    AS we see: cancer has multifactorial reasons and it is necessary taking the challenge performing high sophisticated research by a multifactorial and multidisciplinary manner.

    Again my deep and heartly thanks for that productive and excellent discussion !

    • Dr. BB,

      Thank you for your comment.

      Multidisciplinary perspectives have illuminated the discussion on the pages of this Journal.

      Eager to review Dr. Ritu’s forthcoming paper – the topic has a life of its own and is embodied in your statement:

      “the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.”

    • Thank you BB for your comment. You have touched the core limitation of healthcare professionals: how do we know that we know!

      Do we have a reference to each of the test we perform?

      Do we have objective and standardise quality measures?

      Do we see what is out-there or are we imagining?

      The good news: Everyday we can “think” that we learned something new. We should be happy with that, even if it is means that we learned that yesterday’s truth is not true any-more and even if we are likely to be wrong again…:)

      But still, in the last decades, lots of progress was made….

  11. Dr. Nir,
    I thoroughly enjoyed reading your post as well as the comments that your post has attracted. There were different points of view and each one has been supported with relevant examples in the literature. Here are my two cents on the discussion:
    The paper that you have discussed had the objective of finding out whether real-time MRI guidance of treatment was even possible and if yes, and also if the treatment could be performed in accurate location of the ROI? The data reveals they were pretty successful in accomplishing their objective and of course that gives hope to the imaging-based targeted therapies.
    Whether the ROI is defined properly and if it accounts for the real tumor cure, is a different question. Role of pathologists and the histological analysis they bring about to the table cannot be ruled out, and the absence of a defined line between the tumor and the stromal region in the vicinity is well documented. However, that cannot rule out the value and scope of imaging-based detection and targeted therapy. After all, it is seminal in guiding minimally invasive surgery. As another arm of personalized medicine-based cure for cancer, molecular biologists at MD Anderson have suggested molecular and genetic profiling of the tumor to determine genetic aberrations on the basis of which matched-therapy could be recommended to patients. When phase I trial was conducted, the results were obtained were encouraging and the survival rate was better in matched-therapy patients compared to unmatched patients. Therefore, everytime there is more to consider when treating a cancer patient and who knows a combination of views of oncologists, pathologists, molecular biologists, geneticists, surgeons would device improvised protocols for diagnosis and treatment. It is always going to be complicated and generalizations would never give an answer. Smart interpretations of therapies – imaging-based or others would always be required!

    Ritu

    • Dr. Nir,
      One of your earlier comments, mentioned the non invasiveness of ultrasound, thus, it’s prevalence in use for diagnosis.

      This may be true for other or all areas with the exception of Mammography screening. In this field, an ultrasound is performed only if a suspected area of calcification or a lump has been detected in the routine or patient-initiated request for ad hoc mammography secondery to patient complain of pain or patient report of suspected lump.

      Ultrasound in this field repserents ascalation and two radiologists review.

      It in routine use for Breast biopsy.

    • Thanks Ritu for this supporting comment. The worst enemy of finding solutions is doing nothing while using the excuse of looking for the “ultimate solution” . Personally, I believe in combining methods and improving clinical assessment based on information fusion. Being able to predict, and then timely track the response to treatment is a major issue that affects survival and costs!

  12. […] Dror Nir authored a post on October 16th titled “Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu…” The article attracted a lot of comments from readers including researchers and oncologists and […]

  13. […] ted in this area; New clinical results supports Imaging-guidance for targeted prostate biopsy and Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imagin… Today I report on recent publication presenting the advantage of using targeted trans-perineal […]

  14. […] ted in this area; New clinical results supports Imaging-guidance for targeted prostate biopsy and Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu… Today I report on recent publication presenting the advantage of using targeted trans-perineal […]

  15. […] Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu… […]

Case Study #3:

  • Personalized Medicine: Cancer Cell Biology and Minimally Invasive Surgery (MIS)

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/12/01/personalized-medicine-cancer-cell-biology-and-minimally-invasive-surgery-mis

 

This article generated a Scientific Exchange of 24 Comments, some scholarly comments are quite lengthy

24 Responses

  1. GREAT work.

    I’ll read and comment later on

  2. Highlights of The 2012 Johns Hopkins Prostate Disorders White Paper include:

    A promising new treatment for men with frequent nighttime urination.
    Answers to 8 common questions about sacral nerve stimulation for lower urinary tract symptoms.
    Surprising research on the link between smoking and prostate cancer recurrence.
    How men who drink 6 cups of coffee a day or more may reduce their risk of aggressive prostate cancer.
    Should you have a PSA screening test? Answers to important questions on the controversial USPSTF recommendation.
    Watchful waiting or radical prostatectomy for men with early-stage prostate cancer? What the research suggests.
    A look at state-of-the-art surveillance strategies for men on active surveillance for prostate cancer.
    Locally advanced prostate cancer: Will you benefit from radiation and hormones?
    New drug offers hope for men with metastatic castrate-resistant prostate cancer.
    Behavioral therapy for incontinence: Why it might be worth a try.

    You’ll also get the latest news on benign prostatic enlargement (BPE), also known as benign prostatic hyperplasia (BPH) and prostatitis:
    What’s your Prostate Symptom Score? Here’s a quick quiz you can take right now to determine if you should seek treatment for your enlarged prostate.
    Your surgical choices: a close look at simple prostatectomy, transurethral prostatectomy and open prostatectomy.
    New warnings about 5-alpha-reductase inhibitors and aggressive prostate cancer.

  3. Promising technique.

    INCORE pointed out in detail about the general problem judging response and the stil missing quality in standardization:

    http://www.futuremedicine.com/doi/abs/10.2217/fon.12.78?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dwww.ncbi.nlm.nih.gov

    I did research in response evaluation and prediction for about 15y now and being honest: neither the clinical, nor the molecular biological data proved significant benefit in changing a strategy in patient diagnosis and / or treatment. I would state: this brings us back on the ground and not upon the sky. Additionally it means: we have to ´work harder on that and the WHO has to take responsibility: clinicians use a reponse classification without knowing, that this is just related to “ONE” experiment from the 70′s and that this experiment never had been rescrutinized (please read the Editorial I provided – we use a clinical response classification since more than 30 years worldwide (Miller et al. Cancer 1981) but it is useless !

  4. Dr. BB

    Thank you for your comment.
    Dr. Nir will reply to your comment.
    Regarding the Response Classification in use, it seems that the College of Oncology should champion a task force to revisit the Best Practice in use in this domain and issue a revised version or a new effort for a a new classification system for Clinical Response to treatment in Cancer.

  5. I’m sorry that I was looking for this paper again earlier and didn’t find it. I answered my view on your article earlier.

    This is a method demonstration, but not a proof of concept by any means. It adds to the cacophany of approaches, and in a much larger study would prove to be beneficial in treatment, but not a cure for serious prostate cancer because it is unlikely that it can get beyond the margin, and also because there is overtreatment at the cutoff of PSA at 4.0. There is now a proved prediction model that went to press some 4 months ago. I think that the pathologist has to see the tissue, and the standard in pathology now is for any result that is cancer, two pathologist or a group sitting together should see it. It’s not an easy diagnosis.

    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital, & Alexander Stojadinovic. Tumor response criteria: are they appropriate? Future Oncol. (2012) 8(8), 903–906. 10.2217/FON.12.78. ISSN 1479-6694.

    ..Tumor heterogeneity is a ubiquitous phemomenon. In particular, there are important differences among the various types of gastrointestinal (GI) cancers in terms of tumor biology, treatment response and prognosis.

    ..This forms the principal basis for targeted therapy directed by tumor-specific testing at either the gene or protein level. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    ..Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?

    ..In 2000 theNCI with the European Association for Research and Treatment of Cancer, proposed a replacement of 2D measurement with a decrease in the largest tumor diameter by 30% in one dimension. Tumor response as defined would translate into a 50% decrease for a spherical lesion

    ..We must rethink how we may better determine treatment response in a reliable, reproducible way that is aimed at individualizing the therapy of cancer patients.

    ..we must change the tools we use to assess tumor response. The new modality should be based on empirical evidence that translates into relevant and meaningful clinical outcome data.

    ..This becomes a conundrum of sorts in an era of ‘minimally invasive treatment’.

    ..integrated multidisciplinary panel of international experts – not sure that that will do it

    Several years ago i heard Stamey present the totality of his work at Stanford, with great disappointment over hsPSA that they pioneered in. The outcomes were disappointing.

    I had published a review of all of our cases reviewed for 1 year with Marguerite Pinto.
    There’s a reason that the physicians line up outside of her office for her opinion.
    The review showed that a PSA over 24 ng/ml is predictive of bone metastasis. Any result over 10 was as likely to be prostatitis, BPH or cancer.

    I did an ordinal regression in the next study with Gustave Davis using a bivariate ordinal regression to predict lymph node metastasis using the PSA and the Gleason score. It was better than any univariate model, but there was no followup.

    I reviewed a paper for Clin Biochemistry (Elsevier) on a new method for PSA, very different than what we are familiar with. It was the most elegant paper I have seen in the treatment of the data. The model could predict post procedural time to recurrence to 8 years.

    • I hope we are in agreement on the fact that imaging guided interventions are needed for better treatment outcome. The point I’m trying to make in this post is that people are investing in developing imaging guided intervention and it is making progress.

      Over diagnosis and over treatment is another issue altogether. I think that many of my other posts are dealing with that.

  6. Tumor response criteria: are they appropriate?
    Future Oncology 2012; 8(8): 903-906 , DOI 10.2217/fon.12.78 (doi:10.2217/fon.12.78)
    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital & Alexander Stojadinovic
    Tumor heterogeneity is a problematic because of differences among the metabolic variety among types of gastrointestinal (GI) cancers, confounding treatment response and prognosis.
    This is in response to … a group of investigators from Sunnybrook Health Sciences Centre, University of Toronto, Ontario, Canada who evaluate the feasibility and safety of magnetic resonance (MR) imaging–controlled transurethral ultrasound therapy for prostate cancer in humans. Their study’s objective was to prove that using real-time MRI guidance of HIFU treatment is possible and it guarantees that the location of ablated tissue indeed corresponds to the locations planned for treatment.
    1. There is a difference between expected response to esophageal or gastric neoplasms both biologically and in expected response, even given variability within a class. The expected time to recurrence is usually longer in the latter case, but the confounders are – age at time of discovery, biological time of detection, presence of lymph node and/or distant metastasis, microscopic vascular invasion.
    2. There is a long latent period in abdominal cancers before discovery, unless a lesion is found incidentally in surgery for another reason.
    3. The undeniable reality is that it is not difficult to identify the main lesion, but it is difficult to identify adjacent epithelium that is at risk (transitional or pretransitional). Pathologists have a very good idea about precancerous cervical neoplasia.

    The heterogeneity rests within each tumor and between the primary and metastatic sites, which is expected to be improved by targeted therapy directed by tumor-specific testing. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    The heterogeneity is a problem that will take at least another decade to unravel because of the number of signaling pathways and the crosstalk that is specifically at issue.

    I must refer back to the work of Frank Dixon, Herschel Sidransky, and others, who did much to develop a concept of neoplasia occurring in several stages – minimal deviation and fast growing. These have differences in growth rates, anaplasia, and biochemical. This resembles the multiple “hit” theory that is described in “systemic inflammatory” disease leading to a final stage, as in sepsis and septic shock.
    In 1920, Otto Warburg received the Nobel Prize for his work on respiration. He postulated that cancer cells become anaerobic compared with their normal counterpart that uses aerobic respiration to meet most energy needs. He attributed this to “mitochondrial dysfunction. In fact, we now think that in response to oxidative stress, the mitochondrion relies on the Lynen Cycle to make more cells and the major source of energy becomes glycolytic, which is at the expense of the lean body mass (muscle), which produces gluconeogenic precursors from muscle proteolysis (cancer cachexia). There is a loss of about 26 ATP ~Ps in the transition.
    The mitochondrial gene expression system includes the mitochondrial genome, mitochondrial ribosomes, and the transcription and translation machinery needed to regulate and conduct gene expression as well as mtDNA replication and repair. Machinery involved in energetics includes the enzymes of the Kreb’s citric acid or TCA (tricarboxylic acid) cycle, some of the enzymes involved in fatty acid catabolism (β-oxidation), and the proteins needed to help regulate these systems. The inner membrane is central to mitochondrial physiology and, as such, contains multiple protein systems of interest. These include the protein complexes involved in the electron transport component of oxidative phosphorylation and proteins involved in substrate and ion transport.
    Mitochondrial roles in, and effects on, cellular homeostasis extend far beyond the production of ATP, but the transformation of energy is central to most mitochondrial functions. Reducing equivalents are also used for anabolic reactions. The energy produced by mitochondria is most commonly thought of to come from the pyruvate that results from glycolysis, but it is important to keep in mind that the chemical energy contained in both fats and amino acids can also be converted into NADH and FADH2 through mitochondrial pathways. The major mechanism for harvesting energy from fats is β-oxidation; the major mechanism for harvesting energy from amino acids and pyruvate is the TCA cycle. Once the chemical energy has been transformed into NADH and FADH2 (also discovered by Warburg and the basis for a second Nobel nomination in 1934), these compounds are fed into the mitochondrial respiratory chain.
    The hydroxyl free radical is extremely reactive. It will react with most, if not all, compounds found in the living cell (including DNA, proteins, lipids and a host of small molecules). The hydroxyl free radical is so aggressive that it will react within 5 (or so) molecular diameters from its site of production. The damage caused by it, therefore, is very site specific. The reactions of the hydroxyl free radical can be classified as hydrogen abstraction, electron transfer, and addition.
    The formation of the hydroxyl free radical can be disastrous for living organisms. Unlike superoxide and hydrogen peroxide, which are mainly controlled enzymatically, the hydroxyl free radical is far too reactive to be restricted in such a way – it will even attack antioxidant enzymes. Instead, biological defenses have evolved that reduce the chance that the hydroxyl free radical will be produced and, as nothing is perfect, to repair damage.
    Currently, some endogenous markers are being proposed as useful measures of total “oxidative stress” e.g., 8-hydroxy-2’deoxyguanosine in urine. The ideal scavenger must be non-toxic, have limited or no biological activity, readily reach the site of hydroxyl free radical production (i.e., pass through barriers such as the blood-brain barrier), react rapidly with the free radical, be specific for this radical, and neither the scavenger nor its product(s) should undergo further metabolism.
    Nitric oxide has a single unpaired electron in its π*2p antibonding orbital and is therefore paramagnetic. This unpaired electron also weakens the overall bonding seen in diatomic nitrogen molecules so that the nitrogen and oxygen atoms are joined by only 2.5 bonds. The structure of nitric oxide is a resonance hybrid of two forms.
    In living organisms nitric oxide is produced enzymatically. Microbes can generate nitric oxide by the reduction of nitrite or oxidation of ammonia. In mammals nitric oxide is produced by stepwise oxidation of L-arginine catalyzed by nitric oxide synthase (NOS). Nitric oxide is formed from the guanidino nitrogen of the L-arginine in a reaction that consumes five electrons and requires flavin adenine dinucleotide (FAD), flavin mononucleotide (FMN) tetrahydrobiopterin (BH4), and iron protoporphyrin IX as cofactors. The primary product of NOS activity may be the nitroxyl anion that is then converted to nitric oxide by electron acceptors.
    The thiol-disulfide redox couple is very important to oxidative metabolism. GSH is a reducing cofactor for glutathione peroxidase, an antioxidant enzyme responsible for the destruction of hydrogen peroxide. Thiols and disulfides can readily undergo exchange reactions, forming mixed disulfides. Thiol-disulfide exchange is biologically very important. For example, GSH can react with protein cystine groups and influence the correct folding of proteins, and it GSH may play a direct role in cellular signaling through thiol-disulfide exchange reactions with membrane bound receptor proteins (e.g., the insulin receptor complex), transcription factors (e.g., nuclear factor κB), and regulatory proteins in cells. Conditions that alter the redox status of the cell can have important consequences on cellular function.
    So the complexity of life is not yet unraveled.

    Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?
    The goal is not just complete response. Histopathological response seems to be related post-treatment histopathological assessment but it is not free from the challenge of accurately determining treatment response, as this method cannot delineate whether or not there are residual cancer cells. Functional imaging to assess metabolic response by 18-fluorodeoxyglucose PET also has its limits, as the results are impacted significantly by several variables:

    • tumor type
    • sizing
    • doubling time
    • anaplasia?
    • extent of tumor necrosis
    • type of antitumor therapy and the time when response was determined.
    The new modality should be based on individualized histopathology as well as tumor molecular, genetic and functional characteristics, and individual patients’ characteristics, a greater challenge in an era of ‘minimally invasive treatment’.
    This listing suggests that for every cancer the following data has to be collected (except doubling time). If there are five variables, the classification based on these alone would calculate to be very sizable based on Eugene Rypka’s feature extraction and classification. But looking forward, time to remission and disease free survival are additionally important. Treatment for cure is not the endpoint, but the best that can be done is to extend the time of survival to a realistic long term goal and retain a quality of life.

    Brücher BLDM, Piso P, Verwaal V et al. Peritoneal carcinomatosis: overview and basics. Cancer Invest.30(3),209–224 (2012).
    Brücher BLDM, Swisher S, Königsrainer A et al. Response to preoperative therapy in upper gastrointestinal cancers. Ann. Surg. Oncol.16(4),878–886 (2009).
    Miller AB, Hoogstraten B, Staquet M, Winkler A. Reporting results of cancer treatment. Cancer47(1),207–214 (1981).
    Therasse P, Arbuck SG, Eisenhauer EA et al. New guidelines to evaluate the response to treatment in solid tumors. European Organization for Research and Treatment of Cancer, National Cancer Institute of the United States, National Cancer Institute of Canada. J. Natl Cancer Inst.92(3),205–216 (2000).
    Brücher BLDM, Becker K, Lordick F et al. The clinical impact of histopathological response assessment by residual tumor cell quantification in esophageal squamous cell carcinomas. Cancer106(10),2119–2127 (2006).

    • Dr. Larry,

      Thank you for this comment.

      Please carry it as a stand alone post, Dr. Ritu will refer to it and reference it in her FORTHCOMING pst on Tumor Response which will integrate multiple sources.

      Please execute my instruction

      Thank you

    • Thank you Larry for this educating comment. It explains very well why the Canadian investigators did not try to measure therapy response!

      What they have demonstrated is the technological feasibility of coupling a treatment device to an imaging device and use that in order to guide the treatment to the right place.

      the issue of “choice of treatment” to which you are referring is not in the scope of this publication.
      The point is: if one treatment modality can be guided, other can as well! This should encourage others, to try and develop imaging-based treatment guidance systems.

  7. The crux of the matter in terms of capability is that the cancer tissue, adjacent tissue, and the fibrous matrix are all in transition to the cancerous state. It is taught to resect leaving “free margin”, which is better aesthetically, and has had success in breast surgery. The dilemma is that the patient may return, but how soon?

    • Correct. The philosophy behind lumpectomy is preserving quality of life. It was Prof. Veronesi (IEO) who introduced this method 30 years ago noticing that in the majority of cases, the patient will die from something else before presenting recurrence of breast cancer..

      It is well established that when the resection margins are declared by a pathologist (as good as he/she could be) as “free of cancer”, the probability of recurrence is much lower than otherwise.

  8. Dr. Larry,

    To assist Dr. Ritu, PLEASE carry ALL your comments above into a stand alone post and ADD to it your comment on my post on MIS

    Thank you

  9. Great post! Dr. Nir, can the ultrasound be used in conjunction with PET scanning as well to determine a spatial and functional map of the tumor. With a disease like serous ovarian cancer we typically see an intraperitoneal carcimatosis and it appears that clinicians are wanting to use fluorogenic probes and fiberoptics to visualize the numerous nodules located within the cavity Also is the technique being used mainy for surgery or image guided radiotherapy or can you use this for detecting response to various chemotherapeutics including immunotherapy.

    • Ultrasound can and is actually used in conjunction with PET scanning in many cases. The choice of using ultrasound is always left to the practitioner! Being a non-invasive, low cost procedure makes the use of ultrasound a non-issue. The down-side is that because it is so easy to access and operate, nobody bothers to develop rigorous guidelines about using it and the benefits remains the property of individuals.

      In regards to the possibility of screening for ovarian cancer and characterising pelvic masses using ultrasound I can refer you to scientific work in which I was involved:

      1. VAES (E.), MANCHANDA (R), AUTIER, NIR (R), NIR (D.), BLEIBERG (H.), ROBERT (A.), MENON (U.). Differential diagnosis of adnexal masses: Sequential use of the Risk of Malignancy Index and a novel computer aided diagnostic tool. Published in Ultrasound in Obstetrics & Gynecology. Issue 1 (January). Vol. 39. Page(s): 91-98.

      2. VAES (E.), MANCHANDA (R), NIR (R), NIR (D.), BLEIBERG (H.), AUTIER (P.), MENON (U.), ROBERT (A.). Mathematical models to discriminate between benign and malignant adnexal masses: potential diagnostic improvement using Ovarian HistoScanning. Published in International Journal of Gynecologic Cancer (IJGC). Issue 1. Vol. 21. Page(s): 35-43.

      3. LUCIDARME (0.), AKAKPO (J.-P.), GRANBERG (S.), SIDERI (M.), LEVAVI (H.), SCHNEIDER (A.), AUTIER (P.), NIR (D.), BLEIBERG (H.). A new computer aided diagnostic tool for non-invasive characterisation of malignant ovarian masses: Results of a multicentre validation study. Published in European Radiology. Issue 8. Vol. 20. Page(s): 1822-1830.

      Dror Nir, PhD
      Managing partner

      BE: +32 (0) 473 981896
      UK: +44 (0) 2032392424

      web: http://www.radbee.com/
      blogs: http://radbee.wordpress.com/ ; http://www.MedDevOnIce.com

  10. totally true and i am very thankfull for these briliant comments.

    Remember: 10years ago: every cancer researcher stated: “look at the tumor cells only – forget the stroma”. The era of laser-captured tumor-cell dissection started. Now , everyone knows: it is a system we are looking at and viewing and analyzing tumor cells only is really not enough.

    So if we would be honest, we would have to declare, that all data, which had been produced 13-8years ago, dealing with laser capture microdissection, that al these data would need a re-scrutinization, cause the influence of the stroma was “forgotten”. I ‘d better not try thinking about the waisted millions of dollars.

    If we keep on being honest: the surgeon looks at the “free margin” in a kind of reductionable model, the pathologist is more the control instance. I personally see the pathologist as “the control instance” of surgical quality. Therefore, not the wish of the surgeon is important, the objective way of looking into problems or challenges. Can a pathologist always state, if a R0-resection had been performed ?

    The use of the Resectability Classification:
    There had been many many surrogate marker analysis – nothing new. BUT never a real substantial well tought through structured analysis had been done: mm by mm by mm by mm and afterwards analyzing that by a ROC analysis. BUt against which goldstandard ? If you perform statistically a ROC analysis – you need a golstandard to compare to. Therefore what is the real R0-resectiòn? It had been not proven. It just had been stated in this or that tumor entity that this or that margin with this margin free mm distance or that mm distance is enough and it had been declared as “the real R0-classification”. In some organs it is very very difficult and we all (surgeons, pathologists, clinicians) that we always get to the limit, if we try interpretating the R-classification within the 3rd dimension. Often it is just declared and stated.

    Otherwise: if lymph nodes are negative it does not mean, lymph nodes are really negative, cause up to 38% for example in upper GI cancers have histological negative lymph nodes, but immunohistochemical positive lymph nodes. And this had been also shown by Stojadinovic at el analyzing the ultrastaging in colorectal cancer. So the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.

    AS we see: cancer has multifactorial reasons and it is necessary taking the challenge performing high sophisticated research by a multifactorial and multidisciplinary manner.

    Again my deep and heartly thanks for that productive and excellent discussion !

    • Dr. BB,

      Thank you for your comment.

      Multidisciplinary perspectives have illuminated the discussion on the pages of this Journal.

      Eager to review Dr. Ritu’s forthcoming paper – the topic has a life of its own and is embodied in your statement:

      “the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.”

    • Thank you BB for your comment. You have touched the core limitation of healthcare professionals: how do we know that we know!

      Do we have a reference to each of the test we perform?

      Do we have objective and standardise quality measures?

      Do we see what is out-there or are we imagining?

      The good news: Everyday we can “think” that we learned something new. We should be happy with that, even if it is means that we learned that yesterday’s truth is not true any-more and even if we are likely to be wrong again…:)

      But still, in the last decades, lots of progress was made….

  11. Dr. Nir,
    I thoroughly enjoyed reading your post as well as the comments that your post has attracted. There were different points of view and each one has been supported with relevant examples in the literature. Here are my two cents on the discussion:
    The paper that you have discussed had the objective of finding out whether real-time MRI guidance of treatment was even possible and if yes, and also if the treatment could be performed in accurate location of the ROI? The data reveals they were pretty successful in accomplishing their objective and of course that gives hope to the imaging-based targeted therapies.
    Whether the ROI is defined properly and if it accounts for the real tumor cure, is a different question. Role of pathologists and the histological analysis they bring about to the table cannot be ruled out, and the absence of a defined line between the tumor and the stromal region in the vicinity is well documented. However, that cannot rule out the value and scope of imaging-based detection and targeted therapy. After all, it is seminal in guiding minimally invasive surgery. As another arm of personalized medicine-based cure for cancer, molecular biologists at MD Anderson have suggested molecular and genetic profiling of the tumor to determine genetic aberrations on the basis of which matched-therapy could be recommended to patients. When phase I trial was conducted, the results were obtained were encouraging and the survival rate was better in matched-therapy patients compared to unmatched patients. Therefore, everytime there is more to consider when treating a cancer patient and who knows a combination of views of oncologists, pathologists, molecular biologists, geneticists, surgeons would device improvised protocols for diagnosis and treatment. It is always going to be complicated and generalizations would never give an answer. Smart interpretations of therapies – imaging-based or others would always be required!

    Ritu

    • Dr. Nir,
      One of your earlier comments, mentioned the non invasiveness of ultrasound, thus, it’s prevalence in use for diagnosis.

      This may be true for other or all areas with the exception of Mammography screening. In this field, an ultrasound is performed only if a suspected area of calcification or a lump has been detected in the routine or patient-initiated request for ad hoc mammography secondery to patient complain of pain or patient report of suspected lump.

      Ultrasound in this field repserents ascalation and two radiologists review.

      It in routine use for Breast biopsy.

    • Thanks Ritu for this supporting comment. The worst enemy of finding solutions is doing nothing while using the excuse of looking for the “ultimate solution” . Personally, I believe in combining methods and improving clinical assessment based on information fusion. Being able to predict, and then timely track the response to treatment is a major issue that affects survival and costs!

Case Study #4:

  • Judging the ‘Tumor response’-there is more food for thought

https://pharmaceuticalintelligence.com/2012/12/04/judging-the-tumor-response-there-is-more-food-for-thought/

13 Responses

  1. Dr. Sanexa
    you have brought up an interesting and very clinically relevant point: what is the best measurement of response and 2) how perspectives among oncologists and other professionals differ on this issues given their expertise in their respective subspecialties (immunologist versus oncologist. The advent of functional measurements of tumors (PET etc.) seems extremely important in the therapeutic use AND in the development of these types of compounds since usually a response presents (in cases of solid tumors) as either a lack of growth of the tumor or tumor shrinkage. Did the authors include an in-depth discussion of the rapidity of onset of resistance with these types of compounds?
    Thanks for the posting.

  2. Dr. Williams,
    Thanks for your comment on the post. The editorial brings to attention a view that although PET and other imaging methods provide vital information on tumor growth, shrinkage in response to a therapy, however, there are more aspects to consider including genetic and molecular characteristics of tumor.
    It was an editorial review and the authors did not include any in-depth discussion on the rapidity of onset of resistance with these types of compounds as the focus was primarily on interpreting tumor response.
    I am glad you found the contents of the write-up informative.
    Thanks again!
    Ritu

  3. Thank you for your wonderful comment and interpretation. Dr.Sanexa made a brilliant comment.

    May I allow myself putting my finger deeper into this wound ? Cancer patients deserve it.

    It had been already pointed out by international experts from Munich, Tokyo, Hong-Kong and Houston, dealing with upper GI cancer, that the actual response criteria are not appropriate and moreover: the clinical response criteria in use seem rather to function as an alibi, than helping to differentiate and / or discriminate tumor biology (Ann Surg Oncol 2009):

    http://www.ncbi.nlm.nih.gov/pubmed/19194759

    The response data in a phase-II-trial (one tumor entity, one histology, one treatment, one group) revealed: clinical response evaluation according to the WHO-criteria is not appropriate to determine response:

    http://www.ncbi.nlm.nih.gov/pubmed/15498642

    Of course, there was a time, when it seemed to be useful and this also has to be respected.

    There is another challenge: using statistically a ROC and resulting in thresholds. This was, is and always be “a clinical decision only” and not the decision of the statistician. The clinician tells the statistician, what decision, he wants to make – the responsibility is enormous. Getting back to the roots:
    After the main results of the Munich-group had been published 2001 (Ann Surg) and 2004 (J Clin Oncol):

    http://www.ncbi.nlm.nih.gov/pubmed/11224616

    http://www.ncbi.nlm.nih.gov/pubmed/14990646

    the first reaction in the community was: to difficult, can’t be, not re-evaluated, etc.. However, all evaluated cut-offs / thresholds had been later proven to be the real and best ones by the MD Anderson Cancer Center in Houston, Texas. Jaffer Ajani – a great and critical oncologist – pushed that together with Steve Swisher and they found the same results. Than the upper GI stakeholders went an uncommon way in science: they re-scrutinized their findings. Meanwhile the Goldstandard using histopathology as the basis-criterion had been published in Cancer 2006.

    http://www.ncbi.nlm.nih.gov/pubmed/16607651

    Not every author, who was at the authorlist in 2001 and 2004 wanted to be a part of this analysis and publication ! Why ? Everyone should judge that by himself.

    The data of this analysis had been submitted to the New England Journal of Medicine. In the 2nd review stage process, the manuscript was rejected. The Ann Surg Oncol accepted the publication: the re-scrutinized data resulted in another interesting finding: in the future maybe “one PET-scan” might be appropriate predicting the patient’s response.

    Where are we now ?

    The level of evidence using the response criteria is very low: Miller’s (Cancer 1981) publication belonged to ”one single” experiment from Moertel (Cancer 1976). During that time, there was no definition of “experiences” rather than “oncologists”. These terms had not been in use during that time.

    Additionally they resulted in a (scientifically weak) change of the classification, published by Therasse (J Natl Cancer Inst 2000). Targeted therapy did not result in a change so far. In 2009, the international upper GI experts sent their publication of the Ann Surg Oncol 2009 to the WHO but without any kind of reaction.

    Using molecular biological predictive markers within the last 10years all seem to have potential.

    http://www.ncbi.nlm.nih.gov/pubmed/20012971

    http://www.ncbi.nlm.nih.gov/pubmed/18704459

    http://www.ncbi.nlm.nih.gov/pubmed/17940507

    http://www.ncbi.nlm.nih.gov/pubmed/17354029

    But, experts are aware: the real step breaking barriers had not been performed so far. Additionally, it is very important in trying to evaluate and / predict response, that not different tumor entities with different survival and tumor biology are mixed together. Those data are from my perspective not helpful, but maybe that is my own Bias (!) of my view.

    INCORE, the International Consortium of Research Excellence of the Theodor-Billroth-Academy, was invited publishing the Editorial in Future Oncology 2012. The consortium pointed out, that living within an area of ‘prove of principle’ and also trying to work out level of evidence in medicine, it is “the duty and responsibility” of every clinician, but also of the societies and institutions, also of the WHO.

    Complete remission is not the only goal, as experts dealing with ‘response-research’ are aware. It is so frustrating for patients and clinicians: there is a rate of those patients with complete remission, who develop early recurrence ! This reflects, that complete remission cannot function as the only criterion describing response !

    Again, my heartly thanks, that Dr.Sanexa discussed this issue in detail.
    I hope, I found the way explaining the way of development and evaluating response criteria properly and in a differentiated way of view. From the perspective of INCORE:

    “an interdisciplinary initiative with all key stake¬holders and disciplines represented is imperative to make predictive and prognostic individualized tumor response assessment a modern-day reality. The integrated multidisciplinary panel of international experts need to define how to leverage existing data, tissue and testing platforms in order to predict individual patient treatment response and prognosis.”

  4. Dr. Brucher,

    First of all thanks for expressing your views on the ‘tumor response’ in a comprehensive way. You are the first author of the editorial review one of the prominent people who has taken part in the process of defining tumor response and I am glad that you decided to write a comment on the writeup.
    The topic has been explained well in an immaculate manner and that it further clarifies the need for the perfect markers that would be able to evaluate and predict tumor response. There are, as you mentioned, some molecular markers available including VEGF, cyclins, that have been brought to focus in the context of squamous cell carcinoma.

    It would be great if you could be the guest author for our blog and we could publish your opinion (comment on this blog post) as a separate post. Please let us know if it is OK with you.

    Thanks again for your comment
    Ritu

  5. Thank you all to the compelling discussions, above.

    Please review the two sources on the topic I placed at the bottom of the post, above as post on this Scientific Journal,

    All comments made to both entries are part of thisvdiscussion, I am referring to Dr. Nir’s post on size of tumor, to BB comment to Nir’s post, to Larry’ Pathologist view on Tumors and my post on remission and minimally invasive surgery (MIS).

    Great comments by Dr. Williams, BB and wonderful topic exposition by Dr. Ritu.

  6. Aviva,
    Thats a great idea. I will combine all sources referred by you, the post on tumor imaging by Dr. Nir and the comments made on the these posts including Dr. Brucher’s comments in a new posts.
    Thanks
    Ritu

    • Great idea, ask Larry, he has written two very long important comments on this topic, one on Nir’s post and another one, ask him where, if it is not on MIS post. GREAT work, Ritu, integration is very important. Dr, Williams is one of our Gems.

    • Assessing tumour response it is not an easy task!Because tumours don’t change,but happilly our knowlege(about them) does really change,is everchanging(thans god!).In the past we had the Recist Criteria,then the Modified Recist Criteria,becausa of Gist and other tumors.At this very moment,these are clearly insuficient.We do need more new validated facing the reality of nowadays. A great, enormoust post Dr. Ritu! Congratulations!

 

Conclusions

The Voice of Aviva Lev-Ari, PhD, RN:

The relevance of the Scientific Agora to Medical Education is vast. The Open Access Journal allows EVERY Scientist on the internet the GLOBAL reach and access to Open Access published scientific contents NOT only to the subscription payer base of Journals. If you don’t have a HIGH FEE subscription you get NO access to content in the Journal, you can’t participate in Multiple Comment Exchanges. In the Medical Education context – COMMENTS are the medium to debate with peers. 

Multiple Comment Exchanges on Four articles in the Journal, above, demonstrate the vibrancy of the scientific discussion, the multiplicity of perspectives, the subjectivity of the contribution to the debate and the unique expertise and clinical experience expressed by each Scientist.

 .

Read Full Post »

Glycobiology advances

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

The Evolution of the Glycobiology Space

The Nascent Stage of another Omics Field with Biomarker and Therapeutic Potential

Enal Razvi, Ph.D. , Gary Oosta, Ph.D

http://www.genengnews.com/insight-and-intelligence/the-evolution-of-the-glycobiology-space/77900638/

 

The Evolution of the Glycobiology Space

 

Glycobiology is an important field of study with medical applications because it is known that tumor cells alter their glycosylation pattern, which may contribute to their metastatic potential as well as potential immune evasion. [iStock/© vjanez]    http://www.genengnews.com/media/images/AnalysisAndInsight/Apr12_2016_iStock_41612310_PlasmaMembraneOfACell1211657142.jpg

There is growing interest in the field of glycobiology given the fact that epitopes with physiological and pathological relevance have glyco moieties.  We believe that another “omics” revolution is on the horizon—the study of the glyco modifications on the surface of cells and their potential as biomarkers and therapeutic targets in many disease classes. Not much industry tracking of this field has taken place. Thus, we sought to map this landscape by examining the entire ensemble of academic publications in this space and teasing apart the trends operative in this field from a qualitative and quantitative perspective. We believe that this methodology of en masse capture and publication and annotation provides an effective approach to evaluate this early-stage field.

Identifiation and Growth of Glycobiology Publications

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure11935315421.jpg

For this article, we identified 7000 publications in the broader glycobiology space and analyzed them in detail.  It is important to frame glycobiology in the context of genomics and proteomics as a means to assess the scale of the field. Figure 1 presents the relative sizes of these fields as assessed by publications in from 1975 to 2015.

Note that the relative scale of genomics versus proteomics and glycobiology/glycomics in this graph strongly suggests that glycobiology is a nascent space, and thus a driver for us to map its landscape today and as it evolves over the coming years.

Figure 2. (A) Segmentation of the glycobiology landscape. (B) Glycobiology versus glycomics publication growth.

 

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure2ab1917013624.jpg

To examine closely the various components of the glycobiology space, we segmented the publications database, presented in Figure 2A. Note the relative sizes and growth rates (slopes) of the various segments.

Clearly, glycoconjugates currently are the majority of this space and account for the bulk of the publications.  Glycobiology and glycomics are small but expanding and therefore can be characterized as “nascent market segments.”  These two spaces are characterized in more detail in Figure 2B, which presents their publication growth rates.

Note the very recent increased attention directed at these spaces and hence our drive to initiate industry coverage of these spaces. Figure 2B presents the overall growth and timeline of expansion of these fields—especially glycobiology—but it provides no information about the qualitative nature of these fields.

Focus of Glycobiology Publications

http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure2c1827601892.jpg

Figure 2C. Word cloud based on titles of publications in the glycobiology and glycomics spaces.

To understand the focus of publications in this field, and indeed the nature of this field, we constructed a word cloud based on titles of the publications that comprise this space presented in Figure 2C.

There is a marked emphasis on terms such as oligosaccharides and an emphasis on cells (this is after all glycosylation on the surface of cells). Overall, a pictorial representation of the types and classes of modifications that comprise this field emerge in this word cloud, demonstrating the expansion of the glycobiology and to a lesser extent the glycomics spaces as well as the character of these nascent but expanding spaces.

Characterization of the Glycobiology Space in Journals

Figure 3A. Breakout of publications in the glycobiology/glycomics fields.   http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure3a_5002432117316.jpg
Having framed the overall growth of the glycobiology field, we wanted to understand its structure and the classes of researchers as well as publications that comprise this field. To do this, we segmented the publications that constitute this field into the various journals in which glycobiology research is published. Figure 3A presents the breakout of publications by journal to illustrate the “scope” of this field.

The distribution of glycobiology publications across the various journals suggests a very concentrated marketplace that is very technically focused. The majority of the publications segregate into specialized journals on this topic, a pattern very indicative of a field in the very early stages of development—a truly nascent marketplace.

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure3b1012091061.jpg

Figure 3B. Origin of publications in the glycobiology/glycomics fields.
We also sought to understand the “origin” of these publications—the breakout between academic- versus industry-derived journals. Figure 3B presents this breakout and shows that these publications are overwhelmingly (92.3%) derived from the academic sector. This is again a testimonial to the early nascent nature of this marketplace without significant engagement by the commercial sector and therefore is an important field to characterize and track from the ground up.

Select Biosciences, Inc. further analyzed the growth trajectory of the glycobiology papers in Figure 3C as a means to examine closely the publications trajectory. Although there appears to be some wobble along the way, overall the trajectory is upward, and of late it is expanding significantly.

In Summary

Figure 3C. Trajectory of the glycobiology space.   http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure3c1236921793.jpg
Glycobiology is the study of what coats living cells—glycans, or carbohydrates, and glycoconjugates. This is an important field of study with medical applications because it is known that tumor cells alter their glycosylation pattern, which may contribute to their metastatic potential as well as potential immune evasion.

At this point, glycobiology is largely basic research and thus it pales in comparison with the field of genomics. But in 10 years, we predict the study of glycobiology and glycomics will be ubiquitous and in the mainstream.

We started our analysis of this space because we’ve been focusing on many other classes of analytes, such as microRNAs, long-coding RNAs, oncogenes, tumor suppressor genes, etc., whose potential as biomarkers is becoming established. Glycobiology, on the other hand, represents an entire new space—a whole new category of modifications that could be analyzed for diagnostic potential and perhaps also for therapeutic targeting.

Today, glycobiology and glycomics are where genomics was at the start of the Human Genome Project. They respresent a nascent space and with full headroom for growth. Select Biosciences will continue to track this exciting field for research developments as well as development of biomarkers based on glyco-epitopes.

Enal Razvi, Ph.D., conducted his doctoral work on viral immunology and subsequent to receiving his Ph.D. went on to the Rockefeller University in New York to serve as Aaron Diamond Post-doctoral fellow under Professor Ralph Steinman [Nobel Prize Winner in 2011 for his discovery of dendritic cells in the early-70s with Zanvil Cohn]. Subsequently, Dr. Razvi completed his research fellowship at Harvard Medical School. For the last two decades Dr. Razvi has worked with small and large companies and consulted for more than 100 clients worldwide. He currently serves as Biotechnology Analyst and Managing Director of SelectBio U.S. He can be reached at enal@selectbio.us. Gary M. Oosta holds a Ph.D. in Biophysics from Massachusetts Institute of Technology and a B.A. in Chemistry from E. Mich. Univ. He has 25 years of industrial research experience in various technology areas including medical diagnostics, thin-layer coating, bio-effects of electromagnetic radiation, and blood coagulation. Dr. Oosta has authored 20 technical publications and is an inventor on 77 patents worldwide. In addition, he has managed research groups that were responsible for many other patented innovations. Dr. Oosta has a long-standing interest in using patents and publications as strategic technology indicators for future technology selection and new product development. To enjoy more articles like this from GEN, click here to subscribe now!

RELATED CONTENT
Ezose, Hirosaki University Sign Glycomics Partnership to Identify Urologic Cancer Biomarkers
Getting Testy Over Liquid Biopsies
Enabling High-Throughput Glycomics
Market & Tech Analysis
The Evolution of the Glycobiology Space
Cancer Immunotherapy 2016
The Cancer Biomarkers Marketplace
Microfluidics in the Life Sciences
Liquid Biopsies Landscape

Read Full Post »

Problem of Science Doctorate Programs

Larry H. Bernstein, MD, FCAP, Curator

LPBI

The Problem in Biomedical Education

 

Henry Bourne (UCSF)

Dr. Henry Bourne has trained graduate students and postdocs at UCSF for over 40 years. In his iBiology talk, he discusses the imminent need for change in graduate education. With time to degrees getting longer, the biomedical community needs to create experimental graduate programs to find more effective and low cost ways to train future scientists and run successful laboratories. If we don’t start looking for solutions, the future of the biomedical enterprise will grow increasingly unstable.

Watch Henry Bourne’s iBioMagazine: The Problem in Biomedical Education

Henry Bourne is Professor Emeritus and former chair of the Department of Pharmacology at the University of California – San Francisco. His research focused on trimeric G-proteins, G-protein coupled receptors, and the cellular signals responsible for polarity and direction-finding of human leukocytes. He is the author of several books including a memoir, Ambition and Delight, and has written extensively about graduate training and biomedical workforce issues. Now Dr. Bourne’s research focuses on the organization and founding of US biomedical research in the early 20th century.

Related Talks

Read Full Post »

High blood pressure can damage the retina’s blood vessels and limit the retina’s function. It can also put pressure on the optic nerve.

Sourced through Scoop.it from: www.healthline.com

See on Scoop.itCardiovascular Disease: PHARMACO-THERAPY

Read Full Post »

Ischemia, Infarction, and the Waveforms Q through U, Part 1: How to Read an EKG Curriculum

Reporter: Aviva Lev-Ari, PhD, RN

 

Watch Video

https://www.youtube.com/v/Ajfk21k3Zmo?fs=1&hl=fr_FR

ECG Interpretation: Ischemia, Infarction, and the Waveforms Q through U, Part 1 Girish L. Kalra, MD Assistant Professor, Department of Medicine, Emory Univer…

Sourced through Scoop.it from: www.youtube.com

See on Scoop.itCardiovascular and vascular imaging

Read Full Post »

Need for Protophysiology?

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

Comparison of fundamental physical properties of the model cells (protocells) and the living cells reveals the need in protophysiology

V.V. Matveev

International Journal of Astrobiology 2016. pp1-8.   http://dx.doi.org:/10.1017/S1473550415000476

A hypothesis is proposed about potassium ponds being the cradles of life enriches the gamut of ideas about the possible conditions of pre-biological evolution on the primeval Earth, but does not bring us closer to solving the real problem of the origin of life. The gist of the matter lies in the mechanism of making a delimitation between two environments – the intracellular environment and the habitat of protocells. Since the sodium–potassium pump (Na+ /K+-ATPase) was discovered, no molecular model has been proposed for a predecessor of the modern sodium pump. This has brought into life the idea of the potassium pond, wherein protocells would not need a sodium pump. However, current notions of the operation of living cells come into conflict with even physical laws when trying to use them to explain the origin and functioning of protocells. Thus, habitual explanations of the physical properties of living cells have become inapplicable to explain the corresponding properties of Sidney Fox’s microspheres. Likewise, existing approaches to solving the problem of the origin of life do not see the need for the comparative study of living cells and cell models, assemblies of biological and artificial small molecules and macromolecules under physical conditions conducive to the origin of life. The time has come to conduct comprehensive research into the fundamental physical properties of protocells and create a new discipline – protocell physiology or protophysiology – which should bring us much closer to solving the problem of the origin of life.

 

There is a statement we constantly come across in the scientific and popular-science literature: the ion composition of the internal environment of the body of humans and animals, in which all of its cells are immersed, is close to that of seawater. This observation appeared in the literature even 100 years ago, when it became possible to investigate the ion composition of biological liquids.

This similarity between the internal environment of the body and the sea is quite obvious: in both seawater and blood plasma there are one or two orders of magnitude more Na+ ions than K+. It is this composition that can make one think that life originated in the primeval ocean (the memory of which has since been sustained by the internal environment of the body), and the first cells delineated themselves from seawater using a weakly permeable membrane, so that their internal environment became special, suitable for chemical and physical processes needed to sustain life. Indeed, the ratio of the above cations in the cytoplasm is the exact reverse of their ratio in seawater: there is much more K+ in it than Na+ . In fact, physiological processes can only be possible in an environment where potassium prevails over sodium. Therefore, any theory of the origin of life must explain how such a deep delimitation (distinction) between the two environments could occur: the intracellular environment, wherein vitally important processes take course, and the external environment, which provides the cell with necessary materials and conditions.

For the protocell to separate from seawater, a mechanism must arise that creates and maintains the ion asymmetry between the primeval cell and its environs. We normally consider a mechanism of this kind as the isolating lipid membrane with a molecular ion pump, the Na+/K+ -ATPase, built into it. If life originated in seawater, the origin of the first cell inevitably comes down to the origin of the sodium pump and any structure supporting it – the lipid membrane – without which the work of any pump would make little sense. It seems that life is born in conditions that are really adverse to it and even ruinous.

 

The great basic question of science: Membrane compartment or non-membrane phase compartment (biophase) is a physical basis for origin of life?

1. If life originated in seawater, the origin of the first cell inevitably comes down to the origin of the sodium pump and any structure supporting it – the lipid membrane – without which the work of any pump would make little sense.

2. Since the sodium-potassium pump (Na+/K+-ATPase) was discovered, no molecular model has been proposed for a predecessor of the modern sodium pump. Neither Miller’s electrical charges, nor Fox’s amino-acid condensation, nor building ready-made biomolecules into coacervates; none of this has managed to lead to the self-origination of the progenitor of the ion pump even in favourable lab conditions.

3. In 2007, we saw the simultaneous release of two articles, in which it was posited that life originated not in seawater as previously thought, but in smaller bodies of water with a K+/Na+ ratio necessary to sustain life. In this conditions sodium pump is not needed and the pump can originate later. But why the pump is needed if K+/Na+ ratio is good? The origin of the sodium pump in conditions where there is no natural need for it may require the agency of Providence.

4. Potassium Big Bang on Earth instead of potassium ponds.

5. Fox’s microspheres do not need potassium ponds.

6. Despite the fact that Fox’s microspheres have no fully functional membrane with sodium pumps and specific ion channels, they generate action potentials similar to that by nerve cells and in addition have ion channels which open and close spontaneously. This ability of the microspheres contradicts to the generally accepted ideas about the mechanism of generation of biological electrical potentials.

7. Hodgkin-Huxley model of action potentials is similarly well-compatible with both the nerve cell and Fox’s microsphere.

8. Biophase as the main subject of protophysiology. In the past they considered the living cell as a non-membrane phase compartment with different physical properties in comparison to the surrounding medium, and this physical difference plays a key role in cell function. According to a new take on an old phase, non-membrane phase compartments play an important role in the functioning of the cell nucleus, nuclear envelope and then of cytoplasm. Somebody sees the compartments even as temporary organelles. According to available data, the phase compartments can play a key role in cell signaling. In this historical context, studies in recent years dedicated to non-membrane phase compartments in the living cells sound sensational.

9. It is essentially a Protocell World which weaves known RNA World, DNA World and Protein World into unity. 10. In the view of non-membrane phase approach, the usage of liposomes and other membrane (non-biophase) cell models to solve the issue of the origin of life is a deadlock way of the investigation. I would be grat

Read Full Post »

Mindful Discoveries

Larry H. Bernstein, MD, FCAP, Curator

LPBI

Schizophrenia and the Synapse

Genetic evidence suggests that overactive synaptic pruning drives development of schizophrenia.

By Ruth Williams | January 27, 2016 … more follows)

http://www.the-scientist.com/?articles.view/articleNo/45189/title/Schizophrenia-and-the-Synapse/

3.2.4

3.2.4   Mindful Discoveries, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 2: CRISPR for Gene Editing and DNA Repair

http://www.the-scientist.com/images/News/January2016/Schizophrenia.jpg

C4 (green) at synapses of human neurons

Compared to the brains of healthy individuals, those of people with schizophrenia have higher expression of a gene called C4, according to a paper published inNature today (January 27). The gene encodes an immune protein that moonlights in the brain as an eradicator of unwanted neural connections (synapses). The findings, which suggest increased synaptic pruning is a feature of the disease, are a direct extension of genome-wide association studies (GWASs) that pointed to the major histocompatibility (MHC) locus as a key region associated with schizophrenia risk.

“The MHC [locus] is the first and the strongest genetic association for schizophrenia, but many people have said this finding is not useful,” said psychiatric geneticist Patrick Sullivan of the University of North Carolina School of Medicine who was not involved in the study. “The value of [the present study is] to show that not only is it useful, but it opens up new and extremely interesting ideas about the biology and therapeutics of schizophrenia.”

Schizophrenia has a strong genetic component—it runs in families—yet, because of the complex nature of the condition, no specific genes or mutations have been identified. The pathological processes driving the disease remain a mystery.

Researchers have turned to GWASs in the hope of finding specific genetic variations associated with schizophrenia, but even these have not provided clear candidates.

“There are some instances where genome-wide association will literally hit one base [in the DNA],” explained Sullivan. While a 2014 schizophrenia GWAS highlighted the MHC locus on chromosome 6 as a strong risk area, the association spanned hundreds of possible genes and did not reveal specific nucleotide changes. In short, any hope of pinpointing the MHC association was going to be “really challenging,” said geneticist Steve McCarroll of Harvard who led the new study.

Nevertheless, McCarroll and colleagues zeroed in on the particular region of the MHC with the highest GWAS score—the C4 gene—and set about examining how the area’s structural architecture varied in patients and healthy people.

The C4gene can exist in multiple copies (from one to four) on each copy of chromosome 6, and has four different forms: C4A-short, C4B-short, C4A-long, and C4B-long. The researchers first examined the “structural alleles” of the C4 locus—that is, the combinations and copy numbers of the different C4 forms—in healthy individuals. They then examined how these structural alleles related to expression of both C4Aand C4B messenger RNAs (mRNAs) in postmortem brain tissues.From this the researchers had a clear picture of how the architecture of the C4 locus affected expression ofC4A and C4B. Next, they compared DNA from roughly 30,000 schizophrenia patients with that from 35,000 healthy controls, and a correlation emerged: the alleles most strongly associated with schizophrenia were also those that were associated with the highest C4A expression. Measuring C4A mRNA levels in the brains of 35 schizophrenia patients and 70 controls then revealed that, on average, C4A levels in the patients’ brains were 1.4-fold higher.C4 is an immune system “complement” factor—a small secreted protein that assists immune cells in the targeting and removal of pathogens. The discovery of C4’s association to schizophrenia, said McCarroll, “would have seemed random and puzzling if it wasn’t for work . . . showing that other complement components regulate brain wiring.” Indeed, complement protein C3 locates at synapses that are going to be eliminated in the brain, explained McCarroll, “and C4 was known to interact with C3 . . . so we thought well, actually, this might make sense.”McCarroll’s team went on to perform studies in mice that revealed C4 is necessary for C3 to be deposited at synapses. They also showed that the more copies of the C4 gene present in a mouse, the more the animal’s neurons were pruned.Synaptic pruning is a normal part of development and is thought to reflect the process of learning, where the brain strengthens some connections and eradicates others. Interestingly, the brains of deceased schizophrenia patients exhibit reduced neuron density. The new results, therefore, “make a lot of sense,” said Cardiff University’s Andrew Pocklington who did not participate in the work. They also make sense “in terms of the time period when synaptic pruning is occurring, which sort of overlaps with the period of onset for schizophrenia: around adolescence and early adulthood,” he added.

“[C4] has not been on anybody’s radar for having anything to do with schizophrenia, and now it is and there’s a whole bunch of really neat stuff that could happen,” said Sullivan. For one, he suggested, “this molecule could be something that is amenable to therapeutics.”

A. Sekar et al., “Schizophrenia risk from complexvariation of complement component 4,”Nature,   http://dx.doi.com:/10.1038/nature16549, 2016.     

Tags schizophrenia, neuroscience, gwas, genetics & genomics, disease/medicine and cell & molecular biology

Schizophrenia: From genetics to physiology at last

Ryan S. Dhindsa& David B. Goldstein

Nature (2016)  http://dx.doi.org://10.1038/nature16874

The identification of a set of genetic variations that are strongly associated with the risk of developing schizophrenia provides insights into the neurobiology of this destructive disease.

http://www.nytimes.com/2016/01/28/health/schizophrenia-cause-synaptic-pruning-brain-psychiatry.html

Genetic study provides first-ever insight into biological origin of schizophrenia

Suspect gene may trigger runaway synaptic pruning during adolescence — NIH-funded study

NIH/NATIONAL INSTITUTE OF MENTAL HEALTH

IMAGE

http://media.eurekalert.org/multimedia_prod/pub/web/107629_web.jpg

The site in Chromosome 6 harboring the gene C4 towers far above other risk-associated areas on schizophrenia’s genomic “skyline,” marking its strongest known genetic influence. The new study is the first to explain how specific gene versions work biologically to confer schizophrenia risk.  CREDIT  Psychiatric Genomics Consortium

Versions of a gene linked to schizophrenia may trigger runaway pruning of the teenage brain’s still-maturing communications infrastructure, NIH-funded researchers have discovered. People with the illness show fewer such connections between neurons, or synapses. The gene switched on more in people with the suspect versions, who faced a higher risk of developing the disorder, characterized by hallucinations, delusions and impaired thinking and emotions.

“Normally, pruning gets rid of excess connections we no longer need, streamlining our brain for optimal performance, but too much pruning can impair mental function,” explained Thomas Lehner, Ph.D., director of the Office of Genomics Research Coordination of the NIH’s National Institute of Mental Health (NIMH), which co-funded the study along with the Stanley Center for Psychiatric Research at the Broad Institute and other NIH components. “It could help explain schizophrenia’s delayed age-of-onset of symptoms in late adolescence/early adulthood and shrinkage of the brain’s working tissue. Interventions that put the brakes on this pruning process-gone-awry could prove transformative.”

The gene, called C4 (complement component 4), sits in by far the tallest tower on schizophrenia’s genomic “skyline” (see graph below) of more than 100 chromosomal sites harboring known genetic risk for the disorder. Affecting about 1 percent of the population, schizophrenia is known to be as much as 90 percent heritable, yet discovering how specific genes work to confer risk has proven elusive, until now.

A team of scientists led by Steve McCarroll, Ph.D., of the Broad Institute and Harvard Medical School, Boston, leveraged the statistical power conferred by analyzing the genomes of 65,000 people, 700 postmortem brains, and the precision of mouse genetic engineering to discover the secrets of schizophrenia’s strongest known genetic risk. C4’s role represents the most compelling evidence, to date, linking specific gene versions to a biological process that could cause at least some cases of the illness.

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” said McCarroll. “The human genome is providing a powerful new way in to this disease. Understanding these genetic effects on risk is a way of prying open that block box, peering inside and starting to see actual biological mechanisms.”

McCarroll’s team, including Harvard colleagues Beth Stevens, Ph.D., Michael Carroll, Ph.D., and Aswin Sekar, report on their findings online Jan. 27, 2016 in the journal Nature.

A swath of chromosome 6 encompassing several genes known to be involved in immune function emerged as the strongest signal associated with schizophrenia risk in genome-wide analyses by the NIMH-funded Psychiatric Genomics Consortium over the past several years. Yet conventional genetics failed to turn up any specific gene versions there linked to schizophrenia.

To discover how the immune-related site confers risk for the mental disorder, McCarroll’s team mounted a search for “cryptic genetic influences” that might generate “unconventional signals.” C4, a gene with known roles in immunity, emerged as a prime suspect because it is unusually variable across individuals. It is not unusual for people to have different numbers of copies of the gene and distinct DNA sequences that result in the gene working differently.

The researchers dug deeply into the complexities of how such structural variation relates to the gene’s level of expression and how that, in turn, might relate to schizophrenia. They discovered structurally distinct versions that affect expression of two main forms of the gene in the brain. The more a version resulted in expression of one of the forms, called C4A, the more it was associated with schizophrenia. The more a person had the suspect versions, the more C4 switched on and the higher their risk of developing schizophrenia. Moreover, in the human brain, the C4 protein turned out to be most prevalent in the cellular machinery that supports connections between neurons.

Adapting mouse molecular genetics techniques for studying synaptic pruning and C4’s role in immune function, the researchers also discovered a previously unknown role for C4 in brain development. During critical periods of postnatal brain maturation, C4 tags a synapse for pruning by depositing a sister protein in it called C3. Again, the more C4 got switched on, the more synapses got eliminated.

In humans, such streamlining/pruning occurs as the brain develops to full maturity in the late teens/early adulthood – conspicuously corresponding to the age-of-onset of schizophrenia symptoms.

Future treatments designed to suppress excessive levels of pruning by counteracting runaway C4 in at risk individuals might nip in the bud a process that could otherwise develop into psychotic illness, suggest the researchers. And thanks to the head start gained in understanding the role of such complement proteins in immune function, such agents are already in development, they note.

“This study marks a crucial turning point in the fight against mental illness. It changes the game,” added acting NIMH director Bruce Cuthbert, Ph.D. “Thanks to this genetic breakthrough, we can finally see the potential for clinical tests, early detection, new treatments and even prevention.”

###

VIDEO: Opening Schizophrenia’s Black Box https://youtu.be/s0y4equOTLg

Reference: Sekar A, Biala AR, de Rivera H, Davis A, Hammond TR, Kamitaki N, Tooley K Presumey J Baum M, Van Doren V, Genovese G, Rose SA, Handsaker RE, Schizophrenia Working Group of the Psychiatric Genomics Consortium, Daly MJ, Carroll MC, Stevens B, McCarroll SA. Schizophrenia risk from complex variation of complement component 4.Nature. Jan 27, 2016. DOI: 10.1038/nature16549.

Schizophrenia risk from complex variation of complement component 4

Aswin SekarAllison R. BialasHeather de RiveraAvery DavisTimothy R. Hammond, …., Michael C. CarrollBeth Stevens Steven A. McCarroll

Nature(2016)   http://dx.doi.org:/10.1038/nature16549

Schizophrenia is a heritable brain illness with unknown pathogenic mechanisms. Schizophrenia’s strongest genetic association at a population level involves variation in the major histocompatibility complex (MHC) locus, but the genes and molecular mechanisms accounting for this have been challenging to identify. Here we show that this association arises in part from many structurally diverse alleles of the complement component 4 (C4) genes. We found that these alleles generated widely varying levels of C4A and C4B expression in the brain, with each common C4 allele associating with schizophrenia in proportion to its tendency to generate greater expression of C4A. Human C4 protein localized to neuronal synapses, dendrites, axons, and cell bodies. In mice, C4 mediated synapse elimination during postnatal development. These results implicate excessive complement activity in the development of schizophrenia and may help explain the reduced numbers of synapses in the brains of individuals with schizophrenia.

Figure 1: Structural variation of the complement component 4 (C4) gene.

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-f1.jpg

a, Location of the C4 genes within the major histocompatibility complex (MHC) locus on human chromosome 6. b, Human C4 exists as two paralogous genes (isotypes), C4A and C4B; the encoded proteins are distinguished at a key site

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-f3.jpg

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-sf8.jpg

Gene Study Points Toward Therapies for Common Brain Disorders

University of Edinburgh    http://www.dddmag.com/news/2016/01/gene-study-points-toward-therapies-common-brain-disorders

Scientists have pinpointed the cells that are likely to trigger common brain disorders, including Alzheimer’s disease, Multiple Sclerosis and intellectual disabilities.

It is the first time researchers have been able to identify the particular cell types that malfunction in a wide range of brain diseases.

Scientists say the findings offer a roadmap for the development of new therapies to target the conditions.

The researchers from the University of Edinburgh’s Centre for Clinical Brain Sciences used advanced gene analysis techniques to investigate which genes were switched on in specific types of brain cells.

They then compared this information with genes that are known to be linked to each of the most common brain conditions — Alzheimer’s disease, anxiety disorders, autism, intellectual disability, multiple sclerosis, schizophrenia and epilepsy.

Their findings reveal that for some conditions, the support cells rather than the neurons that transmit messages in the brain are most likely to be the first affected.

Alzheimer’s disease, for example, is characterised by damage to the neurons. Previous efforts to treat the condition have focused on trying to repair this damage.

The study found that a different cell type — called microglial cells — are responsible for triggering Alzheimer’s and that damage to the neurons is a secondary symptom of disease progression.

Researchers say that developing medicines that target microglial cells could offer hope for treating the illness.

The approach could also be used to find new treatment targets for other diseases that have a genetic basis, the researchers say.

Dr Nathan Skene, who carried out the study with Professor Seth Grant, said: “The brain is the most complex organ made up from a tangle of many cell types and sorting out which of these cells go wrong in disease is of critical importance to developing new medicines.”

Professor Seth Grant said: “We are in the midst of scientific revolution where advanced molecular methods are disentangling the Gordian Knot of the brain and completely unexpected new pathways to solving diseases are emerging. There is a pressing need to exploit the remarkable insights from the study.”

Quantitative multimodal multiparametric imaging in Alzheimer’s disease

Qian Zhao, Xueqi Chen, Yun Zhou      Brain Informatics  http://link.springer.com/article/10.1007/s40708-015-0028-9

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder, causing changes in memory, thinking, and other dysfunction of brain functions. More and more people are suffering from the disease. Early neuroimaging techniques of AD are needed to develop. This review provides a preliminary summary of the various neuroimaging techniques that have been explored for in vivo imaging of AD. Recent advances in magnetic resonance (MR) techniques, such as functional MR imaging (fMRI) and diffusion MRI, give opportunities to display not only anatomy and atrophy of the medial temporal lobe, but also at microstructural alterations or perfusion disturbance within the AD lesions. Positron emission tomography (PET) imaging has become the subject of intense research for the diagnosis and facilitation of drug development of AD in both animal models and human trials due to its non-invasive and translational characteristic. Fluorodeoxyglucose (FDG) PET and amyloid PET are applied in clinics and research departments. Amyloid beta (Aβ) imaging using PET has been recognized as one of the most important methods for the early diagnosis of AD, and numerous candidate compounds have been tested for Aβ imaging. Besides in vivo imaging method, a lot of ex vivo modalities are being used in the AD researches. Multiphoton laser scanning microscopy, neuroimaging of metals, and several metal bioimaging methods are also mentioned here. More and more multimodality and multiparametric neuroimaging techniques should improve our understanding of brain function and open new insights into the pathophysiology of AD. We expect exciting results will emerge from new neuroimaging applications that will provide scientific and medical benefits.

Keywords –   Alzheimer’s disease Neuroimaging PET MRI Amyloid beta Multimodal

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that gradually destroys brain cells, causing changes in memory, thinking, and other dysfunction of brain functions [1]. AD is considered to a prolonged preclinical stage where neuropathological changes precede the clinical symptoms [2]. An estimation of 35 million people worldwide is living with this disease. If effective treatments are not discovered in a timely fashion, the number of AD cases is anticipated to rise to 113 million by 2050 [3].

Amyloid beta (Aβ) and tau are two of the major biomarkers of AD, and have important and different roles in association with the progression of AD pathophysiology. Jack et al. established hypothetical models of the major biomarkers of AD. By renewing and modifying the models, they found that the two major proteinopathies underlying AD biomarker changes, Aβ and tau, may be initiated independently in late onset AD where they hypothesize that an incident Aβ pathophysiology can accelerate an antecedent limbic and brainstem tauopathy [4]. MRI technique was used in the article, which revealed that the level of Aβ load was associated with a shorter time-to-progression of AD [5]. This warrants an urgent need to develop early neuroimaging techniques of AD neuropathology that can detect and predict the disease before the onset of dementia, monitor therapeutic efficacy in halting and slowing down progression in the earlier stage of the disease.

There have been various reports on the imaging assessments of AD. Some measurements reflect the pathology of AD directly, including positron emission tomography (PET) amyloid imaging and cerebrospinal fluid (CSF) beta-amyloid 42 (Aβ42), while others reflect neuronal injury associated with AD indirectly, including CSF tau (total and phosphorylated tau), fluorodeoxy-d-glucose (FDG)-PET, and MRI. AD Neuroimaging Initiative (ADNI) has been to establish the optimal panel of clinical assessments, MRI and PET imaging measures, as well as other biomarkers from blood and CSF, to inform clinical trial design for AD therapeutic development. At the same time, it has been highly productive in generating a wealth of data for elucidating disease mechanisms occurring during early stages of preclinical and prodromal AD [6].

Single neuroimaging often reflects limit information of AD. As a result, multimodal neuroimaging is widely used in neuroscience researches, as it overcomes the limitations of individual modalities. Multimodal multiparametric imaging mean the combination of different imaging techniques, such as PET, MRI, simultaneously or separately. The multimodal multiparametric imaging enables the visualization and quantitative analysis of the alterations in brain structure and function, such as PET/CT, and PET/MRI. [7]. In this review article, we summarize and discuss the main applications, findings, perspectives as well as advantages and challenges of different neuroimaging in AD, especially MRI and PET imaging.

2 Magnetic resonance imaging

MRI demonstrates specific volume loss or cortical atrophy patterns with disease progression in AD patients [810]. There are several MRI techniques and analysis methods used in clinical and scientific research of AD. Recent advances in MR techniques, such as functional MRI (fMRI) and diffusion MRI, depict not only anatomy and atrophy of the medial temporal lobe (MTL), but also microstructural alterations or perfusion disturbance within this region.

2.1 Functional MRI

Because of the cognitive reserve (CR), the relationship between severity of AD patients’ brain damage and corresponding clinical symptoms is not always paralleled [11, 12]. Recently, resting-state fMRI (RS-fMRI) is popular for its ability to map brain functional connectivity non-invasively [13]. By using RS-fMRI, Bozzali et al. reported that the CR played a role in modulating the effect of AD pathology on default mode network functional connectivity, which account for the variable clinical symptoms of AD [14]. Moreover, AD patients with higher educated experience were able to recruit compensatory neural mechanisms, which can be measured using RS-fMRI. Arterial spin-labeled (ASL) MRI is another functional brain imaging modality, which measures cerebral blood flow (CBF) by magnetically labeled arterial blood water following through the carotid and vertebral arteries as an endogenous contrast medium. Several studies have concluded the characteristics of CBF changes in AD patients using ASL-MRI [1517].

At some point in time, sufficient brain damage accumulates to result in cognitive symptoms and impairment. Mild cognitive impairment (MCI) is a condition in which subjects are usually only mildly impaired in memory with relative preservation of other cognitive domains and functional activities and do not meet the criteria for dementia [18], or as the prodromal state AD [19]. MCI patients are at a higher risk of developing AD and up to 15 % convert to AD per year [18]. Binnewijzend et al. have reported the pseudocontinuous ASL could distinguish both MCI and AD from healthy controls, and be used in the early diagnosis of AD [20]. In their continuous study, they used quantitative whole brain pseudocontinuous ASL to compare regional CBF (rCBF) distribution patterns in different types of dementia, and concluded that ASL-MRI could be a non-invasive and easily accessible alternative to FDG-PET imaging in the assessment of CBF of AD patients [21].

2.2 Structure MRI

Structural MRI (sMRI) has already been a reliable imaging method in the clinical diagnosis of AD, characterized as gray matter reduction and ventricular enlargement in standard T1-weighted sequences [9]. Locus coeruleus (LC) and substantia nigra (SN) degeneration was seen in AD. By using new quantitative calculating method, Chen et al. presented a new quantitative neuromelanin MRI approach for simultaneous measurement of locus LC and SN of brainstem in living human subjects [22]. The approach they used demonstrated advantages in image acquisition, pre-processing, and quantitative analysis. Numerous transgenic animal models of amyloidosis are available, which can manipulate a lot of neuropathological features of AD progression from the deposition of β-amyloid [23]. Braakman et al. demonstrated the dynamics of amyloid plaque formation and development in a serial MRI study in a transgenic mouse model [24]. Increased iron accumulation in gray matter is frequently observed in AD. Because of the paramagnetic nature of iron, MRI shows nice potential in the investigating iron levels in AD [25]. Quantitative MRI was shown high sensitivity and specificity in mapping cerebral iron deposition, and helped in the research on AD diagnosis [26].

The imaging patterns are always associated with the pathologic changes, such as specific protein markers. Spencer et al. manifested the relationship between quantitative T1 and T2 relaxation time changes and three immunohistochemical markers: β-amyloid, neuron-specific nuclear protein (a marker of neuronal cell load), and myelin basic protein (a marker of myelin load) in AD transgenic mice [27].

High-field MRI has been successfully applied to imaging plaques in transgenic mice for over a decade without contrast agents [24, 2830]. Sillerud et al. devised a method using blood–brain barrier penetrating, amyloid-targeted, superparamagnetic iron oxide nanoparticles (SPIONs) for better imaging of amyloid plaque [31]. Then, they successfully used this SPION-MRI to assess the drug efficacy on the 3D distribution of Aβ plaques in transgenic AD mouse [32].

2.3 Diffusion MRI

Diffusion-weighted imaging (DWI) is a sensitive tool that allows quantifying of physiologic alterations in water diffusion, which result from microscopic structural changes.

Diffusion tensor imaging (DTI) is a well-established and commonly employed diffusion MRI technique in clinical and research on neuroimaging studies, which is based on a Gaussian model of diffusion processes [33]. In general, AD is associated with widespread reduced fractional anisotropy (FA) and increased mean diffusivity (MD) in several regions, most prominently in the frontal and temporal lobes, and along the cingulum, corpus callosum, uncinate fasciculus, superior longitudinal fasciculus, and MTL-associated tracts than healthy controls [3437]. Acosta-Cabronero et al. reported increased axial diffusivity and MD in the splenium, which were the earliest abnormalities in AD [38]. FA and radial diffusivity (DR) differences in the corpus callosum, cingulum, and fornix were found to separate individuals with MCI who converted to AD from non-converters [39]. DTI was also found to be a better predictor of AD-specific MTL atrophy when compared to CSF biomarkers [40]. These findings suggested the potential clinical utility of DTI as early biomarkers of AD and its progression. However, an increase in MD and DR and a decrease in FA with advancing age in selective brain regions have been previously reported [41, 42]. Diffusion MRI can be also used in the classifying of various stages of AD. Multimodal classification method, which combined fMRI and DTI, separated more MCI from healthy controls than single approaches [43].

In recent years, tau has emerged as a potential target for therapeutic intervention. Tau plays a critical role in the neurodegenerative process forming neurofibrillary tangles, which is a major hallmark of AD and correlates with clinical disease progression. Wells et al. applied multiparametric MRI, containing high-resolution structure MRI (sMRI), a novel chemical exchange saturation transfer (CEST) MRI, DTI, and ASL, and glucose CEST to measure changes of tau pathology in AD transgenic mouse [44].

Besides DWI MRI, perfusion-weighted imaging (PWI) is another advanced MR technique, which could measure the cerebral hemodynamics at the capillary level. Zimny et al. evaluated the correlation of MTL with both DWI and PWI in AD and MCI patients [45].

3 Positron emission tomography

PET is a specific imaging technique applying in researches of brain function and neurochemistry of small animals, medium-sized animals, and human subjects [4648]. As a particular brain imaging technique, PET imaging has become the subject of intense research for the diagnosis and facilitation of drug development of AD in both animal models and human trials due to its non-invasive and translational characteristic. PET with various radiotracers is considered as a standard non-invasive quantitative imaging technique to measure CBF, glucose metabolism, and β-amyloid and tau deposition.

3.1 FDG-PET

To date, 18F-FDG is one of the best and widely used neuroimaging tracers of PET, which employed for research and clinical assessment of AD [49]. Typical lower FDG metabolism was shown in the precuneus, posterior cingulate, and temporal and parietal cortex with progression to whole brain reductions with increasing disease progress in AD brains [50, 51]. FDG-PET imaging reflects the cerebral glucose metabolism, neuronal injury, which provides indirect evidence on cognitive function and progression that cannot be provided by amyloid PET imaging.

Schraml et al. [52] identified a significant association between hypometabolic convergence index and phenotypes using ADNI data. Some researchers also used 18F-FDG-PET to analyze genetic information with multiple biomarkers to classify AD status, predicting cognitive decline or MCI to AD conversion [5355]. Trzepacz et al. [56] reported multimodal AD neuroimaging study, using MRI, 11C-PiB PET, and 18F-FDG-PET imaging to predict MCI conversion to AD along with APOE genotype. Zhang et al. [57] compared the genetic modality single-nucleotide polymorphism (SNP) with sMRI, 18F-FDG-PET, and CSF biomarkers, which were used to differentiate healthy control, MCI, and AD. They found FDG-PET is the best modality in terms of accuracy.

3.2 Amyloid beta PET

Aβ, the primary constituent of senile plaques, and tau tangles are hypothesized to play a primary role in the pathogenesis of AD, but it is still hard to identify the fundamental mechanisms [5860]. Aβ plaque in brain is one of the pathological hallmarks of AD [61,62]. Accumulation of Aβ peptide in the cerebral cortex is considered one cause of dementia in AD [63]. Numerous studies have involved in vivo PET imaging assessing cortical β-amyloid burden [6466].

Aβ imaging using PET has been recognized as one of the most important methods for the early diagnosis of AD [67]. Numerous candidate compounds have been tested for Aβ imaging, such as 11C-PiB [68], 18F-FDDNP [69], 11C-SB-13 [70], 18F-BAY94-9172 [71], 18F-AV-45 [72], 18F-flutemetamol [73, 74], 11C-AZD2184 [75], and 18F-ADZ4694 [76], 11C-BF227 and 18F-FACT [77].

Several amyloid PET studies examined genotypes, phenotypes, or gene–gene interactions. Ramanan et al. [78] reported the GWAS results with 18F-AV-45 reflecting the cerebral amyloid metabolism in AD for the first time. Swaminathan et al. [79] revealed the association between plasma Aβ from peripheral blood and cortical amyloid deposition on 11C-PiB. Hohman et al. [80] reported the relationship between SNPs involved in amyloid and tau pathophysiology with 18F-AV-45 PET.

Among the PET tracers, 11C-PiB, which has a high affinity for fibrillar Aβ, is a reliable biomarker of underlying AD pathology [68, 81]. It shows cortical uptake well paralleled with AD pathology [82, 83], has recently been approved for use by the Food and Drug Administration (FDA, April 2012) and the European Medicines Agency (January 2013). 18F-GE-067 (flutemetamol) and 18F-BAY94-9172 (florbetaben) have also been approved by the US FDA in the last 2 years [84, 85].

18F-Florbetapir (also known as 18F-AV-45) exhibits high affinity specific binding to amyloid plaques. 18F-AV-45 labels Aβ plaques in sections from patients with pathologically confirmed AD [72].

It was reported in several research groups that 18F-AV-45 PET imaging showed a reliability of both qualitative and quantitative assessments in AD patients, and Aβ+ increased with diagnostic category (healthy control < MCI < AD) [82, 86, 87]. Johnson et al. used 18F-AV-45 PET imaging to evaluate the amyloid deposition in both MCI and AD patients qualitatively and quantitatively, and found that amyloid burden increased with diagnostic category (MCI < AD), age, and APOEε4 carrier status [88]. Payoux et al. reported the equivocal amyloid PET scans using 18F-AV-45 associated with a specific pattern of clinical signs in a large population of non-demented older adults more than 70 years old [89].

More and more researchers consider combination and comparison of multiple PET tracers targeting amyloid plaque imaging together. Bruck et al. compared the prognostic ability of 11C-PiB PET, 18F-FDG-PET, and quantitative hippocampal volumes measured with MR imaging in predicting MCI to AD conversion. They found that the FDG-PET and 11C-PiB PET imaging are better in predicting MCI to AD conversion [90]. Hatashita et al. used 11C-PiB and FDG-PET imaging to identify MCI due to AD, 11C-PiB showed a higher sensitivity of 96.6 %, and FDG-PET added diagnostic value in predicting AD over a short period [91].

Besides, new Aβ imaging agents were radiosynthesized. Yousefi et al. radiosynthesized a new Aβ imaging agent 18F-FIBT, and compared the three different Aβ-targeted radiopharmaceuticals for PET imaging, including 18F-FIBT, 18F-florbetaben, and 11C-PiB [92]. 11C-AZD2184 is another new PET tracer developed for amyloid senile plaque imaging, and the kinetic behavior of 11C-AZD2184 is suitable for quantitative analysis and can be used in clinical examination without input function [75,93, 94].

4 Multimodality imaging: PET/MRI

Several diagnostic techniques, including MRI and PET, are employed for the diagnosis and monitoring of AD [95]. Multimodal imaging could provide more information in the formation and key molecular event of AD than single method. It drives the progression of neuroimaging research due to the recognition of the clinical benefits of multimodal data [96], and the better access to hybrid devices, such as PET/MRI [97].

Maier et al. evaluated the dynamics of 11C-PiB PET, 15O-H2O-PET, and ASL-MRI in transgenic AD mice and concluded that the AD-related decline of rCBF was caused by the cerebral Aβ angiopathy [98]. Edison et al. systematically compared 11C-PiB PET and MRI in AD, MCI patients, and controls. They thought that 11C-PiB PET was adequate for clinical diagnostic purpose, while MRI remained more appropriate for clinical research [99]. Zhou et al. investigated the interactions between multimodal PET/MRI in elder patients with MCI, AD, and healthy controls, and confirmed the invaluable application of amyloid PET and MRI in early diagnosis of AD [100]. Kim et al. reported that Aβ-weighted cortical thickness, which incorporates data from both MRI and amyloid PET imaging, is a consistent and objective imaging biomarker in AD [101].

5 Other imaging modalities

Multiphoton non-linear optical microscope imaging systems using ultrafast lasers have powerful advantages such as label-free detection, deep penetration of thick samples, high sensitivity, subcellular spatial resolution, 3D optical sectioning, chemical specificity, and minimum sample destruction [102, 103]. Coherent anti-Stokes–Raman scattering (CARS), two-photon excited fluorescence (TPEF), and second-harmonic generation (SHG) microscopy are the most widely used biomedical imaging techniques [104106].

Quantitative electroencephalographic and neuropsychological investigation of an alternative measure of frontal lobe executive functions: the Figure Trail Making Test

 Paul S. Foster, Valeria Drago, Brad J. Ferguson, Patti Kelly Harrison,David W. Harrison 

Brain Informatis    http://dx.doi.org:/10.1007/s40708-015-0025-z    http://link.springer.com/article/10.1007/s40708-015-0025-z/fulltext.html

The most frequently used measures of executive functioning are either sensitive to left frontal lobe functioning or bilateral frontal functioning. Relatively little is known about right frontal lobe contributions to executive functioning given the paucity of measures sensitive to right frontal functioning. The present investigation reports the development and initial validation of a new measure designed to be sensitive to right frontal lobe functioning, the Figure Trail Making Test (FTMT). The FTMT, the classic Trial Making Test, and the Ruff Figural Fluency Test (RFFT) were administered to 42 right-handed men. The results indicated a significant relationship between the FTMT and both the TMT and the RFFT. Performance on the FTMT was also related to high beta EEG over the right frontal lobe. Thus, the FTMT appears to be an equivalent measure of executive functioning that may be sensitive to right frontal lobe functioning. Applications for use in frontotemporal dementia, Alzheimer’s disease, and other patient populations are discussed.

Keywords – Frontal lobes, Executive functioning, Trail making test, Sequencing, Behavioral speed, Designs, Nonverbal, Neuropsychological assessment, Regulatory control, Effortful control

A recent survey indicated that the vast majority of neuropsychologists frequently assess executive functioning as part of their neuropsychological evaluations [1]. Surveys of neuropsychologists have indicated that the Trail Making Test (TMT), Controlled Oral Word Association Test (COWAT), Wisconsin Card Sorting Test (WCST), and the Stroop Color-Word Test (SCWT) are among the most commonly used instruments [1,2]. Further, the Rabin et al. [1] survey indicated that these same tests are among the most frequently used by neuropsychologists when specifically assessing executive or frontal lobe functioning. The frequent use of the TMT, WCST, and the SCWT, as well as the assumption that they are measures of executive functioning, led Demakis (2003–2004) to conduct a series of meta-analyses to determine the sensitivity of these test to detect frontal lobe dysfunction, particularly lateralized frontal lobe dysfunction. The findings indicated that the SCWT and Part A of the TMT [3], as well as the WCST [4], were all sensitive to frontal lobe dysfunction. However, only the SCWT differentiated between left and right frontal lobe dysfunction, with the worst performance among those with left frontal lobe dysfunction [3].

The finding of the Demakis [4] meta-analysis, that the WCST was not sensitive to lateralized frontal lobe dysfunction, is not surprising given the equivocal findings that have been reported. Whereas performance on the WCST is sensitive to frontal lobe dysfunction [5, 6], demonstration of lateralized frontal dysfunction has been quite problematic. Unilateral left or right dorsolateral frontal dysfunction has been associated with impaired performance on the WCST [6]. Fallgatter and Strik [7] found bilateral frontal lobe activation during performance of the WCST. However, other imaging studies have found right lateralized frontal lobe activation [8] and left lateralized frontal activation [9] in response to performance on the WCST. Further, left frontal lobe alpha power is negatively correlated with performance on the WCST [10]. Finally, patients with left frontal lobe tumors exhibit more impaired performance on the WCST than those with right frontal tumors [11].

Unlike the data for the WCST, more consistent findings have been reported regarding lateralized frontal lobe functioning for the other commonly used measures of executive functioning. For instance, as with the Demakis [3] study, many investigations have found the SCWT to be sensitive to left frontal lobe functioning, although the precise localization within the left frontal lobe has varied. Impaired performance on the SCWT results from left frontal lesions [12] and specifically from lesions localized to the left dorsolateral frontal lobe [13, 14], though bilateral frontal lesions have also yielded impaired performance [13, 14]. Further, studies using neuroimaging to investigate the neural basis of performance on the SCWT have indicated involvement of the left anterior cingulated cortex [15], left lateral prefrontal cortex [16], left inferior precentral sulcus [17], and the left dorsolateral frontal lobe [18].

Wide agreement exists among investigations of the frontal lateralization of verbal or lexical fluency to confrontation. Specifically, patients with left frontal lobe lesions are known to exhibit impaired performance on lexical fluency to confrontation tasks, relative to either patients with right frontal lesions [12, 19, 20] or controls [21]. A recent meta-analysis also indicated that the largest deficits in performance on measures of lexical fluency are associated with left frontal lobe lesions [22]. Troster et al. [23] found that, relative to patients with right pallidotomy, patients with left pallidotomy exhibited more impaired lexical fluency. Several neuroimaging investigations have further supported the role of the left frontal lobe in lexical fluency tasks [15, 2427]. Performance on lexical fluency tasks also varies as a function of lateral frontal lobe asymmetry, as assessed by electroencephalography [28].

The Trail Making Test is certainly among the most widely used tests [1] and perhaps the most widely researched. Various norms exist for the TMT (see [29]), with Tombaugh [30] providing the most recent comprehensive set of normative data. Different methods of analyzing and interpreting the data have also been proposed and used, including error analysis [13, 14, 3133], subtraction scores [13, 14, 34], and ratio scores [13, 14, 35].

Several different language versions of the test have been developed and reported, including Arabic [36], Chinese [37, 38], Greek [39], and Hebrew [40]. Numerous alternative versions of the TMT have been developed to address perceived shortcomings of the original TMT. For instance, the Symbol Trail Making Test [41] was developed to reduce the cultural confounds associated with the use of the Arabic numeral system and English alphabet in the original TMT. The Color Trails Test (CTT; [42]) was also developed to control for cultural confounds, although mixed results have been reported regarding whether the CTT is indeed analogous to the TMT [4345]. A version of the TMT for preschool children, the TRAILS-P, has also been reported [46].

Additionally, the Comprehensive Trail Making Test [47] was developed to control for perceived psychometric shortcomings of the original TMT (for a review see [48] and the Oral Trail Making Test (OTMT; [49]) was developed to reduce confounds associated with motor speed and visual search abilities, with research supporting the OTMT as an equivalent measure [50, 51]. Alternate forms of the TMT have also been developed to permit successive administrations [32, 52] and to assess the relative contributions of the requisite cognitive skills [53].

Delis et al. [54] stated that the continued development of new instrumentation for improving diagnosis and treatment is a critical undertaking in all health-related fields. Further, in their view, the field of neuropsychology has recognized the importance of continually striving to develop new clinical measures. Delis and colleagues developed the extensive Delis-Kaplan Executive Functioning System (D-KEFS; [55]) in the spirit of advancing the instrumentation of neuropsychology. The D-KEFS includes a Trail Making Test consisting of five separate conditions. The Number-Letter Switching condition involves a sequencing procedure similar to that of the classic TMT. The other four conditions are designed to assess the component processes involved in completing the Number-Letter Switching condition so that a precise analysis of the nature of any underlying dysfunction may be accomplished. Specifically, these additional components include Visual Scanning, Number Sequencing, Letter Sequencing, and Motor Speed.

Given that the TMT comprises numbers and letters and is a measure of executive functioning, it may preferentially involve the left frontal lobe. Although the literature is somewhat controversial, neuropsychological and neuroimaging studies seem to provide support for the sensitivity of the TMT to detect left frontal dysfunction [56]. Recent clinically oriented studies investigating frontal lobe involvement of the TMT using transcranial magnetic stimulation (TMS) and near-infrared spectroscopy (NIRS) also support this localization [57]. Performance on Part B of the TMT improved following repetitive TMS applied to the left dorsolateral frontal lobe [57].

With 9–13-year-old boys performing TMT Part B, Weber et al. [58] found a left lateralized increase in the prefrontal cortex in deoxygenated hemoglobin, an indicator of increased oxygen consumption. Moll et al. [59] demonstrated increased activation specific to the prefrontal cortex, especially the left prefrontal region, in healthy controls performing Part B of the TMT. Foster et al. [60] found a significant positive correlation between performance on Part A of the TMT and low beta (13–21 Hz) magnitude (μV) at the left lateral frontal lobe, but not at the right lateral frontal lobe. Finally, Stuss et al. [13, 14] found that patients with left dorsolateral frontal dysfunction evidenced more errors than patients with lesions in other areas of the frontal lobes and those patients with left frontal lesions were the slowest to complete the test.

Taken together, the possibility exists that the aforementioned tests are largely associated with left frontal lobe activity and the TMT, in particular, provides information concerning mental processing speed as well as cognitive flexibility and set-shifting. While some studies have found that deficits in visuomotor set-shifting are specific to the frontal lobe damage [61], others investigators have reported such impairment in patients with posterior brain lesions and widespread cerebral dysfunctions, including cerebellar damage [62] and Alzheimer disease [63]. Thus, it remains unclear whether impairments in visuomotor set-shifting are specific to frontal lobe dysfunction or whether they are non-specific and can result from more posterior or widespread brain dysfunction.

Compared to the collective knowledge we have regarding the cognitive roles of the left frontal lobe, relatively little is known about right frontal lobe contributions to executive functioning. This is likely a result of the dearth of tests that are associated with right frontal activity. The Ruff Figural Fluency Test (RFFT; [64]) is among the few standardized tests of right frontal lobe functioning and was listed as the 14th most commonly used instrument to assess executive functioning in the Rabin et al. [1] survey. The RFFT is known to be sensitive to right frontal lobe functioning [65, 66]; see also [67] pp. 297–298), as is a measure based on the RFFT [19].

The present investigation, with the same intent and spirit as that reported by Delis et al. [54], sought to develop and initially validate a measure of right frontal lobe functioning in an effort to attain a greater understanding of right frontal contributions to executive functioning and to advance the instrumentation of neuropsychology. To meet this objective, a version of the Trail Making Test comprising figures, as opposed to numbers and letters, was developed. The TMT was used as a model for the new test, referred to as the Figure Trail Making Test (FTMT), due to the high frequency of use, the volume of research conducted, and the ease of administration of the TMT. Given that the TMT and the FTMT are both measuring executive functioning, we felt that a moderate correlation would exist between these two measures. Specifically, we hypothesized that performance on the FTMT would be positively correlated with performance on the TMT, in terms of the total time required to complete each part of the tests, an additive and subtractive score, and a ratio score. The total time required to complete each part of the FTMT was also hypothesized to be negatively correlated with the total number of unique designs produced on the RFFT and positively correlated with the number of perseverative errors committed on the RFFT and the perseverative error ratio. We also sought to determine whether the TMT and the FTMT were measuring different constructs by conducting a factor analysis, anticipating that the two tests would load on separate factors.

Additionally, we sought to obtain neurophysiological evidence that the FTMT is sensitive to right frontal lobe functioning. Specifically, we used quantitative electroencephalography (QEEG) to measure electrical activity over the left and right frontal lobes. A previous investigation we conducted found that performance on Part A of the TMT was related to left frontal lobe (F7) low beta magnitude [60]. For the present investigation, we predicted that significant negative correlations would exist between performance on Parts A and B of the TMT and both low and high beta magnitude at the F7 electrode site. We further predicted that significant negative correlations would exist between performance on Parts C and D of the FTMT and both low and high beta magnitude at the F8 electrode site.

3 Discussion

The need for additional measures of executive functions and especially instruments which may provide implications relevant to cerebral laterality is clear. There remains especially a void for neuropsychological instruments using a TMT format, which may provide information pertaining to the functional integrity of the right frontal region. Consistent with the hypotheses forwarded, significant correlations were found between performance on the TMT and the FTMT, in terms of the raw time required to complete each respective part of the tests as well as the additive and subtraction scores. The fact that the ratio scores were not significantly correlated is not surprising given that research has generally indicated a lack of clinical utility for this score [13, 14, 35]. Given the present findings, the TMT and the FTMT appear to be equivalent measures of executive functioning. Further, the present findings not only suggest that the FTMT may be a measure of executive functioning but also extend the realm of executive functioning to the sequencing and set-shifting of nonverbal stimuli.

However, the finding of significant correlations between the TMT and the FTMT represents somewhat of a caveat in that the TMT has been found to be sensitive to left frontal lobe functioning [13, 14, 57, 59]. This would seem to suggest the possibility that the FTMT is also sensitive to left frontal lobe functioning. The possibility that FTMT is related to left frontal lobe functioning is tempered, though, by the fact that the many of the hypothesized correlations between performance on the RFFT and the FTMT were also significant. Performance on the RFFT is related to right frontal lobe functioning [65,66]. Thus, the significant correlations between the RFFT and the FTMT suggest that the FTMT may also be sensitive to right frontal lobe functioning. Additionally, it should also be noted that the TMT was not significantly correlated with performance on the RFFT, with the exception of the significant correlation between performance on the TMT Part A and the total number of unique designs produced on the RFFT. Taken together, the results suggest that the FTMT may be a measure of right frontal executive functioning.

Additional support for the sensitivity of the FTMT to right frontal lobe functioning is provided by the finding of a significant negative correlation between performance on Part D of the FTMT and high beta magnitude. We have previously used QEEG to provide neurophysiological validation of the RFFT [65] and the Rey Auditory Verbal Learning Test [70] and the present findings provide further support for the use of QEEG in validating neuropsychological tests. The lack of significant correlations between the TMT and either low or high beta magnitude may be related to a restricted range of scores on the TMT. As a whole, performance on the FTMT was more variable than performance on the TMT and this relatively restricted range for the TMT may have impacted the obtained correlations. Given the present findings, together with those of the Foster et al. [65, 70] investigations, further support is also provided for the use of EEG in establishing neurophysiological validation for neuropsychological tests.

The results from the factor analysis provide support for the contention that the FMT may be a measure of right frontal lobe activity and also provide initial discriminant validity for the FTMT. Specifically, Parts C and D of the FTMT were found to load on the same factor as the number of designs generated on the RFFT, although the time required to complete Part A of the TMT is also included. Additionally, the number of errors committed on Parts C and D of the FTMT comprises a single factor, separate from either the TMT or the RFFT. Although these results support the FTMT as a measure of nonverbal executive functioning, it would be helpful to conduct an additional factor analysis including additional measures of right frontal functioning, and perhaps other measures of right hemisphere functioning as marker variables.

We sought to develop a measure sensitive to right frontal lobe functioning due to the paucity of such tests and the potentially important uses that right frontal lobe tests may have clinically. Tests of right frontal lobe functioning may, for instance, be useful in identifying and distinguishing left versus right frontotemporal dementia (FTD). Research has indicated that FTD is associated with cerebral atrophy at the right dorsolateral frontal and left premotor cortices [71]. Fukui and Kertesz [72] found right frontal lobe volume reduction in FTD relative to Alzheimer’s disease and progressive nonfluent aphasia. Some have suggested that FTD should not be considered as a unitary disorder and that neuropsychological testing may aid in differentially diagnosing left versus right FTD [73].

Whereas right FTD has been associated with more errors and perseverative responses on the Wisconsin Card Sorting Test (WCST), left FTD has been associated with significantly worse performance on the Boston Naming Test (BNT) and the Stroop Color-Word test [73]. Razani et al. [74] also distinguished between left and right FTD in finding that left FTD performed worse on the BNT and the right FTD patients performed worse on the WCST. However, as noted earlier, the WCST has been associated with left frontal activity [9], right frontal activation [8], and bilateral frontal activation [7]. Further, patients with left frontal tumors perform worse than those with right frontal tumors [11].

Patients with FTD that predominantly involves the right frontotemporal region have behavioral and emotional abnormalities and those with predominantly left frontotemporal region damage have a loss of lexical semantic knowledge. Patients, in whom neural degeneration begins on the left side, often present to the clinicians at an early stage of the disease due to the presence of language abnormalities, but maintain their emotion processing abilities, being preserved the right anterior temporal lobe. However, as this disease advances, the disease may progress to the right frontotemporal regions. Tests sensitive to right frontal lobe functioning may be useful tools to identify in advance the course of the disease, providing immediate and specific treatments and informing the caregivers on the possible prospective frame of the disease.

A potentially more important use of tests sensitive to right frontal lobe functioning, though, may be in predicting dementia patients that will develop significant and disruptive behavioral deficits. Research has found that approximately 92 % of right-sided FTD patients exhibit socially undesirable behaviors as their initial symptom, as compared to only 11 % of left-sided FTD patients [75]. Behavioral deficits in FTD are associated with gray matter loss at the dorsomedial frontal region, particularly on the right [76].

Alzheimer’s disease (AD) is also often associated with significant behavioral disturbances. Even AD patients with mild dementia are noted to exhibit behavioral deficits such as delusions, hallucinations, agitation, dysphoria, anxiety, apathy, and irritability [77]. Indeed, Shimabukuro et al. [77] found that regardless of dementia severity, over half of all AD patients exhibited apathy, delusions, irritability, dysphoria, and anxiety. Delusions in AD patients are associated with relative right frontal hypoperfusion as indicated by SPECT imaging [78, 79]. Further, positron emission tomography (PET) has indicated that AD patients exhibiting delusions exhibit hypometabolism at the right superior dorsolateral frontal and right inferior frontal pole [80].

Although research clearly implicates right frontal lobe dysfunction in the expression of behavioral deficits, data from neuropsychological testing are not as clear. Negative symptoms in patients with AD and FTD have been related to measures of nonverbal and verbal executive functioning as well as verbal memory [81]. Positive symptoms, in contrast, were related to constructional skills and attention. However, Staff et al. [78] failed to dissociate patients with delusions from those without delusions based on neuropsychological test performance, despite significant differences existing in right frontal and limbic functioning as revealed by functional imaging. The inclusion of other measures of right frontal lobe functioning may result in improved neuropsychological differentiation of dementia patients with and without significant behavioral disturbances. Further, it may be possible to predict early in the disease process those patients that will ultimately develop behavioral disturbances with improved measures of right frontal functioning. Predicting those that may develop behavioral problems will permit earlier treatment and will provide the family with more time to prepare for the potential emergence of such difficulties. Certainly, future research needs to be conducted that incorporates measures of right and left frontal lobe functioning in regression analyses to determine the plausibility of such prediction.

Tests sensitive to right frontal lobe functioning may also be useful in identifying more subtle right frontal lobe dysfunction and the cognitive and behavioral changes that follow. The right frontal lobe mediates language melody or prosody and forms a cohesive discourse, interprets abstract communication in spoken and written languages, and interprets the inferred relationships involved in communications. Subtle difficulties in interpreting abstract meaning in communication, comprehending metaphors, and even understanding jokes that are often seen in right frontal lobe stroke patients may not be detected by the family and may also be under diagnosed by clinicians [82]. Further, patients with right frontal lobe lesions are generally more euphoric and unconcerned, often minimizing their symptoms [82] or denying the illness, which may delay referral to a clinician and diagnosis.

Attention deficit hyperactivity disorder (ADHD) is a neurological disease characterized by motor inhibition deficit, problems with cognitive flexibility, social disruption, and emotional disinhibition [83, 84]. Functional MRI studies reveal reduced right prefrontal activation during “frontal tasks,” such as go/no go [85], Stroop [86], and attention task performance [87]. The right frontal lobe deficit hypothesis is further supported by structural studies [88, 89]. Tests of right frontal lobe functioning may be useful in further characterizing the nature of this deficit and in specifying the likely hemispheric locus of dysfunction.

To summarize, we feel that right frontal lobe functioning has been relatively neglected in neuropsychological assessment and that many uses for such tests exist. Our intent was to develop a test purportedly sensitive to right frontal functioning that would be easy and quick to administer in a clinical setting. However, we are certainly not meaning to assert that our FTMT would be applicable in all the aforementioned conditions. Additional research should be conducted to determine the precise clinical utility of the FTMT.

Further validation of the FTMT should also be undertaken. Establishing convergent validation may involve correlating tests measuring the same domain, such as executive functioning. This was initially accomplished in the present investigation through the significant correlations between the TMT and the FTMT. Additionally, convergent validation may also involve correlating tests that purportedly measure the same region of the brain. This was also initially accomplished in the present investigation through the significant correlations between the FTMT and the RFFT. However, additional convergent validation certainly needs to be obtained, as well as validation using patient populations and neurophysiological validation.

We are currently collecting data that hopefully will provide neurophysiological validation of the FTMT. Certainly, though, it is hoped that the present investigation will not only stimulate further research seeking to validate the FTMT and provide more comprehensive normative data, but also stimulate research investigating whether the FTMT or other measures of right frontal lobe functioning may be used to predict patients that will develop behavioral disturbances.

World’s Greatest Literature Reveals Multifractals, Cascades of Consciousness

http://www.scientificcomputing.com/news/2016/01/worlds-greatest-literature-reveals-multifractals-cascades-consciousness

http://www.scientificcomputing.com/sites/scientificcomputing.com/files/Worlds_Greatest_Literature_Reveals_Multifractals_Cascades_of_Consciousness_440.jpg

Multifractal analysis of Finnegan’s Wake by James Joyce. The ideal shape of the graph is virtually indistinguishable from the results for purely mathematical multifractals. The horizontal axis represents the degree of singularity, and the vertical axis shows the spectrum of singularity. Courtesy of IFJ PAN

Arthur Conan Doyle, Charles Dickens, James Joyce, William Shakespeare and JRR Tolkien. Regardless of the language they were working in, some of the world’s greatest writers appear to be, in some respects, constructing fractals. Statistical analysis, however, revealed something even more intriguing. The composition of works from within a particular genre was characterized by the exceptional dynamics of a cascading (avalanche) narrative structure. This type of narrative turns out to be multifractal. That is, fractals of fractals are created.

As far as many bookworms are concerned, advanced equations and graphs are the last things which would hold their interest, but there’s no escape from the math. Physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ) in Cracow, Poland, performed a detailed statistical analysis of more than one hundred famous works of world literature, written in several languages and representing various literary genres. The books, tested for revealing correlations in variations of sentence length, proved to be governed by the dynamics of a cascade. This means that the construction of these books is, in fact, a fractal. In the case of several works, their mathematical complexity proved to be exceptional, comparable to the structure of complex mathematical objects considered to be multifractal. Interestingly, in the analyzed pool of all the works, one genre turned out to be exceptionally multifractal in nature.

Fractals are self-similar mathematical objects: when we begin to expand one fragment or another, what eventually emerges is a structure that resembles the original object. Typical fractals, especially those widely known as the Sierpinski triangle and the Mandelbrot set, are monofractals, meaning that the pace of enlargement in any place of a fractal is the same, linear: if they at some point were rescaled x number of times to reveal a structure similar to the original, the same increase in another place would also reveal a similar structure.

Multifractals are more highly advanced mathematical structures: fractals of fractals. They arise from fractals ‘interwoven’ with each other in an appropriate manner and in appropriate proportions. Multifractals are not simply the sum of fractals and cannot be divided to return back to their original components, because the way they weave is fractal in nature. The result is that, in order to see a structure similar to the original, different portions of a multifractal need to expand at different rates. A multifractal is, therefore, non-linear in nature.

“Analyses on multiple scales, carried out using fractals, allow us to neatly grasp information on correlations among data at various levels of complexity of tested systems. As a result, they point to the hierarchical organization of phenomena and structures found in nature. So, we can expect natural language, which represents a major evolutionary leap of the natural world, to show such correlations as well. Their existence in literary works, however, had not yet been convincingly documented. Meanwhile, it turned out that, when you look at these works from the proper perspective, these correlations appear to be not only common, but in some works they take on a particularly sophisticated mathematical complexity,” says Professor Stanislaw Drozdz, IFJ PAN, Cracow University of Technology.

The study involved 113 literary works written in English, French, German, Italian, Polish, Russian and Spanish by such famous figures as Honore de Balzac, Arthur Conan Doyle, Julio Cortazar, Charles Dickens, Fyodor Dostoevsky, Alexandre Dumas, Umberto Eco, George Elliot, Victor Hugo, James Joyce, Thomas Mann, Marcel Proust, Wladyslaw Reymont, William Shakespeare, Henryk Sienkiewicz, JRR Tolkien, Leo Tolstoy and Virginia Woolf, among others. The selected works were no less than 5,000 sentences long, in order to ensure statistical reliability.

To convert the texts to numerical sequences, sentence length was measured by the number of words (an alternative method of counting characters in the sentence turned out to have no major impact on the conclusions). The dependences were then searched for in the data — beginning with the simplest, i.e. linear. This is the posited question: if a sentence of a given length is x times longer than the sentences of different lengths, is the same aspect ratio preserved when looking at sentences respectively longer or shorter?

“All of the examined works showed self-similarity in terms of organization of the lengths of sentences. Some were more expressive — here The Ambassadors by Henry James stood out — while others to far less of an extreme, as in the case of the French seventeenth-century romance Artamene ou le Grand Cyrus. However, correlations were evident and, therefore, these texts were the construction of a fractal,” comments Dr. Pawel Oswiecimka (IFJ PAN), who also noted that fractality of a literary text will, in practice, never be as perfect as in the world of mathematics. It is possible to magnify mathematical fractals up to infinity, while the number of sentences in each book is finite and, at a certain stage of scaling, there will always be a cut-off in the form of the end of the dataset.

Things took a particularly interesting turn when physicists from IFJ PAN began tracking non-linear dependence, which in most of the studied works was present to a slight or moderate degree. However, more than a dozen works revealed a very clear multifractal structure, and almost all of these proved to be representative of one genre, that of stream of consciousness. The only exception was the Bible, specifically the Old Testament, which has, so far, never been associated with this literary genre.

“The absolute record in terms of multifractality turned out to be Finnegan’s Wakeby James Joyce. The results of our analysis of this text are virtually indistinguishable from ideal, purely mathematical multifractals,” says Drozdz.

The most multifractal works also included A Heartbreaking Work of Staggering Genius by Dave Eggers, Rayuela by Julio Cortazar, The US Trilogy by John Dos Passos, The Waves by Virginia Woolf, 2666 by Roberto Bolano, and Joyce’sUlysses. At the same time, a lot of works usually regarded as stream of consciousness turned out to show little correlation to multifractality, as it was hardly noticeable in books such as Atlas Shrugged by Ayn Rand and A la recherche du temps perdu by Marcel Proust.

“It is not entirely clear whether stream of consciousness writing actually reveals the deeper qualities of our consciousness, or rather the imagination of the writers. It is hardly surprising that ascribing a work to a particular genre is, for whatever reason, sometimes subjective. We see, moreover, the possibility of an interesting application of our methodology: it may someday help in a more objective assignment of books to one genre or another,” notes Drozdz.

Multifractal analyses of literary texts carried out by the IFJ PAN have been published in Information Sciences, the journal of computer science. The publication has undergone rigorous verification: given the interdisciplinary nature of the subject, editors immediately appointed up to six reviewers.

Citation: “Quantifying origin and character of long-range correlations in narrative texts” S. Drożdż, P. Oświęcimka, A. Kulig, J. Kwapień, K. Bazarnik, I. Grabska-Gradzińska, J. Rybicki, M. Stanuszek; Information Sciences, vol. 331, 32–44, 20 February 2016; DOI: 10.1016/j.ins.2015.10.023

New Quantum Approach to Big Data could make Impossibly Complex Problems Solvable

David L. Chandler, MIT

http://www.scientificcomputing.com/news/2016/01/new-quantum-approach-big-data-could-make-impossibly-complex-problems-solvable

http://www.scientificcomputing.com/sites/scientificcomputing.com/files/New_Quantum_Approach_to_Big_Data_could_make_Impossibly_Complex_Problems_Solvable_440.jpg

This diagram demonstrates the simplified results that can be obtained by using quantum analysis on enormous, complex sets of data. Shown here are the connections between different regions of the brain in a control subject (left) and a subject under the influence of the psychedelic compound psilocybin (right). This demonstrates a dramatic increase in connectivity, which explains some of the drug’s effects (such as “hearing” colors or “seeing” smells). Such an analysis, involving billions of brain cells, would be too complex for conventional techniques, but could be handled easily by the new quantum approach, the researchers say. Courtesy of the researchers

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage or understand.

Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California.

The team describes their theoretical proposal this week in the journal Nature Communications. Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.

In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”

It doesn’t matter what kind of dataset is being analyzed, he says. The topological approach to looking for connections and holes “works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.”

Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive,” Lloyd says. “This is where quantum mechanics kicks in.” The new quantum-based approach, he says, could exponentially speed up such calculations.

Lloyd offers an example to illustrate that potential speedup: If you have a dataset with 300 points, a conventional approach to analyzing all the topological features in that system would require “a computer the size of the universe,” he says. That is, it would take 2300 (two to the 300th power) processing units — approximately the number of all the particles in the universe. In other words, the problem is simply not solvable in that way.

“That’s where our algorithm kicks in,” he says. Solving the same problem with the new system, using a quantum computer, would require just 300 quantum bits — and a device this size may be achieved in the next few years, according to Lloyd.

“Our algorithm shows that you don’t need a big quantum computer to kick some serious topological butt,” he says.

There are many important kinds of huge datasets where the quantum-topological approach could be useful, Lloyd says, for example understanding interconnections in the brain. “By applying topological analysis to datasets gleaned by electroencephalography or functional MRI, you can reveal the complex connectivity and topology of the sequences of firing neurons that underlie our thought processes,” he says.

The same approach could be used for analyzing many other kinds of information. “You could apply it to the world’s economy, or to social networks, or almost any system that involves long-range transport of goods or information,” Lloyd says. But the limits of classical computation have prevented such approaches from being applied before.

While this work is theoretical, “experimentalists have already contacted us about trying prototypes,” he says. “You could find the topology of simple structures on a very simple quantum computer. People are trying proof-of-concept experiments.”

Ignacio Cirac, a professor at the Max Planck Institute of Quantum Optics in Munich, Germany, who was not involved in this research, calls it “a very original idea, and I think that it has a great potential.” He adds “I guess that it has to be further developed and adapted to particular problems. In any case, I think that this is top-quality research.”

The team also included Silvano Garnerone of the University of Waterloo in Ontario, Canada, and Paolo Zanardi of the Center for Quantum Information Science and Technology at the University of Southern California. The work was supported by the Army Research Office, Air Force Office of Scientific Research, Defense Advanced Research Projects Agency, Multidisciplinary University Research Initiative of the Office of Naval Research, and the National Science Foundation.

Beyond Chess: Computer Beats Human in Ancient Chinese Game

http://www.rdmag.com/news/2016/01/beyond-chess-computer-beats-human-ancient-chinese-game

http://www.rdmag.com/sites/rdmag.com/files/rd1601_chess.jpg

A player places a black stone while his opponent waits to place a white one as they play Go, a game of strategy, in the Seattle Go Center, Tuesday, April 30, 2002. The game, which originated in China more than 2,500 years ago, involves two players who take turns putting markers on a grid. The object is to surround more area on the board with the markers than one’s opponent, as well as capturing the opponent’s pieces by surrounding them. A paper released Wednesday, Jan. 27, 2016 describes how a computer program has beaten a human master at the complex board game, marking significant advance for development of artificial intelligence. (AP Photo/Cheryl Hatch)

A computer program has beaten a human champion at the ancient Chinese board game Go, marking a significant advance for development of artificial intelligence.

The program had taught itself how to win, and its developers say its learning strategy may someday let computers help solve real-world problems like making medical diagnoses and pursuing scientific research.

The program and its victory are described in a paper released Wednesday by the journal Nature.

Computers previously have surpassed humans for other games, including chess, checkers and backgammon. But among classic games, Go has long been viewed as the most challenging for artificial intelligence to master.

Go, which originated in China more than 2,500 years ago, involves two players who take turns putting markers on a checkerboard-like grid. The object is to surround more area on the board with the markers than one’s opponent, as well as capturing the opponent’s pieces by surrounding them.

While the rules are simple, playing it well is not. It’s “probably the most complex game ever devised by humans,” Dennis Hassabis of Google DeepMind in London, one of the study authors, told reporters Tuesday.

The new program, AlphaGo, defeated the European champion in all five games of a match in October, the Nature paper reports.

In March, AlphaGo will face legendary player Lee Sedol in Seoul, South Korea, for a $1 million prize, Hassabis said.

Martin Mueller, a computing science professor at the University of Alberta in Canada who has worked on Go programs for 30 years but didn’t participate in AlphaGo, said the new program “is really a big step up from everything else we’ve seen…. It’s a very, very impressive piece of work.”

Biological Origin of Schizophrenia

Excessive ‘pruning’ of connections between neurons in brain predisposes to disease

http://hms.harvard.edu/sites/default/files/uploads/news/McCarroll_C4_600x400.jpg

Imaging studies showed C4 (in green) located at the synapses of primary human neurons. Image: Heather de Rivera, McCarroll lab

 PAUL GOLDSMITH    http://hms.harvard.edu/news/biological-origin-schizophrenia

The risk of schizophrenia increases if a person inherits specific variants in a gene related to “synaptic pruning”—the elimination of connections between neurons—according to a study from Harvard Medical School, the Broad Institute and Boston Children’s Hospital. The findings were based on genetic analysis of nearly 65,000 people.

The study represents the first time that the origin of this psychiatric disease has been causally linked to specific gene variants and a biological process.

Get more HMS news here

It also helps explain two decades-old observations: synaptic pruning is particularly active during adolescence, which is the typical period of onset for symptoms of schizophrenia, and the brains of schizophrenic patients tend to show fewer connections between neurons.

The gene, complement component 4 (C4), plays a well-known role in the immune system. It has now been shown to also play a key role in brain development and schizophrenia risk. The insight may allow future therapeutic strategies to be directed at the disorder’s roots, rather than just its symptoms.

The study, which appears online Jan. 27 in Nature, was led by HMS researchers at the Broad Institute’s Stanley Center for Psychiatric Research and Boston Children’s. They include senior author Steven McCarroll, HMS associate professor of genetics and director of genetics for the Stanley Center; Beth Stevens, HMS assistant professor of neurology at Boston Children’s and institute member at the Broad; Michael Carroll, HMS professor of pediatrics at Boston Children’s; and first author Aswin Sekar, an MD-PhD student at HMS.

The study has the potential to reinvigorate translational research on a debilitating disease. Schizophrenia afflicts approximately 1 percent people worldwide and is characterized by hallucinations, emotional withdrawal and a decline in cognitive function. These symptoms most frequently begin in patients when they are teenagers or young adults.

“These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

First described more than 130 years ago, schizophrenia lacks highly effective treatments and has seen few biological or medical breakthroughs over the past half-century.

In the summer of 2014, an international consortium led by researchers at the Stanley Center identified more than 100 regions in the human genome that carry risk factors for schizophrenia.

The newly published study now reports the discovery of the specific gene underlying the strongest of these risk factors and links it to a specific biological process in the brain.

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” said McCarroll. “The human genome is providing a powerful new way in to this disease. Understanding these genetic effects on risk is a way of prying open that black box, peering inside and starting to see actual biological mechanisms.”

“This study marks a crucial turning point in the fight against mental illness,” said Bruce Cuthbert, acting director of the National Institute of Mental Health. “Because the molecular origins of psychiatric diseases are little-understood, efforts by pharmaceutical companies to pursue new therapeutics are few and far between. This study changes the game. Thanks to this genetic breakthrough we can finally see the potential for clinical tests, early detection, new treatments and even prevention.”

The path to discovery

The discovery involved the collection of DNA from more than 100,000 people, detailed analysis of complex genetic variation in more than 65,000 human genomes, development of an innovative analytical strategy, examination of postmortem brain samples from hundreds of people and the use of animal models to show that a protein from the immune system also plays a previously unsuspected role in the brain.

Over the past five years, Stanley Center geneticists and collaborators around the world collected more than 100,000 human DNA samples from 30 different countries to locate regions of the human genome harboring genetic variants that increase the risk of schizophrenia. The strongest signal by far was on chromosome 6, in a region of DNA long associated with infectious disease. This caused some observers to suggest that schizophrenia might be triggered by an infectious agent. But researchers had no idea which of the hundreds of genes in the region was actually responsible or how it acted.

Based on analyses of the genetic data, McCarroll and Sekar focused on a region containing the C4 gene. Unlike most genes, C4 has a high degree of structural variability. Different people have different numbers of copies and different types of the gene.

McCarroll and Sekar developed a new molecular technique to characterize the C4 gene structure in human DNA samples. They also measured C4 gene activity in nearly 700 post-mortem brain samples.

They found that the C4 gene structure (DNA) could predict the C4 gene activity (RNA) in each person’s brain. They then used this information to infer C4 gene activity from genome data from 65,000 people with and without schizophrenia.

These data revealed a striking correlation. People who had particular structural forms of the C4 gene showed higher expression of that gene and, in turn, had a higher risk of developing schizophrenia.

Connecting cause and effect through neuroscience

But how exactly does C4—a protein known to mark infectious microbes for destruction by immune cells—affect the risk of schizophrenia?

Answering this question required synthesizing genetics and neurobiology.

Stevens, a recent recipient of a MacArthur Foundation “genius grant,” had found that other complement proteins in the immune system also played a role in brain development. These results came from studying an experimental model of synaptic pruning in the mouse visual system.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics.”

Carroll had long studied C4 for its role in immune disease, and developed mice with different numbers of copies of C4.

The three labs set out to study the role of C4 in the brain.

They found that C4 played a key role in pruning synapses during maturation of the brain. In particular, they found that C4 was necessary for another protein—a complement component called C3—to be deposited onto synapses as a signal that the synapses should be pruned. The data also suggested that the more C4 activity an animal had, the more synapses were eliminated in its brain at a key time in development.

The findings may help explain the longstanding mystery of why the brains of people with schizophrenia tend to have a thinner cerebral cortex (the brain’s outer layer, responsible for many aspects of cognition) with fewer synapses than do brains of unaffected individuals. The work may also help explain why the onset of schizophrenia symptoms tends to occur in late adolescence.

The human brain normally undergoes widespread synapse pruning during adolescence, especially in the cerebral cortex. Excessive synaptic pruning during adolescence and early adulthood, due to increased complement (C4) activity, could lead to the cognitive symptoms seen in schizophrenia.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” Stevens said.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics,” she said. “We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

Opening a path toward early detection and potential therapies

Beyond providing the first insights into the biological origins of schizophrenia, the work raises the possibility that therapies might someday be developed that could turn down the level of synaptic pruning in people who show early symptoms of schizophrenia.

This would be a dramatically different approach from current medical therapies, which address only a specific symptom of schizophrenia—psychosis—rather than the disorder’s root causes, and which do not stop cognitive decline or other symptoms of the illness.

The researchers emphasize that therapies based on these findings are still years down the road. Still, the fact that much is already known about the role of complement proteins in the immune system means that researchers can tap into a wealth of existing knowledge to identify possible therapeutic approaches. For example, anticomplement drugs are already under development for treating other diseases.

“In this area of science, our dream has been to find disease mechanisms that lead to new kinds of treatments,” said McCarroll. “These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

This work was supported by the Broad Institute’s Stanley Center for Psychiatric Research and by the National Institutes of Health (grants U01MH105641, R01MH077139 and T32GM007753).

Adapted from a Broad Institute news release.

 

Scientists open the ‘black box’ of schizophrenia with dramatic genetic discovery

Amy Ellis Nutt    https://www.washingtonpost.com/news/speaking-of-science/wp/2016/01/27/scientists-open-the-black-box-of-schizophrenia-with-dramatic-genetic-finding/

Scientists Prune Away Schizophrenia’s Hidden Genetic Mechanisms

http://www.genengnews.com/gen-news-highlights/scientists-prune-away-schizophrenia-s-hidden-genetic-mechanisms/81252297/

https://youtu.be/s0y4equOTLg

A landmark study has revealed that a person’s risk of schizophrenia is increased if they inherit specific variants in a gene related to “synaptic pruning”—the elimination of connections between neurons. The findings represent the first time that the origin of this devastating psychiatric disease has been causally linked to specific gene variants and a biological process.

http://www.genengnews.com/Media/images/GENHighlight/thumb_107629_web2209513618.jpg

The site in Chromosome 6 harboring the gene C4 towers far above other risk-associated areas on schizophrenia’s genomic “skyline,” marking its strongest known genetic influence. The new study is the first to explain how specific gene versions work biologically to confer schizophrenia risk. [Psychiatric Genomics Consortium]

  • A new study by researchers at the Broad Institute’s Stanley Center for Psychiatric Research, Harvard Medical School, and Boston Children’s Hospital genetically analyzed nearly 65,000 people and revealed that an individual’s risk of schizophrenia is increased if they inherited distinct variants in a gene related to “synaptic pruning”—the elimination of connections between neurons. This new data represents the first time that the origin of this psychiatric disease has been causally linked to particular gene variants and a biological process.

The investigators discovered that versions of a gene commonly thought to be involved in immune function might trigger a runaway pruning of an adolescent brain’s still-maturing communications infrastructure. The researchers described a scenario where patients with schizophrenia show fewer such connections between neurons or synapses.

“Normally, pruning gets rid of excess connections we no longer need, streamlining our brain for optimal performance, but too much pruning can impair mental function,” explained Thomas Lehner, Ph.D., director of the Office of Genomics Research Coordination at the NIH’s National Institute of Mental Health (NIMH), which co-funded the study along with the Stanley Center for Psychiatric Research at the Broad Institute and other NIH components. “It could help explain schizophrenia’s delayed age-of-onset of symptoms in late adolescence and early adulthood and shrinkage of the brain’s working tissue. Interventions that put the brakes on this pruning process-gone-awry could prove transformative.”

The gene the research team called into question, dubbed C4 (complement component 4), was associated with the largest risk for the disorder. C4’s role represents some of the most compelling evidence, to date, linking specific gene versions to a biological process that could cause at least some cases of the illness.

The findings from this study were published recently in Nature through an article entitled “Schizophrenia risk from complex variation of complement component 4.”

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” noted senior study author Steven McCarroll, Ph.D., director of genetics for the Stanley Center and an associate professor of genetics at Harvard Medical School. “The human genome is providing a powerful new way into this disease. Understanding these genetic effects on risk is a way of prying open that block box, peering inside and starting to see actual biological mechanisms.”

Dr. McCarroll and his colleagues found that a stretch of chromosome 6 encompassing several genes known to be involved in immune function emerged as the strongest signal associated with schizophrenia risk in genome-wide analyses. Yet conventional genetics failed to turn up any specific gene versions there that were linked to schizophrenia.

In order to uncover how the immune-related site confers risk for the mental disorder, the scientists mounted a search for cryptic genetic influences that might generate unconventional signals. C4, a gene with known roles in immunity, emerged as a prime suspect because it is unusually variable across individuals.

Upon further investigation into the complexities of how such structural variation relates to the gene’s level of expression and how that, in turn, might link to schizophrenia, the team discovered structurally distinct versions that affect expression of two main forms of the gene within the brain. The more a version resulted in expression of one of the forms, called C4A, the more it was associated with schizophrenia. The greater number of copies an individual had of the suspect versions, the more C4 switched on and the higher their risk of developing schizophrenia. Furthermore, the C4 protein turned out to be most prevalent within the cellular machinery that supports connections between neurons.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” remarked co-author Beth Stevens, Ph.D. a neuroscientist and assistant professor of neurology at Boston Children’s Hospital and institute member at the Broad. “This discovery enriches our understanding of the complement system in brain development and disease, and we could not have made that leap without the genetics. We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

“This study marks a crucial turning point in the fight against mental illness. It changes the game,” added acting NIMH director Bruce Cuthbert, Ph.D. “Because the molecular origins of psychiatric diseases are little-understood, efforts by pharmaceutical companies to pursue new therapeutics are few and far between. This study changes the game. Thanks to this genetic breakthrough, we can finally see the potential for clinical tests, early detection, new treatments, and even prevention.”

Connecting cause and effect through neuroscience

But how exactly does C4—a protein known to mark infectious microbes for destruction by immune cells—affect the risk of schizophrenia?

Answering this question required synthesizing genetics and neurobiology.

Stevens, a recent recipient of a MacArthur Foundation “genius grant,” had found that other complement proteins in the immune system also played a role in brain development. These results came from studying an experimental model of synaptic pruning in the mouse visual system.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics.”

Carroll had long studied C4 for its role in immune disease, and developed mice with different numbers of copies of C4.

The three labs set out to study the role of C4 in the brain.

They found that C4 played a key role in pruning synapses during maturation of the brain. In particular, they found that C4 was necessary for another protein—a complement component called C3—to be deposited onto synapses as a signal that the synapses should be pruned. The data also suggested that the more C4 activity an animal had, the more synapses were eliminated in its brain at a key time in development.

The findings may help explain the longstanding mystery of why the brains of people with schizophrenia tend to have a thinner cerebral cortex (the brain’s outer layer, responsible for many aspects of cognition) with fewer synapses than do brains of unaffected individuals. The work may also help explain why the onset of schizophrenia symptoms tends to occur in late adolescence.

The human brain normally undergoes widespread synapse pruning during adolescence, especially in the cerebral cortex. Excessive synaptic pruning during adolescence and early adulthood, due to increased complement (C4) activity, could lead to the cognitive symptoms seen in schizophrenia.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” Stevens said.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics,” she said. “We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

Opening a path toward early detection and potential therapies

Beyond providing the first insights into the biological origins of schizophrenia, the work raises the possibility that therapies might someday be developed that could turn down the level of synaptic pruning in people who show early symptoms of schizophrenia.

This would be a dramatically different approach from current medical therapies, which address only a specific symptom of schizophrenia—psychosis—rather than the disorder’s root causes, and which do not stop cognitive decline or other symptoms of the illness.

The researchers emphasize that therapies based on these findings are still years down the road. Still, the fact that much is already known about the role of complement proteins in the immune system means that researchers can tap into a wealth of existing knowledge to identify possible therapeutic approaches. For example, anticomplement drugs are already under development for treating other diseases.

“In this area of science, our dream has been to find disease mechanisms that lead to new kinds of treatments,” said McCarroll. “These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

This work was supported by the Broad Institute’s Stanley Center for Psychiatric Research and by the National Institutes of Health (grants U01MH105641, R01MH077139 and T32GM007753).

Adapted from a Broad Institute news release.

 

https://img.washingtonpost.com/wp-apps/imrs.php?src=https://img.washingtonpost.com/rf/image_908w/2010-2019/WashingtonPost/2011/09/27/Production/Sunday/SunBiz/Images/mental2b.jpg&w=1484

This post has been updated.

For the first time, scientists have pinned down a molecular process in the brain that helps to trigger schizophrenia. The researchers involved in the landmark study, which was published Wednesday in the journal Nature, say the discovery of this new genetic pathway probably reveals what goes wrong neurologically in a young person diagnosed with the devastating disorder.

The study marks a watershed moment, with the potential for early detection and new treatments that were unthinkable just a year ago, according to Steven Hyman, director of the Stanley Center for Psychiatric Research at the Broad Institute at MIT. Hyman, a former director of the National Institute of Mental Health, calls it “the most significant mechanistic study about schizophrenia ever.”

“I’m a crusty, old, curmudgeonly skeptic,” he said. “But I’m almost giddy about these findings.”

The researchers, chiefly from the Broad Institute, Harvard Medical School and Boston Children’s Hospital, found that a person’s risk of schizophrenia is dramatically increased if they inherit variants of a gene important to “synaptic pruning” — the healthy reduction during adolescence of brain cell connections that are no longer needed.

[Schizophrenic patients have different oral bacteria than non-mentally ill individuals]

In patients with schizophrenia, a variation in a single position in the DNA sequence marks too many synapses for removal and that pruning goes out of control. The result is an abnormal loss of gray matter.

The genes involved coat the neurons with “eat-me signals,” said study co-author Beth Stevens, a neuroscientist at Children’s Hospital and Broad. “They are tagging too many synapses. And they’re gobbled up.

The Institute’s founding director, Eric Lander, believes the research represents an astonishing breakthrough. “It’s taking what has been a black box…and letting us peek inside for the first time. And that is amazingly consequential,” he said.

The timeline for this discovery has been relatively fast. In July 2014, Broad researchers published the results of the largest genomic study on the disorder and found more than 100 genetic locations linked to schizophrenia. Based on that research, Harvard and Broad geneticist Steven McCarroll analyzed data from about 29,000 schizophrenia cases, 36,000 controls and 700 post mortem brains. The information was drawn from dozens of studies performed in 22 countries, all of which contribute to the worldwide database called the Psychiatric Genomics Consortium.

[Influential government-appointed panel recommends depression screening for everyone]

One area in particular, when graphed, showed the strongest association. It was dubbed the “Manhattan plot” for its resemblance to New York City’s towering buildings. The highest peak was on chromosome 6, where McCarroll’s team discovered the gene variant. C4 was “a dark corner of the human genome,” he said, an area difficult to decipher because of its “astonishing level” of diversity.

C4 and numerous other genes reside in a region of chromosome 6 involved in the immune system, which clears out pathogens and similar cellular debris from the brain. The study’s researchers found that one of C4’s variants, C4A, was most associated with a risk for schizophrenia.

More than 25 million people around the globe are affected by schizophrenia, according to the World Health Organization, including 2 million to 3 million Americans. Highly hereditable, it is one of the most severe mental illnesses, with an annual economic burden in this country of tens of billions of dollars.

“This paper is really exciting,” said Jacqueline Feldman, associate medical director of the National Alliance on Mental Illness. “We as scientists and physicians have to temper our enthusiasm because we’ve gone down this path before. But this is profoundly interesting.”

There have been hundreds of theories about schizophrenia over the years, but one of the enduring mysteries has been how three prominent findings related to each other: the apparent involvement of immune molecules, the disorder’s typical onset in late adolescence and early adulthood, and the thinning of gray matter seen in autopsies of patients.

[A low-tech way to help treat young schizophrenic patients]

“The thing about this result,” said McCarroll, the lead author, ” it makes a lot of other things understandable. To have a result to connect to these observations and to have a molecule and strong level of genetic evidence from tens of thousands of research participants, I think that combination sets [this study] apart.”

The authors stressed that their findings, which combine basic science with large-scale analysis of genetic studies, depended on an unusual level of cooperation among experts in genetics, molecular biology, developmental neurobiology and immunology.

“This could not have been done five years ago,” said Hyman. “This required the ability to reference a very large dataset . …When I was [NIMH] director, people really resisted collaborating. They were still in the Pharaoh era. They wanted to be buried with their data.”

The study offers a new approach to schizophrenia research, which has been largely stagnant for decades.  Most psychiatric drugs seek to interrupt psychotic thinking, but experts agree that psychosis is just a single symptom — and a late-occurring one at that. One of the chief difficulties for psychiatric researchers, setting them apart from most other medical investigators, is that they can’t cut schizophrenia out of the brain and look at it under a microscope. Nor are there any good animal models.

All that now has changed, according to Stevens. “We now have a strong molecular handle, a pathway and a gene, to develop better models,” he said.

Which isn’t to say a cure is right around the corner.

“This is the first exciting  clue, maybe even the most important we’ll ever have, but it will be decades” before a true cure is found,” Hyman said. “Hope is a wonderful thing. False promise is not.”

Insight Pharma Report

Three neurodegenerative disorders that are heavily focused on in this report include: Alzheimer’s Disease/Mild Cognitive Impairment, Parkinson’s Disease, and Amyotrophic Lateral Sclerosis. Part II of the report will include all three of these disorders, highlighting specifics including background, history, and development of the disease. Deeper into the chapters, the report will unfold biomarkers under investigation, genetic targets, and an analysis of multiple studies investigating these elements.

Experts interviewed in these chapters include:

  • Dr. Jens Wendland, Head of Neuroscience Genetics, Precision Medicine, Clinical Research, Pfizer Worldwide R&D
  • Dr. Howard J. Federoff, Executive Vice President for Health Sciences, Georgetown University
  • Dr. Andrew West, Associate Professor of Neurology and Neurobiology and Co-Director, Center for Neurodegeneration and Experimental Therapeutics
  • Dr. Merit Ester Cudkowicz, Chief of Neurology at Massachusetts General Hospital

Part III of the report makes a shift from neurobiomarkers to neurodiagnostics. This section highlights several diagnostics in play and in the making from a number of companies, identifying company strategies, research underway, hypotheses, and institution goals. Elite researchers and companies highlighted in this part include:

  • Dr. Xuemei Huang, Professor and Vice Chair, Department of Neurology; Professor of Neurosurgery, Radiology,  Pharmacology, and Kinesiology Director; Hershey Brain Analysis Research Laboratory for Neurodegenerative Disorders, Penn State University-Milton, S. Hershey Medical Center Department of Neurology
  • Dr. Andreas Jeromin, CSO and President of Atlantic Biomarkers
  • Julien Bradley, Senior Director, Sales & Marketing, Quanterix
  • Dr. Scott Marshall, Head of Bioanalytics, and Dr. Jared Kohler, Head of Biomarker Statistics, BioStat Solutions, Inc.

Further analysis appears in Part IV. This section includes a survey exclusively conducted for this report. With over 30 figures and graphics and an in depth analysis, this part features insight into targets under investigation, challenges, advantages, and desired features of future diagnostic applications. Furthermore, the survey covers more than just the featured neurodegenerative disorders in this report, expanding to Multiple Sclerosis and Huntington’s Disease.

Finally, Insight Pharma Reports concludes this report with clinical trial and pipeline data featuring targets and products from over 300 companies working in Alzheimer’s Disease, Parkinson’s Disease and Amyotrophic Lateral Sclerosis.

Epigenome Tapped to Understand Rise of Subtype of Brain Medulloblastoma

http://www.genengnews.com/gen-news-highlights/epigenome-tapped-to-understand-rise-of-subtype-of-brain-medulloblastoma/81252294/

Scientists have identified the cells that likely give rise to the brain tumor subtype Group 4 medulloblastoma. [V. Yakobchuk/ Fotolia]

http://www.genengnews.com/Media/images/GENHighlight/thumb_Jan_28_2016_Fotolia_6761569_ColorfulBrain_4412824411.jpg

An international team of scientists say they have identified the cells that likely give rise to the brain tumor subtype Group 4 medulloblastoma. The believe their study (“Active medulloblastoma enhancers reveal subgroup-specific cellular origins”), published in Nature, removes a barrier to developing more effective targeted therapies against the brain tumor’s most common subtype.

Medulloblastoma occurs in infants, children, and adults, but it is the most common malignant pediatric brain tumor. The disease includes four biologically and clinically distinct subtypes, of which Group 4 is the most common. In children, about half of medulloblastoma patients are of the Group 4 subtype. Efforts to improve patient outcomes, particularly for those with high-risk Group 4 medulloblastoma, have been hampered by the lack of accurate animal models.

Evidence from this study suggests Group 4 tumors begin in neural stem cells that are born in a region of the developing cerebellum called the upper rhomic lip (uRL), according to the researchers.

“Pinpointing the cell(s) of origin for Group 4 medulloblastoma will help us to better understand normal cerebellar development and dramatically improve our chances of developing genetically faithful preclinical mouse models. These models are desperately needed for learning more about Group 4 medulloblastoma biology and evaluating rational, molecularly targeted therapies to improve patient outcomes,” said Paul Northcott, Ph.D., an assistant member of the St. Jude department of developmental neurobiology. Dr. Northcott, Stefan Pfister, M.D., of the German Cancer Research Center (DKFZ), and James Bradner, M.D., of Dana-Farber Cancer Institute, are the corresponding authors.

The discovery and other findings about the missteps fueling tumor growth came from studying the epigenome. Researchers used the analytic tool ChiP-seq to identify and track medulloblastoma subtype differences based on the activity of epigenetic regulators, which included proteins known as master regulator transcription factors. They bind to DNA enhancers and super-enhancers. The master regulator transcription factors and super-enhancers work together to regulate the expression of critical genes, such as those responsible for cell identity.

Those and other tools helped investigators identify more than 3,000 super-enhancers in 28 medulloblastoma tumors as well as evidence that the activity of super-enhancers varied by subtype. The super-enhancers switched on known cancer genes, including genes like ALK, MYC, SMO, and OTX2 that are associated with medulloblastoma, the researchers reported.

Knowledge of the subtype super-enhancers led to identification of the transcription factors that regulate their activity. Using computational methods, researchers applied that information to reconstruct the transcription factor networks responsible for medulloblastoma subtype diversity and identity, providing previously unknown insights into the regulatory landscape and transcriptional output of the different medulloblastoma subtypes.

The approach helped to discover and nominate Lmx1A as a master regulator transcription factor of Group 4 tumors, which led to the identification of the likely Group 4 tumor cells of origin. Lmx1A was known to play an important role in normal development of cells in the uRL and cerebellum. Additional studies performed in mice with and without Lmx1A in this study supported uRL cells as the likely source of Group 4 tumors.

“By studying the epigenome, we also identified new pathways and molecular dependencies not apparent in previous gene expression and mutational studies,” explained Dr. Northcott. “The findings open new therapeutic avenues, particularly for the Group 3 and 4 subtypes where patient outcomes are inferior for the majority of affected children.”

For example, researchers identified increased enhancer activity targeting the TGFbeta pathway. The finding adds to evidence that the pathway may drive Group 3 medulloblastoma, currently the subtype with the worst prognosis. The pathway regulates cell growth, cell death, and other functions that are often disrupted in cancer, but it’s role in medulloblastoma is poorly understood.

The analysis included samples from 28 medulloblastoma tumors representing the four subtypes. Researchers believe it is the largest epigenetic study yet for any single cancer type and, importantly, the first to use a large cohort of primary patient tumor tissues instead of cell lines grown in the laboratory. Previous studies have suggested that cell lines may be of limited use for studying the tumor epigenome. The three Group 3 medulloblastoma cell lines used in this study reinforced the observation, highlighting significant differences in epigenetic regulators at work in medulloblastoma cell lines versus tumor samples.

Read Full Post »

Einstein and General Theory of Relativity

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

General Relativity And The ‘Lone Genius’ Model Of Science

Chad Orzel

http://www.forbes.com/sites/chadorzel/2015/11/24/general-relativity-and-the-lone-genius-model-of-science/

 

(Credit: AP)

 

One hundred years ago this Wednesday, Albert Einstein gave the last of a series of presentations to the Prussian Academy of Sciences, which marks the official completion of his General Theory of Relativity. This anniversary is generating a good deal of press and various celebratory events, such as the premiere of a new documentary special. If you prefer your physics explanations in the plainest language possible, there’s even an “Up Goer Five” version (personally, I don’t find these all that illuminating, but lots of people seem to love it).

Einstein is, of course, the most iconic scientist in history, and much of the attention to this week’s centennial will center on the idea of his singular genius. Honestly, general relativity is esoteric enough that were it not for Einstein’s personal fame, there probably wouldn’t be all that much attention paid to this outside of the specialist science audience.

But, of course, while the notion of Einstein as a lone, unrecognized genius is a big part of his myth, he didn’t create relativity entirely on his own, asthis article in Nature News makes clear. The genesis of relativity is a single simple idea, but even in the early stages, when he developed Special Relativity while working as a patent clerk, he honed his ideas through frequent discussions with friends and colleagues. Most notable among these was probably Michele Besso, who Einstein later referred to as “the best sounding board in Europe.”

And most of the work on General Relativity came not when Einstein was toiling in obscurity, but after he had begun to climb the academic ladder in Europe. In the ten years between the Special and General theories, he went through a series of faculty jobs of increasing prestige. He also laboriously learned a great deal of mathematics in order to reach the final form of the theory, largely with the assistance of his friend Marcel Grossmann. The path to General Relativity was neither simple nor solitary, and the Nature piece documents both the mis-steps along the way and the various people who helped out.

While Einstein wasn’t working alone, though, the Nature piece also makes an indirect case for his status as a genius worth celebrating. Not because of the way he solved the problem, but through the choice of problem to solve. Einstein pursued a theory that would incorporate gravitation into relativity with dogged determination through those years, but he was one of a very few people working on it. There were a couple of other theories kicking around, particularly Gunnar Nordström’s, but these didn’t generate all that much attention. The mathematician David Hilbert nearly scooped Einstein with the final form of the field equations in November of 1915 (some say he did get there first), but Hilbert was a latecomer who only got interested in the problem of gravitation after hearing about it from Einstein, and his success was a matter of greater familiarity with the necessary math. One of the books I used when I taught a relativity class last year quoted Hilbert as saying that “every child in the streets of Göttingen knows more about four-dimensional geometry than Einstein,” but that Einstein’s physical insight got him to the theory before superior mathematicians.

 

History: Einstein was no lone genius

Michel Janssen & Jürgen Renn   

16 November 2015 Corrected:   17 November 2015    Nature Nov 2015; 527(7578)

Lesser-known and junior colleagues helped the great physicist to piece together his general theory of relativity, explain Michel Janssen and Jürgen Renn.

http://www.nature.com/news/history-einstein-was-no-lone-genius-1.18793

 

http://www.nature.com/polopoly_fs/7.31357.1447429421!/image/Comment2.jpg_gen/derivatives/landscape_630/Comment2.jpg

Marcel Grossmann (left) and Michele Besso (right), university friends of Albert Einstein (centre), both made important contributions to general relativity.

 

A century ago, in November 1915, Albert Einstein published his general theory of relativity in four short papers in the proceedings of the Prussian Academy of Sciences in Berlin1. The landmark theory is often presented as the work of a lone genius. In fact, the physicist received a great deal of help from friends and colleagues, most of whom never rose to prominence and have been forgotten2, 3, 4, 5. (For full reference details of all Einstein texts mentioned in this piece, seeSupplementary Information.)

Here we tell the story of how their insights were woven into the final version of the theory. Two friends from Einstein’s student days — Marcel Grossmann and Michele Besso — were particularly important. Grossmann was a gifted mathematician and organized student who helped the more visionary and fanciful Einstein at crucial moments. Besso was an engineer, imaginative and somewhat disorganized, and a caring and lifelong friend to Einstein. A cast of others contributed too.

Einstein met Grossmann and Besso at the Swiss Federal Polytechnical School in Zurich6 — later renamed the Swiss Federal Institute of Technology (Eidgenössische Technische Hochschule; ETH) — where, between 1896 and 1900, he studied to become a school teacher in physics and mathematics. Einstein also met his future wife at the ETH, classmate Mileva Marić. Legend has it that Einstein often skipped class and relied on Grossmann’s notes to pass exams.

 

http://www.nature.com/polopoly_fs/7.31485.1447758022!/image/entanglement.jpg_gen/derivatives/fullsize/entanglement.jpg

 

Grossmann’s father helped Einstein to secure a position at the patent office in Berne in 1902, where Besso joined him two years later. Discussions between Besso and Einstein earned the former the sole acknowledgment in the most famous of Einstein’s 1905 papers, the one introducing the special theory of relativity. As well as publishing the papers that made 1905 his annus mirabilis, Einstein completed his dissertation that year to earn a PhD in physics from the University of Zurich.

In 1907, while still at the patent office, he started to think about extending the principle of relativity from uniform to arbitrary motion through a new theory of gravity. Presciently, Einstein wrote to his friend Conrad Habicht — whom he knew from a reading group in Berne mockingly called the Olympia Academy by its three members — saying that he hoped that this new theory would account for a discrepancy of about 43˝ (seconds of arc) per century between Newtonian predictions and observations of the motion of Mercury’s perihelion, the point of its orbit closest to the Sun.

Einstein started to work in earnest on this new theory only after he left the patent office in 1909, to take up professorships first at the University of Zurich and two years later at the Charles University in Prague. He realized that gravity must be incorporated into the structure of space-time, such that a particle subject to no other force would follow the straightest possible trajectory through a curved space-time.

In 1912, Einstein returned to Zurich and was reunited with Grossmann at the ETH. The pair joined forces to generate a fully fledged theory. The relevant mathematics was Gauss’s theory of curved surfaces, which Einstein probably learned from Grossmann’s notes. As we know from recollected conversations, Einstein told Grossmann7: “You must help me, or else I’ll go crazy.”

Their collaboration, recorded in Einstein’s ‘Zurich notebook‘, resulted in a joint paper published in June 1913, known as the Entwurf (‘outline’) paper. The main advance between this 1913 Entwurf theory and the general relativity theory of November 1915 are the field equations, which determine how matter curves space-time. The final field equations are ‘generally covariant’: they retain their form no matter what system of coordinates is chosen to express them. The covariance of the Entwurf field equations, by contrast, was severely limited.

 

http://www.nature.com/polopoly_fs/7.31488.1447759403!/image/einstein_lost.jpg_gen/derivatives/fullsize/einstein_lost.jpg

Einstein’s lost theory uncovered

 

Two Theories

In May 1913, as he and Grossmann put the finishing touches to their Entwurf paper, Einstein was asked to lecture at the annual meeting of the Society of German Natural Scientists and Physicians to be held that September in Vienna, an invitation that reflects the high esteem in which the 34-year-old was held by his peers.

In July 1913, Max Planck and Walther Nernst, two leading physicists from Berlin, came to Zurich to offer Einstein a well-paid and teaching-free position at the Prussian Academy of Sciences in Berlin, which he swiftly accepted and took up in March 1914. Gravity was not a pressing problem for Planck and Nernst; they were mainly interested in what Einstein could do for quantum physics.  (It was Walther Nernst who advised that Germany could not engage in WWI and win unless it was a short war).

Several new theories had been proposed in which gravity, like electromagnetism, was represented by a field in the flat space-time of special relativity. A particularly promising one came from the young Finnish physicist Gunnar Nordström. In his Vienna lecture, Einstein compared his own Entwurf theory to Nordström’s theory. Einstein worked on both theories between May and late August 1913, when he submitted the text of his lecture for publication in the proceedings of the 1913 Vienna meeting.

In the summer of 1913, Nordström visited Einstein in Zurich. Einstein convinced him that the source of the gravitational field in both their theories should be constructed out of the ‘energy–momentum tensor’: in pre-relativistic theories, the density and the flow of energy and momentum were represented by separate quantities; in relativity theory, they are combined into one quantity with ten different components.

 

http://www.nature.com/polopoly_fs/7.31358.1447420168!/image/Comment4.jpg_gen/derivatives/landscape_630/Comment4.jpg

ETH-Bibliothek Zürich, Bildarchiv

ETH Zurich, where Einstein met friends with whom he worked on general relativity.

 

This energy–momentum tensor made its first appearance in 1907–8 in the special-relativistic reformulation of the theory of electrodynamics of James Clerk Maxwell and Hendrik Antoon Lorentz by Hermann Minkowski. It soon became clear that an energy–momentum tensor could be defined for physical systems other than electromagnetic fields. The tensor took centre stage in the new relativistic mechanics presented in the first textbook on special relativity, Das Relativitätsprinzip, written by Max Laue in 1911. In 1912, a young Viennese physicist, Friedrich Kottler, generalized Laue’s formalism from flat to curved space-time. Einstein and Grossmann relied on this generalization in their formulation of the Entwurf theory. During his Vienna lecture, Einstein called for Kottler to stand up and be recognized for this work8.

Einstein also worked with Besso that summer to investigate whether the Entwurf theory could account for the missing 43˝ per century for Mercury’s perihelion. Unfortunately, they found that it could only explain 18˝. Nordström’s theory, Besso checked later, gave 7˝ in the wrong direction. These calculations are preserved in the ‘Einstein–Besso manuscript‘ of 1913.

Besso contributed significantly to the calculations and raised interesting questions. He wondered, for instance, whether the Entwurf field equations have an unambiguous solution that uniquely determines the gravitational field of the Sun. Historical analysis of extant manuscripts suggests that this query gave Einstein the idea for an argument that reconciled him with the restricted covariance of the Entwurf equations. This ‘hole argument’ seemed to show that generally covariant field equations cannot uniquely determine the gravitational field and are therefore inadmissible9.

Einstein and Besso also checked whether the Entwurf equations hold in a rotating coordinate system. In that case the inertial forces of rotation, such as the centrifugal force we experience on a merry-go-round, can be interpreted as gravitational forces. The theory seemed to pass this test. In August 1913, however, Besso warned him that it did not. Einstein did not heed the warning, which would come back to haunt him.

 

http://www.nature.com/polopoly_fs/7.31486.1447758069!/image/integrity.jpg_gen/derivatives/fullsize/integrity.jpg

Scientific method: Defend the integrity of physics

 

In his lecture in Vienna in September 1913, Einstein concluded his comparison of the two theories with a call for experiment to decide. The Entwurf theory predicts that gravity bends light, whereas Nordström’s does not. It would take another five years to find out. Erwin Finlay Freundlich, a junior astronomer in Berlin with whom Einstein had been in touch since his days in Prague, travelled to Crimea for the solar eclipse of August 1914 to determine whether gravity bends light but was interned by the Russians just as the First World War broke out. Finally, in 1919, English astronomer Arthur Eddington confirmed Einstein’s prediction of light bending by observing the deflection of distant stars seen close to the Sun’s edge during another eclipse, making Einstein a household name10.

Back in Zurich, after the Vienna lecture, Einstein teamed up with another young physicist, Adriaan Fokker, a student of Lorentz, to reformulate the Nordström theory using the same kind of mathematics that he and Grossmann had used to formulate the Entwurf theory. Einstein and Fokker showed that in both theories the gravitational field can be incorporated into the structure of a curved space-time. This work also gave Einstein a clearer picture of the structure of the Entwurf theory, which helped him and Grossmann in a second joint paper on the theory. By the time it was published in May 1914, Einstein had left for Berlin.

 

http://www.nature.com/polopoly_fs/7.31489.1447761264!/image/Einstein_frontal_small.jpg_gen/derivatives/fullsize/Einstein_frontal_small.jpg

Snapshots explore Einstein’s unusual brain

 

The Breakup

Turmoil erupted soon after the move. Einstein’s marriage fell apart and Mileva moved back to Zurich with their two young sons. Albert renewed the affair he had started and broken off two years before with his cousin Elsa Löwenthal (née Einstein). The First World War began. Berlin’s scientific elite showed no interest in the Entwurf theory, although renowned colleagues elsewhere did, such as Lorentz and Paul Ehrenfest in Leiden, the Netherlands. Einstein soldiered on.

By the end of 1914, his confidence had grown enough to write a long exposition of the theory. But in the summer of 1915, after a series of his lectures in Göttingen had piqued the interest of the great mathematician David Hilbert, Einstein started to have serious doubts. He discovered to his dismay that the Entwurf theory does not make rotational motion relative. Besso was right. Einstein wrote to Freundlich for help: his “mind was in a deep rut”, so he hoped that the young astronomer as “a fellow human being with unspoiled brain matter” could tell him what he was doing wrong. Freundlich could not help him.

“Worried that Hilbert might beat him to the punch, Einstein rushed new equations into print.”

The problem, Einstein soon realized, lay with the Entwurf field equations. Worried that Hilbert might beat him to the punch, Einstein rushed new equations into print in early November 1915, modifying them the following week and again two weeks later in subsequent papers submitted to the Prussian Academy. The field equations were generally covariant at last.

In the first November paper, Einstein wrote that the theory was “a real triumph” of the mathematics of Carl Friedrich Gauss and Bernhard Riemann. He recalled in this paper that he and Grossmann had considered the same equations before, and suggested that if only they had allowed themselves to be guided by pure mathematics rather than physics, they would never have accepted equations of limited covariance in the first place.

Other passages in the first November paper, however, as well as his other papers and correspondence in 1913–15, tell a different story. It was thanks to the elaboration of the Entwurf theory, with the help of Grossmann, Besso, Nordström and Fokker, that Einstein saw how to solve the problems with the physical interpretation of these equations that had previously defeated him.

In setting out the generally covariant field equations in the second and fourth papers, he made no mention of the hole argument. Only when Besso and Ehrenfest pressed him a few weeks after the final paper, dated 25 November, did Einstein find a way out of this bind — by realizing that only coincident events and not coordinates have physical meaning. Besso had suggested a similar escape two years earlier, which Einstein had brusquely rejected2.

In his third November paper, Einstein returned to the perihelion motion of Mercury. Inserting the astronomical data supplied by Freundlich into the formula he derived using his new theory, Einstein arrived at the result of 43″ per century and could thus fully account for the difference between Newtonian theory and observation. “Congratulations on conquering the perihelion motion,” Hilbert wrote to him on 19 November. “If I could calculate as fast as you can,” he quipped, “the hydrogen atom would have to bring a note from home to be excused for not radiating.”

Einstein kept quiet on why he had been able to do the calculations so fast. They were minor variations on the ones he had done with Besso in 1913. He probably enjoyed giving Hilbert a taste of his own medicine: in a letter to Ehrenfest written in May 1916, Einstein characterized Hilbert’s style as “creating the impression of being superhuman by obfuscating one’s methods”.

Einstein emphasized that his general theory of relativity built on the work of Gauss and Riemann, giants of the mathematical world. But it also built on the work of towering figures in physics, such as Maxwell and Lorentz, and on the work of researchers of lesser stature, notably Grossmann, Besso, Freundlich, Kottler, Nordström and Fokker. As with many other major breakthroughs in the history of science, Einstein was standing on the shoulders of many scientists, not just the proverbial giants4.

 

http://www.nature.com/polopoly_fs/7.31375.1447420557!/image/cartoon.jpg_gen/derivatives/landscape_630/cartoon.jpg

Berlin’s physics elite (Fritz Haber, Walther Nernst, Heinrich Rubens, Max Planck) and Einstein’s old and new family (Mileva Einstein-Marić and heir sons Eduard and Hans Albert; Elsa Einstein-Löwenthal and her daughters Ilse and Margot) are watching as Einstein is pursuing his new theory of gravity and his idée fixeof generalizing the relativity principle while carried by giants of both physics and mathematics (Isaac Newton, James Clerk Maxwell, Carl Friedrich Gauss, Bernhard Riemann) and scientists of lesser stature (Marcel Grossmann, Gunnar Nordström, Erwin Finlay Freundlich, Michele Besso).

Nature 527, 298–300 (19 Nov 2015)       http://dx.doi.org:/10.1038/527298a

 

 

Read Full Post »

Twitter, Google, LinkedIn Enter in the Curation Foray: What’s Up With That?

 

Reporter: Stephen J. Williams, Ph.D.

Recently Twitter has announced a new feature which they hope to use to increase engagement on their platform. Originally dubbed Project Lightning and now called Moments, this feature involves many human curators which aggregate and curate tweets surrounding individual live events(which used to be under #Live).

As Madhu Muthukumar (@justmadhu), Twitter’s Product Manager, published a blog post describing Moments said:

“Every day, people share hundreds of millions of tweets. Among them are things you can’t experience anywhere but on Twitter: conversations between world leaders and celebrities, citizens reporting events as they happen, cultural memes, live commentary on the night’s big game, and many more,” the blog post noted. “We know finding these only-on-Twitter moments can be a challenge, especially if you haven’t followed certain accounts. But it doesn’t have to be.”

Please see more about Moments on his blog here.

Moments is a new tab on Twitter’s mobile and desktop home screens where the company will curate trending topics as they’re unfolding in real-time — from citizen-reported news to cultural memes to sports events and more. Moments will fall into five total categories, including “Today,” “News,” “Sports,” “Entertainment” and “Fun.” (Source: Fox)

Now It’s Google’s Turn

 

As Dana Blankenhorn wrote in his article Twitter, Google Try It Buzzfeed’s Way With Curation

in SeekingAlpha

What’s a challenge for Google is a direct threat to Twitter’s existence.

For all the talk about what doesn’t work in journalism, curation works. Following the news, collecting it and commenting, and encouraging discussion, is the “secret sauce” for companies like Buzzfeed, Vox, Vice and The Huffington Post, which often wind up getting more traffic from a story at, say The New York Times (NYSE:NYT), than the Times does as a result.

Curation is, in some ways, a throwback to the pre-Internet era. It’s done by people. (At least I think I’m a people.) So as odd as it is for Twitter (NYSE:TWTR) to announce it will curate live events it’s even odder to see Google (NASDAQ:GOOG) (NASDAQ:GOOGL) doing it in a project called YouTube Newswire.

Buzzfeed, Google’s content curation platform, made for desktop as well as a mobile app, allows sharing of curated news, viral videos.

The feel for both Twitter and Google’s content curation will be like a newspaper, with an army of human content curators determining what is the trendiest news to read or videos to watch.

BuzzFeed articles, or at least, the headlines can easily be mined from any social network but reading the whole article still requires that you open the link within the app or outside using a mobile web browser. Loading takes some time–a few seconds longer. Try browsing the BuzzFeed feed on the app and you’ll notice the obvious difference.

However it was earlier this summer in a Forbes article Why Apple, Snapchat and Twitter are betting on human editors, but Facebook and Google aren’t that Apple, Snapchat and Twitter as well as LinkedIn Pulse and Instragram were going to use human editors and curators while Facebook and Google were going to rely on their powerful algorithms. Google (now Alphabet) CEO Eric Schmidt has even called Apple’s human curated playlists “elitist” although Google Play has human curated playlists.

Maybe Google is responding to views on its Google News like this review in VentureBeat:

Google News: Less focused on social signals than textual ones, Google News uses its analytic tools to group together related stories and highlight the biggest ones. Unlike Techmeme, it’s entirely driven by algorithms, and that means it often makes weird choices. I’ve heard that Google uses social sharing signals from Google+ to help determine which stories appear on Google News, but have never heard definitive confirmation of that — and now that Google+ is all but dead, it’s mostly moot. I find Google News an unsatisfying home page, but it is a good place to search for news once you’ve found it.

Now WordPress Too!

 

WordPress also has announced its curation plugin called Curation Traffic.

According to WordPress

You Own the Platform, You Benefit from the Traffic

“The Curation Traffic™ System is a complete WordPress based content curation solution. Giving you all the tools and strategies you need to put content curation into action.

It is push-button simple and seamlessly integrates with any WordPress site or blog.

With Curation Traffic™, curating your first post is as easy as clicking “Curate” and the same post that may originally only been sent to Facebook or Twitter is now sent to your own site that you control, you benefit from, and still goes across all of your social sites.”

The theory the more you share on your platform the more engagement the better marketing experience. And with all the WordPress users out there they have already an army of human curators.

So That’s Great For News But What About Science and Medicine?

 

The news and trendy topics such as fashion and music are common in most people’s experiences. However more technical areas of science, medicine, engineering are not in most people’s domain so aggregation of content needs a process of peer review to sort basically “the fact from fiction”. On social media this is extremely important as sensational stories of breakthroughs can spread virally without proper vetting and even influence patient decisions about their own personal care.

Expertise Depends on Experience

In steps the human experience. On this site (www.pharmaceuticalintelligence.com) we attempt to do just this. A consortium of M.D.s, Ph.D. and other medical professionals spend their own time to aggregate not only topics of interest but curate on specific topics to add some more insight from acceptable sources over the web.

In Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison; Dr. Larry Berstein compares a museum or music curator to curation of scientific findings and literature and draws similar conclusions from each: that a curation can be a tool to gain new insights previously unseen an observer. A way of stepping back to see a different picture, hear a different song.

 

For instance, using a Twitter platform, we curate #live meeting notes and tweets from meeting attendees (please see links below and links within) to give a live conference coverage

http://pharmaceuticalintelligence.com/press-coverage/

and curation and analysis give rise not only to meeting engagement butunique insights into presentations.

 

In addition, the use of a WordPress platform allows easy sharing among many different social platforms including Twitter, Google+, LinkedIn, Pinterest etc.

Hopefully, this will catch on to the big powers of Twitter, Google and Facebook to realize there exists armies of niche curation communities which they can draw on for expert curation in the biosciences.

Other posts on this site on Curation and include

 

Inevitability of Curation: Scientific Publishing moves to embrace Open Data, Libraries and Researchers are trying to keep up

The Methodology of Curation for Scientific Research Findings

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

The growing importance of content curation

Data Curation is for Big Data what Data Integration is for Small Data

Stem Cells and Cardiac Repair: Content Curation & Scientific Reporting

Cardiovascular Diseases and Pharmacological Therapy: Curations

Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison

 

 

 

 

 

 

 

Read Full Post »

Older Posts »

%d bloggers like this: