Feeds:
Posts
Comments

Archive for the ‘Mass automation of plasma proteins’ Category

2021 Virtual World Medical Innovation Forum, Mass General Brigham, Gene and Cell Therapy, VIRTUAL May 19–21, 2021

The 2021 Virtual World Medical Innovation Forum will focus on the growing impact of gene and cell therapy. Senior healthcare leaders from all over look to shape and debate the area of gene and cell therapy. Our shared belief: no matter the magnitude of change, responsible healthcare is centered on a shared commitment to collaborative innovation–industry, academia, and practitioners working together to improve patients’ lives.

About the World Medical Innovation Forum

Mass General Brigham is pleased to present the World Medical Innovation Forum (WMIF) virtual event Wednesday, May 19 – Friday, May 21. This interactive web event features expert discussions of gene and cell therapy (GCT) and its potential to change the future of medicine through its disease-treating and potentially curative properties. The agenda features 150+ executive speakers from the healthcare industry, venture, startups, life sciences manufacturing, consumer health and the front lines of care, including many Harvard Medical School-affiliated researchers and clinicians. The annual in-person Forum will resume live in Boston in 2022. The World Medical Innovation Forum is presented by Mass General Brigham Innovation, the global business development unit supporting the research requirements of 7,200 Harvard Medical School faculty and research hospitals including Massachusetts General, Brigham and Women’s, Massachusetts Eye and Ear, Spaulding Rehab and McLean Hospital. Follow us on Twitter: twitter.com/@MGBInnovation

Accelerating the Future of Medicine with Gene and Cell Therapy What Comes Next

https://worldmedicalinnovation.org/agenda/

Virtual | May 19–21, 2021

#WMIF2021

@MGBInnovation

Leaders in Pharmaceutical Business Intelligence (LPBI) Group

will cover the event in Real Time

Aviva Lev-Ari, PhD, RN

Founder LPBI 1.0 & LPBI 2.0

member_60221522 copy

will be in virtual attendance producing the e-Proceedings

and the Tweet Collection of this Global event expecting +15,000 attendees

@pharma_BI

@AVIVA1950

LPBI’s Eighteen Books in Medicine

https://lnkd.in/ekWGNqA

Among them, books on Gene and Cell Therapy include the following:

Topics for May 19 – 21 include:

Impact on Patient Care – Therapeutic and Potentially Curative GCT Developments

GCT Delivery, Manufacturing – What’s Next

GCT Platform Development

Oncolytic Viruses – Cancer applications, start-ups

Regenerative Medicine/Stem Cells

Future of CAR-T

M&A Shaping GCT’s Future

Market Priorities

Venture Investing in GCT

China’s GCT Juggernaut

Disease and Patient Focus: Benign blood disorders, diabetes, neurodegenerative diseases

Click here for the current WMIF agenda  

Plus:

Fireside Chats: 1:1 interviews with industry CEOs/C-Suite leaders including Novartis Gene Therapies, ThermoFisher, Bayer AG, FDA

First Look: 18 briefings on emerging GCT research from Mass General Brigham scientists

Virtual Poster Session: 40 research posters and presenters on potential GCT discoveries from Mass General Brigham

Announcement of the Disruptive Dozen, 12 GCT technologies likely to break through in the next few years

AGENDA

Wednesday, May 19, 2021

8:00 AM – 8:10 AM

Opening Remarks

Welcome and the vision for Gene and Cell Therapy and why it is a top Mass General Brigham priority. Introducer: Scott Sperling

  • Co-President, Thomas H. Lee Partners
  • Chairman of the Board of Directors, PHS

Presenter: Anne Klibanski, MD

  • CEO, Mass General Brigham

3,000 people joined 5/19 morning

30 sessions: Lab to Clinic,  academia, industry, investment community

May 22,23,24, 2022 – in Boston, in-person 2022 WMIF on CGT 8:10 AM – 8:30 AM

The Grand Challenge of Widespread GCT Patient Benefits

Co-Chairs identify the key themes of the Forum –  set the stage for top GCT opportunities, challenges, and where the field might take medicine in the future. Moderator: Susan Hockfield, PhD

  • President Emerita and Professor of Neuroscience, MIT

GCT – poised to deliver therapies

Inflection point as Panel will present

Doctors and Patients – Promise for some patients 

Barriers for Cell & Gene

Access for patients to therapies like CGT Speakers: Nino Chiocca, MD, PhD

  • Neurosurgeon-in-Chief and Chairman, Neurosurgery, BWH
  • Harvey W. Cushing Professor of Neurosurgery, HMS

Oncolytic virus triple threat: Toxic, immunological, combine with anti cancer therapies

Polygenic therapy – multiple genes involved, plug-play, Susan Slaugenhaupt, PhD

  • Scientific Director and Elizabeth G. Riley and Daniel E. Smith Jr., Endowed Chair, Mass General Research Institute
  • Professor, Neurology, HMS

Ravi Thadhani, MD

  • CAO, Mass General Brigham
  • Professor, Medicine and Faculty Dean, HMS

Role of academia special to spear head the Polygenic therapy – multiple genes involved, plug-play, 

Access critical, relations with IndustryLuk Vandenberghe, PhD

  • Grousbeck Family Chair, Gene Therapy, MEE
  • Associate Professor, Ophthalmology, HMS

Pharmacology Gene-Drug, Interface academic centers and industry

many CGT drugs emerged in Academic center 8:35 AM – 8:50 AM FIRESIDE

Gene and Cell Therapy 2.0 – What’s Next as We Realize their Potential for Patients

Dave Lennon, PhD

  • President, Novartis Gene Therapies

Hope that CGT emerging, how the therapies work, neuro, muscular, ocular, genetic diseases of liver and of heart revolution for the industry 900 IND application 25 approvals Economic driver Skilled works, VC disease. Modality one time intervention, long duration of impart, reimbursement, ecosystem to be built around CGT

FDA works by indications and risks involved, Standards and expectations for streamlining manufacturing, understanding of process and products 

payments over time payers and Innovators relations Moderator: Julian Harris, MD

  • Partner, Deerfield

Promise of CGT realized, what part?

FDA role and interaction in CGT

Manufacturing aspects which is critical Speaker: Dave Lennon, PhD

  • President, Novartis Gene Therapies

Hope that CGT emerging, how the therapies work, neuro, muscular, ocular, genetic diseases of liver and of heart revolution for the industry 900 IND application 25 approvals Economic driver Skilled works, VC disease. Modality one time intervention, long duration of impart, reimbursement, ecosystem to be built around CGT

FDA works by indications and risks involved, Standards and expectations for streamlining manufacturing, understanding of process and products 

payments over time payers and Innovators relations

  • Q&A 8:55 AM – 9:10 AM  

8:55 AM – 9:20 AM

The Patient and GCT

GCT development for rare diseases is driven by patient and patient-advocate communities. Understanding their needs and perspectives enables biomarker research, the development of value-driving clinical trial endpoints and successful clinical trials. Industry works with patient communities that help identify unmet needs and collaborate with researchers to conduct disease natural history studies that inform the development of biomarkers and trial endpoints. This panel includes patients who have received cutting-edge GCT therapy as well as caregivers and patient advocates. Moderator: Patricia Musolino, MD, PhD

  • Co-Director Pediatric Stroke and Cerebrovascular Program, MGH
  • Assistant Professor of Neurology, HMS

What is the Power of One – the impact that a patient can have on their own destiny by participating in Clinical Trials Contacting other participants in same trial can be beneficial Speakers: Jack Hogan

  • Patient, MEE

Jeanette Hogan

  • Parent of Patient, MEE

Jim Holland

  • CEO, Backcountry.com

Parkinson patient Constraints by regulatory on participation in clinical trial advance stage is approved participation Patients to determine the level of risk they wish to take Information dissemination is critical Barbara Lavery

  • Chief Program Officer, ACGT Foundation

Advocacy agency beginning of work Global Genes educational content and out reach to access the information 

Patient has the knowledge of the symptoms and recording all input needed for diagnosis by multiple clinicians Early application for CGTDan Tesler

  • Clinical Trial Patient, BWH/DFCC

Experimental Drug clinical trial patient participation in clinical trial is very important to advance the state of scienceSarah Beth Thomas, RN

  • Professional Development Manager, BWH

Outcome is unknown, hope for good, support with resources all advocacy groups, 

  • Q&A 9:25 AM – 9:40 AM  

9:25 AM – 9:45 AM FIRESIDE

GCT Regulatory Framework | Why Different?

  Moderator: Vicki Sato, PhD

  • Chairman of the Board, Vir Biotechnology

Diversity of approaches

Process at FDA generalize from 1st entry to rules more generalizable  Speaker: Peter Marks, MD, PhD

  • Director, Center for Biologics Evaluation and Research, FDA

Last Spring it became clear that something will work a vaccine by June 2020 belief that enough candidates the challenge manufacture enough and scaling up FDA did not predicted the efficacy of mRNA vaccine vs other approaches expected to work

Recover Work load for the pandemic will wean & clear, Gene Therapies IND application remained flat in the face of the pandemic Rare diseases urgency remains Consensus with industry advisory to get input gene therapy Guidance  T-Cell therapy vs Regulation best thinking CGT evolve speedily flexible gained by Guidance

Immune modulators, Immunotherapy Genome editing can make use of viral vectors future technologies nanoparticles and liposome encapsulation 

  • Q&A 9:50 AM – 10:05 AM  

9:50 AM – 10:15 AM

Building a GCT Platform for Mainstream Success

This panel of GCT executives, innovators and investors explore how to best shape a successful GCT strategy. Among the questions to be addressed:

  • How are GCT approaches set around defining and building a platform?
  • Is AAV the leading modality and what are the remaining challenges?
  • What are the alternatives?
  • Is it just a matter of matching modalities to the right indications?

Moderator: Jean-François Formela, MD

  • Partner, Atlas Venture

Established core components of the Platform Speakers: Katherine High, MD

  • President, Therapeutics, AskBio

Three drugs approved in Europe in the Gene therapy space

Regulatory Infrastructure exists for CGT drug approval – as new class of therapeutics

Participants investigators, regulators, patients i. e., MDM 

Hemophilia in male most challenging

Human are natural hosts for AV safety signals Dave Lennon, PhD

  • President, Novartis Gene Therapies

big pharma has portfolios of therapeutics not one drug across Tx areas: cell, gene iodine therapy 

collective learning infrastructure features manufacturing at scale early in development Acquisitions strategy for growth # applications for scaling Rick Modi

  • CEO, Affinia Therapeutics

Copy, paste EDIT from product A to B novel vectors leverage knowledge varient of vector, coder optimization choice of indication is critical exploration on larger populations Speed to R&D and Speed to better gene construct get to clinic with better design vs ASAP 

Data sharing clinical experience with vectors strategies patients selection, vector selection, mitigation, patient type specific Louise Rodino-Klapac, PhD

  • EVP, Chief Scientific Officer, Sarepta Therapeutics

AAV based platform 15 years in development same disease indication vs more than one indication stereotype, analytics as hurdle 1st was 10 years 2nd was 3 years

Safety to clinic vs speed to clinic, difference of vectors to trust

  • Q&A 10:20 AM – 10:35 AM  

10:20 AM – 10:45 AM

AAV Success Studies | Retinal Dystrophy | Spinal Muscular Atrophy

Recent AAV gene therapy product approvals have catalyzed the field. This new class of therapies has shown the potential to bring transformative benefit to patients. With dozens of AAV treatments in clinical studies, all eyes are on the field to gauge its disruptive impact.

The panel assesses the largest challenges of the first two products, the lessons learned for the broader CGT field, and the extent to which they serve as a precedent to broaden the AAV modality.

  • Is AAV gene therapy restricted to genetically defined disorders, or will it be able to address common diseases in the near term?
  • Lessons learned from these first-in-class approvals.
  • Challenges to broaden this modality to similar indications.
  • Reflections on safety signals in the clinical studies?

Moderator: Joan Miller, MD

  • Chief, Ophthalmology, MEE
  • Cogan Professor & Chair of Ophthalmology, HMS

Retina specialist, Luxturna success FMA condition cell therapy as solution

Lessons learned

Safety Speakers: Ken Mills

  • CEO, RegenXBio

Tissue types additional administrations, tech and science, address additional diseases, more science for photoreceptors a different tissue type underlying pathology novelties in last 10 years 

Cell therapy vs transplant therapy no immunosuppressionEric Pierce, MD, PhD

  • Director, Ocular Genomics Institute, MEE
  • Professor of Ophthalmology, HMS

Laxterna success to be replicated platform, paradigms measurement visual improved

More science is needed to continue develop vectors reduce toxicity,

AAV can deliver different cargos reduce adverse events improve vectorsRon Philip

  • Chief Operating Officer, Spark Therapeutics

The first retinal gene therapy, voretigene neparvovec-rzyl (Luxturna, Spark Therapeutics), was approved by the FDA in 2017.Meredith Schultz, MD

  • Executive Medical Director, Lead TME, Novartis Gene Therapies

Impact of cell therapy beyond muscular dystrophy, translational medicine, each indication, each disease, each group of patients build platform unlock the promise

Monitoring for Safety signals real world evidence remote markers, home visits, clinical trial made safer, better communication of information

  • Q&A 10:50 AM – 11:05 AM  

10:45 AM – 10:55 AM

Break

  10:55 AM – 11:05 AM FIRST LOOK

Control of AAV pharmacology by Rational Capsid Design

Luk Vandenberghe, PhD

  • Grousbeck Family Chair, Gene Therapy, MEE
  • Associate Professor, Ophthalmology, HMS

AAV a complex driver in Pharmacology durable, vector of choice, administer in vitro, gene editing tissue specificity, pharmacokinetics side effects and adverse events manufacturability site variation diversify portfolios,

Pathway for rational AAV rational design, curated smart variant libraries, AAV  sequence screen multiparametric , data enable liver (de-) targeting unlock therapeutics areas: cochlea 

  • Q&A 11:05 AM – 11:25 AM  

11:05 AM – 11:15 AM FIRST LOOK

Enhanced gene delivery and immunoevasion of AAV vectors without capsid modification

Casey Maguire, PhD

  • Associate Professor of Neurology, MGH & HMS

Virus Biology: Enveloped (e) or not 

enveloped for gene therapy eAAV platform technology: tissue targets and Indications commercialization of eAAV 

  • Q&A 11:15 AM – 11:35 AM  

11:20 AM – 11:45 AM HOT TOPICS

AAV Delivery

This panel will address the advances in the area of AAV gene therapy delivery looking out the next five years. Questions that loom large are: How can biodistribution of AAV be improved? What solutions are in the wings to address immunogenicity of AAV? Will patients be able to receive systemic redosing of AAV-based gene therapies in the future? What technical advances are there for payload size? Will the cost of manufacturing ever become affordable for ultra-rare conditions? Will non-viral delivery completely supplant viral delivery within the next five years?What are the safety concerns and how will they be addressed? Moderators: Xandra Breakefield, PhD

  • Geneticist, MGH, MGH
  • Professor, Neurology, HMS

Florian Eichler, MD

  • Director, Center for Rare Neurological Diseases, MGH
  • Associate Professor, Neurology, HMS

Speakers: Jennifer Farmer

  • CEO, Friedreich’s Ataxia Research Alliance

Ataxia requires therapy targeting multiple organ with one therapy, brain, spinal cord, heart several IND, clinical trials in 2022Mathew Pletcher, PhD

  • SVP, Head of Gene Therapy Research and Technical Operations, Astellas

Work with diseases poorly understood, collaborations needs example of existing: DMD is a great example explain dystrophin share placedo data 

Continue to explore large animal guinea pig not the mice, not primates (ethical issues) for understanding immunogenicity and immune response Manny Simons, PhD

  • CEO, Akouos

AAV Therapy for the fluid of the inner ear, CGT for the ear vector accessible to surgeons translational work on the inner ear for gene therapy right animal model 

Biology across species nerve ending in the cochlea

engineer out of the caspid, lowest dose possible, get desired effect by vector use, 2022 new milestones

  • Q&A 11:50 AM – 12:05 PM  

11:50 AM – 12:15 PM

M&A | Shaping GCT Innovation

The GCT M&A market is booming – many large pharmas have made at least one significant acquisition. How should we view the current GCT M&A market? What is its impact of the current M&A market on technology development? Are these M&A trends new are just another cycle? Has pharma strategy shifted and, if so, what does it mean for GCT companies? What does it mean for patients? What are the long-term prospects – can valuations hold up? Moderator: Adam Koppel, MD, PhD

  • Managing Director, Bain Capital Life Sciences

What acquirers are looking for??

What is the next generation vs what is real where is the industry going? Speakers:

Debby Baron,

  • Worldwide Business Development, Pfizer 

CGT is an important area Pfizer is active looking for innovators, advancing forward programs of innovation with the experience Pfizer has internally 

Scalability and manufacturing  regulatory conversations, clinical programs safety in parallel to planning getting drug to patients

Kenneth Custer, PhD

  • Vice President, Business Development and Lilly New Ventures, Eli Lilly and Company

Marianne De Backer, PhD

Head of Strategy, Business Development & Licensing, and Member of the Executive Committee, Bayer

Absolute Leadership in Gene editing, gene therapy, via acquisition and strategic alliance 

Operating model of the acquired company discussed , company continue independence

Sean Nolan

  • Board Chairman, Encoded Therapeutics & Affinia

Executive Chairman, Jaguar Gene Therapy & Istari Oncology

As acquiree multiple M&A: How the acquirer looks at integration and cultures of the two companies 

Traditional integration vs jump start by external acquisition 

AAV – epilepsy, next generation of vectors 

  • Q&A 12:20 PM – 12:35 PM  

12:15 PM – 12:25 PM FIRST LOOK

Gene Therapies for Neurological Disorders: Insights from Motor Neuron Disorders

Merit Cudkowicz, MD

  • Chief of Neurology, MGH

ALS – Man 1in 300, Women 1 in 400, next decade increase 7% 

10% ALS is heredity 160 pharma in ALS space, diagnosis is late 1/3 of people are not diagnosed, active community for clinical trials Challenges: disease heterogeneity cases of 10 years late in diagnosis. Clinical Trials for ALS in Gene Therapy targeting ASO1 protein therapies FUS gene struck youngsters 

Q&A

  • 12:25 PM – 12:45 PM  

12:25 PM – 12:35 PM FIRST LOOK

Gene Therapy for Neurologic Diseases

Patricia Musolino, MD, PhD

  • Co-Director Pediatric Stroke and Cerebrovascular Program, MGH
  • Assistant Professor of Neurology, HMS

Cerebral Vascular disease – ACTA2 179H gene smooth muscle cell proliferation disorder

no surgery or drug exist –

Cell therapy for ACTA2 Vasculopathy  in the brain and control the BP and stroke – smooth muscle intima proliferation. Viral vector deliver aiming to change platform to non-viral delivery rare disease , gene editing, other mutations of ACTA2 gene target other pathway for atherosclerosis 

  • Q&A 12:35 PM – 12:55 PM  

12:35 PM – 1:15 PM

Lunch

  1:15 PM – 1:40 PM

Oncolytic Viruses in Cancer | Curing Melanoma and Beyond

Oncolytic viruses represent a powerful new technology, but so far an FDA-approved oncolytic (Imlygic) has only occurred in one area – melanoma and that what is in 2015. This panel involves some of the protagonists of this early success story.  They will explore why and how Imlygic became approved and its path to commercialization.  Yet, no other cancer indications exist for Imlygic, unlike the expansion of FDA-approved indication for immune checkpoint inhibitors to multiple cancers.  Why? Is there a limitation to what and which cancers can target?  Is the mode of administration a problem?

No other oncolytic virus therapy has been approved since 2015. Where will the next success story come from and why?  Will these therapies only be beneficial for skin cancers or other easily accessible cancers based on intratumoral delivery?

The panel will examine whether the preclinical models that have been developed for other cancer treatment modalities will be useful for oncolytic viruses.  It will also assess the extent pre-clinical development challenges have slowed the development of OVs. Moderator: Nino Chiocca, MD, PhD

  • Neurosurgeon-in-Chief and Chairman, Neurosurgery, BWH
  • Harvey W. Cushing Professor of Neurosurgery, HMS

Challenges of manufacturing at Amgen what are they? Speakers: Robert Coffin, PhD

  • Chief Research & Development Officer, Replimune

2002 in UK promise in oncolytic therapy GNCSF

Phase III melanoma 2015 M&A with Amgen

oncolytic therapy remains non effecting on immune response 

data is key for commercialization 

do not belief in systemic therapy achieve maximum immune response possible from a tumor by localized injection Roger Perlmutter, MD, PhD

  • Chairman, Merck & Co.

response rates systemic therapy like PD1, Keytruda, OPTIVA well tolerated combination of Oncolytic with systemic 

GMP critical for manufacturing David Reese, MD

  • Executive Vice President, Research and Development, Amgen

Inter lesion injection of agent vs systemic therapeutics 

cold tumors immune resistant render them immune susceptible 

Oncolytic virus is a Mono therapy

addressing the unknown Ann Silk, MD

  • Physician, Dana Farber-Brigham and Women’s Cancer Center
  • Assistant Professor of Medicine, HMS

Which person gets oncolytics virus if patient has immune suppression due to other indications

Safety of oncolytic virus greater than Systemic treatment

series biopsies for injected and non injected tissue and compare Suspect of hot tumor and cold tumors likely to have sme response to agent unknown all potential 

  • Q&A 1:45 PM – 2:00 PM  

1:45 PM – 2:10 PM

Market Interest in Oncolytic Viruses | Calibrating

There are currently two oncolytic virus products on the market, one in the USA and one in China.  As of late 2020, there were 86 clinical trials 60 of which were in phase I with just 2 in Phase III the rest in Phase I/II or Phase II.   Although global sales of OVs are still in the ramp-up phase, some projections forecast OVs will be a $700 million market by 2026. This panel will address some of the major questions in this area:

What regulatory challenges will keep OVs from realizing their potential? Despite the promise of OVs for treating cancer only one has been approved in the US. Why has this been the case? Reasons such have viral tropism, viral species selection and delivery challenges have all been cited. However, these are also true of other modalities. Why then have oncolytic virus approaches not advanced faster and what are the primary challenges to be overcome?

  • Will these need to be combined with other agents to realize their full efficacy and how will that impact the market?
  • Why are these companies pursuing OVs while several others are taking a pass?

Moderators: Martine Lamfers, PhD

  • Visiting Scientist, BWH

Challenged in development of strategies 

Demonstrate efficacyRobert Martuza, MD

  • Consultant in Neurosurgery, MGH
  • William and Elizabeth Sweet Distinguished Professor of Neurosurgery, HMS

Modulation mechanism Speakers: Anlong Li, MD, PhD

  • Clinical Director, Oncology Clinical Development, Merck Research Laboratories

IV delivery preferred – delivery alternative are less aggereable Jeffrey Infante, MD

  • Early development Oncolytic viruses, Oncology, Janssen Research & Development

oncologic virus if it will generate systemic effects the adoption will accelerate

What areas are the best efficacious 

Direct effect with intra-tumor single injection with right payload 

Platform approach  Prime with 1 and Boost with 2 – not yet experimented with 

Do not have the data at trial design for stratification of patients 

Turn off strategy not existing yetLoic Vincent, PhD

  • Head of Oncology Drug Discovery Unit, Takeda

R&D in collaboration with Academic

Vaccine platform to explore different payload

IV administration may not bring sufficient concentration to the tumor is administer  in the blood stream

Classification of Patients by prospective response type id UNKNOWN yet, population of patients require stratification

  • Q&A 2:15 PM – 2:30 PM  

2:10 PM – 2:20 PM FIRST LOOK

Oncolytic viruses: turning pathogens into anticancer agents

Nino Chiocca, MD, PhD

  • Neurosurgeon-in-Chief and Chairman, Neurosurgery, BWH
  • Harvey W. Cushing Professor of Neurosurgery, HMS

Oncolytic therapy DID NOT WORK Pancreatic Cancer and Glioblastoma 

Intra- tumoral heterogeniety hinders success 

Solution: Oncolytic VIRUSES – Immunological “coldness”

GADD-34 20,000 GBM 40,000 pancreatic cancer

  • Q&A 2:25 PM – 2:40 PM  

2:20 PM – 2:45 PM

Entrepreneurial Growth | Oncolytic Virus

In 2020 there were a total of 60 phase I trials for Oncolytic Viruses. There are now dozens of companies pursuing some aspect of OV technology. This panel will address:

  •  How are small companies equipped to address the challenges of developing OV therapies better than large pharma or biotech?
  • Will the success of COVID vaccines based on Adenovirus help the regulatory environment for small companies developing OV products in Europe and the USA?
  • Is there a place for non-viral delivery and other immunotherapy companies to engage in the OV space?  Would they bring any real advantages?

Moderator: Reid Huber, PhD

  • Partner, Third Rock Ventures

Critical milestones to observe Speakers: Caroline Breitbach, PhD

  • VP, R&D Programs and Strategy, Turnstone Biologics

Trying Intra-tumor delivery and IV infusion delivery oncolytic vaccine pushing dose 

translation biomarkers program 

transformation tumor microenvironment Brett Ewald, PhD

  • SVP, Development & Corporate Strategy, DNAtrix

Studies gets larger, kicking off Phase III multiple tumors Paul Hallenbeck, PhD

  • President and Chief Scientific Officer, Seneca Therapeutics

Translation: Stephen Russell, MD, PhD

  • CEO, Vyriad

Systemic delivery Oncolytic Virus IV delivery woman in remission

Collaboration with Regeneron

Data collection: Imageable reporter secretable reporter, gene expression

Field is intense systemic oncolytic delivery is exciting in mice and in human, response rates are encouraging combination immune stimulant, check inhibitors 

  • Q&A 2:50 PM – 3:05 PM  

2:45 PM – 3:00 PM

Break

  3:00 PM – 3:25 PM

CAR-T | Lessons Learned | What’s Next

Few areas of potential cancer therapy have had the attention and excitement of CAR-T. This panel of leading executives, developers, and clinician-scientists will explore the current state of CAR-T and its future prospects. Among the questions to be addressed are:

  • Is CAR-T still an industry priority – i.e. are new investments being made by large companies? Are new companies being financed? What are the trends?
  • What have we learned from first-generation products, what can we expect from CAR-T going forward in novel targets, combinations, armored CAR’s and allogeneic treatment adoption?
  • Early trials showed remarkable overall survival and progression-free survival. What has been observed regarding how enduring these responses are?
  • Most of the approvals to date have targeted CD19, and most recently BCMA. What are the most common forms of relapses that have been observed?
  • Is there a consensus about what comes after these CD19 and BCMA trials as to additional targets in liquid tumors? How have dual-targeted approaches fared?
  • Moderator:
  • Marcela Maus, MD, PhD
    • Director, Cellular Immunotherapy Program, Cancer Center, MGH
    • Associate Professor, Medicine, HMSIs CAR-T Industry priority
  • Speakers:
  • Head of R&D, Atara BioTherapeutics
  • Phyno-type of the cells for hematologic cancers 
  • solid tumor 
  • inventory of Therapeutics for treating patients in the future 
  • Progressive MS program
  • EBBT platform B-Cells and T-Cells
    • Stefan Hendriks
      • Gobal Head, Cell & Gene, Novartis
      • yes, CGT is a strategy in the present and future
      • Journey started years ago 
      • Confirmation the effectiveness of CAR-T therapies, 1 year response prolonged to 5 years 26 months
      • Patient not responding – a lot to learn
      • Patient after 8 months of chemo can be helped by CAR-T
    • Christi Shaw
      • CEO, Kite
      • CAR-T is priority 120 companies in the space
      • Manufacturing consistency 
      • Patients respond with better quality of life
      • Blood cancer – more work to be done

Q&A

  • 3:30 PM – 3:45 PM  

3:30 PM – 3:55 PM HOT TOPICS

CAR-T | Solid Tumors Success | When?

The potential application of CAR-T in solid tumors will be a game-changer if it occurs. The panel explores the prospects of solid tumor success and what the barriers have been. Questions include:

  •  How would industry and investor strategy for CAR-T and solid tumors be characterized? Has it changed in the last couple of years?
  •  Does the lack of tumor antigen specificity in solid tumors mean that lessons from liquid tumor CAR-T constructs will not translate well and we have to start over?
  •  Whether due to antigen heterogeneity, a hostile tumor micro-environment, or other factors are some specific solid tumors more attractive opportunities than others for CAR-T therapy development?
  •  Given the many challenges that CAR-T faces in solid tumors, does the use of combination therapies from the start, for example, to mitigate TME effects, offer a more compelling opportunity.

Moderator: Oladapo Yeku, MD, PhD

  • Clinical Assistant in Medicine, MGH

window of opportunities studies  Speakers: Jennifer Brogdon

  • Executive Director, Head of Cell Therapy Research, Exploratory Immuno-Oncology, NIBR

2017 CAR-T first approval

M&A and research collaborations

TCR tumor specific antigens avoid tissue toxicity Knut Niss, PhD

  • CTO, Mustang Bio

tumor hot start in 12 month clinical trial solid tumors , theraties not ready yet. Combination therapy will be an experimental treatment long journey checkpoint inhibitors to be used in combination maintenance Lipid tumor Barbra Sasu, PhD

  • CSO, Allogene

T cell response at prostate cancer 

tumor specific 

cytokine tumor specific signals move from solid to metastatic cell type for easier infiltration

Where we might go: safety autologous and allogeneic Jay Short, PhD

  • Chairman, CEO, Cofounder, BioAlta, Inc.

Tumor type is not enough for development of therapeutics other organs are involved in the periphery

difficult to penetrate solid tumors biologics activated in the tumor only, positive changes surrounding all charges, water molecules inside the tissue acidic environment target the cells inside the tumor and not outside 

Combination staggered key is try combination

  • Q&A 4:00 PM – 4:15 PM  

4:00 PM – 4:25 PM

GCT Manufacturing | Vector Production | Autologous and Allogeneic | Stem Cells | Supply Chain | Scalability & Management

The modes of GCT manufacturing have the potential of fundamentally reordering long-established roles and pathways. While complexity goes up the distance from discovery to deployment shrinks. With the likelihood of a total market for cell therapies to be over $48 billion by 2027,  groups of products are emerging.  Stem cell therapies are projected to be $28 billion by 2027 and non-stem cell therapies such as CAR-T are projected be $20 billion by 2027. The manufacturing challenges for these two large buckets are very different. Within the CAR-T realm there are diverging trends of autologous and allogeneic therapies and the demands on manufacturing infrastructure are very different. Questions for the panelists are:

  • Help us all understand the different manufacturing challenges for cell therapies. What are the trade-offs among storage cost, batch size, line changes in terms of production cost and what is the current state of scaling naïve and stem cell therapy treatment vs engineered cell therapies?
  • For cell and gene therapy what is the cost of Quality Assurance/Quality Control vs. production and how do you think this will trend over time based on your perspective on learning curves today?
  • Will point of care production become a reality? How will that change product development strategy for pharma and venture investors? What would be the regulatory implications for such products?
  • How close are allogeneic CAR-T cell therapies? If successful what are the market implications of allogenic CAR-T? What are the cost implications and rewards for developing allogeneic cell therapy treatments?

Moderator: Michael Paglia

  • VP, ElevateBio

Speakers:

  • Dannielle Appelhans
    • SVP TechOps and Chief Technical Officer, Novartis Gene Therapies
  • Thomas Page, PhD
    • VP, Engineering and Asset Development, FUJIFILM Diosynth Biotechnologies
  • Rahul Singhvi, ScD
    • CEO and Co-Founder, National Resilience, Inc.
  • Thomas VanCott, PhD
    • Global Head of Product Development, Gene & Cell Therapy, Catalent
    • 2/3 autologous 1/3 allogeneic  CAR-T high doses and high populations scale up is not done today quality maintain required the timing logistics issues centralized vs decentralized  allogeneic are health donors innovations in cell types in use improvements in manufacturing

Ropa Pike, Director,  Enterprise Science & Partnerships, Thermo Fisher Scientific 

Centralized biopharma industry is moving  to decentralized models site specific license 

  • Q&A 4:30 PM – 4:45 PM  

4:30 PM – 4:40 PM FIRST LOOK

CAR-T

Marcela Maus, MD, PhD

  • Director, Cellular Immunotherapy Program, Cancer Center, MGH
  • Assistant Professor, Medicine, HMS 

Fit-to-purpose CAR-T cells: 3 lead programs

Tr-fill 

CAR-T induce response myeloma and multiple myeloma GBM

27 patents on CAR-T

+400 patients treaded 40 Clinical Trials 

  • Q&A 4:40 PM – 5:00 PM  

4:40 PM – 4:50 PM FIRST LOOK

Repurposed Tumor Cells as Killers and Immunomodulators for Cancer Therapy

Khalid Shah, PhD

  • Vice Chair, Neurosurgery Research, BWH
  • Director, Center for Stem Cell Therapeutics and Imaging, HMS

Solid tumors are the hardest to treat because: immunosuppressive, hypoxic, Acidic Use of autologous tumor cells self homing ThTC self targeting therapeutic cells Therapeutic tumor cells efficacy pre-clinical models GBM 95% metastesis ThTC translation to patient settings

  • Q&A 4:50 PM – 5:10 PM  

4:50 PM – 5:00 PM FIRST LOOK

Other Cell Therapies for Cancer

David Scadden, MD

  • Director, Center for Regenerative Medicine; Co-Director, Harvard Stem Cell Institute, Director, Hematologic Malignancies & Experimental Hematology, MGH
  • Jordan Professor of Medicine, HMS

T-cell are made in bone marrow create cryogel  can be an off-the-shelf product repertoire on T Receptor CCL19+ mesenchymal cells mimic Tymus cells –

inter-tymic injection. Non human primate validation

Q&A

 

5:00 PM – 5:20 PM   5:00 PM – 5:20 PM FIRESIDE

Fireside with Mikael Dolsten, MD, PhD

  Introducer: Jonathan Kraft Moderator: Daniel Haber, MD, PhD

  • Chair, Cancer Center, MGH
  • Isselbacher Professor of Oncology, HMS

Vaccine Status Mikael Dolsten, MD, PhD

  • Chief Scientific Officer and President, Worldwide Research, Development and Medical, Pfizer

Deliver vaccine around the Globe, Israel, US, Europe.

3BIL vaccine in 2022 for all Global vaccination 

Bio Ntech in Germany

Experience with Biologics immuneoncology & allogeneic antibody cells – new field for drug discovery 

mRNA curative effort and cancer vaccine 

Access to drugs developed by Pfizer to underdeveloped countries 

  • Q&A 5:25 PM – 5:40 AM  

5:20 PM – 5:30 PM

Closing Remarks

Thursday, May 20, 2021

8:00 AM – 8:25 AM

GCT | The China Juggernaut

China embraced gene and cell therapies early. The first China gene therapy clinical trial was in 1991. China approved the world’s first gene therapy product in 2003—Gendicine—an oncolytic adenovirus for the treatment of advanced head and neck cancer.  Driven by broad national strategy, China has become a hotbed of GCT development, ranking second in the world with more than 1,000 clinical trials either conducted or underway and thousands of related patents.  It has a booming GCT biotech sector, led by more than 45 local companies with growing IND pipelines.

In late 1990, a T cell-based immunotherapy, cytokine-induced killer (CIK) therapy became a popular modality in the clinic in China for tumor treatment.  In early 2010, Chinese researchers started to carry out domestic CAR T trials inspired by several important reports suggested the great antitumor function of CAR T cells. Now, China became the country with the most registered CAR T trials, CAR T therapy is flourishing in China.

The Chinese GCT ecosystem has increasingly rich local innovation and growing complement of development and investment partnerships – and also many subtleties.

This panel, consisting of leaders from the China GCT corporate, investor, research and entrepreneurial communities, will consider strategic questions on the growth of the gene and cell therapy industry in China, areas of greatest strength, evolving regulatory framework, early successes and products expected to reach the US and world market. Moderator: Min Wu, PhD

  • Managing Director, Fosun Health Fund

What are the area of CGT in China, regulatory similar to the US Speakers: Alvin Luk, PhD

  • CEO, Neuropath Therapeutics

Monogenic rare disease with clear genomic target

Increase of 30% in patient enrollment 

Regulatory reform approval is 60 days no delayPin Wang, PhD

  • CSO, Jiangsu Simcere Pharmaceutical Co., Ltd.

Similar starting point in CGT as the rest of the World unlike a later starting point in other biologicalRichard Wang, PhD

  • CEO, Fosun Kite Biotechnology Co., Ltd

Possibilities to be creative and capitalize the new technologies for innovating drug

Support of the ecosystem by funding new companie allowing the industry to be developed in China

Autologous in patients differences cost challengeTian Xu, PhD

  • Vice President, Westlake University

ICH committee and Chinese FDA -r regulation similar to the US

Difference is the population recruitment, in China patients are active participants in skin disease 

Active in development of transposome 

Development of non-viral methods, CRISPR still in D and transposome

In China price of drugs regulatory are sensitive Shunfei Yan, PhD

  • Investment Manager, InnoStar Capital

Indication driven: Hymophilia, 

Allogogenic efficiency therapies

Licensing opportunities 

  • Q&A 8:30 AM – 8:45 AM  

8:30 AM – 8:55 AM

Impact of mRNA Vaccines | Global Success Lessons

The COVID vaccine race has propelled mRNA to the forefront of biomedicine. Long considered as a compelling modality for therapeutic gene transfer, the technology may have found its most impactful application as a vaccine platform. Given the transformative industrialization, the massive human experience, and the fast development that has taken place in this industry, where is the horizon? Does the success of the vaccine application, benefit or limit its use as a therapeutic for CGT?

  • How will the COVID success impact the rest of the industry both in therapeutic and prophylactic vaccines and broader mRNA lessons?
  • How will the COVID success impact the rest of the industry both on therapeutic and prophylactic vaccines and broader mRNA lessons?
  • Beyond from speed of development, what aspects make mRNA so well suited as a vaccine platform?
  • Will cost-of-goods be reduced as the industry matures?
  • How does mRNA technology seek to compete with AAV and other gene therapy approaches?

Moderator: Lindsey Baden, MD

  • Director, Clinical Research, Division of Infectious Diseases, BWH
  • Associate Professor, HMS

In vivo delivery process regulatory cooperation new opportunities for same platform for new indication Speakers:

Many years of mRNA pivoting for new diseases, DARPA, nucleic Acids global deployment of a manufacturing unit on site where the need arise Elan Musk funds new directions at Moderna

How many mRNA can be put in one vaccine: Dose and tolerance to achieve efficacy 

45 days for Personalized cancer vaccine one per patient

1.6 Billion doses produced rare disease monogenic correct mRNA like CF multiple mutation infection disease and oncology applications

Platform allowing to swap cargo reusing same nanoparticles address disease beyond Big Pharma options for biotech

WHat strain of Flu vaccine will come back in the future when people do not use masks 

  • Kate Bingham, UK Vaccine Taskforce

July 2020, AAV vs mRNA delivery across UK local centers administered both types supply and delivery uplift 

  • Q&A 9:00 AM – 9:15 AM  

9:00 AM – 9:25 AM HOT TOPICS

Benign Blood Disorders

Hemophilia has been and remains a hallmark indication for the CGT. Given its well-defined biology, larger market, and limited need for gene transfer to provide therapeutic benefit, it has been at the forefront of clinical development for years, however, product approval remains elusive. What are the main hurdles to this success? Contrary to many indications that CGT pursues no therapeutic options are available to patients, hemophiliacs have an increasing number of highly efficacious treatment options. How does the competitive landscape impact this field differently than other CGT fields? With many different players pursuing a gene therapy option for hemophilia, what are the main differentiators? Gene therapy for hemophilia seems compelling for low and middle-income countries, given the cost of currently available treatments; does your company see opportunities in this market? Moderator: Nancy Berliner, MD

  • Chief, Division of Hematology, BWH
  • H. Franklin Bunn Professor of Medicine, HMS

Speakers: Theresa Heggie

  • CEO, Freeline Therapeutics

Safety concerns, high burden of treatment CGT has record of safety and risk/benefit adoption of Tx functional cure CGT is potent Tx relative small quantity of protein needs be delivered 

Potency and quality less quantity drug and greater potency

risk of delivery unwanted DNA, capsules are critical 

analytics is critical regulator involvement in potency definition

Close of collaboration is excitingGallia Levy, MD, PhD

  • Chief Medical Officer, Spark Therapeutics

Hemophilia CGT is the highest potential for Global access logistics in underdeveloped countries working with NGOs practicality of the Tx

Roche reached 120 Counties great to be part of the Roche GroupAmir Nashat, PhD

  • Managing Partner, Polaris Ventures

Suneet Varma

  • Global President of Rare Disease, Pfizer

Gene therapy at Pfizer small molecule, large molecule and CGT – spectrum of choice allowing Hemophilia patients to marry 

1/3 internal 1/3 partnership 1/3 acquisitions 

Learning from COVID-19 is applied for other vaccine development

review of protocols and CGT for Hemophelia

You can’t buy Time

With MIT Pfizer is developing a model for Hemopilia CGT treatment

  • Q&A 9:30 AM – 9:45 AM  

9:25 AM – 9:35 AM FIRST LOOK

Treating Rett Syndrome through X-reactivation

Jeannie Lee, MD, PhD

  • Molecular Biologist, MGH
  • Professor of Genetics, HMS

200 disease X chromosome unlock for neurological genetic diseases: Rett Syndromeand other autism spectrum disorders female model vs male mice model

deliver protein to the brain 

restore own missing or dysfunctional protein

Epigenetic not CGT – no exogent intervention Xist ASO drug

Female model

  • Q&A 9:35 AM – 9:55 AM  

9:35 AM – 9:45 AM FIRST LOOK

Rare but mighty: scaling up success in single gene disorders

Florian Eichler, MD

  • Director, Center for Rare Neurological Diseases, MGH
  • Associate Professor, Neurology, HMS

Single gene disorder NGS enable diagnosis, DIagnosis to Treatment How to know whar cell to target, make it available and scale up Address gap: missing components Biomarkers to cell types lipid chemistry cell animal biology 

crosswalk from bone marrow matter 

New gene discovered that causes neurodevelopment of stagnant genes Examining new Biology cell type specific biomarkers 

  • Q&A 9:45 AM – 10:05 AM  

9:50 AM – 10:15 AM HOT TOPICS

Diabetes | Grand Challenge

The American Diabetes Association estimates 30 million Americans have diabetes and 1.5 million are diagnosed annually. GCT offers the prospect of long-sought treatment for this enormous cohort and their chronic requirements. The complexity of the disease and its management constitute a grand challenge and highlight both the potential of GCT and its current limitations.

  •  Islet transplantation for type 1 diabetes has been attempted for decades. Problems like loss of transplanted islet cells due to autoimmunity and graft site factors have been difficult to address. Is there anything different on the horizon for gene and cell therapies to help this be successful?
  • How is the durability of response for gene or cell therapies for diabetes being addressed? For example, what would the profile of an acceptable (vs. optimal) cell therapy look like?

Moderator: Marie McDonnell, MD

  • Chief, Diabetes Section and Director, Diabetes Program, BWH
  • Lecturer on Medicine, HMS

Type 1 Diabetes cost of insulin for continuous delivery of drug

alternative treatments: 

The Future: neuropotent stem cells 

What keeps you up at night  Speakers: Tom Bollenbach, PhD

  • Chief Technology Officer, Advanced Regenerative Manufacturing Institute

Data managment sterility sensors, cell survival after implantation, stem cells manufacturing, process development in manufacturing of complex cells

Data and instrumentation the Process is the Product

Manufacturing tight schedules Manasi Jaiman, MD

  • Vice President, Clinical Development, ViaCyte
  • Pediatric Endocrinologist

continous glucose monitoring Bastiano Sanna, PhD

  • EVP, Chief of Cell & Gene Therapies and VCGT Site Head, Vertex Pharmaceuticals

100 years from discovering Insulin, Insulin is not a cure in 2021 – asking patients to partner more 

Produce large quantities of the Islet cells encapsulation technology been developed 

Scaling up is a challengeRogerio Vivaldi, MD

  • CEO, Sigilon Therapeutics

Advanced made, Patient of Type 1 Outer and Inner compartments of spheres (not capsule) no immune suppression continuous secretion of enzyme Insulin independence without immune suppression 

Volume to have of-the-shelf inventory oxegenation in location lymphatic and vascularization conrol the whole process modular platform learning from others

  • Q&A 10:20 AM – 10:35 AM  

10:20 AM – 10:40 AM FIRESIDE

Building A Unified GCT Strategy

  Introducer: John Fish

  • CEO, Suffolk
  • Chairman of Board Trustees, Brigham Health

Moderator: Meg Tirrell

  • Senior Health and Science Reporter, CNBC

Last year, what was it at Novartis Speaker: Jay Bradner, MD

  • President, NIBR

Keep eyes open, waiting the Pandemic to end and enable working back on all the indications 

Portfolio of MET, Mimi Emerging Therapies 

Learning from the Pandemic – operationalize the practice science, R&D leaders, new collaboratives at NIH, FDA, Novartis

Pursue programs that will yield growth, tropic diseases with Gates Foundation, Rising Tide pods for access CGT within Novartis Partnership with UPenn in Cell Therapy 

Cost to access to IP from Academia to a Biotech CRISPR accessing few translations to Clinic

Protein degradation organization constraint valuation by parties in a partnership 

Novartis: nuclear protein lipid nuclear particles, tamplate for Biotech to collaborate

Game changing: 10% of the Portfolio, New frontiers human genetics in Ophthalmology, CAR-T, CRISPR, Gene Therapy Neurological and payloads of different matter

  • Q&A 10:45 AM – 11:00 AM  

10:40 AM – 10:50 AM

Break

  10:50 AM – 11:00 AM FIRST LOOK

Getting to the Heart of the Matter: Curing Genetic Cardiomyopathy

Christine Seidman, MD

  • Director, Cardiovascular Genetics Center, BWH
  • Smith Professor of Medicine & Genetics, HMS

The Voice of Dr. Seidman – Her abstract is cited below

The ultimate opportunity presented by discovering the genetic basis of human disease is accurate prediction and disease prevention. To enable this achievement, genetic insights must enable the identification of at-risk

individuals prior to end-stage disease manifestations and strategies that delay or prevent clinical expression. Genetic cardiomyopathies provide a paradigm for fulfilling these opportunities. Hypertrophic cardiomyopathy (HCM) is characterized by left ventricular hypertrophy, diastolic dysfunction with normal or enhanced systolic performance and a unique histopathology: myocyte hypertrophy, disarray and fibrosis. Dilated cardiomyopathy (DCM) exhibits enlarged ventricular volumes with depressed systolic performance and nonspecific histopathology. Both HCM and DCM are prevalent clinical conditions that increase risk for arrhythmias, sudden death, and heart failure. Today treatments for HCM and DCM focus on symptoms, but none prevent disease progression. Human molecular genetic studies demonstrated that these pathologies often result from dominant mutations in genes that encode protein components of the sarcomere, the contractile unit in striated muscles. These data combined with the emergence of molecular strategies to specifically modulate gene expression provide unparalleled opportunities to silence or correct mutant genes and to boost healthy gene expression in patients with genetic HCM and DCM. Many challenges remain, but the active and vital efforts of physicians, researchers, and patients are poised to ensure success.

Hypertrophic and Dilated Cardiomyopaies ‘

10% receive heart transplant 12 years survival 

Mutation puterb function

TTN: contribute 20% of dilated cardiomyopaty

Silence gene 

pleuripotential cells deliver therapies 

  • Q&A 11:00 AM – 11:20 AM  

11:00 AM – 11:10 AM FIRST LOOK

Unlocking the secret lives of proteins in health and disease

Anna Greka, MD, PhD

  • Medicine, BWH
  • Associate Professor, Medicine, HMS

Cyprus Island, kidney disease by mutation causing MUC1 accumulation and death BRD4780 molecule that will clear the misfolding proteins from the kidney organoids: pleuripotent stem cells small molecule developed for applications in the other cell types in brain, eye, gene mutation build mechnism for therapy clinical models transition from Academia to biotech 

Q&A

  • 11:10 AM – 11:30 AM  

11:10 AM – 11:35 AM

Rare and Ultra Rare Diseases | GCT Breaks Through

One of the most innovative segments in all of healthcare is the development of GCT driven therapies for rare and ultra-rare diseases. Driven by a series of insights and tools and funded in part by disease focused foundations, philanthropists and abundant venture funding disease after disease is yielding to new GCT technology. These often become platforms to address more prevalent diseases. The goal of making these breakthroughs routine and affordable is challenged by a range of issues including clinical trial design and pricing.

  • What is driving the interest in rare diseases?
  • What are the biggest barriers to making breakthroughs ‘routine and affordable?’
  • What is the role of retrospective and prospective natural history studies in rare disease?  When does the expected value of retrospective disease history studies justify the cost?
  • Related to the first question, what is the FDA expecting as far as controls in clinical trials for rare diseases?  How does this impact the collection of natural history data?

Moderator: Susan Slaugenhaupt, PhD

  • Scientific Director and Elizabeth G. Riley and Daniel E. Smith Jr., Endowed Chair, Mass General Research Institute
  • Professor, Neurology, HMS

Speakers: Leah Bloom, PhD

  • SVP, External Innovation and Strategic Alliances, Novartis Gene Therapies

Ultra rare (less than 100) vs rare difficulty to recruit patients and to follow up after treatment Bobby Gaspar, MD, PhD

  • CEO, Orchard Therapeutics

Study of rare condition have transfer to other larger diseases – delivery of therapeutics genes, like immune disorders 

Patient testimonials just to hear what a treatment can make Emil Kakkis, MD, PhD

  • CEO, Ultragenyx

Do 100 patient study then have information on natural history to develop a clinical trial Stuart Peltz, PhD

  • CEO, PTC Therapeutics

Rare disease, challenge for FDA approval and after market commercialization follow ups

Justification of cost for Rare disease – demonstration of Change is IP in value patients advocacy is helpful

  • Q&A 11:40 AM – 11:55 AM  

11:40 AM – 12:00 PM FIRESIDE

Partnering Across the GCT Spectrum

  Moderator: Erin Harris

  • Chief Editor, Cell & Gene

Perspective & professional tenure

Partnership in manufacturing what are the recommendations?

Hospital systems: Partnership Challenges  Speaker: Marc Casper

  • CEO, ThermoFisher

25 years in Diagnostics last 20 years at ThermoFisher 

products used in the Lab for CAR-T research and manufacture 

CGT Innovations: FDA will have a high level of approval each year

How move from research to clinical trials to manufacturing Quicker process

Best practices in Partnerships: the root cause if acceleration to market service providers to deliver highest standards

Building capacity by acquisition to avoid the waiting time

Accelerate new products been manufactured 

Collaborations with Academic Medical center i.e., UCSF in CGT joint funding to accelerate CGT to clinics’

Customers are extremely knowledgable, scale the capital investment made investment

150MIL a year to improve the Workflow 

  • Q&A 12:05 PM – 12:20 PM  

12:05 PM – 12:30 PM

  • 12:05 PM – 12:20 PM  

12:05 PM – 12:30 PM

CEO Panel | Anticipating Disruption | Planning for Widespread GCT

The power of GCT to cure disease has the prospect of profoundly improving the lives of patients who respond. Planning for a disruption of this magnitude is complex and challenging as it will change care across the spectrum. Leading chief executives shares perspectives on how the industry will change and how this change should be anticipated. Moderator: Meg Tirrell

  • Senior Health and Science Reporter, CNBC

CGT becoming staple therapy what are the disruptors emerging Speakers: Lisa Dechamps

  • SVP & Chief Business Officer, Novartis Gene Therapies

Reimagine medicine with collaboration at MGH, MDM condition in children 

The Science is there, sustainable processes and systems impact is transformational

Value based pricing, risk sharing Payers and Pharma for one time therapy with life span effect

Collaboration with FDAKieran Murphy

  • CEO, GE Healthcare

Diagnosis of disease to be used in CGT

2021 investment in CAR-T platform 

Investment in several CGT frontier

Investment in AI, ML in system design new technologies 

GE: Scale and Global distributions, sponsor companies in software 

Waste in Industry – Healthcare % of GDP, work with MGH to smooth the workflow faster entry into hospital and out of Hospital

Telemedicine during is Pandemic: Radiologist needs to read remotely 

Supply chain disruptions slow down all ecosystem 

Production of ventilators by collaboration with GM – ingenuity 

Scan patients outside of hospital a scanner in a Box Christian Rommel, PhD

  • Head, Pharmaceuticals Research & Development, Bayer AG

CGT – 2016 and in 2020 new leadership and capability 

Disease Biology and therapeutics

Regenerative Medicine: CGT vs repair building pipeline in ophthalmology and cardiovascular 

During Pandemic: Deliver Medicines like Moderna, Pfizer – collaborations between competitors with Government Bayer entered into Vaccines in 5 days, all processes had to change access innovations developed over decades for medical solutions 

  • Q&A 12:35 PM – 12:50 PM  

12:35 PM – 12:55 PM FIRESIDE

Building a GCT Portfolio

GCT represents a large and growing market for novel therapeutics that has several segments. These include Cardiovascular Disease, Cancer, Neurological Diseases, Infectious Disease, Ophthalmology, Benign Blood Disorders, and many others; Manufacturing and Supply Chain including CDMO’s and CMO’s; Stem Cells and Regenerative Medicine; Tools and Platforms (viral vectors, nano delivery, gene editing, etc.). Bayer’s pharma business participates in virtually all of these segments. How does a Company like Bayer approach the development of a portfolio in a space as large and as diverse as this one? How does Bayer approach the support of the production infrastructure with unique demands and significant differences from its historical requirements? Moderator:

Shinichiro Fuse, PhD

  • Managing Partner, MPM Capital

Speaker: Wolfram Carius, PhD

  • EVP, Pharmaceuticals, Head of Cell & Gene Therapy, Bayer AG

CGT will bring treatment to cure, delivery of therapies 

Be a Leader repair, regenerate, cure

Technology and Science for CGT – building a portfolio vs single asset decision criteria development of IP market access patients access acceleration of new products

Bayer strategy: build platform for use by four domains  

Gener augmentation

Autologeneic therapy, analytics

Gene editing

Oncology Cell therapy tumor treatment: What kind of cells – the jury is out

Of 23 product launch at Bayer no prediction is possible some high some lows 

  • Q&A 1:00 PM – 1:15 PM  

12:55 PM – 1:35 PM

Lunch

  1:40 PM – 2:05 PM

GCT Delivery | Perfecting the Technology

Gene delivery uses physical, chemical, or viral means to introduce genetic material into cells. As more genetically modified therapies move closer to the market, challenges involving safety, efficacy, and manufacturing have emerged. Optimizing lipidic and polymer nanoparticles and exosomal delivery is a short-term priority. This panel will examine how the short-term and long-term challenges are being tackled particularly for non-viral delivery modalities. Moderator: Natalie Artzi, PhD

  • Assistant Professor, BWH

Speakers: Geoff McDonough, MD

  • CEO, Generation Bio

Sonya Montgomery

  • CMO, Evox Therapeutics

Laura Sepp-Lorenzino, PhD

  • Chief Scientific Officer, Executive Vice President, Intellia Therapeutics

Doug Williams, PhD

  • CEO, Codiak BioSciences
  • Q&A 2:10 PM – 2:25 PM  

2:05 PM – 2:10 PM

Invention Discovery Grant Announcement

  2:10 PM – 2:20 PM FIRST LOOK

Enhancing vesicles for therapeutic delivery of bioproducts

Xandra Breakefield, PhD

  • Geneticist, MGH, MGH
  • Professor, Neurology, HMS
  • Q&A 2:20 PM – 2:35 PM  

2:20 PM – 2:30 PM FIRST LOOK

Versatile polymer-based nanocarriers for targeted therapy and immunomodulation

Natalie Artzi, PhD

  • Assistant Professor, BWH
  • Q&A 2:30 PM – 2:45 PM  

2:55 PM – 3:20 PM HOT TOPICS

Gene Editing | Achieving Therapeutic Mainstream

Gene editing was recognized by the Nobel Committee as “one of gene technology’s sharpest tools, having a revolutionary impact on life sciences.” Introduced in 2011, gene editing is used to modify DNA. It has applications across almost all categories of disease and is also being used in agriculture and public health.

Today’s panel is made up of pioneers who represent foundational aspects of gene editing.  They will discuss the movement of the technology into the therapeutic mainstream.

  • Successes in gene editing – lessons learned from late-stage assets (sickle cell, ophthalmology)
  • When to use what editing tool – pros and cons of traditional gene-editing v. base editing.  Is prime editing the future? Specific use cases for epigenetic editing.
  • When we reach widespread clinical use – role of off-target editing – is the risk real?  How will we mitigate? How practical is patient-specific off-target evaluation?

Moderator: J. Keith Joung, MD, PhD

  • Robert B. Colvin, M.D. Endowed Chair in Pathology & Pathologist, MGH
  • Professor of Pathology, HMS

Speakers: John Evans

  • CEO, Beam Therapeutics

Lisa Michaels

  • EVP & CMO, Editas Medicine
  • Q&A 3:25 PM – 3:50 PM  

3:25 PM – 3:50 PM HOT TOPICS

Common Blood Disorders | Gene Therapy

There are several dozen companies working to develop gene or cell therapies for Sickle Cell Disease, Beta Thalassemia, and  Fanconi Anemia. In some cases, there are enzyme replacement therapies that are deemed effective and safe. In other cases, the disease is only managed at best. This panel will address a number of questions that are particular to this class of genetic diseases:

  • What are the pros and cons of various strategies for treatment? There are AAV-based editing, non-viral delivery even oligonucleotide recruitment of endogenous editing/repair mechanisms. Which approaches are most appropriate for which disease?
  • How can companies increase the speed of recruitment for clinical trials when other treatments are available? What is the best approach to educate patients on a novel therapeutic?
  • How do we best address ethnic and socio-economic diversity to be more representative of the target patient population?
  • How long do we have to follow up with the patients from the scientific, patient’s community, and payer points of view? What are the current FDA and EMA guidelines for long-term follow-up?
  • Where are we with regards to surrogate endpoints and their application to clinically meaningful endpoints?
  • What are the emerging ethical dilemmas in pediatric gene therapy research? Are there challenges with informed consent and pediatric assent for trial participation?
  • Are there differences in reimbursement policies for these different blood disorders? Clearly durability of response is a big factor. Are there other considerations?

Moderator: David Scadden, MD

  • Director, Center for Regenerative Medicine; Co-Director, Harvard Stem Cell Institute, Director, Hematologic Malignancies & Experimental Hematology, MGH
  • Jordan Professor of Medicine, HMS

Speakers: Samarth Kukarni, PhDNick Leschly

  • Chief Bluebird, Bluebird Bio

Mike McCune, MD, PhD

  • Head, HIV Frontiers, Global Health Innovative Technology Solutions, Bill & Melinda Gates Foundation
  • Q&A 3:55 PM – 4:15 PM  

3:50 PM – 4:00 PM FIRST LOOK

Gene Editing

J. Keith Joung, MD, PhD

  • Robert B. Colvin, M.D. Endowed Chair in Pathology & Pathologist, MGH
  • Professor of Pathology, HMS
  • Q&A 4:00 PM – 4:20 PM  

4:20 PM – 4:45 PM HOT TOPICS

Gene Expression | Modulating with Oligonucleotide-Based Therapies

Oligonucleotide drugs have recently come into their own with approvals from companies such as Biogen, Alnylam, Novartis and others. This panel will address several questions:

How important is the delivery challenge for oligonucleotides? Are technological advancements emerging that will improve the delivery of oligonucleotides to the CNS or skeletal muscle after systemic administration?

  • Will oligonucleotides improve as a class that will make them even more effective?   Are further advancements in backbone chemistry anticipated, for example.
  • Will oligonucleotide based therapies blaze trails for follow-on gene therapy products?
  • Are small molecules a threat to oligonucleotide-based therapies?
  • Beyond exon skipping and knock-down mechanisms, what other roles will oligonucleotide-based therapies take mechanistically — can genes be activating oligonucleotides?  Is there a place for multiple mechanism oligonucleotide medicines?
  • Are there any advantages of RNAi-based oligonucleotides over ASOs, and if so for what use?

Moderator: Jeannie Lee, MD, PhD

  • Molecular Biologist, MGH
  • Professor of Genetics, HMS

Speakers: Bob Brown, PhD

  • CSO, EVP of R&D, Dicerna

Brett Monia, PhD

  • CEO, Ionis

Alfred Sandrock, MD, PhD

  • EVP, R&D and CMO, Biogen
  • Q&A 4:50 PM – 5:05 PM  

4:45 PM – 4:55 PM FIRST LOOK

RNA therapy for brain cancer

Pierpaolo Peruzzi, MD, PhD

  • Nuerosurgery, BWH
  • Assistant Professor of Neurosurgery, HMS
  • Q&A 4:55 PM – 5:15 PM  

Friday, May 21, 2021

8:30 AM – 8:55 AM

Venture Investing | Shaping GCT Translation

What is occurring in the GCT venture capital segment? Which elements are seeing the most activity? Which areas have cooled? How is the investment market segmented between gene therapy, cell therapy and gene editing? What makes a hot GCT company? How long will the market stay frothy? Some review of demographics — # of investments, sizes, etc. Why is the market hot and how long do we expect it to stay that way? Rank the top 5 geographic markets for GCT company creation and investing? Are there academic centers that have been especially adept at accelerating GCT outcomes? Do the business models for the rapid development of coronavirus vaccine have any lessons for how GCT technology can be brought to market more quickly? Moderator: Meredith Fisher, PhD

  • Partner, Mass General Brigham Innovation Fund

Speakers: David Berry, MD, PhD

  • CEO, Valo Health
  • General Partner, Flagship Pioneering

Robert Nelsen

  • Managing Director, Co-founder, ARCH Venture Partners

Kush Parmar, MD, PhD

  • Managing Partner, 5AM Ventures
  • Q&A 9:00 AM – 9:15 AM  

9:00 AM – 9:25 AM

Regenerative Medicine | Stem Cells

The promise of stem cells has been a highlight in the realm of regenerative medicine. Unfortunately, that promise remains largely in the future. Recent breakthroughs have accelerated these potential interventions in particular for treating neurological disease. Among the topics the panel will consider are:

  • Stem cell sourcing
  • Therapeutic indication growth
  • Genetic and other modification in cell production
  • Cell production to final product optimization and challenges
  • How to optimize the final product

Moderator: Ole Isacson, MD, PhD

  • Director, Neuroregeneration Research Institute, McLean
  • Professor, Neurology and Neuroscience, HMS

Speakers: Kapil Bharti, PhD

  • Senior Investigator, Ocular and Stem Cell Translational Research Section, NIH

Joe Burns, PhD

  • VP, Head of Biology, Decibel Therapeutics

Erin Kimbrel, PhD

  • Executive Director, Regenerative Medicine, Astellas

Nabiha Saklayen, PhD

  • CEO and Co-Founder, Cellino
  • Q&A 9:30 AM – 9:45 AM  

9:25 AM – 9:35 AM FIRST LOOK

Stem Cells

Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS
  • Q&A 9:35 AM – 9:55 AM  

9:35 AM – 10:00 AM

Capital Formation ’21-30 | Investing Modes Driving GCT Technology and Timing

The dynamics of venture/PE investing and IPOs are fast evolving. What are the drivers – will the number of investors grow will the size of early rounds continue to grow? How is this reflected in GCT target areas, company design, and biotech overall? Do patients benefit from these trends? Is crossover investing a distinct class or a little of both? Why did it emerge and what are the characteristics of the players?  Will SPACs play a role in the growth of the gene and cell therapy industry. What is the role of corporate investment arms eg NVS, Bayer, GV, etc. – has a category killer emerged?  Are we nearing the limit of what the GCT market can absorb or will investment capital continue to grow unabated? Moderator: Roger Kitterman

  • VP, Venture, Mass General Brigham

Speakers: Ellen Hukkelhoven, PhD

  • Managing Director, Perceptive Advisors

Peter Kolchinsky, PhD

  • Founder and Managing Partner, RA Capital Management

Deep Nishar

  • Senior Managing Partner, SoftBank Investment Advisors

Oleg Nodelman

  • Founder & Managing Partner, EcoR1 Capital
  • Q&A 10:05 AM – 10:20 AM  

10:00 AM – 10:10 AM FIRST LOOK

New scientific and clinical developments for autologous stem cell therapy for Parkinson’s disease patients

Penelope Hallett, PhD

  • NRL, McLean
  • Assistant Professor Psychiatry, HMS
  • Q&A 10:10 AM – 10:30 AM  

10:10 AM – 10:35 AM HOT TOPICS

Neurodegenerative Clinical Outcomes | Achieving GCT Success

Can stem cell-based platforms become successful treatments for neurodegenerative diseases?

  •  What are the commonalities driving GCT success in neurodegenerative disease and non-neurologic disease, what are the key differences?
  • Overcoming treatment administration challenges
  • GCT impact on degenerative stage of disease
  • How difficult will it be to titrate the size of the cell therapy effect in different neurological disorders and for different patients?
  • Demonstrating clinical value to patients and payers
  • Revised clinical trial models to address issues and concerns specific to GCT

Moderator: Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS

Speakers: Erwan Bezard, PhD

  • INSERM Research Director, Institute of Neurodegenerative Diseases

Nikola Kojic, PhD

  • CEO and Co-Founder, Oryon Cell Therapies

Geoff MacKay

  • President & CEO, AVROBIO

Viviane Tabar, MD

  • Founding Investigator, BlueRock Therapeutics
  • Chair of Neurosurgery, Memorial Sloan Kettering
  • Q&A 10:40 AM – 10:55 AM  

10:35 AM – 11:35 AM

Disruptive Dozen: 12 Technologies that Will Reinvent GCT

Nearly one hundred senior Mass General Brigham Harvard faculty contributed to the creation of this group of twelve GCT technologies that they believe will breakthrough in the next two years. The Disruptive Dozen identifies and ranks the GCT technologies that will be available on at least an experimental basis to have the chance of significantly improving health care. 11:35 AM – 11:45 AM

Concluding Remarks

Friday, May 21, 2021

Computer connection to the iCloud of WordPress.com FROZE completely at 10:30AM EST and no file update was possible. COVERAGE OF MAY 21, 2021 IS RECORDED BELOW FOLLOWING THE AGENDA BY COPY AN DPASTE OF ALL THE TWEETS I PRODUCED ON MAY 21, 2021 8:30 AM – 8:55 AM

Venture Investing | Shaping GCT Translation

What is occurring in the GCT venture capital segment? Which elements are seeing the most activity? Which areas have cooled? How is the investment market segmented between gene therapy, cell therapy and gene editing? What makes a hot GCT company? How long will the market stay frothy? Some review of demographics — # of investments, sizes, etc. Why is the market hot and how long do we expect it to stay that way? Rank the top 5 geographic markets for GCT company creation and investing? Are there academic centers that have been especially adept at accelerating GCT outcomes? Do the business models for the rapid development of coronavirus vaccine have any lessons for how GCT technology can be brought to market more quickly? Moderator: Meredith Fisher, PhD

  • Partner, Mass General Brigham Innovation Fund

Speakers: David Berry, MD, PhD

  • CEO, Valo Health
  • General Partner, Flagship Pioneering

Robert Nelsen

  • Managing Director, Co-founder, ARCH Venture Partners

Kush Parmar, MD, PhD

  • Managing Partner, 5AM Ventures
  • Q&A 9:00 AM – 9:15 AM  

9:00 AM – 9:25 AM

Regenerative Medicine | Stem Cells

The promise of stem cells has been a highlight in the realm of regenerative medicine. Unfortunately, that promise remains largely in the future. Recent breakthroughs have accelerated these potential interventions in particular for treating neurological disease. Among the topics the panel will consider are:

  • Stem cell sourcing
  • Therapeutic indication growth
  • Genetic and other modification in cell production
  • Cell production to final product optimization and challenges
  • How to optimize the final product

Moderator: Ole Isacson, MD, PhD

  • Director, Neuroregeneration Research Institute, McLean
  • Professor, Neurology and Neuroscience, HMS

Speakers: Kapil Bharti, PhD

  • Senior Investigator, Ocular and Stem Cell Translational Research Section, NIH

Joe Burns, PhD

  • VP, Head of Biology, Decibel Therapeutics

Erin Kimbrel, PhD

  • Executive Director, Regenerative Medicine, Astellas

Nabiha Saklayen, PhD

  • CEO and Co-Founder, Cellino
  • Q&A 9:30 AM – 9:45 AM  

9:25 AM – 9:35 AM FIRST LOOK

Stem Cells

Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS
  • Q&A 9:35 AM – 9:55 AM  

9:35 AM – 10:00 AM

Capital Formation ’21-30 | Investing Modes Driving GCT Technology and Timing

The dynamics of venture/PE investing and IPOs are fast evolving. What are the drivers – will the number of investors grow will the size of early rounds continue to grow? How is this reflected in GCT target areas, company design, and biotech overall? Do patients benefit from these trends? Is crossover investing a distinct class or a little of both? Why did it emerge and what are the characteristics of the players?  Will SPACs play a role in the growth of the gene and cell therapy industry. What is the role of corporate investment arms eg NVS, Bayer, GV, etc. – has a category killer emerged?  Are we nearing the limit of what the GCT market can absorb or will investment capital continue to grow unabated? Moderator: Roger Kitterman

  • VP, Venture, Mass General Brigham

Speakers: Ellen Hukkelhoven, PhD

  • Managing Director, Perceptive Advisors

Peter Kolchinsky, PhD

  • Founder and Managing Partner, RA Capital Management

Deep Nishar

  • Senior Managing Partner, SoftBank Investment Advisors

Oleg Nodelman

  • Founder & Managing Partner, EcoR1 Capital
  • Q&A 10:05 AM – 10:20 AM  

10:00 AM – 10:10 AM FIRST LOOK

New scientific and clinical developments for autologous stem cell therapy for Parkinson’s disease patients

Penelope Hallett, PhD

  • NRL, McLean
  • Assistant Professor Psychiatry, HMS
  • Q&A 10:10 AM – 10:30 AM  

10:10 AM – 10:35 AM HOT TOPICS

Neurodegenerative Clinical Outcomes | Achieving GCT Success

Can stem cell-based platforms become successful treatments for neurodegenerative diseases?

  •  What are the commonalities driving GCT success in neurodegenerative disease and non-neurologic disease, what are the key differences?
  • Overcoming treatment administration challenges
  • GCT impact on degenerative stage of disease
  • How difficult will it be to titrate the size of the cell therapy effect in different neurological disorders and for different patients?
  • Demonstrating clinical value to patients and payers
  • Revised clinical trial models to address issues and concerns specific to GCT

Moderator: Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS

Speakers: Erwan Bezard, PhD

  • INSERM Research Director, Institute of Neurodegenerative Diseases

Nikola Kojic, PhD

  • CEO and Co-Founder, Oryon Cell Therapies

Geoff MacKay

  • President & CEO, AVROBIO

Viviane Tabar, MD

  • Founding Investigator, BlueRock Therapeutics
  • Chair of Neurosurgery, Memorial Sloan Kettering
  • Q&A 10:40 AM – 10:55 AM  

10:35 AM – 11:35 AM

Disruptive Dozen: 12 Technologies that Will Reinvent GCT

Nearly one hundred senior Mass General Brigham Harvard faculty contributed to the creation of this group of twelve GCT technologies that they believe will breakthrough in the next two years. The Disruptive Dozen identifies and ranks the GCT technologies that will be available on at least an experimental basis to have the chance of significantly improving health care. 11:35 AM – 11:45 AM

Concluding Remarks

The co-chairs convene to reflect on the insights shared over the three days. They will discuss what to expect at the in-person GCT focused May 2-4, 2022 World Medical Innovation Forum.

 

The co-chairs convene to reflect on the insights shared over the three days. They will discuss what to expect at the in-person GCT focused May 2-4, 2022 World Medical Innovation Forum.Christine Seidman, MD

Hypertrophic and Dilated Cardiomyopaies ‘

10% receive heart transplant 12 years survival 

Mutation puterb function

TTN: contribute 20% of dilated cardiomyopaty

Silence gene 

pleuripotential cells deliver therapies 

  • Q&A 11:00 AM – 11:20 AM  

11:00 AM – 11:10 AM FIRST LOOK

Unlocking the secret lives of proteins in health and disease

Anna Greka, MD, PhD

  • Medicine, BWH
  • Associate Professor, Medicine, HMS

Cyprus Island, kidney disease by mutation causing MUC1 accumulation and death BRD4780 molecule that will clear the misfolding proteins from the kidney organoids: pleuripotent stem cells small molecule developed for applications in the other cell types in brain, eye, gene mutation build mechnism for therapy clinical models transition from Academia to biotech 

Q&A

  • 11:10 AM – 11:30 AM  

11:10 AM – 11:35 AM

Rare and Ultra Rare Diseases | GCT Breaks Through

One of the most innovative segments in all of healthcare is the development of GCT driven therapies for rare and ultra-rare diseases. Driven by a series of insights and tools and funded in part by disease focused foundations, philanthropists and abundant venture funding disease after disease is yielding to new GCT technology. These often become platforms to address more prevalent diseases. The goal of making these breakthroughs routine and affordable is challenged by a range of issues including clinical trial design and pricing.

  • What is driving the interest in rare diseases?
  • What are the biggest barriers to making breakthroughs ‘routine and affordable?’
  • What is the role of retrospective and prospective natural history studies in rare disease?  When does the expected value of retrospective disease history studies justify the cost?
  • Related to the first question, what is the FDA expecting as far as controls in clinical trials for rare diseases?  How does this impact the collection of natural history data?

Moderator: Susan Slaugenhaupt, PhD

  • Scientific Director and Elizabeth G. Riley and Daniel E. Smith Jr., Endowed Chair, Mass General Research Institute
  • Professor, Neurology, HMS

Speakers: Leah Bloom, PhD

  • SVP, External Innovation and Strategic Alliances, Novartis Gene Therapies

Ultra rare (less than 100) vs rare difficulty to recruit patients and to follow up after treatment Bobby Gaspar, MD, PhD

  • CEO, Orchard Therapeutics

Study of rare condition have transfer to other larger diseases – delivery of therapeutics genes, like immune disorders 

Patient testimonials just to hear what a treatment can make Emil Kakkis, MD, PhD

  • CEO, Ultragenyx

Do 100 patient study then have information on natural history to develop a clinical trial Stuart Peltz, PhD

  • CEO, PTC Therapeutics

Rare disease, challenge for FDA approval and after market commercialization follow ups

Justification of cost for Rare disease – demonstration of Change is IP in value patients advocacy is helpful

  • Q&A 11:40 AM – 11:55 AM  

11:40 AM – 12:00 PM FIRESIDE

Partnering Across the GCT Spectrum

  Moderator: Erin Harris

  • Chief Editor, Cell & Gene

Perspective & professional tenure

Partnership in manufacturing what are the recommendations?

Hospital systems: Partnership Challenges  Speaker: Marc Casper

  • CEO, ThermoFisher

25 years in Diagnostics last 20 years at ThermoFisher 

products used in the Lab for CAR-T research and manufacture 

CGT Innovations: FDA will have a high level of approval each year

How move from research to clinical trials to manufacturing Quicker process

Best practices in Partnerships: the root cause if acceleration to market service providers to deliver highest standards

Building capacity by acquisition to avoid the waiting time

Accelerate new products been manufactured 

Collaborations with Academic Medical center i.e., UCSF in CGT joint funding to accelerate CGT to clinics’

Customers are extremely knowledgable, scale the capital investment made investment

150MIL a year to improve the Workflow 

  • Q&A 12:05 PM – 12:20 PM  

12:05 PM – 12:30 PM

CEO Panel | Anticipating Disruption | Planning for Widespread GCT

The power of GCT to cure disease has the prospect of profoundly improving the lives of patients who respond. Planning for a disruption of this magnitude is complex and challenging as it will change care across the spectrum. Leading chief executives shares perspectives on how the industry will change and how this change should be anticipated. Moderator: Meg Tirrell

  • Senior Health and Science Reporter, CNBC

CGT becoming staple therapy what are the disruptors emerging Speakers: Lisa Dechamps

  • SVP & Chief Business Officer, Novartis Gene Therapies

Reimagine medicine with collaboration at MGH, MDM condition in children 

The Science is there, sustainable processes and systems impact is transformational

Value based pricing, risk sharing Payers and Pharma for one time therapy with life span effect

Collaboration with FDAKieran Murphy

  • CEO, GE Healthcare

Diagnosis of disease to be used in CGT

2021 investment in CAR-T platform 

Investment in several CGT frontier

Investment in AI, ML in system design new technologies 

GE: Scale and Global distributions, sponsor companies in software 

Waste in Industry – Healthcare % of GDP, work with MGH to smooth the workflow faster entry into hospital and out of Hospital

Telemedicine during is Pandemic: Radiologist needs to read remotely 

Supply chain disruptions slow down all ecosystem 

Production of ventilators by collaboration with GM – ingenuity 

Scan patients outside of hospital a scanner in a Box Christian Rommel, PhD

  • Head, Pharmaceuticals Research & Development, Bayer AG

CGT – 2016 and in 2020 new leadership and capability 

Disease Biology and therapeutics

Regenerative Medicine: CGT vs repair building pipeline in ophthalmology and cardiovascular 

During Pandemic: Deliver Medicines like Moderna, Pfizer – collaborations between competitors with Government Bayer entered into Vaccines in 5 days, all processes had to change access innovations developed over decades for medical solutions 

  • Q&A 12:35 PM – 12:50 PM  

12:35 PM – 12:55 PM FIRESIDE

Building a GCT Portfolio

GCT represents a large and growing market for novel therapeutics that has several segments. These include Cardiovascular Disease, Cancer, Neurological Diseases, Infectious Disease, Ophthalmology, Benign Blood Disorders, and many others; Manufacturing and Supply Chain including CDMO’s and CMO’s; Stem Cells and Regenerative Medicine; Tools and Platforms (viral vectors, nano delivery, gene editing, etc.). Bayer’s pharma business participates in virtually all of these segments. How does a Company like Bayer approach the development of a portfolio in a space as large and as diverse as this one? How does Bayer approach the support of the production infrastructure with unique demands and significant differences from its historical requirements? Moderator:

Shinichiro Fuse, PhD

  • Managing Partner, MPM Capital

Speaker: Wolfram Carius, PhD

  • EVP, Pharmaceuticals, Head of Cell & Gene Therapy, Bayer AG

CGT will bring treatment to cure, delivery of therapies 

Be a Leader repair, regenerate, cure

Technology and Science for CGT – building a portfolio vs single asset decision criteria development of IP market access patients access acceleration of new products

Bayer strategy: build platform for use by four domains  

Gener augmentation

Autologeneic therapy, analytics

Gene editing

Oncology Cell therapy tumor treatment: What kind of cells – the jury is out

Of 23 product launch at Bayer no prediction is possible some high some lows 

  • Q&A 1:00 PM – 1:15 PM  

12:55 PM – 1:35 PM

Lunch

  1:40 PM – 2:05 PM

GCT Delivery | Perfecting the Technology

Gene delivery uses physical, chemical, or viral means to introduce genetic material into cells. As more genetically modified therapies move closer to the market, challenges involving safety, efficacy, and manufacturing have emerged. Optimizing lipidic and polymer nanoparticles and exosomal delivery is a short-term priority. This panel will examine how the short-term and long-term challenges are being tackled particularly for non-viral delivery modalities. Moderator: Natalie Artzi, PhD

  • Assistant Professor, BWH

Speakers: Geoff McDonough, MD

  • CEO, Generation Bio

Sonya Montgomery

  • CMO, Evox Therapeutics

Laura Sepp-Lorenzino, PhD

  • Chief Scientific Officer, Executive Vice President, Intellia Therapeutics

Doug Williams, PhD

  • CEO, Codiak BioSciences
  • Q&A 2:10 PM – 2:25 PM  

2:05 PM – 2:10 PM

Invention Discovery Grant Announcement

  2:10 PM – 2:20 PM FIRST LOOK

Enhancing vesicles for therapeutic delivery of bioproducts

Xandra Breakefield, PhD

  • Geneticist, MGH, MGH
  • Professor, Neurology, HMS
  • Q&A 2:20 PM – 2:35 PM  

2:20 PM – 2:30 PM FIRST LOOK

Versatile polymer-based nanocarriers for targeted therapy and immunomodulation

Natalie Artzi, PhD

  • Assistant Professor, BWH
  • Q&A 2:30 PM – 2:45 PM  

2:55 PM – 3:20 PM HOT TOPICS

Gene Editing | Achieving Therapeutic Mainstream

Gene editing was recognized by the Nobel Committee as “one of gene technology’s sharpest tools, having a revolutionary impact on life sciences.” Introduced in 2011, gene editing is used to modify DNA. It has applications across almost all categories of disease and is also being used in agriculture and public health.

Today’s panel is made up of pioneers who represent foundational aspects of gene editing.  They will discuss the movement of the technology into the therapeutic mainstream.

  • Successes in gene editing – lessons learned from late-stage assets (sickle cell, ophthalmology)
  • When to use what editing tool – pros and cons of traditional gene-editing v. base editing.  Is prime editing the future? Specific use cases for epigenetic editing.
  • When we reach widespread clinical use – role of off-target editing – is the risk real?  How will we mitigate? How practical is patient-specific off-target evaluation?

Moderator: J. Keith Joung, MD, PhD

  • Robert B. Colvin, M.D. Endowed Chair in Pathology & Pathologist, MGH
  • Professor of Pathology, HMS

Speakers: John Evans

  • CEO, Beam Therapeutics

Lisa Michaels

  • EVP & CMO, Editas Medicine
  • Q&A 3:25 PM – 3:50 PM  

3:25 PM – 3:50 PM HOT TOPICS

Common Blood Disorders | Gene Therapy

There are several dozen companies working to develop gene or cell therapies for Sickle Cell Disease, Beta Thalassemia, and  Fanconi Anemia. In some cases, there are enzyme replacement therapies that are deemed effective and safe. In other cases, the disease is only managed at best. This panel will address a number of questions that are particular to this class of genetic diseases:

  • What are the pros and cons of various strategies for treatment? There are AAV-based editing, non-viral delivery even oligonucleotide recruitment of endogenous editing/repair mechanisms. Which approaches are most appropriate for which disease?
  • How can companies increase the speed of recruitment for clinical trials when other treatments are available? What is the best approach to educate patients on a novel therapeutic?
  • How do we best address ethnic and socio-economic diversity to be more representative of the target patient population?
  • How long do we have to follow up with the patients from the scientific, patient’s community, and payer points of view? What are the current FDA and EMA guidelines for long-term follow-up?
  • Where are we with regards to surrogate endpoints and their application to clinically meaningful endpoints?
  • What are the emerging ethical dilemmas in pediatric gene therapy research? Are there challenges with informed consent and pediatric assent for trial participation?
  • Are there differences in reimbursement policies for these different blood disorders? Clearly durability of response is a big factor. Are there other considerations?

Moderator: David Scadden, MD

  • Director, Center for Regenerative Medicine; Co-Director, Harvard Stem Cell Institute, Director, Hematologic Malignancies & Experimental Hematology, MGH
  • Jordan Professor of Medicine, HMS

Speakers: Samarth Kukarni, PhDNick Leschly

  • Chief Bluebird, Bluebird Bio

Mike McCune, MD, PhD

  • Head, HIV Frontiers, Global Health Innovative Technology Solutions, Bill & Melinda Gates Foundation
  • Q&A 3:55 PM – 4:15 PM  

3:50 PM – 4:00 PM FIRST LOOK

Gene Editing

J. Keith Joung, MD, PhD

  • Robert B. Colvin, M.D. Endowed Chair in Pathology & Pathologist, MGH
  • Professor of Pathology, HMS
  • Q&A 4:00 PM – 4:20 PM  

4:20 PM – 4:45 PM HOT TOPICS

Gene Expression | Modulating with Oligonucleotide-Based Therapies

Oligonucleotide drugs have recently come into their own with approvals from companies such as Biogen, Alnylam, Novartis and others. This panel will address several questions:

How important is the delivery challenge for oligonucleotides? Are technological advancements emerging that will improve the delivery of oligonucleotides to the CNS or skeletal muscle after systemic administration?

  • Will oligonucleotides improve as a class that will make them even more effective?   Are further advancements in backbone chemistry anticipated, for example.
  • Will oligonucleotide based therapies blaze trails for follow-on gene therapy products?
  • Are small molecules a threat to oligonucleotide-based therapies?
  • Beyond exon skipping and knock-down mechanisms, what other roles will oligonucleotide-based therapies take mechanistically — can genes be activating oligonucleotides?  Is there a place for multiple mechanism oligonucleotide medicines?
  • Are there any advantages of RNAi-based oligonucleotides over ASOs, and if so for what use?

Moderator: Jeannie Lee, MD, PhD

  • Molecular Biologist, MGH
  • Professor of Genetics, HMS

Speakers: Bob Brown, PhD

  • CSO, EVP of R&D, Dicerna

Brett Monia, PhD

  • CEO, Ionis

Alfred Sandrock, MD, PhD

  • EVP, R&D and CMO, Biogen
  • Q&A 4:50 PM – 5:05 PM  

4:45 PM – 4:55 PM FIRST LOOK

RNA therapy for brain cancer

Pierpaolo Peruzzi, MD, PhD

  • Nuerosurgery, BWH
  • Assistant Professor of Neurosurgery, HMS
  • Q&A 4:55 PM – 5:15 PM  

Friday, May 21, 2021

Computer connection to the iCloud of WordPress.com FROZE completely at 10:30AM EST and no file update was possible. COVERAGE OF MAY 21, 2021 IS RECORDED BELOW FOLLOWING THE AGENDA BY COPY AN DPASTE OF ALL THE TWEETS I PRODUCED ON MAY 21, 2021

8:30 AM – 8:55 AM

Venture Investing | Shaping GCT Translation

What is occurring in the GCT venture capital segment? Which elements are seeing the most activity? Which areas have cooled? How is the investment market segmented between gene therapy, cell therapy and gene editing? What makes a hot GCT company? How long will the market stay frothy? Some review of demographics — # of investments, sizes, etc. Why is the market hot and how long do we expect it to stay that way? Rank the top 5 geographic markets for GCT company creation and investing? Are there academic centers that have been especially adept at accelerating GCT outcomes? Do the business models for the rapid development of coronavirus vaccine have any lessons for how GCT technology can be brought to market more quickly? Moderator:   Meredith Fisher, PhD

  • Partner, Mass General Brigham Innovation Fund

Strategies, success what changes are needed in the drug discovery process   Speakers:  

Bring disruptive frontier as a platform with reliable delivery CGT double knock out disease cure all change efficiency and scope human centric vs mice centered right scale of data converted into therapeutics acceleratetion 

Innovation in drugs 60% fails in trial because of Toxicology system of the future deal with big diseases

Moderna is an example in unlocking what is inside us Microbiome and beyond discover new drugs epigenetics  

  • Robert Nelsen
    • Managing Director, Co-founder, ARCH Venture Partners

Manufacturing change is not a new clinical trial FDA need to be presented with new rethinking for big innovations Drug pricing cheaper requires systematization How to systematically scaling up systematize the discovery and the production regulatory innovations

Responsibility mismatch should be and what is “are”

Long term diseases Stack holders and modalities risk benefir for populations 

  • Q&A 9:00 AM – 9:15 AM  

9:00 AM – 9:25 AM

Regenerative Medicine | Stem Cells

The promise of stem cells has been a highlight in the realm of regenerative medicine. Unfortunately, that promise remains largely in the future. Recent breakthroughs have accelerated these potential interventions in particular for treating neurological disease. Among the topics the panel will consider are:

  • Stem cell sourcing
  • Therapeutic indication growth
  • Genetic and other modification in cell production
  • Cell production to final product optimization and challenges
  • How to optimize the final product
  • Moderator:
    • Ole Isacson, MD, PhD
      • Director, Neuroregeneration Research Institute, McLean
      • Professor, Neurology and Neuroscience, MGH, HMS

Opportunities in the next generation of the tactical level Welcome the oprimism and energy level of all Translational medicine funding stem cells enormous opportunities 

  • Speakers:
  • Kapil Bharti, PhD
    • Senior Investigator, Ocular and Stem Cell Translational Research Section, NIH
    • first drug required to establish the process for that innovations design of animal studies not done before
    • Off-th-shelf one time treatment becoming cure 
    •  Intact tissue in a dish is fragile to maintain metabolism
    Joe Burns, PhD
    • VP, Head of Biology, Decibel Therapeutics
    • Ear inside the scall compartments and receptors responsible for hearing highly differentiated tall ask to identify cell for anticipated differentiation
    • multiple cell types and tissue to follow
    Erin Kimbrel, PhD
    • Executive Director, Regenerative Medicine, Astellas
    • In the ocular space immunogenecity
    • regulatory communication
    • use gene editing for immunogenecity Cas1 and Cas2 autologous cells
    • gene editing and programming big opportunities 
    Nabiha Saklayen, PhD
    • CEO and Co-Founder, Cellino
    • scale production of autologous cells foundry using semiconductor process in building cassettes
    • solution for autologous cells
  • Q&A 9:30 AM – 9:45 AM  

9:25 AM – 9:35 AM FIRST LOOK

Stem Cells

Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS
  • Cell therapy for Parkinson to replace dopamine producing cells lost ability to produce dopamin
  • skin cell to become autologous cells reprograms to become cells producing dopamine
  • transplantation fibroblast cells metabolic driven process lower mutation burden 
  • Quercetin inhibition elimination undifferentiated cells graft survival oxygenation increased 
  • Q&A 9:35 AM – 9:55 AM  

9:35 AM – 10:00 AM

Capital Formation ’21-30 | Investing Modes Driving GCT Technology and Timing

The dynamics of venture/PE investing and IPOs are fast evolving. What are the drivers – will the number of investors grow will the size of early rounds continue to grow? How is this reflected in GCT target areas, company design, and biotech overall? Do patients benefit from these trends? Is crossover investing a distinct class or a little of both? Why did it emerge and what are the characteristics of the players?  Will SPACs play a role in the growth of the gene and cell therapy industry. What is the role of corporate investment arms eg NVS, Bayer, GV, etc. – has a category killer emerged?  Are we nearing the limit of what the GCT market can absorb or will investment capital continue to grow unabated? Moderator: Roger Kitterman

  • VP, Venture, Mass General Brigham
  • Saturation reached or more investment is coming in CGT 

Speakers: Ellen Hukkelhoven, PhD

  • Managing Director, Perceptive Advisors
  • Cardiac area transduct cells
  • matching tools
  • 10% success of phase 1 in drug development next phase matters more 

Peter Kolchinsky, PhD

  • Founder and Managing Partner, RA Capital Management
  • Future proof for new comers disruptors 
  • Ex Vivo gene therapy to improve funding products what tool kit belongs to 
  • company insulation from next instability vs comapny stabilizing themselves along few years
  • Company interested in SPAC 
  • cross over investment vs SPAC
  • Multi Omics in cancer early screening metastatic diseas will be wiped out 

Deep Nishar

  • Senior Managing Partner, SoftBank Investment Advisors
  • Young field vs CGT started in the 80s 
  • high payloads is a challenge
  • cost effective fast delivery to large populations
  • Mission oriented by the team and management  
  • Multi Omics disease modality 

Oleg Nodelman

  • Founder & Managing Partner, EcoR1 Capital
  • Invest in company next round of investment will be IPO
  • Help company raise money cross over investment vs SPAC
  • Innovating ideas from academia in need for funding 
  • Q&A 10:05 AM – 10:20 AM  

10:00 AM – 10:10 AM FIRST LOOK

New scientific and clinical developments for autologous stem cell therapy for Parkinson’s disease patients

Penelope Hallett, PhD

  • NRL, McLean
  • Assistant Professor Psychiatry, HMS
  • Pharmacologic agent in existing cause another disorders locomo-movement related 
  • efficacy Autologous cell therapy transplantation approach program T cells into dopamine generating neurons greater than Allogeneic cell transplantation 
  • Q&A 10:10 AM – 10:30 AM  

10:10 AM – 10:35 AM HOT TOPICS

Neurodegenerative Clinical Outcomes | Achieving GCT Success

Can stem cell-based platforms become successful treatments for neurodegenerative diseases?

  •  What are the commonalities driving GCT success in neurodegenerative disease and non-neurologic disease, what are the key differences?
  • Overcoming treatment administration challenges
  • GCT impact on degenerative stage of disease
  • How difficult will it be to titrate the size of the cell therapy effect in different neurological disorders and for different patients?
  • Demonstrating clinical value to patients and payers
  • Revised clinical trial models to address issues and concerns specific to GCT

Moderator: Bob Carter, MD, PhD

  • Chairman, Department of Neurosurgery, MGH
  • William and Elizabeth Sweet, Professor of Neurosurgery, HMS
  • Neurogeneration REVERSAL or slowing down 

Speakers: Erwan Bezard, PhD

  • INSERM Research Director, Institute of Neurodegenerative Diseases
  • Cautious on reversal 
  • Early intervantion versus late

Nikola Kojic, PhD

  • CEO and Co-Founder, Oryon Cell Therapies
  • Autologus cell therapy placed focal replacing missing synapses reestablishment of neural circuitary

Geoff MacKay

  • President & CEO, AVROBIO
  • Prevent condition to be manifested in the first place 
  • clinical effect durable single infusion preventions of symptoms to manifest 
  • Cerebral edema – stabilization
  • Gene therapy know which is the abnormal gene grafting the corrected one 
  • More than biomarker as end point functional benefit not yet established  

Viviane Tabar, MD

  • Founding Investigator, BlueRock Therapeutics
  • Chair of Neurosurgery, Memorial Sloan Kettering
  • Current market does not have delivery mechanism that a drug-delivery is the solution Trials would fail on DELIVERY
  • Immune suppressed patients during one year to avoid graft rejection Autologous approach of Parkinson patient genetically mutated reprogramed as dopamine generating neuron – unknowns are present
  • Circuitry restoration
  • Microenvironment disease ameliorate symptoms – education of patients on the treatment 
  • Q&A 10:40 AM – 10:55 AM  

10:35 AM – 11:35 AM

Disruptive Dozen: 12 Technologies that Will Reinvent GCT

Nearly one hundred senior Mass General Brigham Harvard faculty contributed to the creation of this group of twelve GCT technologies that they believe will breakthrough in the next two years. The Disruptive Dozen identifies and ranks the GCT technologies that will be available on at least an experimental basis to have the chance of significantly improving health care. 11:35 AM – 11:45 AM

Concluding Remarks

The co-chairs convene to reflect on the insights shared over the three days. They will discuss what to expect at the in-person GCT focused May 2-4, 2022 World Medical Innovation Forum.

ALL THE TWEETS PRODUCED ON MAY 21, 2021 INCLUDE THE FOLLOWING:

Aviva Lev-Ari

@AVIVA1950

  • @AVIVA1950_PIcs

4h

#WMIF2021

@MGBInnovation

Erwan Bezard, PhD INSERM Research Director, Institute of Neurodegenerative Diseases Cautious on reversal

@pharma_BI

@AVIVA1950

Aviva Lev-Ari

@AVIVA1950

  • @AVIVA1950_PIcs

4h

#WMIF2021

@MGBInnovation

Nikola Kojic, PhD CEO and Co-Founder, Oryon Cell Therapies Autologus cell therapy placed focal replacing missing synapses reestablishment of neural circutary

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

4h

#WMIF2021

@MGBInnovation

Bob Carter, MD, PhD Chairman, Department of Neurosurgery, MGH William and Elizabeth Sweet, Professor of Neurosurgery, HMS Neurogeneration REVERSAL or slowing down? 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

4h

#WMIF2021

@MGBInnovation

Penelope Hallett, PhD NRL, McLean Assistant Professor Psychiatry, HMS efficacy Autologous cell therapy transplantation approach program T cells into dopamine genetating cells greater than Allogeneic cell transplantation 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

4h

#WMIF2021

@MGBInnovation

Penelope Hallett, PhD NRL, McLean Assistant Professor Psychiatry, HMS Pharmacologic agent in existing cause another disorders locomo-movement related 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

@AVIVA1950_PIcs

4h

#WMIF2021

@MGBInnovation

Roger Kitterman VP, Venture, Mass General Brigham Saturation reached or more investment is coming in CGT Multi OMICS and academia originated innovations are the most attractive areas

@pharma_BI

@AVIVA1950

1

3

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

@AVIVA1950_PIcs

4h

#WMIF2021

@MGBInnovation

Roger Kitterman VP, Venture, Mass General Brigham Saturation reached or more investment is coming in CGT 

@pharma_BI

@AVIVA1950

1

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

4h

#WMIF2021

@MGBInnovation

Oleg Nodelman Founder & Managing Partner, EcoR1 Capital Invest in company next round of investment will be IPO 20% discount

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

@AVIVA1950_PIcs

4h

#WMIF2021

@MGBInnovation

Peter Kolchinsky, PhD Founder and Managing Partner, RA Capital Management Future proof for new comers disruptors  Ex Vivo gene therapy to improve funding products what tool kit belongs to 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

4h

#WMIF2021

@MGBInnovation

Deep Nishar Senior Managing Partner, SoftBank Investment Advisors Young field vs CGT started in the 80s  high payloads is a challenge 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Bob Carter, MD, PhD MGH, HMS cells producing dopamine transplantation fibroblast cells metabolic driven process lower mutation burden  Quercetin inhibition elimination undifferentiated cells graft survival oxygenation increased 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Chairman, Department of Neurosurgery, MGH, Professor of Neurosurgery, HMS Cell therapy for Parkinson to replace dopamine producing cells lost ability to produce dopamine skin cell to become autologous cells reprogramed  

@pharma_BI

@AVIVA1950

#WMIF2021

@MGBInnovation

Kapil Bharti, PhD Senior Investigator, Ocular and Stem Cell Translational Research Section, NIH Off-th-shelf one time treatment becoming cure  Intact tissue in a dish is fragile to maintain metabolism to become like semiconductors

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

@AVIVA1950_PIcs

5h

#WMIF2021

@MGBInnovation

Ole Isacson, MD, PhD Director, Neuroregeneration Research Institute, McLean Professor, Neurology and Neuroscience, MGH, HMS Opportunities in the next generation of the tactical level Welcome the oprimism and energy level of all

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Erin Kimbrel, PhD Executive Director, Regenerative Medicine, Astellas In the ocular space immunogenecity regulatory communication use gene editing for immunogenecity Cas1 and Cas2 autologous cells

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Nabiha Saklayen, PhD CEO and Co-Founder, Cellino scale production of autologous cells foundry using semiconductor process in building cassettes by optic physicists

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Joe Burns, PhD VP, Head of Biology, Decibel Therapeutics Ear inside the scall compartments and receptors responsible for hearing highly differentiated tall ask to identify cell for anticipated differentiation control by genomics

@pharma_BI

@AVIVA1950

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Kapil Bharti, PhD Senior Investigator, Ocular and Stem Cell Translational Research Section, NIH first drug required to establish the process for that innovations design of animal studies not done before 

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Meredith Fisher, PhD Partner, Mass General Brigham Innovation Fund Strategies, success what changes are needed in the drug discovery process@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Robert Nelsen Managing Director, Co-founder, ARCH Venture Partners Manufacturing change is not a new clinical trial FDA need to be presented with new rethinking for big innovations Drug pricing cheaper requires systematization

@pharma_BI

@AVIVA1950

1

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

Kush Parmar, MD, PhD Managing Partner, 5AM Ventures Responsibility mismatch should be and what is “are”

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

5h

#WMIF2021

@MGBInnovation

David Berry, MD, PhD CEO, Valo Health GP, Flagship Pioneering Bring disruptive frontier platform reliable delivery CGT double knockout disease cure all change efficiency scope human centric vs mice centered right scale acceleration

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

6h

#WMIF2021

@MGBInnovation

Kush Parmar, MD, PhD Managing Partner, 5AM Ventures build it yourself, benefit for patients FIrst Look at MGB shows MEE innovation on inner ear worthy investment  

@pharma_BI

@AVIVA1950

@AVIVA1950_PIcs

Aviva Lev-Ari

@AVIVA1950

6h

#WMIF2021

@MGBInnovation

Robert Nelsen Managing Director, Co-founder, ARCH Venture Partners Frustration with supply chain during the Pandemic, GMC anticipation in advance CGT rapidly prototype rethink and invest proactive investor .edu and Pharma

@pharma_BI

@AVIVA1950

Read Full Post »

Clinical Laboratory Challenges

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

CLINICAL LABORATORY NEWS   

The Lab and CJD: Safe Handling of Infectious Prion Proteins

Body fluids from individuals with possible Creutzfeldt-Jakob disease (CJD) present distinctive safety challenges for clinical laboratories. Sporadic, iatrogenic, and familial CJD (known collectively as classic CJD), along with variant CJD, kuru, Gerstmann-Sträussler-Scheinker, and fatal familial insomnia, are prion diseases, also known as transmissible spongiform encephalopathies. Prion diseases affect the central nervous system, and from the onset of symptoms follow a typically rapid progressive neurological decline. While prion diseases are rare, it is not uncommon for the most prevalent form—sporadic CJD—to be included in the differential diagnosis of individuals presenting with rapid cognitive decline. Thus, laboratories may deal with a significant number of possible CJD cases, and should have protocols in place to process specimens, even if a confirmatory diagnosis of CJD is made in only a fraction of these cases.

The Lab’s Role in Diagnosis

Laboratory protocols for handling specimens from individuals with possible, probable, and definitive cases of CJD are important to ensure timely and appropriate patient management. When the differential includes CJD, an attempt should be made to rule-in or out other causes of rapid neurological decline. Laboratories should be prepared to process blood and cerebrospinal fluid (CSF) specimens in such cases for routine analyses.

Definitive diagnosis requires identification of prion aggregates in brain tissue, which can be achieved by immunohistochemistry, a Western blot for proteinase K-resistant prions, and/or by the presence of prion fibrils. Thus, confirmatory diagnosis is typically achieved at autopsy. A probable diagnosis of CJD is supported by elevated concentration of 14-3-3 protein in CSF (a non-specific marker of neurodegeneration), EEG, and MRI findings. Thus, the laboratory may be required to process and send CSF samples to a prion surveillance center for 14-3-3 testing, as well as blood samples for sequencing of the PRNP gene (in inherited cases).

Processing Biofluids

Laboratories should follow standard protective measures when working with biofluids potentially containing abnormally folded prions, such as donning standard personal protective equipment (PPE); avoiding or minimizing the use of sharps; using single-use disposable items; and processing specimens to minimize formation of aerosols and droplets. An additional safety consideration is the use of single-use disposal PPE; otherwise, re-usable items must be either cleaned using prion-specific decontamination methods, or destroyed.

Blood. In experimental models, infectivity has been detected in the blood; however, there have been no cases of secondary transmission of classical CJD via blood product transfusions in humans. As such, blood has been classified, on epidemiological evidence by the World Health Organization (WHO), as containing “no detectible infectivity,” which means it can be processed by routine methods. Similarly, except for CSF, all other body fluids contain no infectivity and can be processed following standard procedures.

In contrast to classic CJD, there have been four cases of suspected secondary transmission of variant CJD via transfused blood products in the United Kingdom. Variant CJD, the prion disease associated with mad cow disease, is unique in its distribution of prion aggregates outside of the central nervous system, including the lymph nodes, spleen, and tonsils. For regions where variant CJD is a concern, laboratories should consult their regulatory agencies for further guidance.

CSF. Relative to highly infectious tissues of the brain, spinal cord, and eye, infectivity has been identified less often in CSF and is considered to have “low infectivity,” along with kidney, liver, and lung tissue. Since CSF can contain infectious material, WHO has recommended that analyses not be performed on automated equipment due to challenges associated with decontamination. Laboratories should perform a risk assessment of their CSF processes, and, if deemed necessary, consider using manual methods as an alternative to automated systems.

Decontamination

The infectious agent in prion disease is unlike any other infectious pathogen encountered in the laboratory; it is formed of misfolded and aggregated prion proteins. This aggregated proteinacious material forms the infectious unit, which is incredibly resilient to degradation. Moreover, in vitro studies have demonstrated that disrupting large aggregates into smaller aggregates increases cytotoxicity. Thus, if the aim is to abolish infectivity, all aggregates must be destroyed. Disinfectant procedures used for viral, bacterial, and fungal pathogens such as alcohol, boiling, formalin, dry heat (<300°C), autoclaving at 121°C for 15 minutes, and ionizing, ultraviolet, or microwave radiation, are either ineffective or variably effective against aggregated prions.

The only means to ensure no risk of residual infectious prions is to use disposable materials. This is not always practical, as, for instance, a biosafety cabinet cannot be discarded if there is a CSF spill in the hood. Fortunately, there are several protocols considered sufficient for decontamination. For surfaces and heat-sensitive instruments, such as a biosafety cabinet, WHO recommends flooding the surface with 2N NaOH or undiluted NaClO, letting stand for 1 hour, mopping up, and rinsing with water. If the surface cannot tolerate NaOH or NaClO, thorough cleaning will remove most infectivity by dilution. Laboratories may derive some additional benefit by using one of the partially effective methods discussed previously. Non-disposable heat-resistant items preferably should be immersed in 1N NaOH, heated in a gravity displacement autoclave at 121°C for 30 min, cleaned and rinsed in water, then sterilized by routine methods. WHO has outlined several alternate decontamination methods. Using disposable cover sheets is one simple solution to avoid contaminating work surfaces and associated lengthy decontamination procedures.

With standard PPE—augmented by a few additional safety measures and prion-specific decontamination procedures—laboratories can safely manage biofluid testing in cases of prion disease.

 

The Microscopic World Inside Us  

Emerging Research Points to Microbiome’s Role in Health and Disease

Thousands of species of microbes—bacteria, viruses, fungi, and protozoa—inhabit every internal and external surface of the human body. Collectively, these microbes, known as the microbiome, outnumber the body’s human cells by about 10 to 1 and include more than 1,000 species of microorganisms and several million genes residing in the skin, respiratory system, urogenital, and gastrointestinal tracts. The microbiome’s complicated relationship with its human host is increasingly considered so crucial to health that researchers sometimes call it “the forgotten organ.”

Disturbances to the microbiome can arise from nutritional deficiencies, antibiotic use, and antiseptic modern life. Imbalances in the microbiome’s diverse microbial communities, which interact constantly with cells in the human body, may contribute to chronic health conditions, including diabetes, asthma and allergies, obesity and the metabolic syndrome, digestive disorders including irritable bowel syndrome (IBS), and autoimmune disorders like multiple sclerosis and rheumatoid arthritis, research shows.

While study of the microbiome is a growing research enterprise that has attracted enthusiastic media attention and venture capital, its findings are largely preliminary. But some laboratorians are already developing a greater appreciation for the microbiome’s contributions to human biochemistry and are considering a future in which they expect to measure changes in the microbiome to monitor disease and inform clinical practice.

Pivot Toward the Microbiome

Following the National Institutes of Health (NIH) Human Genome Project, many scientists noted the considerable genetic signal from microbes in the body and the existence of technology to analyze these microorganisms. That realization led NIH to establish the Human Microbiome Project in 2007, said Lita Proctor, PhD, its program director. In the project’s first phase, researchers studied healthy adults to produce a reference set of microbiomes and a resource of metagenomic sequences of bacteria in the airways, skin, oral cavities, and the gastrointestinal and vaginal tracts, plus a catalog of microbial genome sequences of reference strains. Researchers also evaluated specific diseases associated with disturbances in the microbiome, including gastrointestinal diseases such as Crohn’s disease, ulcerative colitis, IBS, and obesity, as well as urogenital conditions, those that involve the reproductive system, and skin diseases like eczema, psoriasis, and acne.

Phase 1 studies determined the composition of many parts of the microbiome, but did not define how that composition affects health or specific disease. The project’s second phase aims to “answer the question of what microbes actually do,” explained Proctor. Researchers are now examining properties of the microbiome including gene expression, protein, and human and microbial metabolite profiles in studies of pregnant women at risk for preterm birth, the gut hormones of patients at risk for IBS, and nasal microbiomes of patients at risk for type 2 diabetes.

Promising Lines of Research

Cystic fibrosis and microbiology investigator Michael Surette, PhD, sees promising microbiome research not just in terms of evidence of its effects on specific diseases, but also in what drives changes in the microbiome. Surette is Canada research chair in interdisciplinary microbiome research in the Farncombe Family Digestive Health Research Institute at McMaster University
in Hamilton, Ontario.

One type of study on factors driving microbiome change examines how alterations in composition and imbalances in individual patients relate to improving or worsening disease. “IBS, cystic fibrosis, and chronic obstructive pulmonary disease all have periods of instability or exacerbation,” he noted. Surette hopes that one day, tests will provide clinicians the ability to monitor changes in microbial composition over time and even predict when a patient’s condition is about to deteriorate. Monitoring perturbations to the gut microbiome might also help minimize collateral damage to the microbiome during aggressive antibiotic therapy for hospitalized patients, he added.

Monitoring changes to the microbiome also might be helpful for “culture negative” patients, who now may receive multiple, unsuccessful courses of different antibiotics that drive antibiotic resistance. Frustration with standard clinical biology diagnosis of lung infections in cystic fibrosis patients first sparked Surette’s investigations into the microbiome. He hopes that future tests involving the microbiome might also help asthma patients with neutrophilia, community-acquired pneumonia patients who harbor complex microbial lung communities lacking obvious pathogens, and hospitalized patients with pneumonia or sepsis. He envisions microbiome testing that would look for short-term changes indicating whether or not a drug is effective.

Companion Diagnostics

Daniel Peterson, MD, PhD, an assistant professor of pathology at Johns Hopkins University School of Medicine in Baltimore, believes the future of clinical testing involving the microbiome lies in companion diagnostics for novel treatments, and points to companies that are already developing and marketing tests that will require such assays.

Examples of microbiome-focused enterprises abound, including Genetic Analysis, based in Oslo, Norway, with its high-throughput test that uses 54 probes targeted to specific bacteria to measure intestinal gut flora imbalances in inflammatory bowel disease and irritable bowel syndrome patients. Paris, France-based Enterome is developing both novel drugs and companion diagnostics for microbiome-related diseases such as IBS and some metabolic diseases. Second Genome, based in South San Francisco, has developed an experimental drug, SGM-1019, that the company says blocks damaging activity of the microbiome in the intestine. Cambridge, Massachusetts-based Seres Therapeutics has received Food and Drug Administration orphan drug designation for SER-109, an oral therapeutic intended to correct microbial imbalances to prevent recurrent Clostridium difficile infection in adults.

One promising clinical use of the microbiome is fecal transplantation, which both prospective and retrospective studies have shown to be effective in patients with C. difficile infections who do not respond to front-line therapies, said James Versalovic, MD, PhD, director of Texas Children’s Hospital Microbiome Center and professor of pathology at Baylor College of Medicine in Houston. “Fecal transplants and other microbiome replacement strategies can radically change the composition of the microbiome in hours to days,” he explained.

But NIH’s Proctor discourages too much enthusiasm about fecal transplant. “Natural products like stool can have [side] effects,” she pointed out. “The [microbiome research] field needs to mature and we need to verify outcomes before anything becomes routine.”

Hurdles for Lab Testing

While he is hopeful that labs someday will use the microbiome to produce clinically useful information, Surette pointed to several problems that must be solved beforehand. First, molecular methods commonly used right now should be more quantitative and accurate. Additionally, research on the microbiome encompasses a wide variety of protocols, some of which are better at extracting particular types of bacteria and therefore can give biased views of communities living in the body. Also, tests may need to distinguish between dead and live microbes. Another hurdle is that labs using varied bioinfomatic methods may produce different results from the same sample, a problem that Surette sees as ripe for a solution from clinical laboratorians, who have expertise in standardizing robust protocols and in automating tests.

One way laboratorians can prepare for future, routine microbiome testing is to expand their notion of clinical chemistry to include both microbial and human biochemistry. “The line between microbiome science and clinical science is blurring,” said Versalovic. “When developing future assays to detect biochemical changes in disease states, we must consider the contributions of microbial metabolites and proteins and how to tailor tests to detect them.” In the future, clinical labs may test for uniquely microbial metabolites in various disease states, he predicted.

 

Automated Review of Mass Spectrometry Results  

Can We Achieve Autoverification?

Author: Katherine Alexander and Andrea R. Terrell, PhD  // Date: NOV.1.2015  // Source:Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/november/automated-review-of-mass-spectrometry-results-can-we-achieve-autoverification

 

Paralleling the upswing in prescription drug misuse, clinical laboratories are receiving more requests for mass spectrometry (MS) testing as physicians rely on its specificity to monitor patient compliance with prescription regimens. However, as volume has increased, reimbursement has declined, forcing toxicology laboratories both to increase capacity and lower their operational costs—without sacrificing quality or turnaround time. Now, new solutions are available enabling laboratories to bring automation to MS testing and helping them with the growing demand for toxicology and other testing.

What is the typical MS workflow?

A typical workflow includes a long list of manual steps. By the time a sample is loaded onto the mass spectrometer, it has been collected, logged into the lab information management system (LIMS), and prepared for analysis using a variety of wet chemistry techniques.

Most commercial clinical laboratories receive enough samples for MS analysis to batch analyze those samples. A batch consists of a calibrator(s), quality control (QC) samples, and patient/donor samples. Historically, the method would be selected (i.e. “analysis of opiates”), sample identification information would be entered manually into the MS software, and the instrument would begin analyzing each sample. Upon successful completion of the batch, the MS operator would view all of the analytical data, ensure the QC results were acceptable, and review each patient/donor specimen, looking at characteristics such as peak shape, ion ratios, retention time, and calculated concentration.

The operator would then post acceptable results into the LIMS manually or through an interface, and unacceptable results would be rescheduled or dealt with according to lab-specific protocols. In our laboratory we perform a final certification step for quality assurance by reviewing all information about the batch again, prior to releasing results for final reporting through the LIMS.

What problems are associated with this workflow?

The workflow described above results in too many highly trained chemists performing manual data entry and reviewing perfectly acceptable analytical results. Lab managers would prefer that MS operators and certifying scientists focus on troubleshooting problem samples rather than reviewing mounds of good data. Not only is the current process inefficient, it is mundane work prone to user errors. This risks fatigue, disengagement, and complacency by our highly skilled scientists.

Importantly, manual processes also take time. In most clinical lab environments, turnaround time is critical for patient care and industry competitiveness. Lab directors and managers are looking for solutions to automate mundane, error-prone tasks to save time and costs, reduce staff burnout, and maintain high levels of quality.

How can software automate data transfer from MS systems to LIMS?

Automation is not a new concept in the clinical lab. Labs have automated processes in shipping and receiving, sample preparation, liquid handling, and data delivery to the end user. As more labs implement MS, companies have begun to develop special software to automate data analysis and review workflows.

In July 2011, AIT Labs incorporated ASCENT into our workflow, eliminating the initial manual peak review step. ASCENT is an algorithm-based peak picking and data review system designed specifically for chromatographic data. The software employs robust statistical and modeling approaches to the raw instrument data to present the true signal, which often can be obscured by noise or matrix components.

The system also uses an exponentially modified Gaussian (EMG) equation to apply a best-fit model to integrated peaks through what is often a noisy signal. In our experience, applying the EMG results in cleaner data from what might appear to be poor chromatography ultimately allows us to reduce the number of samples we might otherwise rerun.

How do you validate the quality of results?

We’ve developed a robust validation protocol to ensure that results are, at minimum, equivalent to results from our manual review. We begin by building the assay in ASCENT, entering assay-specific information from our internal standard operating procedure (SOP). Once the assay is configured, validation proceeds with parallel batch processing to compare results between software-reviewed data and staff-reviewed data. For new implementations we run eight to nine batches of 30–40 samples each; when we are modifying or upgrading an existing implementation we run a smaller number of batches. The parallel batches should contain multiple positive and negative results for all analytes in the method, preferably spanning the analytical measurement range of the assay.

The next step is to compare the results and calculate the percent difference between the data review methods. We require that two-thirds of the automated results fall within 20% of the manually reviewed result. In addition to validating patient sample correlation, we also test numerous quality assurance rules that should initiate a flag for further review.

What are the biggest challenges during implementation and continual improvement initiatives?

On the technological side, our largest hurdle was loading the sequence files into ASCENT. We had created an in-house mechanism for our chemists to upload the 96-well plate map for their batch into the MS software. We had some difficulty transferring this information to ASCENT, but once we resolved this issue, the technical workflow proceeded fairly smoothly.

The greater challenge was changing our employees’ mindset from one of fear that automation would displace them, to a realization that learning this new technology would actually make them more valuable. Automating a non-mechanical process can be a difficult concept for hands-on scientists, so managers must be patient and help their employees understand that this kind of technology leverages the best attributes of software and people to create a powerful partnership.

We recommend that labs considering automated data analysis engage staff in the validation and implementation to spread the workload and the knowledge. As is true with most technology, it is best not to rely on just one or two super users. We also found it critical to add supervisor level controls on data file manipulation, such as removing a sample that wasn’t run from the sequence table. This can prevent inadvertent deletion of a file, requiring reinjection of the entire batch!

 

Understanding Fibroblast Growth Factor 23

Author: Damien Gruson, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/understanding-fibroblast-growth-factor-23

What is the relationship of FGF-23 to heart failure?

A Heart failure (HF) is an increasingly common syndrome associated with high morbidity, elevated hospital readmission rates, and high mortality. Improving diagnosis, prognosis, and treatment of HF requires a better understanding of its different sub-phenotypes. As researchers gained a comprehensive understanding of neurohormonal activation—one of the hallmarks of HF—they discovered several biomarkers, including natriuretic peptides, which now are playing an important role in sub-phenotyping HF and in driving more personalized management of this chronic condition.

Like the natriuretic peptides, fibroblast growth factor 23 (FGF-23) could become important in risk-stratifying and managing HF patients. Produced by osteocytes, FGF-23 is a key regulator of phosphorus homeostasis. It binds to renal and parathyroid FGF-Klotho receptor heterodimers, resulting in phosphate excretion, decreased 1-α-hydroxylation of 25-hydroxyvitamin D, and decreased parathyroid hormone (PTH) secretion. The relationship to PTH is important because impaired homeostasis of cations and decreased glomerular filtration rate might contribute to the rise of FGF-23. The amino-terminal portion of FGF-23 (amino acids 1-24) serves as a signal peptide allowing secretion into the blood, and the carboxyl-terminal portion (aa 180-251) participates in its biological action.

How might FGF-23 improve HF risk assessment?

Studies have shown that FGF-23 is related to the risk of cardiovascular diseases and mortality. It was first demonstrated that FGF-23 levels were independently associated with left ventricular mass index and hypertrophy as well as mortality in patients with chronic kidney disease (CKD). FGF-23 also has been associated with left ventricular dysfunction and atrial fibrillation in coronary artery disease subjects, even in the absence of impaired renal function.

FGF-23 and FGF receptors are both expressed in the myocardium. It is possible that FGF-23 has direct effects on the heart and participates in the physiopathology of cardiovascular diseases and HF. Experiments have shown that for in vitro cultured rat cardiomyocytes, FGF-23 stimulates pathological hypertrophy by activating the calcineurin-NFAT pathway—and in wild-type mice—the intra-myocardial or intravenous injection of FGF-23 resulted in left ventricular hypertrophy. As such, FGF-23 appears to be a potential stimulus of myocardial hypertrophy, and increased levels may contribute to the worsening of heart failure and long-term cardiovascular death.

Researchers have documented that HF patients have elevated FGF-23 circulating levels. They have also found a significant correlation between plasma levels of FGF-23 and B-type natriuretic peptide, a biomarker related to ventricular stretch and cardiac hypertrophy, in patients with left ventricular hypertrophy. As such, measuring FGF-23 levels might be a useful tool to predict long-term adverse cardiovascular events in HF patients.

Interestingly, researchers have documented a significant relationship between FGF-23 and PTH in both CKD and HF patients. As PTH stimulates FGF-23 expression, it could be that in HF patients, increased PTH levels increase the bone expression of FGF-23, which enhances its effects on the heart.

 

The Past, Present, and Future of Western Blotting in the Clinical Laboratory

Author: Curtis Balmer, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/the-past-present-and-future-of-western-blotting-in-the-clinical-laboratory

Much of the discussion about Western blotting centers around its performance as a biological research tool. This isn’t surprising. Since its introduction in the late 1970s, the Western blot has been adopted by biology labs of virtually every stripe, and become one of the most widely used techniques in the research armamentarium. However, Western blotting has also been employed in clinical laboratories to aid in the diagnosis of various diseases and disorders—an equally important and valuable application. Yet there has been relatively little discussion of its use in this context, or of how advances in Western blotting might affect its future clinical use.

Highlighting the clinical value of Western blotting, Stanley Naides, MD, medical director of Immunology at Quest Diagnostics observed that, “Western blotting has been a very powerful tool in the laboratory and for clinical diagnosis. It’s one of many various methods that the laboratorian brings to aid the clinician in the diagnosis of disease, and the selection and monitoring of therapy.” Indeed, Western blotting has been used at one time or the other to aid in the diagnosis of infectious diseases including hepatitis C (HCV), HIV, Lyme disease, and syphilis, as well as autoimmune disorders such as paraneoplastic disease and myositis conditions.

However, Naides was quick to point out that the choice of assays to use clinically is based on their demonstrated sensitivity and performance, and that the search for something better is never-ending. “We’re constantly looking for methods that improve detection of our target [protein],” Naides said. “There have been a number of instances where we’ve moved away from Western blotting because another method proves to be more sensitive.” But this search can also lead back to Western blotting. “We’ve gone away from other methods because there’s been a Western blot that’s been developed that’s more sensitive and specific. There’s that constant movement between methods as new tests are developed.”

In recent years, this quest has been leading clinical laboratories away from Western blotting toward more sensitive and specific diagnostic assays, at least for some diseases. Using confirmatory diagnosis of HCV infection as an example, Sai Patibandla, PhD, director of the immunoassay group at Siemens Healthcare Diagnostics, explained that movement away from Western blotting for confirmatory diagnosis of HCV infection began with a technical modification called Recombinant Immunoblotting Assay (RIBA). RIBA streamlines the conventional Western blot protocol by spotting recombinant antigen onto strips which are used to screen patient samples for antibodies against HCV. This approach eliminates the need to separate proteins and transfer them onto a membrane.

The RIBA HCV assay was initially manufactured by Chiron Corporation (acquired by Novartics Vaccines and Diagnostics in 2006). It received Food and Drug Administration (FDA) approval in 1999, and was marketed as Chiron RIBA HCV 3.0 Strip Immunoblot Assay. Patibandla explained that, at the time, the Chiron assay “…was the only FDA-approved confirmatory testing for HCV.” In 2013 the assay was discontinued and withdrawn from the market due to reports that it was producing false-positive results.

Since then, clinical laboratories have continued to move away from Western blot-based assays for confirmation of HCV in favor of the more sensitive technique of nucleic acid testing (NAT). “The migration is toward NAT for confirmation of HCV [diagnosis]. We don’t use immunoblots anymore. We don’t even have a blot now to confirm HCV,” Patibandla said.

Confirming HIV infection has followed a similar path. Indeed, in 2014 the Centers for Disease Control and Prevention issued updated recommendations for HIV testing that, in part, replaced Western blotting with NAT. This change was in response to the recognition that the HIV-1 Western blot assay was producing false-negative or indeterminable results early in the course of HIV infection.

At this juncture it is difficult to predict if this trend away from Western blotting in clinical laboratories will continue. One thing that is certain, however, is that clinicians and laboratorians are infinitely pragmatic, and will eagerly replace current techniques with ones shown to be more sensitive, specific, and effective. This raises the question of whether any of the many efforts currently underway to improve Western blotting will produce an assay that exceeds the sensitivity of currently employed techniques such as NAT.

Some of the most exciting and groundbreaking work in this area is being done by Amy Herr, PhD, a professor of bioengineering at University of California, Berkeley. Herr’s group has taken on some of the most challenging limitations of Western blotting, and is developing techniques that could revolutionize the assay. For example, the Western blot is semi-quantitative at best. This weakness dramatically limits the types of answers it can provide about changes in protein concentrations under various conditions.

To make Western blotting more quantitative, Herr’s group is, among other things, identifying losses of protein sample mass during the assay protocol. About this, Herr explains that the conventional Western blot is an “open system” that involves lots of handling of assay materials, buffers, and reagents that makes it difficult to account for protein losses. Or, as Kevin Lowitz, a senior product manager at Thermo Fisher Scientific, described it, “Western blot is a [simple] technique, but a really laborious one, and there are just so many steps and so many opportunities to mess it up.”

Herr’s approach is to reduce the open aspects of Western blot. “We’ve been developing these more closed systems that allow us at each stage of the assay to account for [protein mass] losses. We can’t do this exactly for every target of interest, but it gives us a really good handle [on protein mass losses],” she said. One of the major mechanisms Herr’s lab is using to accomplish this is to secure proteins to the blot matrix with covalent bonding rather than with the much weaker hydrophobic interactions that typically keep the proteins in place on the membrane.

Herr’s group also has been developing microfluidic platforms that allow Western blotting to be done on single cells, “In our system we’re doing thousands of independent Westerns on single cells in four hours. And, hopefully, we’ll cut that down to one hour over the next couple years.”

Other exciting modifications that stand to dramatically increase the sensitivity, quantitation, and through-put of Western blotting also are being developed and explored. For example, the use of capillary electrophoresis—in which proteins are conveyed through a small electrolyte-filled tube and separated according to size and charge before being dropped onto a blotting membrane—dramatically reduces the amount of protein required for Western blot analysis, and thereby allows Westerns to be run on proteins from rare cells or for which quantities of sample are extremely limited.

Jillian Silva, PhD, an associate specialist at the University of California, San Francisco Helen Diller Family Comprehensive Cancer Center, explained that advances in detection are also extending the capabilities of Western blotting. “With the advent of fluorescence detection we have a way to quantitate Westerns, and it is now more quantitative than it’s ever been,” said Silva.

Whether or not these advances produce an assay that is adopted by clinical laboratories remains to be seen. The emphasis on Western blotting as a research rather than a clinical tool may bias advances in favor of the needs and priorities of researchers rather than clinicians, and as Patibandla pointed out, “In the research world Western blotting has a certain purpose. [Researchers] are always coming up with new things, and are trying to nail down new proteins, so you cannot take Western blotting away.” In contrast, she suggested that for now, clinical uses of Western blotting remain “limited.”

 

Adapting Next Generation Technologies to Clinical Molecular Oncology Service

Author: Ronald Carter, PhD, DVM  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/adapting-next-generation-technologies-to-clinical-molecular-oncology-service

Next generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and in the amount of information they provide. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing. This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.What follows is one viewpoint on the major challenges in adopting NGTs into diagnostic molecular oncology service.

Choosing a Platform

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

When high-throughput instruments first made their appearance, laboratories paid more attention to the accuracy of base-reading: Less accurate sequencing meant more data cleaning and resequencing (1). Now, new instrument designs have narrowed the differences, and test chemistry can have a comparatively large impact on analytical accuracy (Figure 1). The robustness of technical performance can also vary significantly depending upon specimen type. For example, LifeTechnologies’ sequencing platforms appear to be comparatively more tolerant of low DNA quality and concentration, which is an important consideration for fixed and processed tissues.

https://www.aacc.org/~/media/images/cln/articles/2015/october/carter_fig1_cln_oct15_ed.jpg

Figure 1 Comparison of Sequencing Chemistries

Sequence pile-ups of the same target sequence (2 large genes), all performed on the same analytical instrument. Results from 4 different chemistries, as designed and supplied by reagent manufacturers prior to optimization in the laboratory. Red lines represent limits of exons. Height of blue columns proportional to depth of coverage. In this case, the intent of the test design was to provide high depth of coverage so that reflex Sanger sequencing would not be necessary. Courtesy B. Sadikovic, U. of Western Ontario.

 

In addition, batching, robotics, workload volume patterns, maintenance contracts, software licenses, and platform lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT: In some applications, fees for intellectual property can represent more than 50% of the bench cost of performing a given test, and increase substantially without warning.

Laboratories must also deal with the problem of obsolescence. Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives. Before NGTs, major instruments could confidently be expected to remain current for at least 6 to 8 years. Now, a major instrument is obsolete much sooner, often within 2 to 3 years. This means that keeping it in service might cost more than investing in a new platform. Lease-purchase arrangements help mitigate year-to-year fluctuations in capital equipment costs, and maximize the value of old equipment at resale.

One Size Still Does Not Fit All

Laboratories face numerous technical considerations to optimize sequencing protocols, but the test has to be matched to the performance criteria needed for the clinical indication (2). For example, measuring response to treatment depends first upon the diagnostic recognition of mutation(s) in the tumor clone; the marker(s) then have to be quantifiable and indicative of tumor volume throughout the course of disease (Table 1).

As a result, diagnostic tests need to cover many different potential mutations, yet accurately identify any clinically relevant mutations actually present. On the other hand, tests for residual disease need to provide standardized, sensitive, and accurate quantification of a selected marker mutation against the normal background. A diagnostic panel might need 1% to 3% sensitivity across many different mutations. But quantifying early response to induction—and later assessment of minimal residual disease—needs a test that is reliably accurate to the 10-4 or 10-5 range for a specific analyte.

Covering all types of mutations in one diagnostic test is not yet possible. For example, subtyping of acute myeloid leukemia is both old school (karyotype, fluorescent in situ hybridization, and/or PCR-based or array-based testing for fusion rearrangements, deletions, and segmental gains) and new school (NGT-based panel testing for molecular mutations).

Chemistries that cover both structural variants and copy number variants are not yet in general use, but the advantages of NGTs compared to traditional methods are becoming clearer, such as in colorectal cancer (3). Researchers are also using cell-free DNA (cfDNA) to quantify residual disease and detect resistance mutations (4). Once a clinically significant clone is identified, enrichment techniques help enable extremely sensitive quantification of residual disease (5).

Validation and Quality Assurance

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assembling the resources for validation and quality assurance. The second is keeping tests up-to-date as new analytes are needed. Even if a given test chemistry has the flexibility to add analytes without revalidating the entire panel, keeping up with clinical advances is a constant priority.

Due to their throughput and multiplexing capacities, NGT platforms typically require considerable upfront investment to adopt, and training staff to perform testing takes even more time. Proper validation is harder to document: Assembling positive controls, documenting test performance criteria, developing quality assurance protocols, and conducting proficiency testing are all demanding. Labs meet these challenges in different ways. Laboratory-developed tests (LDTs) allow self-determined choice in design, innovation, and control of the test protocol, but can be very expensive to set up.

Food and Drug Administration (FDA)-approved methods are attractive but not always an option. More FDA-approved methods will be marketed, but FDA approval itself brings other trade-offs. There is a cost premium compared to LDTs, and the test methodologies are locked down and not modifiable. This is particularly frustrating for NGTs, which have the specific attraction of extensive multiplexing capacity and accommodating new analytes.

IT and the Evolution of Molecular Oncology Reporting Standards

The options for information technology (IT) pipelines for NGTs are improving rapidly. At the same time, recent studies still show significant inconsistencies and lack of reproducibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing. It can be difficult to duplicate published performances in clinical studies because of a lack of sufficient information about the protocol (chemistry) and software. Building bioinformatics capacity is a key requirement, yet skilled people are in short supply and the qualifications needed to work as a bioinformatician in a clinical service are not yet clearly defined.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-specific­ variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages (6). One of the biggest challenges is to reproducibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations (7). For multiple analyte panels, such as predictive testing for breast cancer, only the performance of the whole panel in a population of patients can be compared; individual patients may be scored into different risk categories by different tests, all for the same test indication.

In large scale sequencing of tumor genomes, which types of mutations are most informative in detecting, quantifying, and predicting the behavior of the tumor over time? The amount and complexity of mutation varies considerably across different tumor types, and while some mutations are more common, stable, and clinically informative than others, the utility of a given tumor marker varies in different clinical situations. And, for a given tumor, treatment effect and metastasis leads to retesting for changes in drug sensitivities.

These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision. One approach many labs choose is licensed technologies with shared databases that are updated in real time. These are attractive, despite their cost and licensing fees. New tests that incorporate proprietary IT with NGT platforms link the genetic signatures of tumors to clinically significant considerations like tumor classification, recommended methodologies for monitoring response, predicted drug sensitivities, eligible clinical trials, and prognostic classifications. In-house development of such solutions will be difficult, so licensing platforms from commercial partners is more likely to be the norm.

The Commercial Value of Health Records and Test Data

The future of cancer management likely rests on large-scale databases that link hereditary and somatic tumor testing with clinical outcomes. Multiple centers have such large studies underway, and data extraction and analysis is providing increasingly refined interpretations of clinical significance.

Extracting health outcomes to correlate with molecular test results is commercially valuable, as the pharmaceutical, insurance, and healthcare sectors focus on companion diagnostics, precision medicine, and evidence-based health technology assessment. Laboratories that can develop tests based on large-scale integration of test results to clinical utility will have an advantage.

NGTs do offer opportunities for net reductions in the cost of healthcare. But the lag between availability of a test and peer-evaluated demon­stration of clinical utility can be considerable. Technical developments arise faster than evidence of clinical utility. For example, immuno­histochemistry, estrogen receptor/progesterone receptor status, HER2/neu, and histology are still the major pathological criteria for prognostic evaluation of breast cancer at diagnosis, even though multiple analyte tumor profiling has been described for more than 15 years. Healthcare systems need a more concerted assessment of clinical utility if they are to take advantage of the promises of NGTs in cancer care.

Disruptive Advances

Without a doubt, “disruptive” is an appropriate buzzword in molecular oncology, and new technical advances are about to change how, where, and for whom testing is performed.

• Predictive Testing

Besides cost per analyte, one of the drivers for taking up new technologies is that they enable multiplexing many more analytes with less biopsy material. Single-analyte sequential testing for epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase, and other targets on small biopsies is not sustainable when many more analytes are needed, and even now, a significant proportion of test requests cannot be completed due to lack of suitable biopsy material. Large panels incorporating all the mutations needed to cover multiple tumor types are replacing individual tests in companion diagnostics.

• Cell-Free Tumor DNA

Challenges of cfDNA include standardizing the collection and processing methodologies, timing sampling to minimize the effect of therapeutic toxicity on analytical accuracy, and identifying the most informative sample (DNA, RNA, or protein). But for more and more tumor types, it will be possible to differentiate benign versus malignant lesions, perform molecular subtyping, predict response, monitor treatment, or screen for early detection—all without a surgical biopsy.

cfDNA technologies can also be integrated into core laboratory instrumentation. For example, blood-based EGFR analysis for lung cancer is being developed on the Roche cobas 4800 platform, which will be a significant change from the current standard of testing based upon single tests of DNA extracted from formalin-fixed, paraffin-embedded sections selected by a pathologist (8).

• Whole Genome and Whole Exome Sequencing

Whole genome and whole exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individual or multiple gene test panels as the technical cost of sequencing declines and interpretive accuracy improves (9). Laboratories can apply informatics selectively or broadly to extract much more information at relatively little increase in cost, and the interpretation of individual analytes will be improved by the context of the whole sequence.

• Minimal Residual Disease Testing

Massive resequencing and enrichment techniques can be used to detect minimal residual disease, and will provide an alternative to flow cytometry as costs decline. The challenge is to develop robust analytical platforms that can reliably produce results in a high proportion of patients with a given tumor type, despite using post-treatment specimens with therapy-induced degradation, and a very low proportion of target (tumor) sequence to benign background sequence.

The tumor markers should remain informative for the burden of disease despite clonal evolution over the course of multiple samples taken during progression of the clinical course and treatment. Quantification needs to be accurate and sensitive down to the 10-5 range, and cost competitive with flow cytometry.

• Point-of-Care Test Methodologies

Small, rapid, cheap, and single use point-of-care (POC) sequencing devices are coming. Some can multiplex with analytical times as short as 20 minutes. Accurate and timely testing will be possible in places like pharmacies, oncology clinics, patient service centers, and outreach programs. Whether physicians will trust and act on POC results alone, or will require confirmation by traditional laboratory-based testing, remains to be seen. However, in the simplest type of application, such as a patient known to have a particular mutation, the advantages of POC-based testing to quantify residual tumor burden are clear.

Conclusion

Molecular oncology is moving rapidly from an esoteric niche of diagnostics to a mainstream, required component of integrated clinical laboratory services. While NGTs are markedly reducing the cost per analyte and per specimen, and will certainly broaden the scope and volume of testing performed, the resources required to choose, install, and validate these new technologies are daunting for smaller labs. More rapid obsolescence and increased regulatory scrutiny for LDTs also present significant challenges. Aligning test capacity with approved clinical indications will require careful and constant attention to ensure competitiveness.

References

1. Liu L, Li Y, Li S, et al. Comparison of next-generation sequencing systems. J Biomed Biotechnol 2012; doi:10.1155/2012/251364.

2. Brownstein CA, Beggs AH, Homer N, et al. An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge. Genome Biol 2014;15:R53.

3. Haley L, Tseng LH, Zheng G, et al. Performance characteristics of next-generation sequencing in clinical mutation detection of colorectal ­cancers. [Epub ahead of print] Modern Pathol July 31, 2015 as doi:10.1038/modpathol.2015.86.

4. Butler TM, Johnson-Camacho K, Peto M, et al. Exome sequencing of cell-free DNA from metastatic cancer patients identifies clinically actionable mutations distinct from primary ­disease. PLoS One 2015;10:e0136407.

5. Castellanos-Rizaldos E, Milbury CA, Guha M, et al. COLD-PCR enriches low-level variant DNA sequences and increases the sensitivity of genetic testing. Methods Mol Biol 2014;1102:623–39.

6. Hiltemann S, Jenster G, Trapman J, et al. Discriminating somatic and germline mutations in tumor DNA samples without matching normals. Genome Res 2015;25:1382–90.

7. Lammers PE, Lovly CM, Horn L. A patient with metastatic lung adenocarcinoma harboring concurrent EGFR L858R, EGFR germline T790M, and PIK3CA mutations: The challenge of interpreting results of comprehensive mutational testing in lung cancer. J Natl Compr Canc Netw 2015;12:6–11.

8. Weber B, Meldgaard P, Hager H, et al. Detection of EGFR mutations in plasma and biopsies from non-small cell lung cancer patients by allele-specific PCR assays. BMC Cancer 2014;14:294.

9. Vogelstein B, Papadopoulos N, Velculescu VE, et al. Cancer genome landscapes. Science 2013;339:1546–58.

10. Heitzer E, Auer M, Gasch C, et al. Complex tumor genomes inferred from single circulating tumor cells by array-CGH and next-generation sequencing. Cancer Res 2013;73:2965–75.

11. Healy B. BRCA genes — Bookmaking, fortunetelling, and medical care. N Engl J Med 1997;336:1448–9.

 

 

 

Read Full Post »

Heroes in Basic Medical Research – Leroy Hood

Larry H Bernstein, MD, FCAP, Curator

Leaders in Pharmaceutical Intelligence

Series E. 2; 4.5

Leroy Hood, MD, PhD

Dr. Hood created the technological foundation for the sciences of genomics (study of genes) and proteomics (study of proteins) through the invention of five groundbreaking instruments and by explicating the potentialities of genome and proteome research into the future through his pioneering of the fields of systems biology and systems medicine. Hood’s instruments not only pioneered the deciphering of biological information, but also introduced the concept of high throughput data accumulation through automation and parallelization of the protein and DNA chemistries.

The first four instruments were commercialized by Applied Biosystems, Inc., a company founded by Dr. Hood in 1981, and the ink-jet technology was commercialized by Agilent Technologies, thus making these instruments immediately available to the world-community of scientists.

The first two instruments transformed the field of proteomics. The protein sequencer allowed scientists to read and analyze proteins that had not previously been accessible, resulting in the characterization of a series of new proteins whose genes could then be cloned and analyzed. These discoveries led to significant ramifications for biology, medicine, and pharmacology. The second instrument, the protein synthesizer, synthesized proteins and peptides in sufficient quantities to begin characterizing their functions. The DNA synthesizer, the first of three instruments for genomic analyses, was used to synthesize DNA fragments for DNA mapping and gene cloning. The most notable of Hood’s inventions, the automated DNA sequencer developed in 1986, made possible high-speed sequencing of human genomes and was the key technology enabling the Human Genome Project.

In the early 1990s Hood and his colleagues developed the ink-jet DNA synthesis technology for creating DNA arrays with tens of thousands of gene fragments, one of the first of the so-called DNA chips, which enabled measuring the levels of 10,000s of expressed genes. This instrument has also transformed genomics, biology, and medicine.

In 1992, Hood created the first cross-disciplinary biology department, Molecular Biotechnology, at the University of Washington. In 2000, he left the UW to co-found Institute for Systems Biology, the first of its kind. He has pioneered systems medicine the years since ISB’s founding.

In 2000, Hood and two colleagues founded the Institute for Systems Biology (ISB), a nonprofit research institute integrating biology, technology, computation and medicine to take a systems (holistic) approach to studying the complexity of biology and medicine by analyzing all elements in a biological system rather than studying them one gene or protein at a time (an atomistic approach).

Hood has made many seminal discoveries in the fields of immunology, neurobiology and biotechnology and, most recently, has been a leader in the development of systems biology, its applications to cancer, neurodegenerative disease, and the linkage of systems biology to personalized medicine.

Hood’s efforts in a systems approach to disease have led him to pioneer a new approach to medicine that he coined P4 Medicine in 2003. His view is that P4 medicine will transform the practice of medicine over the next decade, moving it from a largely reactive discipline to a proactive one.

Dr. Hood’s outstanding contributions have had a resounding effect on the advancement of science since the 1960s. Throughout his career, he has adhered to the advice of his mentor, Dr. William J. Dreyer: “If you want to practice biology, do it on the leading edge, and if you want to be on the leading edge, invent new tools for deciphering biological information.”

 

Hood is now pioneering new approaches to P4 medicine

Co-founder and Chairman P4 Medicine institute

—predictive, preventive, personalized and participatory, and most recently, has embarked on creating a P4 pilot project on 100,000 well individuals, that is transforming healthcare.

In addition to his ground-breaking research, Hood has published 750 papers, received 36 patents, 17 honorary degrees and more than 100 awards and honors. He is one of only 15 individuals elected to all three National Academies—the National Academy of Science, the National Academy of Engineering, and the Institute of Medicine. Hood has founded or co-founded 15 different biotechnology companies.

 

http://www.youtube.com/watch%3Fv%3D5aE8tgbsl9U Feb 18, 2015 Dr. Leroy Hood, President and Co-founder, Institute for Systems Biology, gives a talk entitled “Systems Medicine and a Longitudinal, …

http://www.youtube.com/watch%3Fv%3DaYGTLj02sx0  Nov 19, 2014 … of Healthcare? A Personal View of Biological Complexity, Paradigm Changes, Systems Biology and Systems Medicine .Speaker: Leroy Hood.

http://www.youtube.com/watch%3Fv%3DnT1MvnH6j8Q Sep 26, 2014 Dr. Leroy Hood discusses how P4 (Predictive, Preventive, … EMBC 2014 Theme Keynote Lecture with Dr. Emery Brown – Duration: 58:49. by …

Read Full Post »

Proteomics

Writer and Curator: Larry H. Bernstein, MD, FCAP

 

 

The previous discussion concerned genomics, metabolomics, and cancer. The discussion that follows is concerned with the expanding filed of proteomics, which has implication for disease discovery, pharmaceutical targeting, and diagnostics.

The human proteome – a scientific opportunity for transforming diagnostics, therapeutics, and healthcare

Marc Vidal, Daniel W Chan, Mark Gerstein, Matthias Mann, Gilbert S Omenn, et al.
Clinical Proteomics 2012, 9:6  http://www.clinicalproteomicsjournal.com/content/9/1/6

A National Institutes of Health (NIH) workshop was convened in Bethesda, MD on September 26–27, 2011, with representative scientific leaders in the field of proteomics and its applications to clinical settings. The main purpose of this workshop was to articulate ways in which the biomedical research community can capitalize on recent technology advances and synergize with ongoing efforts to advance the field of human proteomics. This executive summary and the following full report describe the main discussions and outcomes of the workshop.

Proteomics Pioneer Award 2013: Professor Amos Bairoch, University of Geneva, Switzerland

Eupa Open Proteomics 2 (2014) 34  http://dx.doi.org/10.1016/j.euprot.2013.12.002

Amos Bairoch has always been fascinated by computer science, genetics and biochemistry. His fi rst project, as a PhD student, was the development of PC/Gene, a MS-DOS based software package for the analysis of protein and nucleotide sequences. While working on this project, he realized that there was no single resource for protein sequences, and started to develop the first annotated protein sequence database, which became Swiss-Prot and was first released in July 1986. In 1988, he created PROSITE, a database of protein families and domains, and a little later ENZYME, an enzyme nomenclature database.

Amos Bairoch led the Swiss-Prot group from its creation in 1988 until 2009. During this period, Swiss-Prot became the primary protein sequence resource in the world and has been a key research instrument for both bioinformaticians and laboratory-based scientists, particularly in the field of proteomics.

Since 2009, Amos Bairoch’s group is developing neXtProt, a knowledgebase
specifically dedicated to human proteins.neXtProt has been chosen as the reference protein database for the HUPO Human Proteome Projects.

For his major contributions in the field of proteomic databases, Amos Bairoch received the Friedrich Miescher Award from the Swiss Society of Biochemistry in 1993, the Helmut Horten Foundation Incentive Award in 1995, the Pehr Edman award and the European Latsis Prize in 2004, the Otto Naegeli prize in 2010, and the HUPO Distinguished Achievement Award in Proteomic Sciences in 2011.

National Heart, Lung, and Blood Institute Clinical Proteomics Working Group Report

CB Granger, JE Van Eyk, SC Mockrin and N. Leigh Anderson
Circulation. 2004;109:1697-1703
http://dx.doi.org:/10.1161/01.CIR.0000121563.47232.2A

The National Heart, Lung, and Blood Institute (NHLBI) Clinical Proteomics Working Group was charged with identifying opportunities and challenges in clinical proteomics and using these as a basis for recommendations aimed at directly improving patient care. The group included representatives of clinical and translational research, proteomic technologies, laboratory medicine, bioinformatics, and 2 of the NHLBI Proteomics Centers, which form part of a program focused on innovative technology development. This report represents the results from a one-and-a-half-day meeting on May 8 and 9, 2003. For the purposes of this report, clinical proteomics is defined as the systematic, comprehensive, large-scale identification of protein patterns (“fingerprints”) of disease and the application of this knowledge to improve patient care and public health through better assessment of disease susceptibility, prevention of disease, selection of therapy for the individual, and monitoring of treatment response.

The -omics era: Proteomics and lipidomics in vascular research

Athanasios Didangelos, Christin Stegemann, Manuel Mayr
Atherosclerosis 221 (2012) 12– 17
http://dx.doi.org:/10.1016/j.atherosclerosis.2011.09.043

The retention of proatherogenic low-density lipoprotein (LDL) particles on the subendothelial extracellular matrix (ECM) is a hallmark of atherosclerosis. Apolipoprotein B (apoB)-containing lipoprotein particles are trapped in the arterial intima by proteoglycans in atherosclerosis-prone areas and eventually become modified, commonly by aggregation and oxidation. The initial accumulation of proatherogenic lipoproteins initiates an inflammatory response, which results in the release of proteolytic enzymes and induces the dedifferentiation of vascular smooth muscle cells (SMCs) resulting in alterations of their matrix producing properties. The precise mechanisms responsible for the accumulation of certain matrix components and subsequent lipoprotein retention on the vessel wall are not fully elucidated. Undoubtedly, ECM remodeling contributes to the formation of atherosclerotic lesions and the lipid composition of apolipoproteins influences their binding properties to the matrix. An unbiased discovery approach, which is not limited to known molecules of presumed importance, will be invaluable for the identification of novel, previously unknown mediators of disease. Although descriptive, the detailed examination of atherosclerotic plaques using advanced proteomics and lipidomics techniques can generate novel insights and form the basis for further mechanistic investigations.

The Revolution in Proteomics Ionization –
CaptiveSpray nanoBooster™
Bruker, LC-MS Source

Bruker CaptiveSpray principle:

Stable and robust nanoflow LC/MS is still a challenge in proteomics analysis. The Bruker CaptiveSpray source is a revolutionary ion source with a patented design that provides provides easy operation just as simple normal flow electrospray.

CaptiveSpray delivers nanospray sensitivity, resists plugging, and provides reproducible uninterrupted flow for even the most complex proteomics samples.

CaptiveSpray nanoBooster brings your MS to the next performance level and provides even higher flexibility.

  • Boost nanoflow sensitivity
    • Push up ID rates
    • Enabling Glycoanalysis
    • Supercharging capability

CaptiveSpray provides a vortex gas that sweeps around the emitter spray tip to desolvate and to focus the Taylor cone into the MS inlet capillary. The vacuum seal to the MS ion guide draws all of the sample ions into the MS increasing the efficiency of sample transfer from the spray tip into the mass spectrometer. The direct connection to the inlet capillary eliminates the need for any source adjustment making the CaptiveSpray source truly Plug-and-Play.

CaptiveSpray Illustration

CaptiveSpray Illustration

CaptiveSpray Illustration

Structure elucidation

Structure elucidation

Structure elucidation

Tissue Proteomics for the Next Decade? Towards a Molecular Dimension in Histology

R Longuespee, M Fleron, C Pottier, F Quesada-Calvo, Marie-Alice Meuwis, et al.
OMICS A Journal of Integrative Biology 2014; 18(9)
http://dx.doi.org:/10.1089/omi.2014.0033

Currently, sampling methods, biochemical procedures, and MS instrumentations allow scientists to perform ‘‘in depth’’ analysis of the protein content of any type of tissue of interest. This article reviews the salient issues in proteomics analysis of tissues. We first outline technical and analytical considerations for sampling and biochemical processing of tissues and subsequently the instrumental possibilities for proteomics analysis such as shotgun proteomics in an anatomical context. Specific attention concerns formalin fixed and paraffin embedded (FFPE) tissues that are potential ‘‘gold mines’’ for histopathological investigations. In all, the matrix assisted laser desorption/ionization (MALDI) MS imaging, which allows for differential mapping of hundreds of compounds on a tissue section, is currently the most striking evidence of linkage and transition between ‘‘classical’’ and ‘‘molecular’’ histology. Tissue proteomics represents a veritable field of research and investment activity for modern biomarker discovery and development for the next decade.

A transcriptome-proteome integrated network identifies ERp57 as a hub that mediates bone metastasis

N Santana-Codina, R Carretero, R Sanz-Pamplona1, T Cabrera, et al.
The American Society for Biochemistry and Molecular Biology
MCP  Apr 26, 2013; Manuscript M112.022772
E-mail: asierra@idibell.cat

Bone metastasis is the most common distant relapse in breast cancer. The identification of key proteins involved in the osteotropic phenotype would represent a major step toward the development of new prognostic markers and therapeutic improvements. The aim of this study was to characterize functional phenotypes that favor bone metastasis in human breast cancer.
We used the human breast cancer cell line MDA-MB-231 and its osteotropic BO2 subclone to identify crucial proteins in bone metastatic growth. We identified 31 proteins, 15 underexpressed and 16 overexpressed, in BO2 cells compared to parental cells. We employed a network-modeling approach in which these 31 candidate proteins were prioritized with respect to their potential in metastasis formation, based on the topology of the protein–protein interaction network and differential expression. The protein–protein interaction network provided a framework to study the functional relationships between biological molecules by attributing functions to genes whose functions had not been characterized.
The combination of expression profiles and protein interactions revealed an endoplasmic reticulum-thiol oxidoreductase, ERp57, functioning as a hub which retained 4 downregulated nodes involved in antigen presentation associated with the human major histocompatibility complex class I molecules, including HLA-A, HLA-B, HLA-E and HLA-F. Further analysis of the interaction network revealed an inverse correlation between ERp57 and vimentin, which influences cytoskeleton reorganization. Moreover, knockdown of ERp57 in BO2 cells confirmed its bone organ-specific prometastatic role. Altogether, ERp57 appears as a multifunctional chaperone that can regulate diverse biological processes to maintain the homeostasis of breast cancer cells and promote the development of bone metastasis.

Tandem-repeat protein domains across the tree of life

Kristin K. Jernigan and Seth R. Bordenstein
PeerJ 3:e732; 2015 http://dx.doi.org:/10.7717/peerj.732

Tandem-repeat protein domains, composed of repeated units of conserved stretches of 20–40 amino acids, are required for a wide array of biological functions. Despite their diverse and fundamental functions, there has been no comprehensive assessment of their taxonomic distribution, incidence, and associations with organismal lifestyle and phylogeny.
In this study, we assess for the first time the abundance of armadillo (ARM) and tetratricopeptide (TPR) repeat domains across all three domains in the tree of life and compare the results to our previous analysis on ankyrin (ANK) repeat domains in this journal. All eukaryotes and a majority of the bacterial and archaeal genomes analyzed have a minimum of one TPR and ARM repeat. In eukaryotes, the fraction of ARM-containing proteins is approximately double that of TPR and ANK-containing proteins, whereas bacteria and archaea are enriched in TPR-containing proteins relative to ARM- and ANK-containing proteins.
We show in bacteria that phylogenetic history, rather than lifestyle or pathogenicity, is a predictor of TPR repeat domain abundance, while neither phylogenetic history nor lifestyle predicts ARM repeat domain abundance. Surprisingly, pathogenic bacteria were not enriched in TPR-containing proteins, which have been associated within virulence factors in certain species. Taken together, this comparative analysis provides a newly appreciated view of the prevalence and diversity of multiple types of tandem-repeat protein domains across the tree of life.
A central finding of this analysis is that tandem repeat domain-containing proteins are prevalent not just in eukaryotes, but also in bacterial and archaeal species.

Detection of colorectal adenoma and cancer based on transthyretin and C3a-desArg serum levels

Anne-Kristin Fentz, Monika Sporl, Jorg Spangenberg, Heinz Joachim List, et al.
Proteomics Clin. Appl. 2007, 1, 536–544
http://dx.doi.org:/10.1002/prca.200600664

Colorectal cancer is the second leading cause of cancer death, and it develops from benign colorectal adenomas in over 95% of patients. Early detection of these cancer precursors by screening tests and their removal can potentially eradicate more than 95% of colorectal cancers before they develop.
To discover sensitive and specific biomarkers for improvement of pre-clinical diagnosis of colorectal adenoma and cancer, we analysed in two independent studies (n = 87 and n = 83 patients) serum samples from colorectal cancer (stage III), colorectal adenoma and control patients using SELDI-TOF-MS. Extensive statistical analysis was performed to establish homogeneous patient groups based on their clinical data.
Two biomarkers that were each able to distinguish control patients from either colorectal adenoma or colorectal cancer patients (p,0.001) were identified as transthyretin (pre-albumin) and C3adesArg by MS/MS and were further validated by antibody-based assays (radial immunodiffusion, ELISA). A combination of both proteins clearly indicated the presence of colorectal adenoma or carcinoma. Using a cut-off of  >0.225 g/L for transthyretin and >1974 ng/mL for C3a-desArg, we found a sensitivity and specificity for colorectal adenoma of 96% and 70%, respectively.

The essential biology of the endoplasmic reticulum stress response for structural and computational biologists

Sadao Wakabayashi, Hiderou Yoshida
CSBJ Mar 2013; 6(7), e201303010   http://dx.doi.org/10.5936/csbj.201303010

The endoplasmic reticulum (ER) stress response is a cytoprotective mechanism that maintains homeostasis of the ER by upregulating the capacity of the ER in accordance with cellular demands. If the ER stress response cannot function correctly, because of reasons such as aging, genetic mutation or environmental stress, unfolded proteins accumulate in the ER and cause ER stress-induced apoptosis, resulting in the onset of folding diseases, including Alzheimer’s disease and diabetes mellitus. Although the mechanism of the ER stress response has been analyzed extensively by biochemists, cell biologists and molecular biologists, many aspects remain to be elucidated. For example, it is unclear how sensor molecules detect ER stress, or how cells choose the two opposite cell fates (survival or apoptosis) during the ER stress response. To resolve these critical issues, structural and computational approaches will be indispensable, although the mechanism of the ER stress response is complicated and difficult to understand holistically at a glance. Here, we provide a concise introduction to the mammalian ER stress response for structural and computational biologists.

Sequence co-evolution gives 3D contacts and structures of protein complexes

Thomas A Hopf, Charlotta P I Schärfe, João P G L M Rodrigues, et al.
eLife 2014;3:e03430   http://dx.doi.org:/10.7554/eLife.03430

Protein–protein interactions are fundamental to many biological processes. Experimental screens have identified tens of thousands of interactions, and structural biology has provided detailed functional insight for select 3D protein complexes. An alternative rich source of information about protein interactions is the evolutionary sequence record. Building on earlier work, we show that analysis of correlated evolutionary sequence changes across proteins identifies residues that are close in space with sufficient accuracy to determine the three-dimensional structure of the protein complexes. We evaluate prediction performance in blinded tests on 76 complexes of known 3D structure, predict protein–protein contacts in 32 complexes of unknown structure, and demonstrate how evolutionary couplings can be used to distinguish between interacting and non-interacting protein pairs in a large complex. With the current growth of sequences, we expect that the method can be generalized to genome-wide elucidation of protein–protein interaction networks and used for interaction predictions at residue resolution.
S-Glutathionylation of Cryptic Cysteines Enhances Titin Elasticity by Blocking Protein Folding

Jorge Alegre-Cebollada, P Kosuri, D Giganti, E Eckels, JA Rivas-Pardo, et al.
Cell, Mar 13, 2014; 156: 1235–1246. http://dx.doi.org/10.1016/j.cell.2014.01.056

The giant elastic protein titin is a determinant factor in how much blood fills the left ventricle during diastole and thus in the etiology of heart disease. Titin has been identified as a target of S-glutathionylation, an end product of the nitric-oxide-signaling cascade that increases cardiac muscle elasticity. However, it is unknown how S-glutathionylation may regulate the elasticity of titin and cardiac tissue.
Here, we show that mechanical unfolding of titin immunoglobulin (Ig) domains exposes buried cysteine residues, which then can be S-glutathionylated. S-glutathionylation of cryptic cysteines greatly decreases the mechanical stability of the parent Ig domain as well as its ability to fold. Both effects favor a more extensible state of titin. Furthermore, we demonstrate that S-glutathionylation of cryptic cysteines in titin mediates mechanochemical modulation of the elasticity of human cardiomyocytes.
We propose that posttranslational modification of cryptic residues is a general mechanism to regulate tissue elasticity.
Encounter complexes and dimensionality reduction in protein–protein association

Dima Kozakov, Keyong Li, David R Hall, Dmitri Beglov, Jiefu Zheng, et al.
eLife 2014;3:e01370 http://dx.doi.org:/10.7554/eLife.01370.001

An outstanding challenge has been to understand the mechanism whereby proteins associate. We report here the results of exhaustively sampling the conformational space in protein–protein association using a physics-based energy function. The agreement between experimental intermolecular paramagnetic relaxation enhancement (PRE) data and the PRE profiles calculated from the docked structures shows that the method captures both specific and non-specific encounter complexes. To explore the energy landscape in the vicinity of the native structure, the nonlinear manifold describing the relative orientation of two solid bodies is projected onto a Euclidean space in which the shape of low energy regions is studied by principal component analysis. Results show that the energy surface is canyon-like, with a smooth funnel within a two dimensional subspace capturing over 75% of the total motion. Thus, proteins tend to associate along preferred pathways, similar to sliding of a protein along DNA in the process of protein-DNA recognition.

Cardiovascular Proteomics: Evolution and Potential

  1. Kent Arrell, Irina Neverova and Jennifer E. Van Eyk
    Circ Res. 2001;88:763-773 http://dx.doi.org:/doi:/10.1161/hh0801.090193

The development of proteomics is a timely one for cardiovascular research. Analyses at the organ, subcellular, and molecular levels have revealed dynamic, complex, and subtle intracellular processes associated with heart and vascular disease. The power and flexibility of proteomic analyses, which facilitate protein separation, identification, and characterization, should hasten our understanding of these processes at the protein level. Properly applied, proteomics provides researchers with cellular protein “inventories” at specific moments in time, making it ideal for documenting protein modification due to a particular disease, condition, or treatment. This is accomplished through the establishment of species- and tissue-specific protein databases, providing a foundation for subsequent proteomic studies. Evolution of proteomic techniques has permitted more thorough investigation into molecular mechanisms underlying cardiovascular disease, facilitating identification not only of modified proteins but also of the nature of their modification. Continued development should lead to functional proteomic studies, in which identification of protein modification, in conjunction with functional data from established biochemical and physiological methods, has the ability to further our understanding of the interplay between proteome change and cardiovascular disease.

Advances in Proteomic Technologies and Its Contribution to the Field of Cancer

Mehdi Mesri

Advances in Medicine  2014, Article ID 238045, 25 pages http://dx.doi.org/10.1155/2014/238045

Systematic studies of the cancer genome have generated a wealth of knowledge in recent years. These studies have uncovered a number of new cancer genes not previously known to be causal targets in cancer. Genetic markers can be used to determine predisposition to tumor development, but molecularly targeted treatment strategies are not widely available for most cancers. Precision care plans still must be developed by understanding and implementing basic science research into clinical treatment. Proteomics is continuing to make major strides in the discovery of fundamental biological processes as well as more recent transition into an assay platform capable of measuring hundreds of proteins in any biological system. As such, proteomics can translate basic science discoveries into the clinical practice of precision medicine. The proteomic field has progressed at a fast rate over the past five years in technology, breadth and depth of applications in all areas of the bioscience. Some of the previously experimental technical approaches are considered the gold standard today, and the community is now trying to come to terms with the volume and complexity of the data generated. Here I describe contribution of proteomics in general and biological mass spectrometry in particular to cancer research, as well as related major technical and conceptual developments in the field.

Chemoproteomics reveals Toll-like receptor fatty acylation

Nicholas M Chesarino, Jocelyn C Hach, James L Chen, Balyn W Zaro, et al.
BMC Biology 2014, 12:91 http://www.biomedcentral.com/1741-7007/12/91

Background: Palmitoylation is a 16-carbon lipid post-translational modification that increases protein hydrophobicity. This form of protein fatty acylation is emerging as a critical regulatory modification for multiple aspects of cellular interactions and signaling. Despite recent advances in the development of chemical tools for the rapid identification and visualization of palmitoylated proteins, the palmitoyl proteome has not been fully defined. Here we sought to identify and compare the palmitoylated proteins in murine fibroblasts and dendritic cells.
Results: A total of 563 putative palmitoylation substrates were identified, more than 200 of which have not been previously suggested to be palmitoylated in past proteomic studies. Here we validate the palmitoylation of several new proteins including Toll-like receptors (TLRs) 2, 5 and 10, CD80, CD86, and NEDD4. Palmitoylation of TLR2, which was uniquely identified in dendritic cells, was mapped to a transmembrane domain-proximal cysteine. Inhibition of TLR2 S-palmitoylation pharmacologically or by cysteine mutagenesis led to decreased cell surface expression and a decreased inflammatory response to microbial ligands. Conclusions: This work identifies many fatty acylated proteins involved in fundamental cellular processes as well as cell type-specific functions, highlighting the value of examining the palmitoyl proteomes of multiple cell types. Spalmitoylation of TLR2 is a previously unknown immunoregulatory mechanism that represents an entirely novel avenue for modulation of TLR2 inflammatory activity.

Comparative Proteomics and Network Analysis Identify PKC Epsilon Underlying Long-Chain Fatty Acid Signaling

T Yonezawa, R Kurata, A Tajima, X Cui, H Maruta, H Nakaoka, K Nakajima and H Inokio
J Proteomics Bioinform 2014: 7:11 http://dx.doi.org/10.4172/jpb.1000337

Long-chain fatty acid possesses myriad roles in the biological function of the cells, not only as an energy substrate but also as substrates for cell membrane synthesis and as precursors for intracellular signaling molecules. However, little is known about the biological pathways that are stimulated by long-chain fatty acid. In order to identify the pathway of long-chain fatty acid, we performed 2-dimensional gel electrophoresis in the cells treated with or without oleate, and then analyzed 648 protein spots using PDQuest software and narrowed down 22 significant changing spots by statistical criterion. We also tried to determine these spots by MALDI-QIT-TOF-MS and SWISSPROT database query. We identified 11 proteins and predicted the biological network using available data sets from protein-protein interaction database. This prediction indicated that several protein kinase Cs (PKCs) underlie long chain fatty acid signaling. Indeed, oleate stimulated predicted PKC pathways. In expression array, oleate significantly up-regulated only PKC epsilon, but not other PKCs, in transcriptional levels. Collectively, our proteomics and network analysis implicates that PKC epsilon pathway plays an important role in long-chain fatty acid signaling.
Editorial: The art of proteomics translation

Translational Proteomics 2013; 1: 1–2 http://dx.doi.org/10.1016/j.trprot.2013.03.001

Over the years, the difficulties of transferring fundamental proteomics discoveries to clinical applications have caused a lot of frustration to proteomics researchers and clinicians alike, in both academia and industry. One of the reasons for this barrier is the lack of understanding between basic scientists and physicians: they have been trained using opposing concepts. Whilst the former want to control and understand all variables, the latter need rapid actions on patients, rather than absolute certainties. Both disciplines are difficult to con-dense into a single scientist and therefore interdisciplinary associations need to be fostered. Translational research has often been viewed as a two-way street: bedside to bench, and back to bedside. We should perhaps look at it as a roundabout, with the patient and his disease in the center, surrounded by a constant, iterative inter-play between basic, translational and clinical scientists, from both the public and private sectors. Proteomics research needs more than just a translation road bridge from discoveries to cures. Rather, it requires networks of road junctions to fill all the gaps and to allow cross-fertilization and synergies. Translational research and translational proteomics are more than just interesting concepts and hot keywords, they are supposed to improve the quality of people’s lives. With the launch of Translational Proteomics, we want to help the scientific and medical communities overcome the challenges on the long path from discovery to patient care. By focusing on connecting basic proteomics research to its ultimate clinical applications, the Journal will provide a space for publications detailing proteomics experiments, from early discovery to validation and the bedside.

Structural Basis of Diverse Membrane Target Recognitions by Ankyrins

C Wang, Z Wei, K Chen, F Ye, C Yu, V Bennett, and M Zhang
eLife 2014;  http:dx.doi.org:/10.7554/eLife.04353

Ankyrin adaptors together with their spectrin partners coordinate diverse ion channels and cell adhesion molecules within plasma membrane domains and  thereby promote physiological activities including fast signaling in the heart and  nervous system. Ankyrins specifically bind to numerous membrane targets through  their 24 ankyrin repeats (ANK repeats), although the mechanism for the facile and  independent evolution of these interactions has not been resolved. Here we report the structures of ANK repeats in complex with an inhibitory segment from the C-terminal regulatory domain and with a sodium channel Nav1.2 peptide, respectively, showing that the extended, extremely conserved inner groove spanning the entire ANK repeat solenoid contains multiple target binding sites capable of accommodating target protein with very diverse sequences via combinatorial usage of these sites. These structures establish a framework for understanding the evolution of ankyrins’ membrane targets, with implications for other proteins containing extended ANK repeat domains.

Fusion of Protein Aggregates Facilitates Asymmetric Damage Segregation

Miguel Coelho, Steven J. Lade, Simon Alberti, Thilo Gross, Iva M. Tolic
PLOS Biology June 2014; 12(6):e1001886
http://dx.doi.org:/10.1371/journal.pbio.1001886

Asymmetric segregation of damaged proteins at cell division generates a cell that retains damage and a clean cell that supports population survival. In cells that divide asymmetrically, such as Saccharomyces cerevisiae, segregation of damaged proteins is achieved by retention and active transport. We have previously shown that in the symmetrically dividing Schizosaccharomyces pombe there is a transition between symmetric and asymmetric segregation of damaged proteins. Yet how this transition and generation of damage-free cells are achieved remained unknown. Here, by combining in vivo imaging of Hsp104-associated aggregates, a form of damage, with mathematical modeling, we find that fusion of protein aggregates facilitates asymmetric segregation. Our model predicts that, after stress, the increased number of aggregates fuse into a single large unit, which is inherited asymmetrically by one daughter cell, whereas the other one is born clean. We experimentally confirmed that fusion increases segregation asymmetry, for a range of stresses, and identified Hsp16 as a fusion factor. Our work shows that fusion of protein aggregates promotes the formation of damage-free cells. Fusion of cellular factors may represent a general mechanism for their asymmetric segregation at division.

Symmetric exchange of multi-protein building blocks between stationary focal adhesions and the cytosol

Jan-Erik Hoffmann, Y Fermin, R LO Stricker, K Ickstadt, E Zamir
eLife 2014;3:e02257. http://dx.doi.org:/10.7554/eLife.02257.001

How can the integrin adhesome get self-assembled locally, rapidly, and correctly as diverse cell-matrix adhesion sites? Here, we investigate this question by exploring the cytosolic state of integrin-adhesome components and their dynamic exchange between adhesion sites and cytosol. Using fluorescence cross-correlation spectroscopy (FCCS) and fluorescence recovery after photo-bleaching (FRAP) we found that the integrin adhesome is extensively pre-assembled already in the cytosol as multi-protein building blocks for adhesion sites. Stationary focal adhesions release symmetrically the same types of protein complexes that they recruit, thereby keeping the cytosolic pool of building blocks spatiotemporally uniform. We conclude a model in which multi-protein building blocks enable rapid and modular self-assembly of adhesion sites and symmetric exchange of these building blocks preserves their specifications and thus the assembly logic of the system.

Redox signaling via the molecular chaperone BiP protects cells against endoplasmic reticulum-derived oxidative stress

Jie Wang, Kristeen A Pareja, Chris A Kaiser, Carolyn S Sevier
eLife 2014;3:e03496. http://dx.doi.org:/10.7554/eLife.03496

Oxidative protein folding in the endoplasmic reticulum (ER) has emerged as a potentially significant source of cellular reactive oxygen species (ROS). Recent studies suggest that levels of ROS generated as a byproduct of oxidative folding rival those produced by mitochondrial respiration. Mechanisms that protect cells against oxidant accumulation within the ER have begun to be elucidated yet many questions still remain regarding how cells prevent oxidant-induced damage from ER folding events. Here we report a new role for a central well-characterized player in ER homeostasis as a direct sensor of ER redox imbalance. Specifically we show that a conserved cysteine in the lumenal chaperone BiP is susceptible to oxidation by peroxide, and we demonstrate that oxidation of this conserved cysteine disrupts BiP’s ATPase cycle. We propose that alteration of BiP activity upon oxidation helps cells cope with disruption to oxidative folding within the ER during oxidative stress.

Current perspectives on cadherin-cytoskeleton interactions and dynamics

Xuan Liang, Guillermo A Gomez, Alpha S Yap
Cell Health and Cytoskeleton 2015:7 11–24
http://dx.doi.org/10.2147/CHC.S76107

Cells are linked together dynamically by adhesion molecules, such as the classical cadherins. E-cadherin, which mediates epithelial cell–cell interactions, plays fundamental roles in tissue organization and is often perturbed in diseases such as cancer. It has long been recognized that the biology of E-cadherin arises from cooperation between adhesion and the actin cytoskeleton. A major feature is the generation of contractile forces at junctions, yielding patterns of tension that contribute to tissue integrity and patterning. Here we discuss recent developments in understanding how cadherin junctions integrate signaling and cytoskeletal dynamics to sense and generate force.

N-glycosylation status of E-cadherin controls cytoskeletal dynamics through the organization of distinct β-catenin- and γ-catenin-containing AJs

Basem T Jamal, M Nita-Lazar, Z Gao, B Amin, J Walker, MA Kukuruzinska
Cell Health and Cytoskeleton 2009:1 67–80

N-glycosylation of E-cadherin has been shown to inhibit cell–cell adhesion. Specifically, our recent studies have provided evidence that the reduction of E-cadherin N-glycosylation promoted the recruitment of stabilizing components, vinculin and serine/threonine protein phosphatase 2A (PP2A), to adherens junctions (AJs) and enhanced the association of AJs with the actin cytoskeleton. Here, we examined the details of how N-glycosylation of E-cadherin affected the molecular organization of AJs and their cytoskeletal interactions. Using the hypoglycosylated E-cadherin variant, V13, we show that V13/β-catenin complexes preferentially interacted with PP2A and with the microtubule motor protein dynein. This correlated with dephosphorylation of the microtubule-associated protein tau, suggesting that increased association of PP2A with V13-containing AJs promoted their tethering to microtubules. On the other hand, V13/γ-catenin complexes associated more with vinculin, suggesting that they mediated the interaction of AJs with the actin cytoskeleton. N-glycosylation driven changes in the molecular organization of AJs were physiologically significant because transfection of V13 into A253 cancer cells, lacking both mature AJs and tight junctions (TJs), promoted the formation of stable AJs and enhanced the function of TJs to a greater extent than wild-type E-cadherin. These studies provide the first mechanistic insights into how N-glycosylation of E-cadherin drives changes in AJ composition through the assembly of distinct β-catenin- and γ-catenin-containing scaffolds that impact the interaction with different cytoskeletal components.

Mapping the dynamics of force transduction at cell-cell 4 junctions of epithelial clusters

Mei Rosa Ng, Achim Besser, Joan S. Brugge, Gaudenz Danuser
eLife 2014;10.7554/eLife.03282
http://dx.doi.org/10.7554/eLife.03282

Force transduction at cell-cell adhesions regulates tissue development, maintenance and adaptation. We developed computational and experimental approaches to quantify, with both subcellular and multi-cellular resolution, the dynamics of force transmission in cell clusters. Applying this technology to spontaneously-forming adherent epithelial cell clusters, we found that basal force fluctuations were coupled to E-cadherin localization at the level of individual cell-cell junctions. At the multi-cellular scale, cell-cell force exchange depended on the cell position within a cluster, and was adaptive to reconfigurations due to cell divisions or positional rearrangements. Importantly, force transmission through a cell required coordinated modulation of cell-matrix adhesion and actomyosin contractility in the cell and its neighbors. These data provide insights into  mechanisms that could control mechanical stress homeostasis in dynamic epithelial tissues, and highlight our methods as a resource for the study of mechanotransduction in cell-cell adhesions.

G-protein-coupled receptor signaling and polarized actin dynamics drive cell-in-cell invasion

Vladimir Purvanov, Manuel Holst, Jameel Khan, Christian Baarlink, Robert Grosse
eLife 2014;3:e02786.  http://dx.doi.org:/10.7554/eLife.02786

Homotypic or entotic cell-in-cell invasion is an integrin-independent process observed in carcinoma cells exposed during conditions of low adhesion such as in exudates of malignant disease. Although active cell-in-cell invasion depends on RhoA and actin, the precise mechanism as well as the underlying actin structures and assembly factors driving the process are unknown. Furthermore, whether specific cell surface receptors trigger entotic invasion in a signal-dependent fashion has not been investigated. In this study, we identify the G-protein-coupled LPA receptor 2 (LPAR2) as a signal transducer specifically required for the actively invading cell during entosis. We find that G12/13 and PDZ-RhoGEF are required for entotic invasion, which is driven by blebbing and a uropod-like actin structure at the rear of the invading cell. Finally, we provide evidence for an involvement of the RhoA-regulated formin Dia1 for entosis downstream of LPAR2. Thus, we delineate a signaling process that regulates actin dynamics during cell-in-cell invasion.

Cytoskeletal Basis of Ion Channel Function in Cardiac Muscle

Matteo Vatta, and Georgine Faulkner
Future Cardiol. 2006 Jul 1; 2(4): 467–476. http://dx.doi.org:/10.2217/14796678.2.4.467

The heart is a force-generating organ that responds to self-generated electrical stimuli from specialized cardiomyocytes. This function is modulated by sympathetic and parasympathetic activity.

In order to contract and accommodate the repetitive morphological changes induced by the cardiac cycle, cardiomyocytes depend on their highly evolved and specialized cytoskeletal apparatus. Defects in components of the cytoskeleton, in the long term, affect the ability of the cell to compensate at both functional and structural levels. In addition to the structural remodeling, the myocardium becomes increasingly susceptible to altered electrical activity leading to arrhythmogenesis. The development of arrhythmias secondary to structural remodeling defects has been noted, although the detailed molecular mechanisms are still elusive. Here I will review the current knowledge of the molecular and functional relationships between the cytoskeleton and ion channels and, I will discuss the future impact of new data on molecular cardiology research and clinical practice.

Structure and transport mechanism of the sodium/proton 2 antiporter MjNhaP1

Cristina Paulino, D Wöhlert , E Kapotova, Ö Yildiz & W Kühlbrandt
eLife 2014;  http://dx.doi.org/10.7554/eLife.03583

Sodium/proton antiporters are essential for sodium and pH homeostasis and play a major role in human health and disease. We determined the structures of the archaeal sodium/proton antiporter MjNhaP1 in two complementary states. The inward-open state was obtained by x-ray crystallography in the presence of sodium at pH8, where the transporter is highly active. The outward-open state was obtained by electron crystallography without sodium at pH4, where MjNhaP1 is inactive. Comparison of both structures reveals a 7° tilt of the 6-helix bundle. Na+  uptake measurements indicate non-cooperative transport with an activity maximum at pH7.5. We conclude that binding of a Na+ ion from the outside induces helix movements that close the extracellular cavity, open the cytoplasmic funnel, and result in a ~5 Å vertical relocation of the ion binding site to release the substrate ion into the cytoplasm.

Integrated control of transporter endocytosis and recycling by the arrestin-related protein Rod1 and the ubiquitin ligase Rsp5

Michel Becuwe, Sébastien Léon
eLife 2014; http://dx.doi.org/10.7554/eLife.03307

After endocytosis, membrane proteins can recycle to the cell membrane or be degraded in lysosomes. Cargo ubiquitylation favors their lysosomal targeting and can be regulated by external signals, but the mechanism is ill-defined. Here, we studied the post-endocytic trafficking of Jen1, a yeast monocarboxylate transporter, using microfluidics-assisted live cell imaging. We show that the ubiquitin ligase Rsp5 and the glucose-regulated arrestin related (ART) protein Rod1, involved in the glucose-induced internalization of Jen1, are  also required for the post-endocytic sorting of Jen1 to the yeast lysosome. This new step takes place at the trans-Golgi network (TGN), where Rod1 localizes dynamically upon triggering endocytosis. Indeed, transporter trafficking to the TGN after internalization is required for their degradation. Glucose removal promotes Rod1 relocalization to the cytosol and Jen1 deubiquitylation, allowing transporter recycling when the signal is only transient. Therefore, nutrient availability regulates transporter fate through the localization of the ART/Rsp5 ubiquitylation complex at the TGN.

  1. McKenney, W Huynh, ME. Tanenbaum, G Bhabha, and RD. Vale
    Science Express 19 June 2014 /10.1126/science.1254198
    http://www.sciencemag.org/content/early/recent/10.1126/science.1254198

Cytoplasmic dynein is a molecular motor that transports a large variety of cargoes (e.g., organelles, mRNAs, and viruses) along microtubules over long intracellular distances. The dynactin protein complex is important for dynein activity in vivo, but its precise role has been unclear. Here, we found that purified mammalian dynein did not move processively on microtubules in vitro. However, when dynein formed a complex with dynactin and one of four different cargo-specific adapter proteins, the motor became ultra-processive, moving for distances similar to those of native cargoes in living cells. Thus, we propose that dynein is largely inactive in the cytoplasm and that a variety of adapter proteins activate processive motility by linking dynactin to dynein only when the motor is bound to its proper cargo.

Removal of surface charge–charge interactions from ubiquitin leaves the protein folded and very stable

Vakhtang V. Loladze And George I. Makhatadze
Protein Science (2002), 11:174–177
http://www.proteinscience.org/cgi/doi/10.1101/ps.29902.

The contribution of solvent-exposed charged residues to protein stability was evaluated using ubiquitin as a model protein. We combined site-directed mutagenesis and specific chemical modifications to first replace all Arg residues with Lys, followed by carbomylation of Lys- amino groups. Under the conditions in which all carboxylic groups are protonated (at pH 2), the chemically modified protein is folded and very stable (dG= 18 kJ/mol). These results indicate that surface charge–charge interactions are not an essential fundamental force for protein folding and stability.

Phase Transitions of Multivalent Proteins Can Promote Clustering of Membrane Receptors

Sudeep Banjade and Michael K. Rosen
eLife 2014; http://dx.doi.org/10.7554/eLife.04123

Clustering of proteins into micrometer-sized structures at membranes is observed in many signaling pathways. Most models of clustering are specific to particular systems, and relationships between physical properties of the clusters and their molecular components are not well understood. We report biochemical reconstitution on supported lipid bilayers of protein clusters containing the adhesion receptor Nephrin, and its cytoplasmic partners, Nck and N-WASP. With Nephrin attached to the bilayer, multivalent interactions enable these proteins to polymerize on the membrane surface and undergo two-dimensional phase separation, producing micrometer-sized clusters. Dynamics and thermodynamics of the clusters are modulated by the valencies and affinities of the interacting species. In the presence of the Arp2/3 complex, the clusters assemble actin filaments, suggesting that clustering of regulatory factors could promote local actin assembly at membranes. Interactions between multivalent proteins could be a  general mechanism for cytoplasmic adaptor proteins to organize membrane receptors into micrometer-scale signaling zones.

The quantitative architecture of centromeric chromatin

Dani L Bodor, João F Mata, Mikhail Sergeev, Ana Filipa David, et al.
eLife 2014;3:e02137. http://dx.doi.org:/10.7554/eLife.02137

The centromere, responsible for chromosome segregation during mitosis, is epigenetically defined by CENP-A containing chromatin. The amount of centromeric CENP-A has direct implications for both the architecture and epigenetic inheritance of centromeres. Using complementary strategies, we determined that typical human centromeres contain ∼400 molecules of CENP-A, which is controlled by a mass-action mechanism. This number, despite representing only ∼4% of all centromeric nucleosomes, forms a ∼50-fold enrichment to the overall genome. In addition, although pre-assembled CENP-A is randomly segregated during cell division, this amount of CENP-A is sufficient to prevent stochastic loss of centromere function and identity. Finally, we produced a statistical map of CENP-A occupancy at a human neocentromere and identified nucleosome positions that feature CENP-A in a majority of cells. In summary, we present a quantitative view of the centromere that provides a mechanistic framework for both robust epigenetic inheritance of centromeres and the paucity of neocentromere formation.

Synaptic proteins promote calcium-triggered fast transition from point contact to full fusion

Jiajie Diao, Patricia Grob, Daniel J Cipriano, Minjoung Kyoung
eLife 2012;1:e00109. http://dx.doi.org:/10.7554/eLife.00109

The molecular underpinnings of synaptic vesicle fusion for fast neurotransmitter release are still unclear. Here, we used a single vesicle–vesicle system with reconstituted SNARE and synaptotagmin-1 proteoliposomes to decipher the temporal sequence of membrane states upon Ca2+-injection at 250–500 μM on a 100-ms timescale. Furthermore, detailed membrane morphologies were imaged with cryo-electron microscopy before and after Ca2+-injection. We discovered a heterogeneous network of immediate and delayed fusion pathways. Remarkably, all instances of Ca2+-triggered immediate fusion started from a membrane–membrane point-contact and proceeded to complete fusion without discernible hemifusion intermediates. In contrast, pathways that involved a stable hemifusion diaphragm only resulted in fusion after many seconds, if at all. When complexin was included, the Ca2+-triggered fusion network shifted towards the immediate pathway, effectively synchronizing fusion, especially at lower Ca2+-concentration. Synaptic proteins may have evolved to select this immediate pathway out of a heterogeneous network of possible membrane fusion pathways.

Cytoskeleton, cytoskeletal interactions, and vascular endothelial function

Jingli Wang, Michael E Widlansky
Cell Health and Cytoskeleton 2012:4 119–127
http://dx.doi.org/10.2147/CHC.S21823

Far from being inert, the vascular endothelium is a critical regulator of vascular function. While the endothelium participates in autocrine, paracrine, and endocrine signaling, it also transduces mechanical signals from the cell surface involving key cell structural elements. In this review, we discuss the structure of the vascular endothelium and its relationship to traditional cardiovascular risk factors and clinical cardiovascular events. Further, we review the emerging evidence that cell structural elements, including the glycocalyx, intercellular junctions, and cytoskeleton elements, help the endothelium to communicate with its environment to regulate vascular function, including vessel permeability and signal transduction via nitric oxide bioavailability. Further work is necessary to better delineate the regulatory relationships between known key regulators of vascular function and endothelial cell structural elements.

Cellular prion protein is required for neuritogenesis: fine-tuning of multiple signaling pathways involved in focal adhesions and actin cytoskeleton dynamics

Aurélie Alleaume-Butaux, C Dakowski, M Pietri, S Mouillet-Richard, et al.
Cell Health and Cytoskeleton 2013:5 1–12
http://dx.doi.org/10.2147/CHC.S28081

Neuritogenesis is a dynamic phenomenon associated with neuronal differentiation that allows a rather spherical neuronal stem cell to develop dendrites and axon, a prerequisite for the integration and transmission of signals. The acquisition of neuronal polarity occurs in three steps: (1) neurite sprouting, which consists of the formation of buds emerging from the postmitotic neuronal soma; (2) neurite outgrowth, which represents the conversion of buds into neurites, their elongation and evolution into axon or dendrites; and (3) the stability and plasticity of neuronal polarity. In neuronal stem cells, remodeling and activation of focal adhesions (FAs) associated with deep modifications of the actin cytoskeleton is a prerequisite for neurite sprouting and subsequent neurite outgrowth. A multiple set of growth factors and interactors located in the extracellular matrix and the plasma membrane orchestrate neuritogenesis by acting on intracellular signaling effectors, notably small G proteins such as RhoA, Rac, and Cdc42, which are involved in actin turnover and the dynamics of FAs. The cellular prion protein (PrPC), a glycosylphosphatidylinositol (GPI)-anchored membrane protein mainly known for its role in a group of fatal neurodegenerative diseases, has emerged as a central player in neuritogenesis. Here, we review the contribution of PrPC to neuronal polarization and detail the current knowledge on the signaling pathways fine-tuned by PrPC to promote neurite sprouting, outgrowth, and maintenance. We emphasize that PrPC-dependent neurite sprouting is a process in which PrPC governs the dynamics of FAs and the actin cytoskeleton via β1 integrin signaling. The presence of PrPC is necessary to render neuronal stem cells competent to respond to neuronal inducers and to develop neurites. In differentiating neurons, PrPC exerts a facilitator role towards neurite elongation. This function relies on the interaction of PrPC with a set of diverse partners such as elements of the extracellular matrix, plasma membrane receptors, adhesion molecules, and soluble factors that control actin cytoskeleton turnover through Rho-GTPase signaling. Once neurons have reached their terminal stage of differentiation and acquired their polarized morphology, PrPC also takes part in the maintenance of neurites. By acting on tissue nonspecific alkaline phosphatase, or matrix metalloproteinase type 9, PrPC stabilizes interactions between neurites and the extracellular matrix.

Broader implications: biological and clinical significance of microtubule acetylation

Sharon M Rymut, Thomas J Kelley
Cell Health and Cytoskeleton 2015:7 71–82
http://dx.doi.org/10.2147/CHC.S77040

Microtubule acetylation is a key posttranslational modification that enhances organelle transport, drives cell signaling, and regulates cell cycle regulation. The optimal level of microtubule acetylation is regulated by the acetyltransferase alpha-tubulin-N-acetyltransferase 1and two deacetylases, histone deacetylase 6 and sirtuin-2. Alterations in microtubule acetylation levels have been associated with the pathophysiology of a number of diseases, including various forms of neurodegenerative conditions, cancer, and even cystic fibrosis. In this review, we will highlight the biological and clinical significance of microtubule acetylation and the potential of targeting this pathway for therapeutics.

Inositol-1,4,5-trisphosphate 1 (IP3)-mediated STIM1 oligomerization requires  intact mitochondrial Ca2+ uptake

  1. Deak, S. Blass, M. J. Khan, L. N. Groschner, M. Waldeck-Weiermair, et al.
    Journal of Cell Science 2014 advanced print

Mitochondria contribute to cell signaling by controlling store-operated Ca2+ entry (SOCE).  SOCE is activated by Ca2+ release from the endoplasmic reticulum (ER), whereupon the stromal  interacting molecule 1 (STIM1) forms oligomers, redistributes to ER-plasma membrane  junctions, and opens plasma membrane Ca2+ channels. Mechanisms by which mitochondria interfere with the complex process of SOCE are insufficiently clarified. In this study we used a shRNA approach to investigate the direct involvement of mitochondrial Ca2+ buffering in SOCE. We demonstrate that knock-down of two proteins that are essential for mitochondrial Ca2+ uptake, either the mitochondrial calcium uniporter (MCU) or uncoupling protein 2 (UCP2), results in decelerated STIM1 oligomerization and impaired SOCE following cell stimulation with an inositol-1,4,5-trisphosphate (IP3)-generating agonist. Upon artificially augmented cytosolic Ca2+-buffering or ER Ca2+ depletion by sarco/endoplasmic reticulum Ca2+-ATPase (SERCA) inhibitors, STIM1 oligomerization did not rely on intact mitochondrial Ca2+ uptake.  However, MCU-dependent mitochondrial sequestration of Ca2+ entering through the SOCE  pathway was essential to prevent slow deactivation of SOCE. Our findings show a stimulus specific contribution of mitochondrial Ca2+ uptake to the SOCE machinery likely by shaping cytosolic Ca2+ micro-domains.

Role of forkhead box protein A3 in age-associated metabolic decline

Xinran Ma, Lingyan Xu, Oksana Gavrilov, and Elisabetta Mueller
PNAS | September 30, 2014 | vol. 111 | no. 39 | 14289–14294
www.pnas.org/cgi/doi/10.1073/pnas.1407640111

Aging is associated with increased adiposity and diminished thermogenesis, but the critical transcription factors influencing these metabolic changes late in life are poorly understood. We recently demonstrated that the winged helix factor forkhead box protein A3 (Foxa3) regulates the expansion of visceral adipose tissue in high-fat diet regimens; however, whether Foxa3 also contributes to the increase in adiposity and the decrease in brown fat activity observed during the normal aging process is currently unknown.
Here we report that during aging, levels of Foxa3 are significantlyand selectively up-regulated in brown and inguinal white fat depots, and that midage Foxa3-null mice have increased white fat browning and thermogenic capacity, decreased adipose tissue expansion, improved insulin sensitivity, and increased longevity. Foxa3 gain-of-function and loss-of-function studies in inguinal adipose depots demonstrated a cell-autonomous function for Foxa3 in white fat tissue browning. Furthermore, our analysis revealed that the mechanisms of Foxa3 modulation of brown fat gene programs involve the suppression of peroxisome proliferator activated receptor γ coactivtor 1 α (PGC1α) levels through interference with cAMP responsive element binding protein 1-mediated transcriptional regulation of the PGC1α promoter. Overall, our data demonstrate a role for Foxa3 in energy expenditure and in age-associated metabolic disorders.

Prediction of enzyme function by combining sequence similarity and protein interactions

Jordi Espadaler, Narayanan Eswa, Enrique Querol, Francesc X Avilés, et al.
BMC Bioinformatics 2008, 9:249 http://dx.doi.org:/10.1186/1471-2105-9-249

Background: A number of studies have used protein interaction data alone for protein function prediction. Here, we introduce a computational approach for annotation of enzymes, based on the observation that similar protein sequences are more likely to perform the same function if they share similar interacting partners.
Results: The method has been tested against the PSI-BLAST program using a set of 3,890 protein sequences from which interaction data was available. For protein sequences that align with at least 40% sequence identity to a known enzyme, the specificity of our method in predicting the first three EC digits increased from 80% to 90% at 80% coverage when compared to PSI-BLAST.
Conclusion: Our method can also be used in proteins for which homologous sequences with known interacting partners can be detected. Thus, our method could increase 10% the specificity of genome-wide enzyme predictions based on sequence matching by PSI-BLAST alone.

Plasma Transthyretin Indicates the Direction of both Nitrogen Balance and Retinoid Status in Health and Disease

Ingenbleek Yves and Bienvenu Jacques
The Open Clinical Chemistry Journal, 2008, 1, 1-12

Whatever the nutritional status and the disease condition, the actual transthyretin (TTR) plasma level is determined by opposing influences between anabolic and catabolic alterations. Rising TTR values indicate that synthetic processes prevail over tissue breakdown with a nitrogen balance (NB) turning positive as a result of efficient nutritional support and / or anti-inflammatory therapy. Declining TTR values point to the failure of sustaining NB as an effect of maladjusted dietetic management and / or further worsening of the morbid condition. Serial measurement of TTR thus appears as a dynamic index defining the direction of NB in acute and chronic disorders, serving as a guide to alert the physician on the validity of his therapeutic strategy. The level of TTR production by the liver also works as a limiting factor for the cellular bioavailability of retinol and retinoid derivatives which play major roles in the brain ageing process. Optimal protein nutritional status, as assessed by TTR values within the normal range, prevents the occurrence of vascular and cerebral damages while maintaining the retinoid-mediated memory, cognitive and behavioral activities of elderly persons.

Prof. Dr. Volker Haucke
Institut für Chemie-Biochemie
Takustrasse 6
http://userpage.chemie.fu-berlin.de/biochemie/aghaucke/teaching.html

Eukaryotic cells contain three major types of cytoskeletal filaments

Eukaryotic cells contain three major types of cytoskeletal filaments

major types of cytoskeletal filaments

major types of cytoskeletal filaments

Intermediate Filaments support the nuclear membrane and connect cells at cell junctions

Intermediate Filaments support the nuclear membrane and connect cells at cell junctions

microtubules (MTs; green) radiate from MTOCs (yellow) towards the cell periphery

microtubules (MTs; green) radiate from MTOCs (yellow) towards the cell periphery

Actin polymerization in vitro reveals a critical dependence of filament assembly on G-actin concentration via a 3-step nucleation mechanism

Actin polymerization in vitro reveals a critical dependence of filament assembly on G-actin concentration via a 3-step nucleation mechanism

Binding-proteins and receptors

Motor, visual and emotional deficits in mice after closed-head mild traumatic brain injury are alleviated by the novel CB2 inverse agonist SMM-189
Reiner, A., Heldt, S.A., Presley, C.S., (…), Gurley, S.N., Moore, B.M.
2015  International Journal of Molecular Sciences 16 (1), pp. 758-787

We have developed a focal blast model of closed-head mild traumatic brain injury (TBI) in mice. As true for individuals that have experienced mild TBI, mice subjected to 50-60 psi blast show motor, visual and emotional deficits, diffuse axonal injury and microglial activation, but no overt neuron
loss. Because microglial activation can worsen brain damage after a concussive event and because microglia can be
modulated by their cannabinoid type 2 receptors (CB2), we evaluated the effectiveness of the novel CB2 receptor inverse agonist SMM-189 in altering microglial activation and mitigating deficits after mild TBI. In vitro analysis indicated that SMM-189 converted human microglia from the pro-inflammatory M1 phenotype to the pro-healing M2 phenotype. Studies in mice showed that daily administration of SMM-189 for two weeks beginning shortly after blast greatly reduced the motor, visual, and emotional deficits otherwise evident after 50-60 psi blasts, and prevented brain injury that may contribute to these deficits. Our results suggest that treatment with the CB2 inverse agonist SMM-189 after a mild TBI event can reduce its adverse consequences by beneficially modulating microglial activation. These
findings recommend further evaluation of CB2 inverse agonists as a novel therapeutic approach for treating mild TBI.

The novel small leucine-rich protein chondroadherin-like (CHADL) is expressed in cartilage and modulates chondrocyte differentiation
Tillgren, V., Ho, J.C.S., Önnerfjord, P., Kalamajski, S.
2015  Journal of Biological Chemistry 290 (2), pp. 918-925

The constitution and biophysical properties of extracellular matrices can dramatically influence cellular phenotype during development, homeostasis, or pathogenesis. These effects can be signaled through a differentially regulated assembly of collagen fibrils, orchestrated by a family of collagen-associated small leucine-rich proteins (SLRPs). In this report, we describe the tissue-specific expression and function of a previously uncharacterized SLRP, chondroadherin-like (CHADL). We developed antibodies against CHADL and, by immunohistochemistry, detected CHADL expression mainly in skeletal tissues, particularly in fetal cartilage and in the pericellular space of adult chondrocytes. In situ hybridizations and immunoblots on tissue lysates confirmed this tissue-specific expression pattern. Recombinant CHADL bound collagen in cell culture and inhibited in vitro collagen fibrillogenesis. After Chadl shRNA knockdown, chondrogenic ATDC5 cells increased their differentiation, indicated by increased transcript levels of Sox9, Ihh, Col2a1, and Col10a1. The knockdown increased collagen II and aggrecan deposition in the cell layers.

Microarray analysis of the knockdown samples suggested collagen receptor-related changes, although other upstream effects could not be excluded. Together, our data indicate that the novel SLRP CHADL is expressed in cartilaginous tissues, influences collagen fibrillogenesis, and modulates chondrocyte differentiation. CHADL appears to have a negative regulatory role, possibly ensuring the formation of a stable extracellular matrix.

P53 protein-mediated Up-regulation of MAP kinase phosphatase 3 (MKP-3) contributes to the establishment of the cellular senescent phenotype through dephosphorylation of extracellular signal-regulated kinase 1/2 (ERK1/2)
Zhang, H., Chi, Y., Gao, K., Zhang, X., Yao, J.
2015  Source of the DocumentJournal of Biological Chemistry 290 (2), pp. 1129-1140

Growth arrest is one of the essential features of cellular senescence. At present, the precise mechanisms responsible for the establishment of the senescence-associated arrested phenotype are still incompletely understood. Given that ERK1/2 is one of the major kinases controlling cell growth and proliferation, we examined the possible implication of ERK1/2. Exposure of normal rat epithelial cells to etoposide caused cellular senescence, as manifested by enlarged cell size, a flattened cell body, reduced cell proliferation, enhanced ?-galactosidase activity, and elevated p53 and p21. Senescent cells displayed a blunted response to growth factor-induced cell proliferation, which was preceded by impaired ERK1/2 activation. Further analysis revealed that senescent cells expressed a significantly higher level of mitogenactivated protein phosphatase 3 (MKP-3, a cytosolic ERK1/2-targeted phosphatase), which was suppressed by blocking the transcriptional activity of the tumor suppressor p53 with pifithrin-?. Inhibition of MKP-3 activity with a specific inhibitor or siRNA enhanced basal ERK1/2 phosphorylation and promoted cell proliferation. Apart from its role in growth arrest, impairment of ERK1/2 also contributed to the resistance of senescent cells to oxidant-elicited cell injury. These results therefore indicate that p53-mediated up-regulation of MKP-3 contributes to the establishment of the senescent cellular phenotype through dephosphorylating ERK1/2. Impairment of ERK1/2 activation could be an important mechanism by which p53 controls cellular senescence.

Dynamics and interaction of Interleukin-4 receptor subunits in living cells
Gandhi, H., Worch, R., Kurgonaite, K., (…), Bökel, C., Weidemann, T.
2015  Biophysical Journal 107 (11), pp. 2515-2527

It has long been established that dimerization of Interleukin-4 receptor (IL-4R) subunits is a pivotal step for JAK/STAT signal transduction. However, ligand-induced complex formation at the surface of living cells has been challenging to observe. Here we report an experimental assay employing trisNTA dyes for orthogonal, external labeling of eGFP-tagged receptor constructs that allows the quantification of receptor heterodimerization by dual-color fluorescence cross-correlation spectroscopy. Fluorescence cross-correlation spectroscopy analysis at the plasma membrane shows that IL-4R subunit dimerization is indeed a strictly ligand-induced process.

Under conditions of saturating cytokine occupancy, we determined intramembrane dissociation constants (Kd,2D) of 180 and 480 receptors per ?m2 for the type-2 complexes IL-4:IL-4R?/IL-13R?1 and IL-13:IL-13R?1/IL-4R?, respectively. For the lower affinity type-1 complex IL-4:IL-4R?/IL-2R?, we estimated a Kd,2D of ?1000 receptors per ?m2. The receptor densities required for effective dimerization thus exceed the typical, average expression levels by several orders of magnitude. In addition, we find that all three receptor subunits accumulate rapidly within a subpopulation of early sorting and recycling endosomes stably anchored just beneath the plasma membrane (cortical endosomes, CEs). The receptors, as well as labeled IL-4 and trisNTA ligands are specifically trafficked into CEs by a constitutive internalization mechanism. This may compensate for the inherent weak affinities that govern ligand-induced receptor dimerization at the plasma membrane. Consistently, activated receptors are also concentrated at the CEs. Our observations thus suggest that receptor trafficking may play an important role for the regulation of IL-4R-mediated JAK/STAT signaling.

Role of mitochondria in nonalcoholic fatty liver disease
Nassir, F., Ibdah, J.A.
2015  International Journal of Molecular Sciences 15 (5), pp. 8713-8742

Nonalcoholic fatty liver disease (NAFLD) affects about 30% of the general population in the United States and includes a spectrum of disease that includes simple steatosis, non-alcoholic steatohepatitis (NASH), fibrosis and cirrhosis. Significant insight has been gained into our understanding of the pathogenesis of NALFD; however the key metabolic aberrations underlying lipid accumulation in hepatocytes and the progression of NAFLD remain to be elucidated. Accumulating and emerging evidence indicate that hepatic mitochondria play a critical role in the development and pathogenesis of steatosis and NAFLD. Here, we review studies that document a link between the pathogenesis of NAFLD and hepatic mitochondrial dysfunction with particular focus on new insights into the role of impaired fatty acid oxidation, the transcription factor peroxisome proliferator-activated receptor-? coactivator-1? (PGC-1?), and sirtuins in development and progression of NAFLD.

Read Full Post »

The Union of Biomarkers and Drug Development

The Union of Biomarkers and Drug Development

Author and Curator: Larry H. Bernstein, MD, FCAP

There has been consolidation going on for over a decade in both thr pharmaceutical and in the diagnostics industry, and at the same time the page is being rewritten for health care delivery.  I shall try to work through a clear picture of these not coincidental events.

Key notables:

  1. A growing segment of the US population is reaching Medicare age
  2. There is also a large underserved population in both metropolitan and nonurban areas and a fragmentation of the middle class after a growth slowdown in the economy since the 2008 deep recession.
  3. The deep recession affecting worldwide economies was only buffered by availability of oil or natural gas.
  4. In addition, there was a self-destructive strategy to cut spending on national scales that withdrew the support that would bolster support for infrastrucrue renewl.
  5. There has been a dramatic success in the clinical diagnostics industry, with a long history of being viewed as a loss leader, and this has been recently followed by the pharmaceutical industry faced with inability to introduce new products, leading to more competition in off-patent medications.
  6. The introduction of the Accountable Care Act has opened the opportunities for improved care, despite political opposition, and has probably sustained opportunity in the healthcare market.

Let’s take a look at this three headed serpent. – Pharma, Diagnostics, New Entity
?  The patient  ?
?  Insurance    ?
?  Physician    ?

Part I.   The Concept

When Illumina Buys Roche: The Dawning Of The Era Of Diagnostics Dominance

Robert J. Easton, Alain J. Gilbert, Olivier Lesueur, Rachel Laing, and Mark Ratner
http://PharmaMedtechBI.com    | IN VIVO: The Business & Medicine Report Jul/Aug 2014; 32(7).

  • With current technology and resources, a well-funded IVD company can create and pursue a strategy of information gathering and informatics application to create medical knowledge, enabling it to assume the risk and manage certain segments of patients
  • We see the first step in the process as the emergence of new specialty therapy companies coming from an IVD legacy, most likely focused in cancer, infection, or critical care

When Illumina Inc. acquired the regulatory consulting firm Myraqa, a specialist in in vitro diagnostics (IVD), in July, the press release announcement characterized the deal as one that would bolster illumina’s in-house capabilities for clinical readiness and help prepare for its next growth phase in regulated markets. That’s not surprising given the US Food and Drug Administration’s (FDA) approval a year and a half ago of its MiSeq next-generation sequencer for clinical use. But the deal could also suggest illumina is beginning to move along the path toward taking on clinical risk – that is, eventually

  • advising physicians and patients, which would mean facing regulators directly

Such a move – by illumina, another life sciences tools firm, or an information specialist from the high-tech universe – is inevitable given

  • the emerging power of diagnostics and traditional health care players’ reluctance to themselves take on such risk.

Alternatively, we believe that a well-funded diagnostics company could establish this position. either way, such a champion would establish dominion over and earn higher valuation than less-aggressive players who

  • only supply compartmentalized drug and device solutions.

Diagnostics companies have long been dogged by a fundamental issue:

  1. they are viewed and valued more along the lines of a commodity business than as firms that deliver a unique product or service
  2. diagnostics companies are in position to do just that today because they are now advantaged by having access to more data points.
  3. if they were to cobble together the right capabilities, diagnostics companies would have the ability to turn information into true medical knowledge

Example: PathGEN PathChip

nucleic-acid-based platform detects 296 viruses, bacteria, fungi & parasites

http://ow.ly/d/2GvQhttp://ow.ly/DSORV

This puts the diagnostics player in an unfamiliar realm where it can ask the question of what value they offer compared with a therapeutic. The key is that diagnostics can now offer unique information and potentially unique tools to capture that information. In order to do so, it has to create information from the data it generates, and then to supply that knowledge to users who will value and act on that knowledge. Complex genomic tests, as much as physical examination, may be the first meaningful touch point for physicians’ classification of disease.

Even if lab tests are more expensive, it is a cheaper means for deciding what to do first for a patient than the trial and error of prescribing medication without adequate information. Information is gaining in value as the amount of treatment data available on genomically characterizable subpopulations increases. In such a circumstance
it is the ability to perform that advisory function that will add tremendous value above what any test provides, the leverage of being able to apply a proprietary diagnostics platform – and importantly, the data it generates. It is the ability to perform that advisory function that will add tremendous value above what any test provides.

Integrated Diagnostics Inc. and Biodesix Inc. with mass spectrometry has the tools for unraveling disease processes, and numerous players are quite visibly in or are getting into the business of providing medical knowledge and clinical decision support in pursuit of a huge payout for those who actually solve important disease mysteries. Of course one has to ask whether MS/MS is sufficient for the assigned task, and also whether the technology is ready for the kind of workload experienced in a clinical service compared to a research vehicle.  My impression (as a reviewer) is that it is not now the time to take this seriously.

Roche has not realized its intent with Ventana: failing to deliver on the promise of boosting Roche’s pipeline, which was a significant factor in the high price Roche paid. The combined company was to be “uniquely positioned to further expand Ventana’s business globally and together develop more cost-efficient, differentiated, and targeted medicines.  On the other hand,  Biodesix decided to use Veristrat to look back and analyze important trial data to try to ascertain which patients would benefit from ficlatuzumab (subset). The predictive effect for the otherwise unimpressive trial results was observed in both progression-free survival and overall survival endpoints, and encouraged the companies to conduct a proof-of-concept study of ficlatuzumab in combination with Tarceva in advanced Non Small Cell Lung Cancer Patients (NSCLC) selected using the Veristrat test.

A second phase of IVD evolution will be far more challenging to pharma, when the most accomplished companies begin to assemble and integrate much broader data
sets, thereby gaining knowledge sufficient to actually manage patients and dictate therapy, including drug selection. No individual physician has or will have access to all of this information on thousands of patients, combined with the informatics to tease out from trillions of data points the optimal personalized medical approach. When the IVD-origin knowledge integrator amasses enough data and understanding to guide therapy decisions in large categories, particularly drug choices, it will become more valuable than any of the drug suppliers.

This is an apparent reversal of fortune. The pharmaceutical industry has been considered the valued provider, while the IVD manufacturer has been the low valued cousin. Now, it is by an ability to make kore accurate the drug administration that the IVD company can control the drug bill, to the detriment of drug developers, by finding algorithms that generate equal-to-innovative-drug outcomes using generics for most of the patients, thereby limiting the margins of drug suppliers and the upsides for new drug discovery/development.

It is here that there appears to be a misunderstanding of the whole picture of the development of the healthcare industry.  The pharmaceutical industry had a high value added only insofar it could replace market leaders for treatment before or at the time of patent expiration, which largely depended either introducing a new class of drug, or by relieving the current drug in its class of undesired toxicities or “side effects”.  Otherwise, the drug armamentarium was time limited to the expiration date. In other words, the value was dependent on a window of no competition.  In addition, as the regulation of healthcare costs were tightening under managed care, the introduction of new products that were deemed to be only marginally better, could be substitued by “off-patent” drug products.

The other misunderstanding is related to the IVD sector.  Laboratory tests in the 1950’s were manual, and they could be done by “technicians” who might not have completed a specialized training in clinical laboratory sciences.  The first sign of progress was the introduction of continuous flow chemistry, with a sampling probe, tubing to bring the reacting reagents into a photocell, and the timing of the reaction controlled by a coiled glass tubing before introducing the colored product into a uv-visible photometer.  In perhaps a decade, the Technicon SMA 12 and 6 instruments were introduced that could do up to 18 tests from a single sample.

Part 2. Emergence of an IVD Clinical Automated Diagnostics Industry

Why tests are ordered

  1. Screening
  2. Diagnosis
  3. Monitoring

Historical Perspective

Case in Point 1:  Outstanding Contributions in Clinical Chemistry. 1991. Arthur Karmen.

Dr. Karmen was born in New York City in 1930. He graduated from the Bronx High School of Science in 1946 and earned an A.B. and M.D. in 1950 and 1954, respectively, from New York University. In 1952, while a medical student working on a summer project at Memorial-Sloan Kettering, he used paper chromatography of amino acids to demonstrate the presence of glutamic-oxaloacetic and glutaniic-pyruvic ransaminases (aspartate and alanine aminotransferases) in serum and blood. In 1954, he devised the spectrophotometric method for measuring aspartate aminotransferase in serum, which, with minor modifications, is still used for diagnostic testing today. When developing this assay, he studied the reaction of NADH with serum and demonstrated the presence of lactate and malate dehydrogenases, both of which were also later used in diagnosis. Using the spectrophotometric method, he found that aspartate aminotransferase increased in the period immediately after an acute myocardial infarction and did the pilot studies that showed its diagnostic utility in heart and liver diseases.  This became as important as the EKG. It was replaced in cardiology usage by the MB isoenzyme of creatine kinase, which was driven by Burton Sobel’s work on infarct size, and later by the troponins.

Case in point 2: Arterial Blood Gases.  Van Slyke. National Academy of Sciences.

The test is used to determine the pH of the blood, the partial pressure of carbon dioxide and oxygen, and the bicarbonate level. Many blood gas analyzers will also report concentrations of lactate, hemoglobin, several electrolytes, oxyhemoglobin, carboxyhemoglobin and methemoglobin. ABG testing is mainly used in pulmonology and critical care medicine to determine gas exchange which reflect gas exchange across the alveolar-capillary membrane.

DONALD DEXTER VAN SLYKE died on May 4, 1971, after a long and productive career that spanned three generations of biochemists and physicians. He left behind not only a bibliography of 317 journal publications and 5 books, but also more than 100 persons who had worked with him and distinguished themselves in biochemistry and academic medicine. His doctoral thesis, with Gomberg at University of Michigan was published in the Journal of the American Chemical Society in 1907.  Van Slyke received an invitation from Dr. Simon Flexner, Director of the Rockefeller Institute, to come to New York for an interview. In 1911 he spent a year in Berlin with Emil Fischer, who was then the leading chemist of the scientific world. He was particularly impressed by Fischer’s performing all laboratory operations quantitatively —a procedure Van followed throughout his life. Prior to going to Berlin, he published the  classic nitrous acid method for the quantitative determination of primary aliphatic amino groups,  the first of the many gasometric procedures devised by Van Slyke, and made possible the determination of amino acids. It was the primary method used to study amino acid

composition of proteins for years before chromatography. Thus, his first seven postdoctoral years were centered around the development of better methodology for protein composition and amino acid metabolism.

With his colleague G. M. Meyer, he first demonstrated that amino acids, liberated during digestion in the intestine, are absorbed into the bloodstream, that they are removed by the tissues, and that the liver alone possesses the ability to convert the amino acid nitrogen into urea.  From the study of the kinetics of urease action, Van Slyke and Cullen developed equations that depended upon two reactions: (1) the combination of enzyme and substrate in stoichiometric proportions and (2) the reaction of the combination into the end products. Published in 1914, this formulation, involving two velocity constants, was similar to that arrived at contemporaneously by Michaelis and Menten in Germany in 1913.

He transferred to the Rockefeller Institute’s Hospital in 2013, under Dr. Rufus Cole, where “Men who were studying disease clinically had the right to go as deeply into its fundamental nature as their training allowed, and in the Rockefeller Institute’s Hospital every man who was caring for patients should also be engaged in more fundamental study”.  The study of diabetes was already under way by Dr. F. M. Allen, but patients inevitably died of acidosis.  Van Slyke reasoned that if incomplete oxidation of fatty acids in the body led to the accumulation of acetoacetic and beta-hydroxybutyric acids in the blood, then a reaction would result between these acids and the bicarbonate ions that would lead to a lower than-normal bicarbonate concentration in blood plasma. The problem thus became one of devising an analytical method that would permit the quantitative determination of bicarbonate concentration in small amounts of blood plasma.  He ingeniously devised a volumetric glass apparatus that was easy to use and required less than ten minutes for the determination of the total carbon dioxide in one cubic centimeter of plasma.  It also was soon found to be an excellent apparatus by which to determine blood oxygen concentrations, thus leading to measurements of the percentage saturation of blood hemoglobin with oxygen. This found extensive application in the study of respiratory diseases, such as pneumonia and tuberculosis. It also led to the quantitative study of cyanosis and a monograph on the subject by C. Lundsgaard and Van Slyke.

In all, Van Slyke and his colleagues published twenty-one papers under the general title “Studies of Acidosis,” beginning in 1917 and ending in 1934. They included not only chemical manifestations of acidosis, but Van Slyke, in No. 17 of the series (1921), elaborated and expanded the subject to describe in chemical terms the normal and abnormal variations in the acid-base balance of the blood. This was a landmark in understanding acid-base balance pathology.  Within seven years after Van moved to the Hospital, he had published a total of fifty-three papers, thirty-three of them coauthored with clinical colleagues.

In 1920, Van Slyke and his colleagues undertook a comprehensive investigation of gas and electrolyte equilibria in blood. McLean and Henderson at Harvard had made preliminary studies of blood as a physico-chemical system, but realized that Van Slyke and his colleagues at the Rockefeller Hospital had superior techniques and the facilities necessary for such an undertaking. A collaboration thereupon began between the two laboratories, which resulted in rapid progress toward an exact physico-chemical description of the role of hemoglobin in the transport of oxygen and carbon dioxide, of the distribution of diffusible ions and water between erythrocytes and plasma,
and of factors such as degree of oxygenation of hemoglobin and hydrogen ion concentration that modified these distributions. In this Van Slyke revised his volumetric gas analysis apparatus into a manometric method.  The manometric apparatus proved to give results that were from five to ten times more accurate.

A series of papers on the CO2 titration curves of oxy- and deoxyhemoglobin, of oxygenated and reduced whole blood, and of blood subjected to different degrees of oxygenation and on the distribution of diffusible ions in blood resulted.  These developed equations that predicted the change in distribution of water and diffusible ions between blood plasma and blood cells when there was a change in pH of the oxygenated blood. A significant contribution of Van Slyke and his colleagues was the application of the Gibbs-Donnan Law to the blood—regarded as a two-phase system, in which one phase (the erythrocytes) contained a high concentration of nondiffusible negative ions, i.e., those associated with hemoglobin, and cations, which were not freely exchaThe importance of Vanngeable between cells and plasma. By changing the pH through varying the CO2 tension, the concentration of negative hemoglobin charges changed in a predictable amount. This, in turn, changed the distribution of diffusible anions such as Cl” and HCO3″ in order to restore the Gibbs-Donnan equilibrium. Redistribution of water occurred to restore osmotic equilibrium. The experimental results confirmed the predictions of the equations.

As a spin-off from the physico-chemical study of the blood, Van undertook, in 1922, to put the concept of buffer value of weak electrolytes on a mathematically exact basis.
This proved to be useful in determining buffer values of mixed, polyvalent, and amphoteric electrolytes, and put the understanding of buffering on a quantitative basis. A
monograph in Medicine entitled “Observation on the Courses of Different Types of Bright’s Disease, and on the Resultant Changes in Renal Anatomy,” was a landmark that
related the changes occurring at different stages of renal deterioration to the quantitative changes taking place in kidney function. During this period, Van Slyke and R. M. Archibald identified glutamine as the source of urinary ammonia. During World War II, Van and his colleagues documented the effect of shock on renal function and, with R. A. Phillips, developed a simple method, based on specific gravity, suitable for use in the field.

Over 100 of Van’s 300 publications were devoted to methodology. The importance of Van Slyke’s contribution to clinical chemical methodology cannot be overestimated.
These included the blood organic constituents (carbohydrates, fats, proteins, amino acids, urea, nonprotein nitrogen, and phospholipids) and the inorganic constituents (total cations, calcium, chlorides, phosphate, and the gases carbon dioxide, carbon monoxide, and nitrogen). It was said that a Van Slyke manometric apparatus was almost all the special equipment needed to perform most of the clinical chemical analyses customarily performed prior to the introduction of photocolorimeters and spectrophotometers for such determinations.

The progress made in the medical sciences in genetics, immunology, endocrinology, and antibiotics during the second half of the twentieth century obscures at times the progress that was made in basic and necessary biochemical knowledge during the first half. Methods capable of giving accurate quantitative chemical information on biological material had to be painstakingly devised; basic questions on chemical behavior and metabolism had to be answered; and, finally, those factors that adversely modified the normal chemical reactions in the body so that abnormal conditions arise that we characterize as disease states had to be identified.

Viewed in retrospect, he combined in one scientific lifetime (1) basic contributions to the chemistry of body constituents and their chemical behavior in the body, (2) a chemical understanding of physiological functions of certain organ systems (notably the respiratory and renal), and (3) how such information could be exploited in the
understanding and treatment of disease. That outstanding additions to knowledge in all three categories were possible was in large measure due to his sound and broadly based chemical preparation, his ingenuity in devising means of accurate measurements of chemical constituents, and the opportunity given him at the Hospital of the Rockefeller Institute to study disease in company with physicians.

In addition, he found time to work collaboratively with Dr. John P. Peters of Yale on the classic, two-volume Quantitative Clinical Chemistry. In 1922, John P. Peters, who had just gone to Yale from Van Slyke’s laboratory as an Associate Professor of Medicine, was asked by a publisher to write a modest handbook for clinicians describing useful chemical methods and discussing their application to clinical problems. It was originally to be called “Quantitative Chemistry in Clinical Medicine.” He soon found that it was going to be a bigger job than he could handle alone and asked Van Slyke to join him in writing it. Van agreed, and the two men proceeded to draw up an outline and divide up the writing of the first drafts of the chapters between them. They also agreed to exchange each chapter until it met the satisfaction of both.At the time it was published in 1931, it contained practically all that could be stated with confidence about those aspects of disease that could be and had been studied by chemical means. It was widely accepted throughout the medical world as the “Bible” of quantitative clinical chemistry, and to this day some of the chapters have not become outdated.

History of Laboratory Medicine at Yale University.

The roots of the Department of Laboratory Medicine at Yale can be traced back to John Peters, the head of what he called the “Chemical Division” of the Department of Internal Medicine, subsequently known as the Section of Metabolism, who co-authored with Donald Van Slyke the landmark 1931 textbook Quantitative Clinical Chemistry (2.3); and to Pauline Hald, research collaborator of Dr. Peters who subsequently served as Director of Clinical Chemistry at Yale-New Haven Hospital for many years. In 1947, Miss Hald reported the very first flame photometric measurements of sodium and potassium in serum (4). This study helped to lay the foundation for modern studies of metabolism and their application to clinical care.

The Laboratory Medicine program at Yale had its inception in 1958 as a section of Internal Medicine under the leadership of David Seligson. In 1965, Laboratory Medicine achieved autonomous section status and in 1971, became a full-fledged academic department. Dr. Seligson, who served as the first Chair, pioneered modern automation and computerized data processing in the clinical laboratory. In particular, he demonstrated the feasibility of discrete sample handling for automation that is now the basis of virtually all automated chemistry analyzers. In addition, Seligson and Zetner demonstrated the first clinical use of atomic absorption spectrophotometry. He was one of the founding members of the major Laboratory Medicine academic society, the Academy of Clinical Laboratory Physicians and Scientists.

Davenport fig 10.jpg

Case in Point 3.  Nathan Gochman.  Developer of Automated Chemistries.

Nathan Gochman, PhD, has over 40 years of experience in the clinical diagnostics industry. This includes academic teaching and research, and 30 years in the pharmaceutical and in vitro diagnostics industry. He has managed R & D, technical marketing and technical support departments. As a leader in the industry he was President of the American Association for Clinical Chemistry (AACC) and the National Committee for Clinical Laboratory Standards (NCCLS, now CLSI). He is currently a Consultant to investment firms and IVD companies.

Nathan Gochman

Nathan Gochman

The clinical laboratory has become so productive, particularly in chemistry and immunology, and the labor, instrument and reagent costs are well determined, that today a physician’s medical decisions are 80% determined by the clinical laboratory.  Medical information systems have lagged far behind.  Why is that?  Because the decision for a MIS has historical been based on billing capture.  Moreover, the historical use of chemical profiles were quite good at validating healthy dtatus in an outpatient population, but the profiles became restricted under Diagnostic Related Groups.    Thus, it came to be that the diagnostics was considered a “commodity”.  In order to be competitive, a laboratory had to provide “high complexity” tests that were drawn in by a large volume of “moderate complexity”tests.

Part 3. Biomarkers in Medical Practice

Case in Point 1.

A Solid Prognostic Biomarker

HDL-C: Target of Therapy or Fuggedaboutit?

Steven E. Nissen, MD, MACC, Peter Libby, MD

DisclosuresNovember 06, 2014

Steven E. Nissen, MD, MACC: I am Steve Nissen, chairman of the Department of Cardiovascular Medicine at the Cleveland Clinic. I am here with Dr Peter Libby, chief of cardiology at the Brigham and Women’s Hospital and professor of medicine at Harvard Medical School. We are going to discuss high-density lipoprotein cholesterol (HDL-C), a topic that has been very controversial recently. Peter, HDL-C has been a pretty good biomarker. The question is whether it is a good target.

Peter Libby, MD: Since the early days in Berkley, when they were doing ultracentrifugation, and when it was reinforced and put on the map by the Framingham Study,[1] we have known that HDL-C is an extremely good biomarker of prospective cardiovascular risk with an inverse relationship with all kinds of cardiovascular events. That is as solid a finding as you can get in observational epidemiology. It is a very reliable prospective marker. It’s natural that the pharmaceutical industry and those of us who are interested in risk reduction would focus on HDL-C as a target. That is where the controversies come in.

Dr Nissen: It has been difficult. My view is that the trials that have attempted to modulate HDL-C or the drugs they used have been flawed. Although the results have not been promising, the jury is yet out. Torcetrapib, the cholesteryl ester transfer protein (CETP) inhibitor developed by Pfizer, had anoff-target toxicity.[2] Niacin is not very effective, and there are a lot of downsides to the drug. That has been an issue, but people are still working on this. We have done some studies. We did our ApoA-1 Milano infusion study[3]about a decade ago, which showed very promising results with respect to shrinking plaques in coronary arteries. I remain open to the possibility that the right drug in the right trial will work.

Dr Libby: What do you do with the genetic data that have come out in the past couple of years? Sekar Kathiresan masterminded and organized an enormous collaboration[4] in which they looked, with contemporary genetics, at whether HDL had the genetic markers of being a causal risk factor. They came up empty-handed.

Dr Nissen: I am cautious about interpreting those data, like I am cautious about interpreting animal studies of atherosclerosis. We have both lived through this problem in which something works extremely well in animals but doesn’t work in humans, or it doesn’t work in animals but it works in humans. The genetic studies don’t seal the fate of HDL. I have an open mind about this. Drugs are complex. They work by complex mechanisms. It is my belief that what we have to do is test these hypotheses in well-designed clinical trials, which are rigorously performed with drugs that are clean—unlike torcetrapib—and don’t have off-target toxicities.

An Unmet Need: High Lp(a) Levels

Dr Nissen: I’m going to push back on that and make a couple of points. The HPS2-THRIVE study was flawed. They studied the wrong people. It was not a good study, and AIM-HIGH[8] was underpowered. I am not putting people on niacin. What do you do with a patient whose Lp(a) is 200 mg/dL?

Dr Libby: I’m waiting for the results of the PCSK9 and anacetrapib studies. You can tell me about evacetrapib.[9]Reducing Lp(a) is an unmet medical need. We both care for kindreds with high Lp(a) levels and premature coronary artery disease. We have no idea what to do with them other than to treat them with statins and lower their LDL-C levels.

Dr Nissen: I have taken a more cautious approach with respect to taking people off of niacin. If I have patients who are doing well and tolerating it (depending on why it was started), I am discontinuing niacin in some people. I am starting very few people on the drug, but I worry about the quality of the trial.

Dr Libby: So you are of the “don’t start don’t stop” school?

Dr Nissen: Yes. It’s difficult when the trial is fatally flawed. There were 11,000 patients from China in this study. I have known for years that if you give niacin to people of Asiatic ethnic descent, they have terrible flushing and they won’t continue the drug. One question is, what was the adherence? The adverse events would have been tolerable had there been efficacy. The concern here is that this study was destined to fail because they studied a low LDL/high HDL population, a group of people for whom niacin just isn’t used.

Triglycerides and HDL: Do We Have It Backwards?

Dr Libby: What about the recent genetic[10] and epidemiologic data that support triglycerides, and apolipoprotein C3 in particular as a causal risk factor? Have we been misled through all of the generations in whom we have been adjusting triglycerides for HDL-C and saying that triglycerides are not a causal risk factor because once we adjust for HDL, the risk goes away? Do you think we got it backwards?

Dr Nissen: The tricky factor here is that because of this intimate inverse relationship between triglycerides and HDL, we may be talking about the same phenomenon. That is one of the reasons that I am not certain we are not going to be able to find a therapy. What if you had a therapy that lowered triglycerides and raised HDL-C? Could that work? Could that combination be favorable? I want answers from rigorous, well-designed clinical trials that ask the right questions in the right populations. I am disappointed, just as I have been disappointed by the fibrate trials.[11,12] There is a class of drugs that raises HDL-C a little and lowers triglycerides a lot.

Dr Nissen: But the gemfibrozil studies (VA-HIT[13] and Helsinki Heart[14]) showed benefit.

The Dyslipidemia Bar Has Been Raised

Dr Libby: Those studies were from the pre-statin era. We both were involved in trials in which patients were on high-dose statins at baseline. Do you think that this is too high a bar?

Dr Nissen: The bar has been raised, and for the pharmaceutical industry, the studies that we need to find out whether lowering triglycerides or raising HDL is beneficial are going to be large. We are doing a study with evacetrapib. It has 12,000 patients. It’s fully enrolled. Evacetrapib is a very clean-looking drug. It doesn’t have such a long biological half-life as anacetrapib, so I am very encouraged that it won’t have that baggage of being around for 2-4 years. We’ve got a couple of shots on goal here. Don’t forget that we have multiple ongoing studies of HDL-C infusion therapies that are still under development. Those have some promise too. The jury is still out.

Dr Libby: We agree on the need to do rigorous, large-scale endpoint trials. Do the biomarker studies, but don’t wait to start the endpoint trial because that’s the proof in the pudding.

Dr Nissen: Exactly. We have had a little controversy about HDL-C. We often agree, but not always, and we may have a different perspective. Thanks for joining me in this interesting discussion of what will continue to be a controversial topic for the next several years until we get the results of the current ongoing trials.

Case in Point 2.

NSTEMI? Honesty in Coding and Communication?

Melissa Walton-Shirley

November 07, 2014

The complaint at ER triage: Weakness, fatigue, near syncope of several days’ duration, vomiting, and decreased sensorium.

The findings: O2sat: 88% on room air. BP: 88 systolic. Telemetry: Sinus tachycardia 120 bpm. Blood sugar: 500 mg/dL. Chest X ray: atelectasis. Urinalysis: pyuria. ECG: T-wave-inversion anterior leads. Echocardiography: normal left ventricular ejection fraction (LVEF) and wall motion. Troponin I: 0.3 ng/mL. CT angiography: negative for pulmonary embolism (PE). White blood cell count: 20K with left shift. Blood cultures: positive for Gram-negative rods.

The treatment: Intravenous fluids and IV levofloxacin—changed to ciprofloxacin.

The communication at discharge: “You had a severe urinary-tract infection and grew bacteria in your bloodstream. Also, you’ve had a slight heart attack. See your cardiologist immediately upon discharge-no more than 5 days from now.”

The diagnoses coded at discharge: Urosepsis and non-ST segment elevation MI (NSTEMI) 410.1.

One year earlier: This moderately obese patient was referred to our practice for a preoperative risk assessment. The surgery planned was a technically simple procedure, but due to the need for precise instrumentation, general endotracheal anesthesia (GETA) was being considered. The patient was diabetic, overweight, and short of air. A stress exam was equivocal for CAD due to poor exercise tolerance and suboptimal imaging. Upon further discussion, symptoms were progressive; therefore, cardiac cath was recommended, revealing angiographically normal coronaries and a predictably elevated left ventricular end diastolic pressure (LVEDP) in the mid-20s range. The patient was given a diagnosis of diastolic dysfunction, a prescription for better hypertension control, and in-depth discussion on exercise and the Mediterranean and DASH diets for weight loss. Symptoms improved with a low dose of diuretic. The surgery was completed without difficulty. Upon follow-up visit, the patient felt well, had lost a few pounds, and blood pressure was well controlled.

Five days after ER workup: While out of town, the patient developed profound weakness and went to the ER as described above. Fast forward to our office visit in the designated time frame of “no longer than 5 days’ postdischarge,” where the patient and family asked me about the “slight heart attack” that literally came on the heels of a normal coronary angiogram.

But the patient really didn’t have a “heart attack,” did they? The cardiologist aptly stated that it was likely nonspecific troponin I leak in his progress notes. Yet the hospitalist framed the diagnosis of NSTEMI as item number 2 in the final diagnoses.

The motivations on behalf of personnel who code charts are largely innocent and likely a direct result of the lack of understanding of the coding system on behalf of us as healthcare providers. I have a feeling, though, that hospitals aren’t anxious to correct this misperception, due to an opportunity for increased reimbursement. I contacted a director of a coding department for a large hospital who prefers to remain anonymous. She explained that NSTEMI ICD9 code 410.1 falls in DRG 282 with a weight of .7562. The diagnosis of “demand ischemia,” code 411.89, a slightly less inappropriate code for a nonspecific troponin I leak, falls in DRG 311 with a weight of .5662. To determine reimbursement, one must multiply the weight by the average hospital Medicare base rate of $5370. Keep in mind that each hospital’s base rate and corresponding payment will vary. The difference in reimbursement for a large hospital bill between these two choices for coding is substantial, at over $1000 difference ($4060 vs $3040).

Although hospitals that are already reeling from shrinking revenues will make more money on the front end by coding the troponin leak incorrectly as an NSTEMI, when multiple unnecessary tests are generated to follow up on a nondiagnostic troponin leak, the amount of available Centers for Medicare & Medicaid Services (CMS) reimbursement pie shrinks in the long run. Furthermore, this inappropriate categorization generates extreme concern on behalf of patients and family members that is often never laid to rest. The emotional toll of a “heart-attack” diagnosis has an impact on work fitness, quality of life, cost of medication, and the cost of future testing. If the patient lived for another 100 years, they will likely still list a “heart attack” in their medical history.

As a cardiologist, I resent the loose utilization of one of “my” heart-attack codes when it wasn’t that at all. At discharge, we need to develop a better way of communicating what exactly did happen. Equally important, we need to communicate what exactly didn’t happen as well.

Case in Point 3.

Blood Markers Predict CKD Heart Failure 

Published: Oct 3, 2014 | Updated: Oct 3, 2014

Elevated levels of high-sensitivity troponin T (hsTnT) and N-terminal pro-B-type natriuretic peptide (NT-proBNP) strongly predicted heart failure in patients with chronic kidney disease followed for a median of close to 6 years, researchers reported.

Compared with patients with the lowest blood levels of hsTnT, those with the highest had a nearly five-fold higher risk for developing heart failure and the risk was 10-fold higher in patients with the highest NT-proBNP levels compared with those with the lowest levels of the protein, researcher Nisha Bansal, MD, of the University of Washington in Seattle, and colleagues wrote online in the Journal of the American Society of Nephrology.

A separate study, published online in theJournal of the American Medical Association earlier in the week, also examined the comorbid conditions of heart and kidney disease, finding no benefit to the practice of treating cardiac surgery patients who developed acute kidney injury with infusions of the antihypertensive drug fenoldopam.

The study, reported by researcher Giovanni Landoni, MD, of the IRCCS San Raffaele Scientific Institute, Milan, Italy, and colleagues, was stopped early “for futility,” according to the authors, and the incidence of hypotension during drug infusion was significantly higher in patients infused with fenoldopam than placebo (26% vs. 15%; P=0.001).

Blood Markers Predict CKD Heart Failure

The study in patients with mild to moderate chronic kidney disease (CKD) was conducted to determine if blood markers could help identify patients at high risk for developing heart failure.

Heart failure is the most common cardiovascular complication among people with renal disease, occurring in about a quarter of CKD patients.

The two markers, hsTnT and NT-proBNP, are associated with overworked cardiac myocytes and have been shown to predict heart failure in the general population.

However, Bansal and colleagues noted, the markers have not been widely used in diagnosing heart failure among patients with CKD due to concerns that reduced renal excretion may raise levels of these markers, and therefore do not reflect an actual increase in heart muscle strain.

To better understand the importance of elevated concentrations of hsTnT and NT-proBNP in CKD patients, the researchers examined their association with incident heart failure events in 3,483 participants in the ongoing observational Chronic Renal Insufficiency Cohort (CRIC) study.

All participants were recruited from June 2003 to August 2008, and all were free of heart failure at baseline. The researchers used Cox regression to examine the association of baseline levels of hsTnT and NT-proBNP with incident heart failure after adjustment for demographic influences, traditional cardiovascular risk factors, makers of kidney disease, pertinent medication use, and mineral metabolism markers.

At baseline, hsTnT levels ranged from ≤5.0 to 378.7 pg/mL and NT-proBNP levels ranged from ≤5 to 35,000 pg/mL. Compared with patients who had undetectable hsTnT, those in the highest quartile (>26.5 ng/mL) had a significantly higher rate of heart failure (hazard ratio 4.77; 95% CI 2.49-9.14).

Compared with those in the lowest NT-proBNP quintile (<47.6 ng/mL), patients in the highest quintile (>433.0 ng/mL) experienced an almost 10-fold increase in heart failure risk (HR 9.57; 95% CI 4.40-20.83).

The researchers noted that these associations remained robust after adjustment for potential confounders and for the other biomarker, suggesting that while hsTnT and NT-proBNP are complementary, they may be indicative of distinct biological pathways for heart failure.

Even Modest Increases in NP-proBNP Linked to Heart Failure

The findings are consistent with an earlier analysis that included 8,000 patients with albuminuria in the Prevention of REnal and Vascular ENd-stage Disease (PREVEND) study, which showed that hsTnT was associated with incident cardiovascular events, even after adjustment for eGFR and severity of albuminuria.

“Among participants in the CRIC study, those with the highest quartile of detectable hsTnT had a twofold higher odds of left ventricular hypertrophy compared with those in the lowest quartile,” Bansal and colleagues wrote, adding that the findings were similar after excluding participants with any cardiovascular disease at baseline.

Even modest elevations in NT-proBNP were associated with significantly increased rates of heart failure, including in subgroups stratified by eGFR, proteinuria, and diabetic status.

“NT-proBNP regulates blood pressure and body fluid volume by its natriuretic and diuretic actions, arterial dilation, and inhibition of the renin-aldosterone-angiotensin system and increased levels of this marker likely reflect myocardial stress induced by subclinical changes in volume or pressure, even in persons without clinical disease,” the researchers wrote.

The researchers concluded that further studies are needed to develop and validate risk prediction tools for clinical heart failure in patients with CKD, and to determine the potential role of these two biomarkers in a heart failure risk prediction and prevention strategy.

Fenoldopam ‘Widely Promoted’ in AKI Cardiac Surgery Setting

The JAMA study examined whether the selective dopamine receptor D agonist fenoldopam mesylate can reduce the need for dialysis in cardiac surgery patients who develop acute kidney injury (AKI).

Fenoldopam induces vasodilation of the renal, mesenteric, peripheral, and coronary arteries, and, unlike dopamine, it has no significant affinity for D2 receptors, meaning that it theoretically induces greater vasodilation in the renal medulla than in the cortex, the researchers wrote.

“Because of these hemodynamic effects, fenoldopam has been widely promoted for the prevention and therapy of AKI in the United States and many other countries with apparent favorable results in cardiac surgery and other settings,” Landoni and colleagues wrote.

The drug was approved in 1997 by the FDA for the indication of in-hospital, short-term management of severe hypertension. It has not been approved for renal indications, but is commonly used off-label in cardiac surgery patients who develop AKI.

Although a meta analysis of randomized trials, conducted by the researchers, indicated a reduction in the incidence and progression of AKI associated with the treatment, Landoni and colleagues wrote that the absence of a definitive trial “leaves clinicians uncertain as to whether fenoldopam should be prescribed after cardiac surgery to prevent deterioration in renal function.”

To address this uncertainty, the researchers conducted a prospective, randomized, parallel-group trial in 667 patients treated at 19 hospitals in Italy from March 2008 to April 2013.

All patients had been admitted to ICUs after cardiac surgery with early acute kidney injury (≥50% increase of serum creatinine level from baseline or low output of urine for ≥6 hours). A total of 338 received fenoldopam by continuous intravenous infusion for a total of 96 hours or until ICU discharge, while 329 patients received saline infusions.

The primary end point was the rate of renal replacement therapy, and secondary end points included mortality (intensive care unit and 30-day mortality) and the rate of hypotension during study drug infusion.

Study Showed No Benefit, Was Stopped Early

Yale Lampoon – AA Liebow.   1954

Not As a Doctor
[Fourth Year]

These lyrics, sung by John Cole, Jack Gariepy and Ed Ransenhofer to music borrowed from Gilbert and Sullivan’s The Mikado, lampooned Averill Liebow, M.D., a pathologist noted for his demands on students. (CPC stands for clinical pathology conference.)

If you want to know what this is,
it’s a medical CPC
Where we give the house staff
the biz, for there’s no one so
wise as we!
We pathologists show them how,
Although it is too late now.
Our art is a sacred cow!

American physician, born 1911, Stryj in Galicia, Austria (now in Ukraine); died 1978.

Averill Abraham Liebow, born in Austria, was the “founding father” of pulmonary pathology in the United States. He started his career as a pathologist at Yale, where he remained for many years. In 1968 he moved to the University of California School of Medicine, San Diego, where he taught for 7 years as Professor and Chairman, Department of Pathology.

His studies include many classic studies of lung diseases. Best known of these is his famous classification of interstitial lung disease. He also published papers on sclerosing pneumocytoma, pulmonary alveolar proteinosis, meningothelial-like nodules, pulmonary hypertension, pulmonary veno-occlusive disease, lymphomatoid granulomatosis, pulmonary Langerhans cell histiocytosis, pulmonary epithelioid hemangioendothelioma and pulmonary hyalinizing granuloma .

As a Lieutenant Colonel in the US Army Medical Corps, He was a member of the Atomic Bomb Casualty Commission who studied the effects of the atomic bomb in Hiroshima and Nagasaki.

We thank Sanjay Mukhopadhyay, M.D., for information submitted.

As a resident at UCSD, Dr. Liebow held “Organ Recitals” every morning, including Mother’s day.  The organs had to be presented in specified order… heart, lung, and so forth.  On one occasion, we needed a heart for purification of human lactate dehydrogenase for a medical student project, so I presented the lung out of order.  Dr. Liebow asked where the heart was, and I told the group it was noprmal and I froze it for enzyme purification (smiles).  In the future show it to me first. He was generous to those who showed interest.  As I was also doing research in Nathan Kaplan’s laboratory, he made special arrangements for me to mentor Deborah Peters, the daughter of a pulmonary physician, and granddaughter of the Peters who collaborated with Van Slyke.  I mentored many students with great reward since then.  He could look at a slide and tell you what the x-ray looked like.  I didn’t encounter that again until he sent me to the Armed Forces Institute of Pathology, Washington, DC during the Vietnam War and Watergate, and I worked in Orthopedic Pathology with Lent C. Johnson.  He would not review a case without the x-ray, and he taught the radiologists.

Part 3

My Cancer Genome from Vanderbilt University: Matching Tumor Mutations to Therapies & Clinical Trials

Reporter: Aviva Lev-Ari, PhD, RN
http://pharmaceuticalintelligence.com/2014/11/05/my-cancer-genome-from-vanderbilt-university-matching-tumor-mutations-to-therapies-clinical-trials/

GenomOncology and Vanderbilt-Ingram Cancer Center (VICC) today announced a partnership for the exclusive commercial development of a decision support tool based on My Cancer Genome™, an online precision cancer medicine knowledge resource for physicians, patients, caregivers and researchers.

Through this collaboration, GenomOncology and VICC will enhance My Cancer Genome through the development of a new genomics content management tool. The MyCancerGenome.org website will remain free and open to the public. In addition, GenomOncology will develop a decision support tool based on My Cancer Genome™ data that will enable automated interpretation of mutations in the genome of a patient’s tumor, providing actionable results in hours versus days.

Vanderbilt-Ingram Cancer Center (VICC) launched My Cancer Genome™ in January 2011 as an integral part of their Personalized Cancer Medicine Initiative that helps physicians and researchers track the latest developments in precision cancer medicine and connect with clinical research trials. This web-based information tool is designed to quickly educate clinicians on the rapidly expanding list of genetic mutations that impact cancers and enable the research of treatment options based on specific mutations. For more information on My Cancer Genome™visit www.mycancergenome.org/about/what-is-my-cancer-genome.

Therapies based on the specific genetic alterations that underlie a patient’s cancer not only result in better outcomes but often have less adverse reactions

Up front fee

Nominal fee covers installation support, configuring the Workbench to your specification, designing and developing custom report(s) and training your team.

Per sample fee

GenomOncology is paid on signed-out clinical reports. This philosophy aligns GenomOncology with your Laboratory as we are incentivized to offer world-class support and solutions to differentiate your clinical NGS program. There is no annual license fee.

Part 4

Clinical Trial Services: Foundation Medicine & EmergingMed to Partner

Reporter: Aviva Lev-Ari, PhD, RN
http://pharmaceuticalintelligence.com/2014/11/03/clinical-trial-services-foundation-medicine-emergingmed-to-partner/

Foundation Medicine and EmergingMed said today that they will partner to offer clinical trial navigation services for health care providers and their patients who have received one of Foundation Medicine’s tumor genomic profiling tests.

The firms will provide concierge services to help physicians

  • identify appropriate clinical trials for patients
  • based on the results of FoundationOne or FoundationOne Heme.

“By providing clinical trial navigation services, we aim to facilitate

  • timely and accurate clinical trial information and enrollment support services for physicians and patients,
  • enabling greater access to treatment options based on the unique genomic profile of a patient’s cancer

Currently, there are over 800 candidate therapies that target genomic alterations in clinical trials,

  • but “patients and physicians must identify and act on relevant options
  • when the patient’s clinical profile is aligned with the often short enrollment window for each trial.

These investigational therapies are an opportunity to engage patients with cancer whose cancer has progressed or returned following standard treatment in a most favorable second option after relapse.  The new service is unique in notifying when new clinical trials emerge that match a patient’s genomic and clinical profile.

Google signs on to Foundation Medicine cancer Dx by offering tests to employees

By Emily Wasserman

Diagnostics luminary Foundation Medicine ($FMI) is generating some upward momentum, fueled by growing revenues and the success of its clinical tests. Tech giant Google ($GOOG) has taken note and is signing onto the company’s cancer diagnostics by offering them to employees.

Foundation Medicine CEO Michael Pellini said during the company’s Q3 earnings call that Google will start covering its DNA tests for employees and their family members suffering from cancer as part of its health benefits portfolio, Reuters reports.

Both sides stand to benefit from the deal, as Google looks to keep a leg up on Silicon Valley competitors and Foundation Medicine expands its cancer diagnostics platform. Last month, Apple ($AAPL) and Facebook ($FB) announced that they would begin covering the cost of egg freezing for female employees. A diagnostics partnership and attractive health benefits could work wonders for Google’s employee retention rates and bottom line.

In the meantime, Cambridge, MA-based Foundation Medicine is charging full speed ahead with its cancer diagnostics platform after filing for an IPO in September 2013. The company chalked up 6,428 clinical tests during Q3 2014, an eye-popping 149% increase year over year, and brought in total revenue for the quarter of $16.4 million–a 100% leap from last year. Foundation Medicine credits the promising numbers in part to new diagnostic partnerships and extended coverage for its tests.

In January, the company teamed up with Novartis ($NVS) to help the drugmaker evaluate potential candidates for its cancer therapies. In April, Foundation Medicine announced that it would develop a companion diagnostic test for a Clovis Oncology ($CLVS) drug under development to treat patients with ovarian cancer, building on an ongoing collaboration between the two companies.

Foundation Medicine also has its sights set on China’s growing diagnostics market, inking a deal in October with WuXi PharmaTech ($WX) that allows the company to perform lab testing for its FoundationOne assay at WuXi’s Shanghai-based Genome Center.

a nod to the deal with Google during a corporate earnings call on Wednesday, according to a person who listened in. Pellini said Google employees were made aware of this new benefit last week.

Foundation Medicine teams with MD Anderson for new trial of cancer Dx

Second study to see if targeted therapy can change patient outcomes

August 15, 2014 | By   FierceDiagnostics

Foundation Medicine ($FMI) is teaming up with the MD Anderson Cancer Center in Texas for a new trial of the the Cambridge, MA-based company’s molecular diagnostic cancer test that targets therapies matched to individual patients.

The study is called IMPACT2 (Initiative for Molecular Profiling and Advanced Cancer Therapy) and is designed to build on results from the the first IMPACT study that found

  • 40% of the 1,144 patients enrolled had an identifiable genomic alteration.

The company said that

  • by matching specific gene alterations to therapies,
  • 27% of patients in the first study responded versus
  • 5% with an unmatched treatment, and
  • “progression-free survival” was longer in the matched group.

The FoundationOne molecular diagnostic test

  • combines genetic sequencing and data gathering
  • to help oncologists choose the best treatment for individual patients.

Costing $5,800 per test, FoundationOne’s technology can uncover a large number of genetic alterations for 200 cancer-related genes,

  • blending genomic sequencing, information and clinical practice.

“Based on the IMPACT1 data, a validated, comprehensive profiling approach has already been adopted by many academic and community-based oncology practices,” Vincent Miller, chief medical officer of Foundation Medicine, said in a release. “This study has the potential to yield sufficient evidence necessary to support broader adoption across most newly diagnosed metastatic tumors.”

The company got a boost last month when the New York State Department of Health approved Foundation Medicine’s two initial cancer tests: the FoundationOne test and FoundationOne Heme, which creates a genetic profile for blood cancers. Typically,

  • diagnostics companies struggle to win insurance approval for their tests
  • even after they gain a regulatory approval, leaving revenue growth relatively flat.

However, Foundation Medicine reported earlier this week its Q2 revenue reached $14.5 million compared to $5.9 million for the same period a year ago. Still,

  1. net losses continue to soar as the company ramps up
  2. its commercial and business development operation,
  • hitting $13.7 million versus a $10.1 million deficit in the second quarter of 2013.

Oncology

There has been a remarkable transformation in our understanding of

  • the molecular genetic basis of cancer and its treatment during the past decade or so.

In depth genetic and genomic analysis of cancers has revealed that

  • each cancer type can be sub-classified into many groups based on the genetic profiles and
  • this information can be used to develop new targeted therapies and treatment options for cancer patients.

This panel will explore the technologies that are facilitating our understanding of cancer, and

  • how this information is being used in novel approaches for clinical development and treatment.
Oncology _ Reprted by Dr. Aviva Lev-Ari, Founder, Leaders in Pharmaceutical Intelligence

Opening Speaker & Moderator:

Lynda Chin, M.D.
Department Chair, Department of Genomic Medicine
MD Anderson Cancer Center

  • Who pays for PM?
  • potential of Big data, analytics, Expert systems, so not each MD needs to see all cases, Profile disease to get same treatment
  • business model: IP, Discovery, sharing, ownership — yet accelerate therapy
  • security of healthcare data
  • segmentation of patient population
  • management of data and tracking innovations
  • platforms to be shared for innovations
  • study to be longitudinal,
  • How do we reconcile course of disease with PM
  • phinotyping the disease vs a Patient in wait for cure/treatment

Panelists:

Roy Herbst, M.D., Ph.D.
Ensign Professor of Medicine and Professor of Pharmacology;
Chief of Medical Oncology, Yale Cancer Center and Smilow Cancer Hospital

Development new drugs to match patient, disease and drug – finding the right patient for the right Clinical Trial

  • match patient to drugs
  • partnerships: out of 100 screened patients, 10 had the gene, 5 were able to attend the trial — without the biomarker — all 100 patients would participate for the WRONG drug for them (except the 5)
  • patients wants to participate in trials next to home NOT to have to travel — now it is in the protocol
  • Annotated Databases – clinical Trial informed consent – adaptive design of Clinical Trial vs protocol
  • even Academic MD can’t read the reports on Genomics
  • patients are treated in the community — more training to MDs
  • Five companies collaborating – comparison og 6 drugs in the same class
  • if drug exist and you have the patient — you must apply PM

Summary and Perspective:

The current changes in Biotechnology have been reviewed with an open question about the relationship of In Vitro Diagnostics to Biopharmaceuticals switching, with the potential, particularly in cancer and infectious diseases, to added value in targeted therapy by matching patients to the best potential treatment for a favorable outcome.

This reviewer does not see the movement of the major diagnostics leaders entering into the domain of direct patient care, even though there are signals in that direction.  The Roche example is perhaps the most interesting because Roche already became the elephant in the room after the introduction of Valium,  subsequently bought out Boehringer Mannheim Diagnostics to gain entry into the IVD market, and established a huge presence in Molecular Diagnostics early.  If it did anything to gain a foothold in the treatment realm, it would more likely forge a relationship with Foundation Medicine.  Abbott Laboratories more than a decade ago was overextended, and it had become the leader in IVD as a result of the specialty tests, but it fell into difficulties with quality control of its products in the high volume testing market, and acceeded to Olympus, Roche, and in the mid volume market to Beckman and Siemens.  Of course, Dupont and Kodak, pioneering companies in IVD, both left the market.

The biggest challenge in the long run is identified by the ability to eliminate many treatments that would be failures for a large number of patients. That has already met the proof of concept.  However, when you look at the size of the subgroups, we are not anywhere near a large scale endeavor.  In addition, there is a lot that has to be worked out that is not related to genomic expression by the “classic” model, but has to take into account the emrging knowledge and greater understanding of regulation of cell metabolism, not only in cancer, but also in chronic inflammatory diseases.

Read Full Post »

The Evolution of Clinical Chemistry in the 20th Century

Curator: Larry H. Bernstein, MD, FCAP

This is a subchapter in the series on developments in diagnostics in the period from 1880 to 1980.

Otto Folin: America’s First Clinical Biochemist

(Extracted from Samuel Meites, AACC History Division; Apr 1996)

Forward by Wendell T. Caraway, PhD.

The first introduction to Folin comes with the Folin-Wu protein-free filktrate, a technique for removing proteins from whole blood or plasma that resulted in water-clear solutions suitable for the determination of glucose, creatinine, uric acid, non-protein nitrogen, and chloride. The major active ingredient used in the precipitation of protein was sodium tungstate prepared “according to Folin”.Folin-Wu sugar tubes were used for the determination of glucose. From these and subsequent encounters, we learned that Folin was a pioneer in methods for the chemical analysis of blood.  The determination of uric acid in serum was the Benedict method in which protein-free filtrate was mixed with solutions of sodium cyanide and arsenophosphotungstic acid and then heated in a water bath to develop a blue color.  A thorough review of the literature revealed that Folin and Denis had published, in 1912, a method for uric acid in which they used sodium carbonate, rather than sodium cyanide, which was modified and largely superceded the “cyanide”method.

Notes from the author.

Modern clinical chemistry began with the application of 20th century quantitative analysis and instrumentation to measure constituents of blood and urine, and relating the values obtained to human health and disease. In the United States, the first impetus propelling this new area of biochemistry was provided by the 1912 papers of Otto Folin.  The only precedent for these stimulating findings was his own earlier and certainly classic papers on the quantitative compositiuon of urine, the laws governing its composition, and studies on the catabolic end products of protein, which led to his ingenious concept of endogenous and exogenous metabolism.  He had already determined blood ammonia in 1902.  This work preceded the entry of Stanley Benedict and Donald Van Slyke into biochemistry.  Once all three of them were active contributors, the future of clinical biochemistry was ensured. Those who would consult the early volumes of the Journal of Biological Chemistry will discover the direction that the work of Otto Follin gave to biochemistry.  This modest, unobstrusive man of Harvard was a powerful stimulus and inspiration to others.

Quantitatively, in the years of his scientific productivity, 1897-1934, Otto Folin published 151 (+ 1) journal articles including a chapter in Aberhalden’s handbook and one in Hammarsten’s Festschrift, but excluding his doctoral dissertation, his published abstracts, and several articles in the proceedings of the Association of Life Insurance Directors of America. He also wrote one monograph on food preservatives and produced five editions of his laboratory manual. He published four articles while studying in Europe (1896-98), 28 while at the McLean Hospital (1900-7), and 119 at Harvard (1908-34). In his banner year of 1912 he published 20 papers. His peak period from 1912-15 included 15 papers, the monograph, and most of the work on the first edition of his laboratory manual.

The quality of Otto Folin’s life’s work relates to its impact on biochemistry, particularly clinical biochemistry.  Otto’s two brilliant collaborators, Willey Denis and Hsien Wu, must be acknowledged.  Without denis, Otto could not have achieved so rapidly the introduction and popularization of modern blood analysis in the U.S. It would be pointless to conjecture how far Otto would have progressed without this pair.

His work provided the basis of the modern approach to the quantitative analysis of blood and urine through improved methods that reduced the body fluid volume required for analysis. He also applied these methods to metabolic studies on tissues as well as body fluids. Because his interests lay in protein metabolism, his major contributions were directede toward measuring nitrogenous waste or end products.His most dramatic achievement was is illustrated by the study of blood nitrogen retention in nephritis and gout.

Folin introduced colorimetry, turbidimetry, and the use of color filters into quantitative clinical biochemistry. He initiated and applied ingeniously conceived reagents and chemical reactions that paved the way for a host of studies by his contemporaries. He introduced the use of phosphomolybdate for detecting phenolic compounds, and phosphomolybdate for uric acid.  These, in turn, led to the quantitation of epinephrine and tyrosin tryptophane, and cystine in protein. The molybdate suggested to Fiske and SubbaRow the determination of phosphate as phosphomolybdate, and the tungsten led to the use of tungstic acid as a protein precipitant.  Phosphomolybdate became the key reagent in thge blood sugar method.  Folin resurrected the abandoned Jaffe reaction and established creatine and creatinine analysis. He also laid the groundwork for the discovery of the discovery of creatine phosphate. Clinical chemistry owes to him the introductionb of Nessler’s reagent, permutit, Lloyd’s reagent, gum ghatti, and preservatives for standards, such as benzoic acid and formaldehyde. Among his distinguished graduate investigators were Bloor, Doisy, fiske, Shaffer, SubbaRow, Sumner and, Wu.

A Golden Age of Clinical Chemistry: 1948–1960

Louis Rosenfeld
Clinical Chemistry 2000; 46(10): 1705–1714

The 12 years from 1948 to 1960 were notable for introduction of the Vacutainer
tube, electrophoresis, radioimmunoassay, and the Auto-Analyzer. Also
appearing during this interval were new organizations, publications, programs,
and services that established a firm foundation for the professional status
of clinical chemists. It was a golden age.
Except for photoelectric colorimeters, the clinical chemistry laboratories
in 1948—and in many places even later—were not very different from
those of 1925. The basic technology and equipment were essentially
unchanged.There was lots of glassware of different kinds—pipettes,
burettes, wooden racks of test tubes, funnels, filter paper,
cylinders, flasks, and beakers—as well as visual colorimeters,
centrifuges, water baths, an exhaust hood for evaporating organic
solvents after extractions, a microscope for examining urine
sediments, a double-pan analytical beam balance for weighing
reagents and standard chemicals, and perhaps a pH meter. The
most complicated apparatus was the Van Slyke volumetric gas
device—manually operated. The emphasis was on classical chemical
and biological techniques that did not require instrumentation.
The unparalleled growth and wide-ranging research that began after
World War II and have continued into the new century, often aided by
government funding for biomedical research and development as civilian
health has become a major national goal, have impacted the operations
of the clinical chemistry laboratory. The years from 1948 to 1960 were
especially notable for the innovative technology that produced better
methods for the investigation of many diseases, in many cases
leading to better treatment.

AUTOMATION IN CLINICAL CHEMISTRY: CURRENT SUCCESSES AND TRENDS
FOR THE FUTURE
Pierangelo Bonini
Pure & Appl.Chem.,1982;.54, (11):, 2Ol7—2O3O,

the history of automation in clinical chemistry is the history of how and
when the techno logical progress in the field of analytical methodology
as well as in the field of instrumentation, has helped clinical chemists
to mechanize their procedures and to control them.

GENERAL STEPS OF A CLINICAL CHEMISTRY PROCEDURE –
1 – PRELIMINARY TREATMENT (DEPR0TEINIZATION)
2 – SAMPLE + REAGENT(S)
3 – INCUBATION
L – READING
5 – CALCULATION
Fig. 1 General steps of a clinical chemistry procedure
Especially in the classic clinical chemistry methods, a preliminary treatment
of the sample ( in most cases a deproteinization) was an essential step. This
was a major constraint on the first tentative steps in automation and we will
see how this problem was faced and which new problems arose from avoiding
deproteinization. Mixing samples and reagents is the next step; then there is
a more or less long incubation at different temperatures and finally reading,
which means detection of modifications of some physical property of the
mixture; in most cases the development of a colour can reveal the reaction
but, as well known, many other possibilities exist; finally the result is calculated.

Some 25 years ago, Skeggs (1) presented his paper on continuous flow
automation that was the basis of very successful instruments still used all over
the world. The continuous flow automation reactions take place in an hydraulic
route common to all samples.them after mechanization.

Standards and samples enter the analytical stream segmented by air bubbles
and, as they circulate, specific chemical reactions and physical manipulations
continuously take place in the stream. Finally, after the air bubbles are vented,
the colour intensity, proportional to the solute molecules, is monitored in a
detector flow cell.

It is evident that the most important aim of automation is to correctly process
as many samples in as short a time as possible. This result can be obtained
thanks to many technological advances either from analytical point of view or
from the instrument technology.

ANALYTICAL METHODOLOGY –
– VERY ACTIVE ENZYMATIC REAGENTS
–                          SHORTER REACTION TIME
– KINETIC AND FIXED TIME REACTIONS
–                        No NEED OF DEPROTEINIZATION
– SURFACTANTS
–                      AUTOMATIC SAtIPLE BLANK CALCULATION
– POLYCHROMATIC ANALYSIS

The introduction of very active enzymatic reagents for determination of
substrates resulted in shorter reaction times and possibly, in many cases,
of avoiding deproteinization.Reaction times are also reduced by using kinetic
and fixed time reactions instead of end points. In this case, the measurement
of sample blank does not need a separate tube with separate reaction
mixture. Deproteinization can be avoided also by using some surfac—
tants in the reagent mixture. An automatic calculation of sample blanks
is also possible by using polychromatic analysis. As we can see from this
figure, reduction of reaction times and elimination of tedious ope
rations like deproteinization, are the main results of this analytical progress.

Many relevant improvements in mechanics and optics over the last
twenty years and the tremendous advance in electronics have largely
contributed to the instrumental improvement of clinical chemistry automation.

A recent interesting innovation in the field of centrifugal analyzers consists
in the possibility of adding another reagent to an already mixed sample—
reagent solution. This innovation allows a preincubation to be made and
sample blanks to be read before adding the starter reagent.
The possibility to measure absorbances in cuvettes positioned longitudinally
to the light path, realized in a recent model of centrifugal analyzers, is claimed
to be advantageous to read absorbances in non homogeneous solutions, to
avoid any influence of reagent volume errors on the absorbance and to have
more suitable calculation factors. The interest of fluorimetric assays is
growing more and more, especially in connection with drugs immunofluorimetric
assays. This technology has been recently applied also to centrifugal analyzers
technology. A Xenon lamp generates a high energy light, reflected by a mirror
— holographic — grating operated by a stepping motor.
The selected wavelength of the exciting light passes through a split and
reaches the rotating cuvettes. Fluorescence is then filtered, read by
means of a photomultiplier and compared to the continuously monitored
fluorescence of an appropriate reference compound. In this way, eventual
instability due either to the electro—optical devices or to changes in
physicochemical properties of solution is corrected.

…more…

Dr. Yellapragada Subbarow – ATP – Energy for Life

One of the observations Dr SubbaRow made while testing the phosphorus method seemed to provide a clue to the mystery what happens to blood sugar when insulin is administered. Biochemists began investigating the problem when Frederick Banting showed that injections of insulin, the pancreatic hormone, keeps blood sugar under control and keeps diabetics alive.

SubbaRow worked for 18 months on the problem, often dieting and starving along with animals used in experiments. But the initial observations were finally shown to be neither significant nor unique and the project had to be scrapped in September 1926.

Out of the ashes of this project however arose another project that provided the key to the ancient mystery of muscular contraction. Living organisms resist degeneration and destruction with the help of muscles, and biochemists had long believed that a hypothetical inogen provided the energy required for the flexing of muscles at work.

Two researchers at Cambridge University in United Kingdom confirmed that lactic acid is formed when muscles contract and Otto Meyerhof of Germany showed that this lactic acid is a breakdown product of glycogen, the animal starch stored all over the body, particularly in liver, kidneys and muscles. When Professor Archibald Hill of the University College of London demonstrated that conversion of glycogen to lactic acid partly accounts for heat produced during muscle contraction everybody assumed that glycogen was the inogen. And, the 1922 Nobel Prize for medicine and physiology was divided between Hill and Meyerhof.

But how is glycogen converted to lactic acid? Embden, another German biochemist, advanced the hypothesis that blood sugar and phosphorus combine to form a hexose phosphoric ester which breaks down glycogen in the muscle to lactic acid.

In the midst of the insulin experiments, it occurred to Fiske and SubbaRow that Embden’s hypothesis would be supported if normal persons were found to have more hexose phosphate in their muscle and liver than diabetics. For diabetes is the failure of the body to use sugar. There would be little reaction between sugar and phosphorus in a diabetic body. If Embden was right, hexose (sugar) phosphate level in the muscle and liver of diabetic animals should rise when insulin is injected.

Fiske and SubbaRow rendered some animals diabetic by removing their pancreas in the spring of 1926, but they could not record any rise in the organic phosphorus content of muscles or livers after insulin was administered to the animals. Sugar phosphates were indeed produced in their animals but they were converted so quickly by enzymes to lactic acid that Fiske and SubbaRow could not detect them with methods then available. This was fortunate for science because, in their mistaken belief that Embden was wrong, they began that summer an extensive study of organic phosphorus compounds in the muscle “to repudiate Meyerhof completely”.

The departmental budget was so poor that SubbaRow often waited on the back streets of Harvard Medical School at night to capture cats he needed for the experiments. When he prepared the cat muscles for estimating their phosphorus content, SubbaRow found he could not get a constant reading in the colorimeter. The intensity of the blue colour went on rising for thirty minutes. Was there something in muscle which delayed the colour reaction? If yes, the time for full colour development should increase with the increase in the quantity of the sample. But the delay was not greater when the sample was 10 c.c. instead of 5 c.c. The only other possibility was that muscle had an organic compound which liberated phosphorus as the reaction in the colorimeter proceeded. This indeed was the case, it turned out. It took a whole year.

The mysterious colour delaying substance was a compound of phosphoric acid and creatine and was named Phosphocreatine. It accounted for two-thirds of the phosphorus in the resting muscle. When they put muscle to work by electric stimulation, the Phosphocreatine level fell and the inorganic phosphorus level rose correspondingly. It completely disappeared when they cut off the blood supply and drove the muscle to the point of “fatigue” by continued electric stimulation. And, presto! It reappeared when the fatigued muscle was allowed a period of rest.

Phosphocreatine created a stir among the scientists present when Fiske unveiled it before the American Society of Biological Chemists at Rochester in April 1927. The Journal of American Medical Association hailed the discovery in an editorial. The Rockefeller Foundation awarded a fellowship that helped SubbaRow to live comfortably for the first time since his arrival in the United States. All of Harvard Medical School was caught up with an enthusiasm that would be a life-time memory for con­temporary students. The students were in awe of the medium-sized, slightly stoop shouldered, “coloured” man regarded as one of the School’s top research workers.

SubbaRow’s carefully conducted series of experiments disproved Meyerhof’s assumptions about the glycogen-lactic acid cycle. His calculations fully accounted for the heat output during muscle contraction. Hill had not been able to fully account for this in terms of Meyerhof’s theory. Clearly the Nobel Committee was in haste in awarding the 1922 physiology prize, but the biochemistry orthodoxy led by Meyerhof and Hill themselves was not too eager to give up their belief in glycogen as the prime source of muscular energy.

Fiske and SubbaRow were fully upheld and the Meyerhof-Hill­ theory finally rejected in 1930 when a Danish physiologist showed that muscles can work to exhaustion without the aid of glycogen or the stimulation of lactic acid.

Fiske and SubbaRow had meanwhile followed a substance that was formed by the combination of phosphorus, liberated from Phosphocreatine, with an unidentified compound in muscle. SubbaRow isolated it and identified it as a chemical in which adenylic acid was linked to two extra molecules of phosphoric acid. By the time he completed the work to the satisfaction of Fiske, it was August 1929 when Harvard Medical School played host to the 13th International Physiological Congress.

ATP was presented to the gathered scientists before the Congress ended. To the dismay of Fiske and SubbaRow, a few days later arrived in Boston a German science journal, published 16 days before the Congress opened. It carried a letter from Karl Lohmann of Meyerhof’s laboratory, saying he had isolated from muscle a compound of adenylic acid linked to two molecules of phosphoric acid!

While Archibald Hill never adjusted himself to the idea that the basis of his Nobel Prize work had been demolished, Otto Meyerhof and his associates had seen the importance of Phosphocreatine discovery and plunged themselves into follow-up studies in competition with Fiske and SubbaRow. Two associates of Hill had in fact stumbled upon Phosphocreatine about the same time as Fiske and SubbaRow but their loyalty to Meyerhof-Hill theory acted as blinkers and their hasty and premature publications reveal their confusion about both the nature and significance of Phosphocreatine.

The discovery of ATP and its significance helped reveal the full story of muscular contraction: Glycogen arriving in muscle gets converted into lactic acid which is siphoned off to liver for re-synthesis of glycogen. This cycle yields three molecules of ATP and is important in delivering usable food energy to the muscle. Glycolysis or break up of glycogen is relatively slow in getting started and in any case muscle can retain ATP only in small quantities. In the interval between the begin­ning of muscle activity and the arrival of fresh ATP from glycolysis, ­Phosphocreatine maintains ATP supply by re-synthesizing it as fast as its energy terminals are used up by muscle for its activity.

Muscular contraction made possible by ATP helps us not only to move our limbs and lift weights but keeps us alive. The heart is after all a muscle pouch and millions of muscle cells embedded in the walls of arteries keep the life-sustaining blood pumped by the heart coursing through body organs. ATP even helps get new life started by powering the sperm’s motion toward the egg as well as the spectacular transformation of the fertilized egg in the womb.

Archibald Hill for long denied any role for ATP in muscle contraction, saying ATP has not been shown to break down in the intact muscle. This objection was also met in 1962 when University of Pennsylvania scientists showed that muscles can contract and relax normally even when glycogen and Phosphocreatine are kept under check with an inhibitor.

Michael Somogyi

Michael Somogyi was born in Reinsdorf, Austria-Hungary, in 1883. He received a degree in chemical engineering from the University of Budapest, and after spending some time there as a graduate assistant in biochemistry, he immigrated to the United States. From 1906 to 1908 he was an assistant in biochemistry at Cornell University.

Returning to his native land in 1908, he became head of the Municipal Laboratory in Budapest, and in 1914 he was granted his Ph.D. After World War I, the politically unstable situation in his homeland led him to return to the United States where he took a job as an instructor in biochemistry at Washington University in St. Louis, Missouri. While there he assisted Philip A. Shaffer and Edward Adelbert Doisy, Sr., a future Nobel Prize recipient, in developing a new method for the preparation of insulin in sufficiently large amounts and of sufficient purity to make it a viable treatment for diabetes. This early work with insulin helped foster Somogyi’s lifelong interest in the treatment and cure of diabetes. He was the first biochemist appointed to the staff of the newly opened Jewish Hospital, and he remained there as the director of their clinical laboratory until his retirement in 1957.

Arterial Blood Gases.  Van Slyke.

The test is used to determine the pH of the blood, the partial pressure of carbon dioxide and oxygen, and the bicarbonate level. Many blood gas analyzers will also report concentrations of lactate, hemoglobin, several electrolytes, oxyhemoglobin, carboxyhemoglobin and methemoglobin. ABG testing is mainly used in pulmonology and critical care medicine to determine gas exchange which reflect gas exchange across the alveolar-capillary membrane.

DONALD DEXTER VAN SLYKE died on May 4, 1971, after a long and productive career that spanned three generations of biochemists and physicians. He left behind not only a bibliography of 317 journal publications and 5 books, but also more than 100 persons who had worked with him and distinguished themselves in biochemistry and academic medicine. His doctoral thesis, with Gomberg at University of Michigan was published in the Journal of the American Chemical Society in 1907.  Van Slyke received an invitation from Dr. Simon Flexner, Director of the Rockefeller Institute, to come to New York for an interview. In 1911 he spent a year in Berlin with Emil Fischer, who was then the leading chemist of the scientific world. He was particularly impressed by Fischer’s performing all laboratory operations quantitatively —a procedure Van followed throughout his life. Prior to going to Berlin, he published the classic nitrous acid method for the quantitative determination of primary aliphatic amino groups, the first of the many gasometric procedures devised by Van Slyke, and made possible the determination of amino acids. It was the primary method used to study amino acid composition of proteins for years before chromatography. Thus, his first seven postdoctoral years were centered around the development of better methodology for protein composition and amino acid metabolism.

With his colleague G. M. Meyer, he first demonstrated that amino acids, liberated during digestion in the intestine, are absorbed into the bloodstream, that they are removed by the tissues, and that the liver alone possesses the ability to convert the amino acid nitrogen into urea.  From the study of the kinetics of urease action, Van Slyke and Cullen developed equations that depended upon two reactions: (1) the combination of enzyme and substrate in stoichiometric proportions and (2) the reaction of the combination into the end products. Published in 1914, this formulation, involving two velocity constants, was similar to that arrived at contemporaneously by Michaelis and Menten in Germany in 1913.

He transferred to the Rockefeller Institute’s Hospital in 2013, under Dr. Rufus Cole, where “Men who were studying disease clinically had the right to go as deeply into its fundamental nature as their training allowed, and in the Rockefeller Institute’s Hospital every man who was caring for patients should also be engaged in more fundamental study”.  The study of diabetes was already under way by Dr. F. M. Allen, but patients inevitably died of acidosis.  Van Slyke reasoned that if incomplete oxidation of fatty acids in the body led to the accumulation of acetoacetic and beta-hydroxybutyric acids in the blood, then a reaction would result between these acids and the bicarbonate ions that would lead to a lower than-normal bicarbonate concentration in blood plasma. The problem thus became one of devising an analytical method that would permit the quantitative determination of bicarbonate concentration in small amounts of blood plasma.  He ingeniously devised a volumetric glass apparatus that was easy to use and required less than ten minutes for the determination of the total carbon dioxide in one cubic centimeter of plasma.  It also was soon found to be an excellent apparatus by which to determine blood oxygen concentrations, thus leading to measurements of the percentage saturation of blood hemoglobin with oxygen. This found extensive application in the study of respiratory diseases, such as pneumonia and tuberculosis. It also led to the quantitative study of cyanosis and a monograph on the subject by C. Lundsgaard and Van Slyke.

In all, Van Slyke and his colleagues published twenty-one papers under the general title “Studies of Acidosis,” beginning in 1917 and ending in 1934. They included not only chemical manifestations of acidosis, but Van Slyke, in No. 17 of the series (1921), elaborated and expanded the subject to describe in chemical terms the normal and abnormal variations in the acid-base balance of the blood. This was a landmark in understanding acid-base balance pathology.  Within seven years after Van moved to the Hospital, he had published a total of fifty-three papers, thirty-three of them coauthored with clinical colleagues.

In 1920, Van Slyke and his colleagues undertook a comprehensive investigation of gas and electrolyte equilibria in blood. McLean and Henderson at Harvard had made preliminary studies of blood as a physico-chemical system, but realized that Van Slyke and his colleagues at the Rockefeller Hospital had superior techniques and the facilities necessary for such an undertaking. A collaboration thereupon began between the two laboratories, which resulted in rapid progress toward an exact physico-chemical description of the role of hemoglobin in the transport of oxygen and carbon dioxide, of the distribution of diffusible ions and water between erythrocytes and plasma, and of factors such as degree of oxygenation of hemoglobin and hydrogen ion concentration that modified these distributions. In this Van Slyke revised his volumetric gas analysis apparatus into a manometric method.  The manometric apparatus proved to give results that were from five to ten times more accurate.

A series of papers on the CO2 titration curves of oxy- and deoxyhemoglobin, of oxygenated and reduced whole blood, and of blood subjected to different degrees of oxygenation and on the distribution of diffusible ions in blood resulted.  These developed equations that predicted the change in distribution of water and diffusible ions between blood plasma and blood cells when there was a change in pH of the oxygenated blood. A significant contribution of Van Slyke and his colleagues was the application of the Gibbs-Donnan Law to the blood—regarded as a two-phase system, in which one phase (the erythrocytes) contained a high concentration of nondiffusible negative ions, i.e., those associated with hemoglobin, and cations, which were not freely exchaThe importance of Vanngeable between cells and plasma. By changing the pH through varying the CO2 tension, the concentration of negative hemoglobin charges changed in a predictable amount. This, in turn, changed the distribution of diffusible anions such as Cl” and HCO3″ in order to restore the Gibbs-Donnan equilibrium. Redistribution of water occurred to restore osmotic equilibrium. The experimental results confirmed the predictions of the equations.

As a spin-off from the physico-chemical study of the blood, Van undertook, in 1922, to put the concept of buffer value of weak electrolytes on a mathematically exact basis.

This proved to be useful in determining buffer values of mixed, polyvalent, and amphoteric electrolytes, and put the understanding of buffering on a quantitative basis. A monograph in Medicine entitled “Observation on the Courses of Different Types of Bright’s Disease, and on the Resultant Changes in Renal Anatomy,” was a landmark that related the changes occurring at different stages of renal deterioration to the quantitative changes taking place in kidney function. During this period, Van Slyke and R. M. Archibald identified glutamine as the source of urinary ammonia. During World War II, Van and his colleagues documented the effect of shock on renal function and, with R. A. Phillips, developed a simple method, based on specific gravity, suitable for use in the field.

Over 100 of Van’s 300 publications were devoted to methodology. The importance of Van Slyke’s contribution to clinical chemical methodology cannot be overestimated. These included the blood organic constituents (carbohydrates, fats, proteins, amino acids, urea, nonprotein nitrogen, and phospholipids) and the inorganic constituents (total cations, calcium, chlorides, phosphate, and the gases carbon dioxide, carbon monoxide, and nitrogen). It was said that a Van Slyke manometric apparatus was almost all the special equipment needed to perform most of the clinical chemical analyses customarily performed prior to the introduction of photocolorimeters and spectrophotometers for such determinations.

The progress made in the medical sciences in genetics, immunology, endocrinology, and antibiotics during the second half of the twentieth century obscures at times the progress that was made in basic and necessary biochemical knowledge during the first half. Methods capable of giving accurate quantitative chemical information on biological material had to be painstakingly devised; basic questions on chemical behavior and metabolism had to be answered; and, finally, those factors that adversely modified the normal chemical reactions in the body so that abnormal conditions arise that we characterize as disease states had to be identified.

Viewed in retrospect, he combined in one scientific lifetime (1) basic contributions to the chemistry of body constituents and their chemical behavior in the body, (2) a chemical understanding of physiological functions of certain organ systems (notably the respiratory and renal), and (3) how such information could be exploited in the understanding and treatment of disease. That outstanding additions to knowledge in all three categories were possible was in large measure due to his sound and broadly based chemical preparation, his ingenuity in devising means of accurate measurements of chemical constituents, and the opportunity given him at the Hospital of the Rockefeller Institute to study disease in company with physicians.

In addition, he found time to work collaboratively with Dr. John P. Peters of Yale on the classic, two-volume Quantitative Clinical Chemistry. In 1922, John P. Peters, who had just gone to Yale from Van Slyke’s laboratory as an Associate Professor of Medicine, was asked by a publisher to write a modest handbook for clinicians describing useful chemical methods and discussing their application to clinical problems. It was originally to be called “Quantitative Chemistry in Clinical Medicine.” He soon found that it was going to be a bigger job than he could handle alone and asked Van Slyke to join him in writing it. Van agreed, and the two men proceeded to draw up an outline and divide up the writing of the first drafts of the chapters between them. They also agreed to exchange each chapter until it met the satisfaction of both.At the time it was published in 1931, it contained practically all that could be stated with confidence about those aspects of disease that could be and had been studied by chemical means. It was widely accepted throughout the medical world as the “Bible” of quantitative clinical chemistry, and to this day some of the chapters have not become outdated.

Paul Flory

Paul J. Flory was born in Sterling, Illinois, in 1910. He attended Manchester College, an institution for which he retained an abiding affection. He did his graduate work at Ohio State University, earning his Ph.D. in 1934. He was awarded the Nobel Prize in Chemistry in 1974, largely for his work in the area of the physical chemistry of macromolecules.

Flory worked as a newly minted Ph.D. for the DuPont Company in the Central Research Department with Wallace H. Carothers. This early experience with practical research instilled in Flory a lifelong appreciation for the value of industrial application. His work with the Air Force Office of Strategic Research and his later support for the Industrial Affiliates program at Stanford University demonstrated his belief in the need for theory and practice to work hand-in-hand.

Following the death of Carothers in 1937, Flory joined the University of Cincinnati’s Basic Science Research Laboratory. After the war Flory taught at Cornell University from 1948 until 1957, when he became executive director of the Mellon Institute. In 1961 he joined the chemistry faculty at Stanford, where he would remain until his retirement.

Among the high points of Flory’s years at Stanford were his receipt of the National Medal of Science (1974), the Priestley Award (1974), the J. Willard Gibbs Medal (1973), the Peter Debye Award in Physical Chemistry (1969), and the Charles Goodyear Medal (1968). He also traveled extensively, including working tours to the U.S.S.R. and the People’s Republic of China.

Abraham Savitzky

Abraham Savitzky was born on May 29, 1919, in New York City. He received his bachelor’s degree from the New York State College for Teachers in 1941. After serving in the U.S. Air Force during World War II, he obtained a master’s degree in 1947 and a Ph.D. in 1949 in physical chemistry from Columbia University.

In 1950, after working at Columbia for a year, he began a long career with the Perkin-Elmer Corporation. Savitzky started with Perkin-Elmer as a staff scientist who was chiefly concerned with the design and development of infrared instruments. By 1956 he was named Perkin-Elmer’s new product coordinator for the Instrument Division, and as the years passed, he continued to gain more and more recognition for his work in the company. Most of his work with Perkin-Elmer focused on computer-aided analytical chemistry, data reduction, infrared spectroscopy, time-sharing systems, and computer plotting. He retired from Perkin-Elmer in 1985.

Abraham Savitzky holds seven U.S. patents pertaining to computerization and chemical apparatus. During his long career he presented numerous papers and wrote several manuscripts, including “Smoothing and Differentiation of Data by Simplified Least Squares Procedures.” This paper, which is the collaborative effort of Savitzky and Marcel J. E. Golay, was published in volume 36 of Analytical Chemistry, July 1964. It is one of the most famous, respected, and heavily cited articles in its field. In recognition of his many significant accomplishments in the field of analytical chemistry and computer science, Savitzky received the Society of Applied Spectroscopy Award in 1983 and the Williams-Wright Award from the Coblenz Society in 1986.

Samuel Natelson

Samuel Natelson attended City College of New York and received his B.S. in chemistry in 1928. As a graduate student, Natelson attended New York University, receiving a Sc.M. in 1930 and his Ph.D. in 1931. After receiving his Ph.D., he began his career teaching at Girls Commercial High School. While maintaining his teaching position, Natelson joined the Jewish Hospital of Brooklyn in 1933. Working as a clinical chemist for Jewish Hospital, Natelson first conceived of the idea of a society by and for clinical chemists. Natelson worked to organize the nine charter members of the American Association of Clinical Chemists, which formally began in 1948. A pioneer in the field of clinical chemistry, Samuel Natelson has become a role model for the clinical chemist. Natelson developed the usage of microtechniques in clinical chemistry. During this period, he served as a consultant to the National Aeronautics and Space Administration in the 1960s, helping analyze the effect of weightless atmospheres on astronauts’ blood. Natelson spent his later career as chair of the biochemistry department at Michael Reese Hospital and as a lecturer at the Illinois Institute of Technology.

Arnold Beckman

Arnold Orville Beckman (April 10, 1900 – May 18, 2004) was an American chemist, inventor, investor, and philanthropist. While a professor at Caltech, he founded Beckman Instruments based on his 1934 invention of the pH meter, a device for measuring acidity, later considered to have “revolutionized the study of chemistry and biology”.[1] He also developed the DU spectrophotometer, “probably the most important instrument ever developed towards the advancement of bioscience”.[2] Beckman funded the first transistor company, thus giving rise to Silicon Valley.[3]

He earned his bachelor’s degree in chemical engineering in 1922 and his master’s degree in physical chemistry in 1923. For his master’s degree he studied the thermodynamics of aqueous ammonia solutions, a subject introduced to him by T. A. White.. Beckman decided to go to Caltech for his doctorate. He stayed there for a year, before returning to New York to be near his fiancée, Mabel. He found a job with Western Electric’s engineering department, the precursor to the Bell Telephone Laboratories. Working with Walter A. Shewhart, Beckman developed quality control programs for the manufacture of vacuum tubes and learned about circuit design. It was here that Beckman discovered his interest in electronics.

In 1926 the couple moved back to California and Beckman resumed his studies at Caltech. He became interested in ultraviolet photolysis and worked with his doctoral advisor, Roscoe G. Dickinson, on an instrument to find the energy of ultraviolet light. It worked by shining the ultraviolet light onto a thermocouple, converting the incident heat into electricity, which drove a galvanometer. After receiving a Ph.D. in photochemistry in 1928 for this application of quantum theory to chemical reactions, Beckman was asked to stay on at Caltech as an instructor and then as a professor. Linus Pauling, another of Roscoe G. Dickinson’s graduate students, was also asked to stay on at Caltech.

During his time at Caltech, Beckman was active in teaching at both the introductory and advanced graduate levels. Beckman shared his expertise in glass-blowing by teaching classes in the machine shop. He also taught classes in the design and use of research instruments. Beckman dealt first-hand with the chemists’ need for good instrumentation as manager of the chemistry department’s instrument shop. Beckman’s interest in electronics made him very popular within the chemistry department at Caltech, as he was very skilled in building measuring instruments.

Over the time that he was at Caltech, the focus of the department increasingly moved towards pure science and away from chemical engineering and applied chemistry. Arthur Amos Noyes, head of the chemistry division, encouraged both Beckman and chemical engineer William Lacey to be in contact with real-world engineers and chemists, and Robert Andrews Millikan, Caltech’s president, referred technical questions to Beckman from government and businessess.

Sunkist Growers was having problems with its manufacturing process. Lemons that were not saleable as produce were made into pectin or citric acid, with sulfur dioxide used as a preservative. Sunkist needed to know the acidity of the product at any given time, Chemist Glen Joseph at Sunkist was attempting to measure the hydrogen-ion concentration in lemon juice electrochemically, but sulfur dioxide damaged hydrogen electrodes, and non-reactive glass electrodes produced weak signals and were fragile.

Joseph approached Beckman, who proposed that instead of trying to increase the sensitivity of his measurements, he amplify his results. Beckman, familiar with glassblowing, electricity, and chemistry, suggested a design for a vacuum-tube amplifier and ended up building a working apparatus for Joseph. The glass electrode used to measure pH was placed in a grid circuit in the vacuum tube, producing an amplified signal which could then be read by an electronic meter. The prototype was so useful that Joseph requested a second unit.

Beckman saw an opportunity, and rethinking the project, decided to create a complete chemical instrument which could be easily transported and used by nonspecialists. By October 1934, he had registered patent application U.S. Patent No. 2,058,761 for his “acidimeter”, later renamed the pH meter. Although it was priced expensively at $195, roughly the starting monthly wage for a chemistry professor at that time, it was significantly cheaper than the estimated cost of building a comparable instrument from individual components, about $500. The original pH meter weighed in at nearly 7 kg, but was a substantial improvement over a benchful of delicate equipment. The earliest meter had a design glitch, in that the pH readings changed with the depth of immersion of the electrodes, but Beckman fixed the problem by sealing the glass bulb of the electrode. The pH meter is an important device for measuring the pH of a solution, and by 11 May 1939, sales were successful enough that Beckman left Caltech to become the full-time president of National Technical Laboratories. By 1940, Beckman was able to take out a loan to build his own 12,000 square foot factory in South Pasadena.

In 1940, the equipment needed to analyze emission spectra in the visible spectrum could cost a laboratory as much as $3,000, a huge amount at that time. There was also growing interest in examining ultraviolet spectra beyond that range. In the same way that he had created a single easy-to-use instrument for measuring pH, Beckman made it a goal to create an easy-to-use instrument for spectrophotometry. Beckman’s research team, led by Howard Cary, developed several models.

The new spectrophotometers used a prism to spread light into its absorption spectra and a phototube to “read” the spectra and generate electrical signals, creating a standardized “fingerprint” for the material tested. With Beckman’s model D, later known as the DU spectrophotometer, National Technical Laboratories successfully created the first easy-to-use single instrument containing both the optical and electronic components needed for ultraviolet-absorption spectrophotometry. The user could insert a sample, dial up the desired frequency, and read the amount of absorption of that frequency from a simple meter. It produced accurate absorption spectra in both the ultraviolet and the visible regions of the spectrum with relative ease and repeatable accuracy. The National Bureau of Standards ran tests to certify that the DU’s results were accurate and repeatable and recommended its use.

Beckman’s DU spectrophotometer has been referred to as the “Model T” of scientific instruments: “This device forever simplified and streamlined chemical analysis, by allowing researchers to perform a 99.9% accurate biological assessment of a substance within minutes, as opposed to the weeks required previously for results of only 25% accuracy.” Nobel laureate Bruce Merrifield is quoted as calling the DU spectrophotometer “probably the most important instrument ever developed towards the advancement of bioscience.”

Development of the spectrophotometer also had direct relevance to the war effort. The role of vitamins in health was being studied, and scientists wanted to identify Vitamin A-rich foods to keep soldiers healthy. Previous methods involved feeding rats for several weeks, then performing a biopsy to estimate Vitamin A levels. The DU spectrophotometer yielded better results in a matter of minutes. The DU spectrophotometer was also an important tool for scientists studying and producing the new wonder drug penicillin. By the end of the war, American pharmaceutical companies were producing 650 billion units of penicillin each month. Much of the work done in this area during World War II was kept secret until after the war.

Beckman also developed the infrared spectrophotometer, first the the IR-1, then, in 1953, he redesigned the instrument. The result was the IR-4, which could be operated using either a single or double beam of infrared light. This allowed a user to take both the reference measurement and the sample measurement at the same time.

Beckman Coulter Inc., is an American company that makes biomedical laboratory instruments. Founded by Caltech professor Arnold O. Beckman in 1935 as National Technical Laboratories to commercialize a pH meter that he had invented, the company eventually grew to employ over 10,000 people, with $2.4 billion in annual sales by 2004. Its current headquarters are in Brea, California.

In the 1940s, Beckman changed the name to Arnold O. Beckman, Inc. to sell oxygen analyzers, the Helipot precision potentiometer, and spectrophotometers. In the 1950s, the company name changed to Beckman Instruments, Inc.

Beckman was contacted by Paul Rosenberg. Rosenberg worked at MIT’s Radiation Laboratory. The lab was part of a secret network of research institutions in both the United States and Britain that were working to develop radar, “radio detecting and ranging”. The project was interested in Beckman because of the high quality of the tuning knobs or “potentiometers” which were used on his pH meters. Beckman had trademarked the design of the pH meter knobs, under the name “helipot” for “helical potentiometer”. Rosenberg had found that the helipot was more precise, by a factor of ten, than other knobs. He redesigned the knob to have a continuous groove, in which the contact could not be jarred out of contact.

Beckman instruments were also used by the Manhattan Project to measure radiation in gas-filled, electrically charged ionization chambers in nuclear reactors.
The pH meter was adapted to do the job with a relatively minor adjustment – substituting an input-load resistor for the glass electrode. As a result, Beckman Instruments developed a new product, the micro-ammeter

After the war, Beckman developed oxygen analyzers that were used to monitor conditions in incubators for premature babies. Doctors at Johns Hopkins University used them to determine recommendations for healthy oxygen levels for incubators.

Beckman himself was approached by California governor Goodwin Knight to head a Special Committee on Air Pollution, to propose ways to combat smog. At the end of 1953, the committee made its findings public. The “Beckman Bible” advised key steps to be taken immediately:

In 1955, Beckman established the seminal Shockley Semiconductor Laboratory as a division of Beckman Instruments to begin commercializing the semiconductor transistor technology invented by Caltech alumnus William Shockley. The Shockley Laboratory was established in nearby Mountain View, California, and thus, “Silicon Valley” was born.

Beckman also saw that computers and automation offered a myriad of opportunities for integration into instruments, and the development of new instruments.

The Arnold and Mabel Beckman Foundation was incorporated in September 1977.  At the time of Beckman’s death, the Foundation had given more than 400 million dollars to a variety of charities and organizations. In 1990, it was considered one of the top ten foundations in California, based on annual gifts. Donations chiefly went to scientists and scientific causes as well as Beckman’s alma maters. He is quoted as saying, “I accumulated my wealth by selling instruments to scientists,… so I thought it would be appropriate to make contributions to science, and that’s been my number one guideline for charity.”

Wallace H. Coulter

Engineer, Inventor, Entrepreneur, Visionary

Wallace Henry Coulter was an engineer, inventor, entrepreneur and visionary. He was co-founder and Chairman of Coulter® Corporation, a worldwide medical diagnostics company headquartered in Miami, Florida. The two great passions of his life were applying engineering principles to scientific research, and embracing the diversity of world cultures. The first passion led him to invent the Coulter Principle™, the reference method for counting and sizing microscopic particles suspended in a fluid.

This invention served as the cornerstone for automating the labor intensive process of counting and testing blood. With his vision and tenacity, Wallace Coulter, was a founding father in the field of laboratory hematology, the science and study of blood. His global viewpoint and passion for world cultures inspired him to establish over twenty international subsidiaries. He recognized that it was imperative to employ locally based staff to service his customers before this became standard business strategy.

Wallace’s first attempts to patent his invention were turned away by more than one attorney who believed “you cannot patent a hole”. Persistent as always, Wallace finally applied for his first patent in 1949 and it was issued on October 20, 1953. That same year, two prototypes were sent to the National Institutes of Health for evaluation. Shortly after, the NIH published its findings in two key papers, citing improved accuracy and convenience of the Coulter method of counting blood cells. That same year, Wallace publicly disclosed his invention in his one and only technical paper at the National Electronics Conference, “High Speed Automatic Blood Cell Counter and Cell Size Analyzer”.

Leonard Skeggs was the inventor of the first continuous flow analyser way back in 1957. This groundbreaking event completely changed the way that chemistry was carried out. Many of the laborious tests that dominated lab work could be automated, increasing productivity and freeing personnel for other more challenging tasks

Continuous flow analysis and its offshoots and decedents are an integral part of modern chemistry. It might therefore be some conciliation to Leonard Skeggs to know that not only was he the beneficiary of an appellation with a long and fascinating history, he also created a revolution in wet chemistry that is still with us today.

Technicon

The AutoAnalyzer is an automated analyzer using a flow technique called continuous flow analysis (CFA), first made by the Technicon Corporation. The instrument was invented 1957 by Leonard Skeggs, PhD and commercialized by Jack Whitehead’s Technicon Corporation. The first applications were for clinical analysis, but methods for industrial analysis soon followed. The design is based on separating a continuously flowing stream with air bubbles.

In continuous flow analysis (CFA) a continuous stream of material is divided by air bubbles into discrete segments in which chemical reactions occur. The continuous stream of liquid samples and reagents are combined and transported in tubing and mixing coils. The tubing passes the samples from one apparatus to the other with each apparatus performing different functions, such as distillation, dialysis, extraction, ion exchange, heating, incubation, and subsequent recording of a signal. An essential principle of the system is the introduction of air bubbles. The air bubbles segment each sample into discrete packets and act as a barrier between packets to prevent cross contamination as they travel down the length of the tubing. The air bubbles also assist mixing by creating turbulent flow (bolus flow), and provide operators with a quick and easy check of the flow characteristics of the liquid. Samples and standards are treated in an exactly identical manner as they travel the length of the tubing, eliminating the necessity of a steady state signal, however, since the presence of bubbles create an almost square wave profile, bringing the system to steady state does not significantly decrease throughput ( third generation CFA analyzers average 90 or more samples per hour) and is desirable in that steady state signals (chemical equilibrium) are more accurate and reproducible.

A continuous flow analyzer (CFA) consists of different modules including a sampler, pump, mixing coils, optional sample treatments (dialysis, distillation, heating, etc.), a detector, and data generator. Most continuous flow analyzers depend on color reactions using a flow through photometer, however, also methods have been developed that use ISE, flame photometry, ICAP, fluorometry, and so forth.

Flow injection analysis (FIA), was introduced in 1975 by Ruzicka and Hansen.
Jaromir (Jarda) Ruzicka is a Professor  of Chemistry (Emeritus at the University of Washington and Affiliate at the University of Hawaii), and member of the Danish Academy of Technical Sciences. Born in Prague in 1934, he graduated from the Department of Analytical Chemistry, Facultyof Sciences, Charles University. In 1968, when Soviets occupied Czechoslovakia, he emigrated to Denmark. There, he joined The Technical University of Denmark, where, ten years  later, received a newly created Chair in Analytical Chemistry. When Jarda met Elo Hansen, they invented Flow Injection.

The first generation of FIA technology, termed flow injection (FI), was inspired by the AutoAnalyzer technique invented by Skeggs in early 1950s. While Skeggs’ AutoAnalyzer uses air segmentation to separate a flowing stream into numerous discrete segments to establish a long train of individual samples moving through a flow channel, FIA systems separate each sample from subsequent sample with a carrier reagent. While the AutoAnalyzer mixes sample homogeneously with reagents, in all FIA techniques sample and reagents are merged to form a concentration gradient that yields analysis results

Arthur Karmen.

Dr. Karmen was born in New York City in 1930. He graduated from the Bronx High School of Science in 1946 and earned an A.B. and M.D. in 1950 and 1954, respectively, from New York University. In 1952, while a medical student working on a summer project at Memorial-Sloan Kettering, he used paper chromatography of amino acids to demonstrate the presence of glutamic-oxaloacetic and glutaniic-pyruvic ransaminases (aspartate and alanine aminotransferases) in serum and blood. In 1954, he devised the spectrophotometric method for measuring aspartate aminotransferase in serum, which, with minor modifications, is still used for diagnostic testing today. When developing this assay, he studied the reaction of NADH with serum and demonstrated the presence of lactate and malate dehydrogenases, both of which were also later used in diagnosis. Using the spectrophotometric method, he found that aspartate aminotransferase increased in the period immediately after an acute myocardial infarction and did the pilot studies that showed its diagnostic utility in heart and liver diseases.  This became as important as the EKG. It was replaced in cardiology usage by the MB isoenzyme of creatine kinase, which was driven by Burton Sobel’s work on infarct size, and later by the troponins.

History of Laboratory Medicine at Yale University.

The roots of the Department of Laboratory Medicine at Yale can be traced back to John Peters, the head of what he called the “Chemical Division” of the Department of Internal Medicine, subsequently known as the Section of Metabolism, who co-authored with Donald Van Slyke the landmark 1931 textbook Quantitative Clinical Chemistry (2.3); and to Pauline Hald, research collaborator of Dr. Peters who subsequently served as Director of Clinical Chemistry at Yale-New Haven Hospital for many years. In 1947, Miss Hald reported the very first flame photometric measurements of sodium and potassium in serum (4). This study helped to lay the foundation for modern studies of metabolism and their application to clinical care.

The Laboratory Medicine program at Yale had its inception in 1958 as a section of Internal Medicine under the leadership of David Seligson. In 1965, Laboratory Medicine achieved autonomous section status and in 1971, became a full-fledged academic department. Dr. Seligson, who served as the first Chair, pioneered modern automation and computerized data processing in the clinical laboratory. In particular, he demonstrated the feasibility of discrete sample handling for automation that is now the basis of virtually all automated chemistry analyzers. In addition, Seligson and Zetner demonstrated the first clinical use of atomic absorption spectrophotometry. He was one of the founding members of the major Laboratory Medicine academic society, the Academy of Clinical Laboratory Physicians and Scientists.

Nathan Gochman.  Developer of Automated Chemistries.

Nathan Gochman, PhD, has over 40 years of experience in the clinical diagnostics industry. This includes academic teaching and research, and 30 years in the pharmaceutical and in vitro diagnostics industry. He has managed R & D, technical marketing and technical support departments. As a leader in the industry he was President of the American Association for Clinical Chemistry (AACC) and the National Committee for Clinical Laboratory Standards (NCCLS, now CLSI). He is currently a Consultant to investment firms and IVD companies.

William Sunderman

A doctor and scientist who lived a remarkable century and beyond — making medical advances, playing his Stradivarius violin at Carnegie Hall at 99 and being honored as the nation’s oldest worker at 100.

He developed a method for measuring glucose in the blood, the Sunderman Sugar Tube, and was one of the first doctors to use insulin to bring a patient out of a diabetic coma. He established quality-control techniques for medical laboratories that ended the wide variation in the results of laboratories doing the same tests.

He taught at several medical schools and founded and edited the journal Annals of Clinical and Laboratory Science. In World War II, he was a medical director for the Manhattan Project, which developed the atomic bomb.

Dr. Sunderman was president of the American Society of Clinical Pathologists and a founding governor of the College of American Pathologists. He also helped organize the Association of Clinical Scientists and was its first president.

Yale Department of Laboratory Medicine

The roots of the Department of Laboratory Medicine at Yale can be traced back to John Peters, the head of what he called the “Chemical Division” of the Department of Internal Medicine, subsequently known as the Section of Metabolism, who co-authored with Donald Van Slyke the landmark 1931 textbook Quantitative Clinical Chemistry; and to Pauline Hald, research collaborator of Dr. Peters who subsequently served as Director of Clinical Chemistry at Yale-New Haven Hospital for many years. In 1947, Miss Hald reported the very first flame photometric measurements of sodium and potassium in serum. This study helped to lay the foundation for modern studies of metabolism and their application to clinical care.

The Laboratory Medicine program at Yale had its inception in 1958 as a section of Internal Medicine under the leadership of David Seligson. In 1965, Laboratory Medicine achieved autonomous section status and in 1971, became a full-fledged academic department. Dr. Seligson, who served as the first Chair, pioneered modern automation and computerized data processing in the clinical laboratory. In particular, he demonstrated the feasibility of discrete sample handling for automation that is now the basis of virtually all automated chemistry analyzers. In addition, Seligson and Zetner demonstrated the first clinical use of atomic absorption spectrophotometry. He was one of the founding members of the major Laboratory Medicine academic society, the Academy of Clinical Laboratory Physicians and Scientists.

The discipline of clinical chemistry and the broader field of laboratory medicine, as they are practiced today, are attributed in no small part to Seligson’s vision and creativity.

Born in Philadelphia in 1916, Seligson graduated from University of Maryland and received a D.Sc. from Johns Hopkins University and an M.D. from the University of Utah. In 1953, he served as captain in the U.S. Army, chief of the Hepatic and Metabolic Disease Laboratory at Walter Reed Army Medical Center.

Recruited to Yale and Grace-New Haven Hospital in 1958 from the University of Pennsylvania as professor of internal medicine at the medical school and the first director of clinical laboratories at the hospital, Seligson subsequently established the infrastructure of the Department of Laboratory Medicine, creating divisions of clinical chemistry, microbiology, transfusion medicine (blood banking) and hematology – each with its own strong clinical, teaching and research programs.

Challenging the continuous flow approach, Seligson designed, built and validated “discrete sample handling” instruments wherein each sample was treated independently, which allowed better choice of methods and greater efficiency. Today continuous flow has essentially disappeared and virtually all modern automated clinical laboratory instruments are based upon discrete sample handling technology.

Seligson was one of the early visionaries who recognized the potential for computers in the clinical laboratory. One of the first applications of a digital computer in the clinical laboratory occurred in Seligson’s department at Yale, and shortly thereafter data were being transmitted directly from the laboratory computer to data stations on the patient wards. Now, such laboratory information systems represent the standard of care.

He was also among the first to highlight the clinical importance of test specificity and accuracy, as compared to simple reproducibility. One of his favorite slides was one that showed almost perfectly reproducible results for 10 successive measurements of blood sugar obtained with what was then the most widely used and popular analytical instrument. However, he would note, the answer was wrong; the assay was not accurate.

Seligson established one of the nation’s first residency programs focused on laboratory medicine or clinical pathology, and also developed a teaching curriculum in laboratory medicine for medical students. In so doing, he created a model for the modern practice of laboratory medicine in an academic environment, and his trainees spread throughout the country as leaders in the field.

Ernest Cotlove

Ernest Cotlove’s scientific and medical career started at NYU where, after finishing medicine in 1943, he pursued studies in renal physiology and chemistry. His outstanding ability to acquire knowledge and conduct innovative investigations earned him an invitation from James Shannon, then Director of the National Heart Institute at NIH. He continued studies of renal physiology and chemistry until 1953 when he became Head of Clinical Chemistry Laboratories in the new Department of Clinical Pathology being developed by George Z. Williams during the Clinical Center’s construction. Dr. Cotlove seized the opportunity to design and equip the most advanced and functional clinical chemistry facility in our country.

Dr. Cotlove’s career exemplified the progress seen in medical research and technology. He designed the electronic chloridometer that bears his name, in spite of published reports that such an approach was theoretically impossible. He used this innovative skill to develop new instruments and methods at the Clinical Center. Many recognized him as an expert in clinical chemistry, computer programming, systems design for laboratory operations, and automation of analytical instruments.

Effects of Automation on Laboratory Diagnosis

George Z. Williams

There are four primary effects of laboratory automation on the practice of medicine: The range of laboratory support is being greatly extended to both diagnosis and guidance of therapeutic management; the new feasibility of multiphasic periodic health evaluation promises effective health and manpower conservation in the future; and substantially lowered unit cost for laboratory analysis will permit more extensive use of comprehensive laboratory medicine in everyday practice. There is, however, a real and growing danger of naive acceptance of and overconfidence in the reliability and accuracy of automated analysis and computer processing without critical evaluation. Erroneous results can jeopardize the patient’s welfare. Every physician has the responsibility to obtain proof of accuracy and reliability from the laboratories which serve his patients.

. Mario Werner

Dr. Werner received his medical degree from the University of Zurich, Switzerland in 1956. After specializing in internal medicine at the University Clinic in Basel, he came to the United States–as a fellow of the Swiss Academy of Medical Sciences–to work at NIH and at the Rockefeller University. From 1964 to 1966, he served as chief of the Central Laboratory at the Klinikum Essen, Ruhr-University, Germany. In 1967, he returned to the US, joining the Division of Clinical Pathology and Laboratory Medicine at the University of California, San Francisco, as an assistant professor. Three years later, he became Associate Professor of Pathology and Laboratory Medicine at Washington University in St. Louis, where he was instrumental in establishing the training program in laboratory medicine. In 1972, he was appointed Professor of Pathology at The George Washington University in Washington, DC.

Norbert Tietz

Professor Norbert W. Tietz received the degree of Doctor of Natural Sciences from the Technical University Stuttgart, Germany, in 1950. In 1954 he immigrated to the United States where he subsequently held positions or appointments at several Chicago area institutions including the Mount Sinai Hospital Medical Center, Chicago Medical School/University of Health Sciences and Rush Medical College.

Professor Tietz is best known as the editor of the Fundamentals of Clinical Chemistry. This book, now in its sixth edition, remains a primary information source for both students and educators in laboratory medicine. It was the first modem textbook that integrated clinical chemistry with the basic sciences and pathophysiology.

Throughout his career, Dr. Tietz taught a range of students from the undergraduate through post-graduate level including (1) medical technology students, (2) medical students, (3) clinical chemistry graduate students, (4) pathology residents, and (5) practicing chemists. For example, in the late 1960’s he began the first master’s of science degree program in clinical chemistry in the United States at the Chicago Medical School. This program subsequently evolved into one of the first Ph.D. programs in clinical chemistry.

Automation and other recent developments in clinical chemistry.

Griffiths J.

http://www.ncbi.nlm.nih.gov/pubmed/1344702

The decade 1980 to 1990 was the most progressive period in the short, but
turbulent, history of clinical chemistry. New techniques and the instrumentation
needed to perform assays have opened a chemical Pandora’s box. Multichannel
analyzers, the base spectrophotometric key to automated laboratories, have
become almost perfect. The extended use of the antigen-monoclonal antibody
reaction with increasing sensitive labels has extended analyte detection
routinely into the picomole/liter range. Devices that aid the automation of
serum processing and distribution of specimens are emerging. Laboratory
computerization has significantly matured, permitting better integration of
laboratory instruments, improving communication between laboratory personnel
and the patient’s physician, and facilitating the use of expert systems and
robotics in the chemistry laboratory

Automation and Expert Systems in a Core Clinical Chemistry Laboratory
Streitberg, GT, et al.  JALA 2009;14:94–105

Clinical pathology or laboratory medicine has a great
influence on clinical decisions and 60e70% of the
most important decisions on admission, discharge,
and medication are based on laboratory results.1
As we learn more about clinical laboratory results
and incorporate them in outcome optimization
schemes, the laboratory will play a more pivotal role
in management of patients and the eventual outcomes.
2 It has been stated that the development of
information technology and automation in laboratory
medicine has allowed laboratory professionals
to keep in pace with the growth in workload.

Since the reasons to automate and the impact of automation have
similarities and these include reduction in errors, increase in productivity,
and improvement in safety. Advances in technology in clinical chemistry
that have included total laboratory automation call for changes in job
responsibilities to include skills in information technology, data management,
instrumentation, patient preparation for diagnostic analysis, interpretation
of pathology results, dissemination of knowledge and information to
patients and other health staff, as well as skills in research.

The clinical laboratory has become so productive, particularly in chemistry and immunology, and the labor, instrument and reagent costs are well determined, that today a physician’s medical decisions are 80% determined by the clinical laboratory.  Medical information systems have lagged far behind.  Why is that?  Because the decision for a MIS has historical been based on billing capture.  Moreover, the historical use of chemical profiles were quite good at validating healthy dtatus in an outpatient population, but the profiles became restricted under Diagnostic Related Groups.    Thus, it came to be that the diagnostics was considered a “commodity”.  In order to be competitive, a laboratory had to provide “high complexity” tests that were drawn in by a large volume of “moderate complexity” tests.

Read Full Post »

Summary to Metabolomics

Summary to Metabolomics

Author and Curator: Larry H. Bernstein, MD, FCAP 

This concludes a long step-by-step journey into rediscovering biological processes from the genome as a framework to the remodeled and reconstituted cell through a number of posttranscription and posttranslation processes that modify the proteome and determine the metabolome.  The remodeling process continues over a lifetime. The process requires a balance between nutrient intake, energy utilization for work in the lean body mass, energy reserves, endocrine, paracrine and autocrine mechanisms, and autophagy.  It is true when we look at this in its full scope – What a creature is man?

http://masspec.scripps.edu/metabo_science/recommended_readings.php
 Recommended Readings and Historical Perspectives

Metabolomics is the scientific study of chemical processes involving metabolites. Specifically, metabolomics is the “systematic study of the unique chemical fingerprints that specific cellular processes leave behind”, the study of their small-molecule metabolite profiles.[1] The metabolome represents the collection of all metabolites in a biological cell, tissue, organ or organism, which are the end products of cellular processes.[2] mRNA gene expression data and proteomic analyses reveal the set of gene products being produced in the cell, data that represents one aspect of cellular function. Conversely, metabolic profiling can give an instantaneous snapshot of the physiology of that cell. One of the challenges of systems biology and functional genomics is to integrate proteomic, transcriptomic, and metabolomic information to provide a better understanding of cellular biology.

The term “metabolic profile” was introduced by Horning, et al. in 1971 after they demonstrated that gas chromatography-mass spectrometry (GC-MS) could be used to measure compounds present in human urine and tissue extracts. The Horning group, along with that of Linus Pauling and Arthur B. Robinson led the development of GC-MS methods to monitor the metabolites present in urine through the 1970s.

Concurrently, NMR spectroscopy, which was discovered in the 1940s, was also undergoing rapid advances. In 1974, Seeley et al. demonstrated the utility of using NMR to detect metabolites in unmodified biological samples.This first study on muscle highlighted the value of NMR in that it was determined that 90% of cellular ATP is complexed with magnesium. As sensitivity has improved with the evolution of higher magnetic field strengths and magic angle spinning, NMR continues to be a leading analytical tool to investigate metabolism. Efforts to utilize NMR for metabolomics have been influenced by the laboratory of Dr. Jeremy Nicholson at Birkbeck College, University of London and later at Imperial College London. In 1984, Nicholson showed 1H NMR spectroscopy could potentially be used to diagnose diabetes mellitus, and later pioneered the application of pattern recognition methods to NMR spectroscopic data.

In 2005, the first metabolomics web database, METLIN, for characterizing human metabolites was developed in the Siuzdak laboratory at The Scripps Research Institute and contained over 10,000 metabolites and tandem mass spectral data. As of September 2012, METLIN contains over 60,000 metabolites as well as the largest repository of tandem mass spectrometry data in metabolomics.

On 23 January 2007, the Human Metabolome Project, led by Dr. David Wishart of the University of Alberta, Canada, completed the first draft of the human metabolome, consisting of a database of approximately 2500 metabolites, 1200 drugs and 3500 food components. Similar projects have been underway in several plant species, most notably Medicago truncatula and Arabidopsis thaliana for several years.

As late as mid-2010, metabolomics was still considered an “emerging field”. Further, it was noted that further progress in the field depended in large part, through addressing otherwise “irresolvable technical challenges”, by technical evolution of mass spectrometry instrumentation.

Metabolome refers to the complete set of small-molecule metabolites (such as metabolic intermediates, hormones and other signaling molecules, and secondary metabolites) to be found within a biological sample, such as a single organism. The word was coined in analogy with transcriptomics and proteomics; like the transcriptome and the proteome, the metabolome is dynamic, changing from second to second. Although the metabolome can be defined readily enough, it is not currently possible to analyse the entire range of metabolites by a single analytical method. The first metabolite database(called METLIN) for searching m/z values from mass spectrometry data was developed by scientists at The Scripps Research Institute in 2005. In January 2007, scientists at the University of Alberta and the University of Calgary completed the first draft of the human metabolome. They catalogued approximately 2500 metabolites, 1200 drugs and 3500 food components that can be found in the human body, as reported in the literature. This information, available at the Human Metabolome Database (www.hmdb.ca) and based on analysis of information available in the current scientific literature, is far from complete.

Each type of cell and tissue has a unique metabolic ‘fingerprint’ that can elucidate organ or tissue-specific information, while the study of biofluids can give more generalized though less specialized information. Commonly used biofluids are urine and plasma, as they can be obtained non-invasively or relatively non-invasively, respectively. The ease of collection facilitates high temporal resolution, and because they are always at dynamic equilibrium with the body, they can describe the host as a whole.

Metabolites are the intermediates and products of metabolism. Within the context of metabolomics, a metabolite is usually defined as any molecule less than 1 kDa in size.
A primary metabolite is directly involved in the normal growth, development, and reproduction. A secondary metabolite is not directly involved in those processes.  By contrast, in human-based metabolomics, it is more common to describe metabolites as being either endogenous (produced by the host organism) or exogenous. Metabolites of foreign substances such as drugs are termed xenometabolites. The metabolome forms a large network of metabolic reactions, where outputs from one enzymatic chemical reaction are inputs to other chemical reactions.

Metabonomics is defined as “the quantitative measurement of the dynamic multiparametric metabolic response of living systems to pathophysiological stimuli or genetic modification”. The word origin is from the Greek μεταβολή meaning change and nomos meaning a rule set or set of laws. This approach was pioneered by Jeremy Nicholson at Imperial College London and has been used in toxicology, disease diagnosis and a number of other fields. Historically, the metabonomics approach was one of the first methods to apply the scope of systems biology to studies of metabolism.

There is a growing consensus that ‘metabolomics’ places a greater emphasis on metabolic profiling at a cellular or organ level and is primarily concerned with normal endogenous metabolism. ‘Metabonomics’ extends metabolic profiling to include information about perturbations of metabolism caused by environmental factors (including diet and toxins), disease processes, and the involvement of extragenomic influences, such as gut microflora. This is not a trivial difference; metabolomic studies should, by definition, exclude metabolic contributions from extragenomic sources, because these are external to the system being studied.

Toxicity assessment/toxicology. Metabolic profiling (especially of urine or blood plasma samples) detects the physiological changes caused by toxic insult of a chemical (or mixture of chemicals).

Functional genomics. Metabolomics can be an excellent tool for determining the phenotype caused by a genetic manipulation, such as gene deletion or insertion. Sometimes this can be a sufficient goal in itself—for instance, to detect any phenotypic changes in a genetically-modified plant intended for human or animal consumption. More exciting is the prospect of predicting the function of unknown genes by comparison with the metabolic perturbations caused by deletion/insertion of known genes.

Nutrigenomics is a generalised term which links genomics, transcriptomics, proteomics and metabolomics to human nutrition. In general a metabolome in a given body fluid is influenced by endogenous factors such as age, sex, body composition and genetics as well as underlying pathologies. The large bowel microflora are also a very significant potential confounder of metabolic profiles and could be classified as either an endogenous or exogenous factor. The main exogenous factors are diet and drugs. Diet can then be broken down to nutrients and non- nutrients.

http://en.wikipedia.org/wiki/Metabolomics

Jose Eduardo des Salles Roselino

The problem with genomics was it was set as explanation for everything. In fact, when something is genetic in nature the genomic reasoning works fine. However, this means whenever an inborn error is found and only in this case the genomic knowledge afterwards may indicate what is wrong and not the completely way to put biology upside down by reading everything in the DNA genetic as well as non-genetic problems.

Coordination of the transcriptome and metabolome by the circadian clock PNAS 2012

Coordination of the transcriptome and metabolome by the circadian clock PNAS 2012

analysis of metabolomic data and differential metabolic regulation for fetal lungs, and maternal blood plasma

conformational changes leading to substrate efflux.img

conformational changes leading to substrate efflux.img

The cellular response is defined by a network of chemogenomic response signatures.

The cellular response is defined by a network of chemogenomic response signatures.

Dynamic Construct of the –Omics

Dynamic Construct of the –Omics

 genome cartoon

genome cartoon

central dogma phenotype

central dogma phenotype

Read Full Post »

Metabolomics Summary and Perspective

Metabolomics Summary and Perspective

Author and Curator: Larry H Bernstein, MD, FCAP 

 

This is the final article in a robust series on metabolism, metabolomics, and  the “-OMICS-“ biological synthesis that is creating a more holistic and interoperable view of natural sciences, including the biological disciplines, climate science, physics, chemistry, toxicology, pharmacology, and pathophysiology with as yet unforeseen consequences.

There have been impressive advances already in the research into developmental biology, plant sciences, microbiology, mycology, and human diseases, most notably, cancer, metabolic , and infectious, as well as neurodegenerative diseases.

Acknowledgements:

I write this article in honor of my first mentor, Harry Maisel, Professor and Emeritus Chairman of Anatomy, Wayne State University, Detroit, MI and to my stimulating mentors, students, fellows, and associates over many years:

Masahiro Chiga, MD, PhD, Averill A Liebow, MD, Nathan O Kaplan, PhD, Johannes Everse, PhD, Norio Shioura, PhD, Abraham Braude, MD, Percy J Russell, PhD, Debby Peters, Walter D Foster, PhD, Herschel Sidransky, MD, Sherman Bloom, MD, Matthew Grisham, PhD, Christos Tsokos, PhD,  IJ Good, PhD, Distinguished Professor, Raool Banagale, MD, Gustavo Reynoso, MD,Gustave Davis, MD, Marguerite M Pinto, MD, Walter Pleban, MD, Marion Feietelson-Winkler, RD, PhD,  John Adan,MD, Joseph Babb, MD, Stuart Zarich, MD,  Inder Mayall, MD, A Qamar, MD, Yves Ingenbleek, MD, PhD, Emeritus Professor, Bette Seamonds, PhD, Larry Kaplan, PhD, Pauline Y Lau, PhD, Gil David, PhD, Ronald Coifman, PhD, Emeritus Professor, Linda Brugler, RD, MBA, James Rucinski, MD, Gitta Pancer, Ester Engelman, Farhana Hoque, Mohammed Alam, Michael Zions, William Fleischman, MD, Salman Haq, MD, Jerard Kneifati-Hayek, Madeleine Schleffer, John F Heitner, MD, Arun Devakonda,MD, Liziamma George,MD, Suhail Raoof, MD, Charles Oribabor,MD, Anthony Tortolani, MD, Prof and Chairman, JRDS Rosalino, PhD, Aviva Lev Ari, PhD, RN, Rosser Rudolph, MD, PhD, Eugene Rypka, PhD, Jay Magidson, PhD, Izaak Mayzlin, PhD, Maurice Bernstein, PhD, Richard Bing, Eli Kaplan, PhD, Maurice Bernstein, PhD.

This article has EIGHT parts, as follows:

Part 1

Metabolomics Continues Auspicious Climb

Part 2

Biologists Find ‘Missing Link’ in the Production of Protein Factories in Cells

Part 3

Neuroscience

Part 4

Cancer Research

Part 5

Metabolic Syndrome

Part 6

Biomarkers

Part 7

Epigenetics and Drug Metabolism

Part 8

Pictorial

genome cartoon

genome cartoon

 iron metabolism

iron metabolism

personalized reference range within population range

personalized reference range within population range

Part 1.  MetabolomicsSurge

metagraph  _OMICS

metagraph _OMICS

Metabolomics Continues Auspicious Climb

Jeffery Herman, Ph.D.
GEN May 1, 2012 (Vol. 32, No. 9)

Aberrant biochemical and metabolite signaling plays an important role in

  • the development and progression of diseased tissue.

This concept has been studied by the science community for decades. However, with relatively

  1. recent advances in analytical technology and bioinformatics as well as
  2. the development of the Human Metabolome Database (HMDB),

metabolomics has become an invaluable field of research.

At the “International Conference and Exhibition on Metabolomics & Systems Biology” held recently in San Francisco, researchers and industry leaders discussed how

  • the underlying cellular biochemical/metabolite fingerprint in response to
  1. a specific disease state,
  2. toxin exposure, or
  3. pharmaceutical compound
  • is useful in clinical diagnosis and biomarker discovery and
  • in understanding disease development and progression.

Developed by BASF, MetaMap® Tox is

  • a database that helps identify in vivo systemic effects of a tested compound, including
  1. targeted organs,
  2. mechanism of action, and
  3. adverse events.

Based on 28-day systemic rat toxicity studies, MetaMap Tox is composed of

  • differential plasma metabolite profiles of rats
  • after exposure to a large variety of chemical toxins and pharmaceutical compounds.

“Using the reference data,

  • we have developed more than 110 patterns of metabolite changes, which are
  • specific and predictive for certain toxicological modes of action,”

said Hennicke Kamp, Ph.D., group leader, department of experimental toxicology and ecology at BASF.

With MetaMap Tox, a potential drug candidate

  • can be compared to a similar reference compound
  • using statistical correlation algorithms,
  • which allow for the creation of a toxicity and mechanism of action profile.

“MetaMap Tox, in the context of early pre-clinical safety enablement in pharmaceutical development,” continued Dr. Kamp,

  • has been independently validated “
  • by an industry consortium (Drug Safety Executive Council) of 12 leading biopharmaceutical companies.”

Dr. Kamp added that this technology may prove invaluable

  • allowing for quick and accurate decisions and
  • for high-throughput drug candidate screening, in evaluation
  1. on the safety and efficacy of compounds
  2. during early and preclinical toxicological studies,
  3. by comparing a lead compound to a variety of molecular derivatives, and
  • the rapid identification of the most optimal molecular structure
  • with the best efficacy and safety profiles might be streamlined.

Dynamic Construct of the –Omics

Dynamic Construct of the –Omics

Targeted Tandem Mass Spectrometry

Biocrates Life Sciences focuses on targeted metabolomics, an important approach for

  • the accurate quantification of known metabolites within a biological sample.

Originally used for the clinical screening of inherent metabolic disorders from dried blood-spots of newborn children, Biocrates has developed

  • a tandem mass spectrometry (MS/MS) platform, which allows for
  1. the identification,
  2. quantification, and
  3. mapping of more than 800 metabolites to specific cellular pathways.

It is based on flow injection analysis and high-performance liquid chromatography MS/MS.

Clarification of Pathway-Specific Inhibition by Fourier Transform Ion Cyclotron Resonance.Mass Spectrometry-Based Metabolic Phenotyping Studies F5.large

common drug targets

common drug targets

The MetaDisIDQ® Kit is a

  • “multiparamatic” diagnostic assay designed for the “comprehensive assessment of a person’s metabolic state” and
  • the early determination of pathophysiological events with regards to a specific disease.

MetaDisIDQ is designed to quantify

  • a diverse range of 181 metabolites involved in major metabolic pathways
  • from a small amount of human serum (10 µL) using isotopically labeled internal standards,

This kit has been demonstrated to detect changes in metabolites that are commonly associated with the development of

  • metabolic syndrome, type 2 diabetes, and diabetic nephropathy,

Dr. Dallman reports that data generated with the MetaDisIDQ kit correlates strongly with

  • routine chemical analyses of common metabolites including glucose and creatinine

Biocrates has also developed the MS/MS-based AbsoluteIDQ® kits, which are

  • an “easy-to-use” biomarker analysis tool for laboratory research.

The kit functions on MS machines from a variety of vendors, and allows for the quantification of 150-180 metabolites.

The SteroIDQ® kit is a high-throughput standardized MS/MS diagnostic assay,

  • validated in human serum, for the rapid and accurate clinical determination of 16 known steroids.

Initially focusing on the analysis of steroid ranges for use in hormone replacement therapy, the SteroIDQ Kit is expected to have a wide clinical application.

Hormone-Resistant Breast Cancer

Scientists at Georgetown University have shown that

  • breast cancer cells can functionally coordinate cell-survival and cell-proliferation mechanisms,
  • while maintaining a certain degree of cellular metabolism.

To grow, cells need energy, and energy is a product of cellular metabolism. For nearly a century, it was thought that

  1. the uncoupling of glycolysis from the mitochondria,
  2. leading to the inefficient but rapid metabolism of glucose and
  3. the formation of lactic acid (the Warburg effect), was

the major and only metabolism driving force for unchecked proliferation and tumorigenesis of cancer cells.

Other aspects of metabolism were often overlooked.

“.. we understand now that

  • cellular metabolism is a lot more than just metabolizing glucose,”

said Robert Clarke, Ph.D., professor of oncology and physiology and biophysics at Georgetown University. Dr. Clarke, in collaboration with the Waters Center for Innovation at Georgetown University (led by Albert J. Fornace, Jr., M.D.), obtained

  • the metabolomic profile of hormone-sensitive and -resistant breast cancer cells through the use of UPLC-MS.

They demonstrated that breast cancer cells, through a rather complex and not yet completely understood process,

  1. can functionally coordinate cell-survival and cell-proliferation mechanisms,
  2. while maintaining a certain degree of cellular metabolism.

This is at least partly accomplished through the upregulation of important pro-survival mechanisms; including

  • the unfolded protein response;
  • a regulator of endoplasmic reticulum stress and
  • initiator of autophagy.

Normally, during a stressful situation, a cell may

  • enter a state of quiescence and undergo autophagy,
  • a process by which a cell can recycle organelles
  • in order to maintain enough energy to survive during a stressful situation or,

if the stress is too great,

  • undergo apoptosis.

By integrating cell-survival mechanisms and cellular metabolism

  • advanced ER+ hormone-resistant breast cancer cells
  • can maintain a low level of autophagy
  • to adapt and resist hormone/chemotherapy treatment.

This adaptation allows cells

  • to reallocate important metabolites recovered from organelle degradation and
  • provide enough energy to also promote proliferation.

With further research, we can gain a better understanding of the underlying causes of hormone-resistant breast cancer, with

  • the overall goal of developing effective diagnostic, prognostic, and therapeutic tools.

NMR

Over the last two decades, NMR has established itself as a major tool for metabolomics analysis. It is especially adept at testing biological fluids. [Bruker BioSpin]

Historically, nuclear magnetic resonance spectroscopy (NMR) has been used for structural elucidation of pure molecular compounds. However, in the last two decades, NMR has established itself as a major tool for metabolomics analysis. Since

  • the integral of an NMR signal is directly proportional to
  • the molar concentration throughout the dynamic range of a sample,

“the simultaneous quantification of compounds is possible

  • without the need for specific reference standards or calibration curves,” according to Lea Heintz of Bruker BioSpin.

NMR is adept at testing biological fluids because of

  1.  high reproducibility,
  2. standardized protocols,
  3. low sample manipulation, and
  4. the production of a large subset of data,

Bruker BioSpin is presently involved in a project for the screening of inborn errors of metabolism in newborn children from Turkey, based on their urine NMR profiles. More than 20 clinics are participating to the project that is coordinated by INFAI, a specialist in the transfer of advanced analytical technology into medical diagnostics. The construction of statistical models are being developed

  • for the detection of deviations from normality, as well as
  • automatic quantification methods for indicative metabolites

Bruker BioSpin recently installed high-resolution magic angle spinning NMR (HRMAS-NMR) systems that can rapidly analyze tissue biopsies. The main objective for HRMAS-NMR is to establish a rapid and effective clinical method to assess tumor grade and other important aspects of cancer during surgery.

Combined NMR and Mass Spec

There is increasing interest in combining NMR and MS, two of the main analytical assays in metabolomic research, as a means

  • to improve data sensitivity and to
  • fully elucidate the complex metabolome within a given biological sample.
  •  to realize a potential for cancer biomarker discovery in the realms of diagnosis, prognosis, and treatment.

.

Using combined NMR and MS to measure the levels of nearly 250 separate metabolites in the patient’s blood, Dr. Weljie and other researchers at the University of Calgary were able to rapidly determine the malignancy of a  pancreatic lesion (in 10–15% of the cases, it is difficult to discern between benign and malignant), while avoiding unnecessary surgery in patients with benign lesions.

When performing NMR and MS on a single biological fluid, ultimately “we are,” noted Dr. Weljie,

  1. “splitting up information content, processing, and introducing a lot of background noise and error and
  2. then trying to reintegrate the data…
    It’s like taking a complex item, with multiple pieces, out of an IKEA box and trying to repackage it perfectly into another box.”

By improving the workflow between the initial splitting of the sample, they improved endpoint data integration, proving that

  • a streamlined approach to combined NMR/MS can be achieved,
  • leading to a very strong, robust and precise metabolomics toolset.

Metabolomics Research Picks Up Speed

Field Advances in Quest to Improve Disease Diagnosis and Predict Drug Response

John Morrow Jr., Ph.D.
GEN May 1, 2011 (Vol. 31, No. 9)

As an important discipline within systems biology, metabolomics is being explored by a number of laboratories for

  • its potential in pharmaceutical development.

Studying metabolites can offer insights into the relationships between genotype and phenotype, as well as between genotype and environment. In addition, there is plenty to work with—there are estimated to be some 2,900 detectable metabolites in the human body, of which

  1. 309 have been identified in cerebrospinal fluid,
  2. 1,122 in serum,
  3. 458 in urine, and
  4. roughly 300 in other compartments.

Guowang Xu, Ph.D., a researcher at the Dalian Institute of Chemical Physics.  is investigating the causes of death in China,

  • and how they have been changing over the years as the country has become a more industrialized nation.
  •  the increase in the incidence of metabolic disorders such as diabetes has grown to affect 9.7% of the Chinese population.

Dr. Xu,  collaborating with Rainer Lehman, Ph.D., of the University of Tübingen, Germany, compared urinary metabolites in samples from healthy individuals with samples taken from prediabetic, insulin-resistant subjects. Using mass spectrometry coupled with electrospray ionization in the positive mode, they observed striking dissimilarities in levels of various metabolites in the two groups.

“When we performed a comprehensive two-dimensional gas chromatography, time-of-flight mass spectrometry analysis of our samples, we observed several metabolites, including

  • 2-hydroxybutyric acid in plasma,
  •  as potential diabetes biomarkers,” Dr. Xu explains.

In other, unrelated studies, Dr. Xu and the German researchers used a metabolomics approach to investigate the changes in plasma metabolite profiles immediately after exercise and following a 3-hour and 24-hour period of recovery. They found that

  • medium-chain acylcarnitines were the most distinctive exercise biomarkers, and
  • they are released as intermediates of partial beta oxidation in human myotubes and mouse muscle tissue.

Dr. Xu says. “The traditional approach of assessment based on a singular biomarker is being superseded by the introduction of multiple marker profiles.”

Typical of the studies under way by Dr. Kaddurah-Daouk and her colleaguesat Duke University

  • is a recently published investigation highlighting the role of an SNP variant in
  • the glycine dehydrogenase gene on individual response to antidepressants.
  •  patients who do not respond to the selective serotonin uptake inhibitors citalopram and escitalopram
  • carried a particular single nucleotide polymorphism in the GD gene.

“These results allow us to pinpoint a possible

  • role for glycine in selective serotonin reuptake inhibitor response and
  • illustrate the use of pharmacometabolomics to inform pharmacogenomics.

These discoveries give us the tools for prognostics and diagnostics so that

  • we can predict what conditions will respond to treatment.

“This approach to defining health or disease in terms of metabolic states opens a whole new paradigm.

By screening hundreds of thousands of molecules, we can understand

  • the relationship between human genetic variability and the metabolome.”

Dr. Kaddurah-Daouk talks about statins as a current

  • model of metabolomics investigations.

It is now known that the statins  have widespread effects, altering a range of metabolites. To sort out these changes and develop recommendations for which individuals should be receiving statins will require substantial investments of energy and resources into defining the complex web of biochemical changes that these drugs initiate.
Furthermore, Dr. Kaddurah-Daouk asserts that,

  • “genetics only encodes part of the phenotypic response.

One needs to take into account the

  • net environment contribution in order to determine
  • how both factors guide the changes in our metabolic state that determine the phenotype.”

Interactive Metabolomics

Researchers at the University of Nottingham use diffusion-edited nuclear magnetic resonance spectroscopy to assess the effects of a biological matrix on metabolites. Diffusion-edited NMR experiments provide a way to

  • separate the different compounds in a mixture
  • based on the differing translational diffusion coefficients (which reflect the size and shape of the molecule).

The measurements are carried out by observing

  • the attenuation of the NMR signals during a pulsed field gradient experiment.

Clare Daykin, Ph.D., is a lecturer at the University of Nottingham, U.K. Her field of investigation encompasses “interactive metabolomics,”which she defines as

“the study of the interactions between low molecular weight biochemicals and macromolecules in biological samples ..

  • without preselection of the components of interest.

“Blood plasma is a heterogeneous mixture of molecules that

  1. undergo a variety of interactions including metal complexation,
  2. chemical exchange processes,
  3. micellar compartmentation,
  4. enzyme-mediated biotransformations, and
  5. small molecule–macromolecular binding.”

Many low molecular weight compounds can exist

  • freely in solution,
  • bound to proteins, or
  • within organized aggregates such as lipoprotein complexes.

Therefore, quantitative comparison of plasma composition from

  • diseased individuals compared to matched controls provides an incomplete insight to plasma metabolism.

“It is not simply the concentrations of metabolites that must be investigated,

  • but their interactions with the proteins and lipoproteins within this complex web.

Rather than targeting specific metabolites of interest, Dr. Daykin’s metabolite–protein binding studies aim to study

  • the interactions of all detectable metabolites within the macromolecular sample.

Such activities can be studied through the use of diffusion-edited nuclear magnetic resonance (NMR) spectroscopy, in which one can assess

  • the effects of the biological matrix on the metabolites.

“This can lead to a more relevant and exact interpretation

  • for systems where metabolite–macromolecule interactions occur.”

Diffusion-edited NMR experiments provide a way to separate the different compounds in a mixture based on

  • the differing translational diffusion coefficients (which reflect the size and shape of the molecule).

The measurements are carried out by observing

  • the attenuation of the NMR signals during a pulsed field gradient experiment.

Pushing the Limits

It is widely recognized that many drug candidates fail during development due to ancillary toxicity. Uwe Sauer, Ph.D., professor, and Nicola Zamboni, Ph.D., researcher, both at the Eidgenössische Technische Hochschule, Zürich (ETH Zürich), are applying

  • high-throughput intracellular metabolomics to understand
  • the basis of these unfortunate events and
  • head them off early in the course of drug discovery.

“Since metabolism is at the core of drug toxicity, we developed a platform for

  • measurement of 50–100 targeted metabolites by
  • a high-throughput system consisting of flow injection
  • coupled to tandem mass spectrometry.”

Using this approach, Dr. Sauer’s team focused on

  • the central metabolism of the yeast Saccharomyces cerevisiae, reasoning that
  • this core network would be most susceptible to potential drug toxicity.

Screening approximately 41 drugs that were administered at seven concentrations over three orders of magnitude, they observed changes in metabolome patterns at much lower drug concentrations without attendant physiological toxicity.

The group carried out statistical modeling of about

  • 60 metabolite profiles for each drug they evaluated.

This data allowed the construction of a “profile effect map” in which

  • the influence of each drug on metabolite levels can be followed, including off-target effects, which
  • provide an indirect measure of the possible side effects of the various drugs.

Dr. Sauer says.“We have found that this approach is

  • at least 100 times as fast as other omics screening platforms,”

“Some drugs, including many anticancer agents,

  • disrupt metabolism long before affecting growth.”

killing cancer cells

killing cancer cells

Furthermore, they used the principle of 13C-based flux analysis, in which

  • metabolites labeled with 13C are used to follow the utilization of metabolic pathways in the cell.

These 13C-determined intracellular responses of metabolic fluxes to drug treatment demonstrate

  • the functional performance of the network to be rather robust,

conformational changes leading to substrate efflux.

conformational changes leading to substrate efflux.

leading Dr. Sauer to the conclusion that

  • the phenotypic vigor he observes to drug challenges
  • is achieved by a flexible make up of the metabolome.

Dr. Sauer is confident that it will be possible to expand the scope of these investigations to hundreds of thousands of samples per study. This will allow answers to the questions of

  • how cells establish a stable functioning network in the face of inevitable concentration fluctuations.

Is Now the Hour?

There is great enthusiasm and agitation within the biotech community for

  • metabolomics approaches as a means of reversing the dismal record of drug discovery

that has accumulated in the last decade.

While the concept clearly makes sense and is being widely applied today, there are many reasons why drugs fail in development, and metabolomics will not be a panacea for resolving all of these questions. It is too early at this point to recognize a trend or a track record, and it will take some time to see how this approach can aid in drug discovery and shorten the timeline for the introduction of new pharmaceutical agents.

Degree of binding correlated with function

Degree of binding correlated with function

Diagram_of_a_two-photon_excitation_microscope_

Diagram_of_a_two-photon_excitation_microscope_

Part 2.  Biologists Find ‘Missing Link’ in the Production of Protein Factories in Cells

Biologists at UC San Diego have found

  • the “missing link” in the chemical system that
  • enables animal cells to produce ribosomes

—the thousands of protein “factories” contained within each cell that

  • manufacture all of the proteins needed to build tissue and sustain life.

‘Missing Link’

‘Missing Link’

Their discovery, detailed in the June 23 issue of the journal Genes & Development, will not only force

  • a revision of basic textbooks on molecular biology, but also
  • provide scientists with a better understanding of
  • how to limit uncontrolled cell growth, such as cancer,
  • that might be regulated by controlling the output of ribosomes.

Ribosomes are responsible for the production of the wide variety of proteins that include

  1. enzymes;
  2. structural molecules, such as hair,
  3. skin and bones;
  4. hormones like insulin; and
  5. components of our immune system such as antibodies.

Regarded as life’s most important molecular machine, ribosomes have been intensively studied by scientists (the 2009 Nobel Prize in Chemistry, for example, was awarded for studies of its structure and function). But until now researchers had not uncovered all of the details of how the proteins that are used to construct ribosomes are themselves produced.

In multicellular animals such as humans,

  • ribosomes are made up of about 80 different proteins
    (humans have 79 while some other animals have a slightly different number) as well as
  • four different kinds of RNA molecules.

In 1969, scientists discovered that

  • the synthesis of the ribosomal RNAs is carried out by specialized systems using two key enzymes:
  • RNA polymerase I and RNA polymerase III.

But until now, scientists were unsure if a complementary system was also responsible for

  • the production of the 80 proteins that make up the ribosome.

That’s essentially what the UC San Diego researchers headed by Jim Kadonaga, a professor of biology, set out to examine. What they found was the missing link—the specialized

  • system that allows ribosomal proteins themselves to be synthesized by the cell.

Kadonaga says that he and coworkers found that ribosomal proteins are synthesized via

  • a novel regulatory system with the enzyme RNA polymerase II and
  • a factor termed TRF2,”

“For the production of most proteins,

  1. RNA polymerase II functions with
  2. a factor termed TBP,
  3. but for the synthesis of ribosomal proteins, it uses TRF2.”
  •  this specialized TRF2-based system for ribosome biogenesis
  • provides a new avenue for the study of ribosomes and
  • its control of cell growth, and

“it should lead to a better understanding and potential treatment of diseases such as cancer.”

Coordination of the transcriptome and metabolome

Coordination of the transcriptome and metabolome

the potential advantages conferred by distal-site protein synthesis

the potential advantages conferred by distal-site protein synthesis

Other authors of the paper were UC San Diego biologists Yuan-Liang Wang, Sascha Duttke and George Kassavetis, and Kai Chen, Jeff Johnston, and Julia Zeitlinger of the Stowers Institute for Medical Research in Kansas City, Missouri. Their research was supported by two grants from the National Institutes of Health (1DP2OD004561-01 and R01 GM041249).

Turning Off a Powerful Cancer Protein

Scientists have discovered how to shut down a master regulatory transcription factor that is

  • key to the survival of a majority of aggressive lymphomas,
  • which arise from the B cells of the immune system.

The protein, Bcl6, has long been considered too complex to target with a drug since it is also crucial

  • to the healthy functioning of many immune cells in the body, not just B cells gone bad.

The researchers at Weill Cornell Medical College report that it is possible

  • to shut down Bcl6 in diffuse large B-cell lymphoma (DLBCL)
  • while not affecting its vital function in T cells and macrophages
  • that are needed to support a healthy immune system.

If Bcl6 is completely inhibited, patients might suffer from systemic inflammation and atherosclerosis. The team conducted this new study to help clarify possible risks, as well as to understand

  • how Bcl6 controls the various aspects of the immune system.

The findings in this study were inspired from

  • preclinical testing of two Bcl6-targeting agents that Dr. Melnick and his Weill Cornell colleagues have developed
  • to treat DLBCLs.

These experimental drugs are

  • RI-BPI, a peptide mimic, and
  • the small molecule agent 79-6.

“This means the drugs we have developed against Bcl6 are more likely to be

  • significantly less toxic and safer for patients with this cancer than we realized,”

says Ari Melnick, M.D., professor of hematology/oncology and a hematologist-oncologist at NewYork-Presbyterian Hospital/Weill Cornell Medical Center.

Dr. Melnick says the discovery that

  • a master regulatory transcription factor can be targeted
  • offers implications beyond just treating DLBCL.

Recent studies from Dr. Melnick and others have revealed that

  • Bcl6 plays a key role in the most aggressive forms of acute leukemia, as well as certain solid tumors.

Bcl6 can control the type of immune cell that develops in the bone marrow—playing many roles

  • in the development of B cells, T cells, macrophages, and other cells—including a primary and essential role in
  • enabling B-cells to generate specific antibodies against pathogens.

According to Dr. Melnick, “When cells lose control of Bcl6,

  • lymphomas develop in the immune system.

Lymphomas are ‘addicted’ to Bcl6, and therefore

  • Bcl6 inhibitors powerfully and quickly destroy lymphoma cells,” .

The big surprise in the current study is that rather than functioning as a single molecular machine,

  • Bcl6 functions like a Swiss Army knife,
  • using different tools to control different cell types.

This multifunction paradigm could represent a general model for the functioning of other master regulatory transcription factors.

“In this analogy, the Swiss Army knife, or transcription factor, keeps most of its tools folded,

  • opening only the one it needs in any given cell type,”

He makes the following analogy:

  • “For B cells, it might open and use the knife tool;
  • for T cells, the cork screw;
  • for macrophages, the scissors.”

“this means that you only need to prevent the master regulator from using certain tools to treat cancer. You don’t need to eliminate the whole knife,” . “In fact, we show that taking out the whole knife is harmful since

  • the transcription factor has many other vital functions that other cells in the body need.”

Prior to these study results, it was not known that a master regulator could separate its functions so precisely. Researchers hope this will be a major benefit to the treatment of DLBCL and perhaps other disorders that are influenced by Bcl6 and other master regulatory transcription factors.

The study is published in the journal Nature Immunology, in a paper titled “Lineage-specific functions of Bcl-6 in immunity and inflammation are mediated by distinct biochemical mechanisms”.

Part 3. Neuroscience

Vesicles influence function of nerve cells 
Oct, 06 2014        source: http://feeds.sciencedaily.com

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Tiny vesicles containing protective substances

  • which they transmit to nerve cells apparently
  • play an important role in the functioning of neurons.

As cell biologists at Johannes Gutenberg University Mainz (JGU) have discovered,

  • nerve cells can enlist the aid of mini-vesicles of neighboring glial cells
  • to defend themselves against stress and other potentially detrimental factors.

These vesicles, called exosomes, appear to stimulate the neurons on various levels:

  • they influence electrical stimulus conduction,
  • biochemical signal transfer, and
  • gene regulation.

Exosomes are thus multifunctional signal emitters

  • that can have a significant effect in the brain.

Exosome

Exosome

The researchers in Mainz already observed in a previous study that

  • oligodendrocytes release exosomes on exposure to neuronal stimuli.
  • these are absorbed by the neurons and improve neuronal stress tolerance.

Oligodendrocytes, a type of glial cell, form an

  • insulating myelin sheath around the axons of neurons.

The exosomes transport protective proteins such as

  • heat shock proteins,
  • glycolytic enzymes, and
  • enzymes that reduce oxidative stress from one cell type to another,
  • but also transmit genetic information in the form of ribonucleic acids.

“As we have now discovered in cell cultures, exosomes seem to have a whole range of functions,” explained Dr. Eva-Maria Krmer-Albers. By means of their transmission activity, the small bubbles that are the vesicles

  • not only promote electrical activity in the nerve cells, but also
  • influence them on the biochemical and gene regulatory level.

“The extent of activities of the exosomes is impressive,” added Krmer-Albers. The researchers hope that the understanding of these processes will contribute to the development of new strategies for the treatment of neuronal diseases. Their next aim is to uncover how vesicles actually function in the brains of living organisms.

http://labroots.com/user/news/article/id/217438/title/vesicles-influence-function-of-nerve-cells

The above story is based on materials provided by Universitt Mainz.

Universitt Mainz. “Vesicles influence function of nerve cells.” ScienceDaily. ScienceDaily, 6 October 2014. www.sciencedaily.com/releases/2014/10/141006174214.htm

Neuroscientists use snail research to help explain “chemo brain”

10/08/2014
It is estimated that as many as half of patients taking cancer drugs experience a decrease in mental sharpness. While there have been many theories, what causes “chemo brain” has eluded scientists.

In an effort to solve this mystery, neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) conducted an experiment in an animal memory model and their results point to a possible explanation. Findings appeared in The Journal of Neuroscience.

In the study involving a sea snail that shares many of the same memory mechanisms as humans and a drug used to treat a variety of cancers, the scientists identified

  • memory mechanisms blocked by the drug.

Then, they were able to counteract or

  • unblock the mechanisms by administering another agent.

“Our research has implications in the care of people given to cognitive deficits following drug treatment for cancer,” said John H. “Jack” Byrne, Ph.D., senior author, holder of the June and Virgil Waggoner Chair and Chairman of the Department of Neurobiology and Anatomy at the UTHealth Medical School. “There is no satisfactory treatment at this time.”

Byrne’s laboratory is known for its use of a large snail called Aplysia californica to further the understanding of the biochemical signaling among nerve cells (neurons).  The snails have large neurons that relay information much like those in humans.

When Byrne’s team compared cell cultures taken from normal snails to

  • those administered a dose of a cancer drug called doxorubicin,

the investigators pinpointed a neuronal pathway

  • that was no longer passing along information properly.

With the aid of an experimental drug,

  • the scientists were able to reopen the pathway.

Unfortunately, this drug would not be appropriate for humans, Byrne said. “We want to identify other drugs that can rescue these memory mechanisms,” he added.

According the American Cancer Society, some of the distressing mental changes cancer patients experience may last a short time or go on for years.

Byrne’s UT Health research team includes co-lead authors Rong-Yu Liu, Ph.D., and Yili Zhang, Ph.D., as well as Brittany Coughlin and Leonard J. Cleary, Ph.D. All are affiliated with the W.M. Keck Center for the Neurobiology of Learning and Memory.

Byrne and Cleary also are on the faculty of The University of Texas Graduate School of Biomedical Sciences at Houston. Coughlin is a student at the school, which is jointly operated by UT Health and The University of Texas MD Anderson Cancer Center.

The study titled “Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase” received support from National Institutes of Health grant (NS019895) and the Zilkha Family Discovery Fellowship.

Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase

Source: Univ. of Texas Health Science Center at Houston

http://www.rdmag.com/news/2014/10/neuroscientists-use-snail-research-help-explain-E2_9_Cchemo-brain

Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase

Rong-Yu Liu*,  Yili Zhang*,  Brittany L. Coughlin,  Leonard J. Cleary, and  John H. Byrne   +Show Affiliations
The Journal of Neuroscience, 1 Oct 2014, 34(40): 13289-13300;
http://dx.doi.org:/10.1523/JNEUROSCI.0538-14.2014

Doxorubicin (DOX) is an anthracycline used widely for cancer chemotherapy. Its primary mode of action appears to be

  • topoisomerase II inhibition, DNA cleavage, and free radical generation.

However, in non-neuronal cells, DOX also inhibits the expression of

  • dual-specificity phosphatases (also referred to as MAPK phosphatases) and thereby
  1. inhibits the dephosphorylation of extracellular signal-regulated kinase (ERK) and
  2. p38 mitogen-activated protein kinase (p38 MAPK),
  3. two MAPK isoforms important for long-term memory (LTM) formation.

Activation of these kinases by DOX in neurons, if present,

  • could have secondary effects on cognitive functions, such as learning and memory.

The present study used cultures of rat cortical neurons and sensory neurons (SNs) of Aplysia

  • to examine the effects of DOX on levels of phosphorylated ERK (pERK) and
  • phosphorylated p38 (p-p38) MAPK.

In addition, Aplysia neurons were used to examine the effects of DOX on

  • long-term enhanced excitability, long-term synaptic facilitation (LTF), and
  • long-term synaptic depression (LTD).

DOX treatment led to elevated levels of

  • pERK and p-p38 MAPK in SNs and cortical neurons.

In addition, it increased phosphorylation of

  • the downstream transcriptional repressor cAMP response element-binding protein 2 in SNs.

DOX treatment blocked serotonin-induced LTF and enhanced LTD induced by the neuropeptide Phe-Met-Arg-Phe-NH2. The block of LTF appeared to be attributable to

  • overriding inhibitory effects of p-p38 MAPK, because
  • LTF was rescued in the presence of an inhibitor of p38 MAPK
    (SB203580 [4-(4-fluorophenyl)-2-(4-methylsulfinylphenyl)-5-(4-pyridyl)-1H-imidazole]) .

These results suggest that acute application of DOX might impair the formation of LTM via the p38 MAPK pathway.
Terms: Aplysia chemotherapy ERK  p38 MAPK serotonin synaptic plasticity

Technology that controls brain cells with radio waves earns early BRAIN grant

10/08/2014

bright spots = cells with increased calcium after treatment with radio waves,  allows neurons to fire

bright spots = cells with increased calcium after treatment with radio waves, allows neurons to fire

BRAIN control: The new technology uses radio waves to activate or silence cells remotely. The bright spots above represent cells with increased calcium after treatment with radio waves, a change that would allow neurons to fire.

A proposal to develop a new way to

  • remotely control brain cells

from Sarah Stanley, a research associate in Rockefeller University’s Laboratory of Molecular Genetics, headed by Jeffrey M. Friedman, is

  • among the first to receive funding from U.S. President Barack Obama’s BRAIN initiative.

The project will make use of a technique called

  • radiogenetics that combines the use of radio waves or magnetic fields with
  • nanoparticles to turn neurons on or off.

The National Institutes of Health is one of four federal agencies involved in the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) initiative. Following in the ambitious footsteps of the Human Genome Project, the BRAIN initiative seeks

  • to create a dynamic map of the brain in action,

a goal that requires the development of new technologies. The BRAIN initiative working group, which outlined the broad scope of the ambitious project, was co-chaired by Rockefeller’s Cori Bargmann, head of the Laboratory of Neural Circuits and Behavior.

Stanley’s grant, for $1.26 million over three years, is one of 58 projects to get BRAIN grants, the NIH announced. The NIH’s plan for its part of this national project, which has been pitched as “America’s next moonshot,” calls for $4.5 billion in federal funds over 12 years.

The technology Stanley is developing would

  • enable researchers to manipulate the activity of neurons, as well as other cell types,
  • in freely moving animals in order to better understand what these cells do.

Other techniques for controlling selected groups of neurons exist, but her new nanoparticle-based technique has a

  • unique combination of features that may enable new types of experimentation.
  • it would allow researchers to rapidly activate or silence neurons within a small area of the brain or
  • dispersed across a larger region, including those in difficult-to-access locations.

Stanley also plans to explore the potential this method has for use treating patients.

“Francis Collins, director of the NIH, has discussed

  • the need for studying the circuitry of the brain,
  • which is formed by interconnected neurons.

Our remote-control technology may provide a tool with which researchers can ask new questions about the roles of complex circuits in regulating behavior,” Stanley says.
Rockefeller University’s Laboratory of Molecular Genetics
Source: Rockefeller Univ.

Part 4.  Cancer

Two Proteins Found to Block Cancer Metastasis

Why do some cancers spread while others don’t? Scientists have now demonstrated that

  • metastatic incompetent cancers actually “poison the soil”
  • by generating a micro-environment that blocks cancer cells
  • from settling and growing in distant organs.

The “seed and the soil” hypothesis proposed by Stephen Paget in 1889 is now widely accepted to explain how

  • cancer cells (seeds) are able to generate fertile soil (the micro-environment)
  • in distant organs that promotes cancer’s spread.

However, this concept had not explained why some tumors do not spread or metastasize.

The researchers, from Weill Cornell Medical College, found that

  • two key proteins involved in this process work by
  • dramatically suppressing cancer’s spread.

The study offers hope that a drug based on these

  • potentially therapeutic proteins, prosaposin and Thrombospondin 1 (Tsp-1),

might help keep human cancer at bay and from metastasizing.

Scientists don’t understand why some tumors wouldn’t “want” to spread. It goes against their “job description,” says the study’s senior investigator, Vivek Mittal, Ph.D., an associate professor of cell and developmental biology in cardiothoracic surgery and director of the Neuberger Berman Foundation Lung Cancer Laboratory at Weill Cornell Medical College. He theorizes that metastasis occurs when

  • the barriers that the body throws up to protect itself against cancer fail.

But there are some tumors in which some of the barriers may still be intact. “So that suggests

  • those primary tumors will continue to grow, but that
  • an innate protective barrier still exists that prevents them from spreading and invading other organs,”

The researchers found that, like typical tumors,

  • metastasis-incompetent tumors also send out signaling molecules
  • that establish what is known as the “premetastatic niche” in distant organs.

These niches composed of bone marrow cells and various growth factors have been described previously by others including Dr. Mittal as the fertile “soil” that the disseminated cancer cell “seeds” grow in.

Weill Cornell’s Raúl Catena, Ph.D., a postdoctoral fellow in Dr. Mittal’s laboratory, found an important difference between the tumor types. Metastatic-incompetent tumors

  • systemically increased expression of Tsp-1, a molecule known to fight cancer growth.
  • increased Tsp-1 production was found specifically in the bone marrow myeloid cells
  • that comprise the metastatic niche.

These results were striking, because for the first time Dr. Mittal says

  • the bone marrow-derived myeloid cells were implicated as
  • the main producers of Tsp-1,.

In addition, Weill Cornell and Harvard researchers found that

  • prosaposin secreted predominantly by the metastatic-incompetent tumors
  • increased expression of Tsp-1 in the premetastatic lungs.

Thus, Dr. Mittal posits that prosaposin works in combination with Tsp-1

  • to convert pro-metastatic bone marrow myeloid cells in the niche
  • into cells that are not hospitable to cancer cells that spread from a primary tumor.
  • “The very same myeloid cells in the niche that we know can promote metastasis
  • can also be induced under the command of the metastatic incompetent primary tumor to inhibit metastasis,”

The research team found that

  • the Tsp-1–inducing activity of prosaposin
  • was contained in only a 5-amino acid peptide region of the protein, and
  • this peptide alone induced Tsp-1 in the bone marrow cells and
  • effectively suppressed metastatic spread in the lungs
  • in mouse models of breast and prostate cancer.

This 5-amino acid peptide with Tsp-1–inducing activity

  • has the potential to be used as a therapeutic agent against metastatic cancer,

The scientists have begun to test prosaposin in other tumor types or metastatic sites.

Dr. Mittal says that “The clinical implications of the study are:

  • “Not only is it theoretically possible to design a prosaposin-based drug or drugs
  • that induce Tsp-1 to block cancer spread, but
  • you could potentially create noninvasive prognostic tests
  • to predict whether a cancer will metastasize.”

The study was reported in the April 30 issue of Cancer Discovery, in a paper titled “Bone Marrow-Derived Gr1+ Cells Can Generate a Metastasis-Resistant Microenvironment Via Induced Secretion of Thrombospondin-1”.

Disabling Enzyme Cripples Tumors, Cancer Cells

First Step of Metastasis

First Step of Metastasis

Published: Sep 05, 2013  http://www.technologynetworks.com/Metabolomics/news.aspx?id=157138

Knocking out a single enzyme dramatically cripples the ability of aggressive cancer cells to spread and grow tumors.

The paper, published in the journal Proceedings of the National Academy of Sciences, sheds new light on the importance of lipids, a group of molecules that includes fatty acids and cholesterol, in the development of cancer.

Researchers have long known that cancer cells metabolize lipids differently than normal cells. Levels of ether lipids – a class of lipids that are harder to break down – are particularly elevated in highly malignant tumors.

“Cancer cells make and use a lot of fat and lipids, and that makes sense because cancer cells divide and proliferate at an accelerated rate, and to do that,

  • they need lipids, which make up the membranes of the cell,”

said study principal investigator Daniel Nomura, assistant professor in UC Berkeley’s Department of Nutritional Sciences and Toxicology. “Lipids have a variety of uses for cellular structure, but what we’re showing with our study is that

  • lipids can send signals that fuel cancer growth.”

In the study, Nomura and his team tested the effects of reducing ether lipids on human skin cancer cells and primary breast tumors. They targeted an enzyme,

  • alkylglycerone phosphate synthase, or AGPS,
  • known to be critical to the formation of ether lipids.

The researchers confirmed that

  1. AGPS expression increased when normal cells turned cancerous.
  2. inactivating AGPS substantially reduced the aggressiveness of the cancer cells.

“The cancer cells were less able to move and invade,” said Nomura.

The researchers also compared the impact of

  • disabling the AGPS enzyme in mice that had been injected with cancer cells.

Nomura. observes -“Among the mice that had the AGPS enzyme inactivated,

  • the tumors were nonexistent,”

“The mice that did not have this enzyme

  • disabled rapidly developed tumors.”

The researchers determined that

  • inhibiting AGPS expression depleted the cancer cells of ether lipids.
  • AGPS altered levels of other types of lipids important to the ability of the cancer cells to survive and spread, including
    • prostaglandins and acyl phospholipids.

“What makes AGPS stand out as a treatment target is that the enzyme seems to simultaneously

  • regulate multiple aspects of lipid metabolism
  • important for tumor growth and malignancy.”

Future steps include the

  • development of AGPS inhibitors for use in cancer therapy,

“This study sheds considerable light on the important role that AGPS plays in ether lipid metabolism in cancer cells, and it suggests that

  • inhibitors of this enzyme could impair tumor formation,”

said Benjamin Cravatt, Professor and Chair of Chemical Physiology at The Scripps Research Institute, who is not part of the UC.

Agilent Technologies Thought Leader Award Supports Translational Research Program
Published: Mon, March 04, 2013

The award will support Dr DePinho’s research into

  • metabolic reprogramming in the earliest stages of cancer.

Agilent Technologies Inc. announces that Dr. Ronald A. DePinho, a world-renowned oncologist and researcher, has received an Agilent Thought Leader Award.

DePinho is president of the University of Texas MD Anderson Cancer Center. DePinho and his team hope to discover and characterize

  • alterations in metabolic flux during tumor initiation and maintenance, and to identify biomarkers for early detection of pancreatic cancer together with
  • novel therapeutic targets.

Researchers on his team will work with scientists from the university’s newly formed Institute of Applied Cancer Sciences.

The Agilent Thought Leader Award provides funds to support personnel as well as a state-of-the-art Agilent 6550 iFunnel Q-TOF LC/MS system.

“I am extremely pleased to receive this award for metabolomics research, as the survival rates for pancreatic cancer have not significantly improved over the past 20 years,” DePinho said. “This technology will allow us to

  • rapidly identify new targets that drive the formation, progression and maintenance of pancreatic cancer.

Discoveries from this research will also lead to

  • the development of effective early detection biomarkers and novel therapeutic interventions.”

“We are proud to support Dr. DePinho’s exciting translational research program, which will make use of

  • metabolomics and integrated biology workflows and solutions in biomarker discovery,”

said Patrick Kaltenbach, Agilent vice president, general manager of the Liquid Phase Division, and the executive sponsor of this award.

The Agilent Thought Leader Program promotes fundamental scientific advances by support of influential thought leaders in the life sciences and chemical analysis fields.

The covalent modifier Nedd8 is critical for the activation of Smurf1 ubiquitin ligase in tumorigenesis

Ping Xie, Minghua Zhang, Shan He, Kefeng Lu, Yuhan Chen, Guichun Xing, et al.
Nature Communications
  2014; 5(3733).  http://dx.doi.org:/10.1038/ncomms4733

Neddylation, the covalent attachment of ubiquitin-like protein Nedd8, of the Cullin-RING E3 ligase family

  • regulates their ubiquitylation activity.

However, regulation of HECT ligases by neddylation has not been reported to date. Here we show that

  • the C2-WW-HECT ligase Smurf1 is activated by neddylation.

Smurf1 physically interacts with

  1. Nedd8 and Ubc12,
  2. forms a Nedd8-thioester intermediate, and then
  3. catalyses its own neddylation on multiple lysine residues.

Intriguingly, this autoneddylation needs

  • an active site at C426 in the HECT N-lobe.

Neddylation of Smurf1 potently enhances

  • ubiquitin E2 recruitment and
  • augments the ubiquitin ligase activity of Smurf1.

The regulatory role of neddylation

  • is conserved in human Smurf1 and yeast Rsp5.

Furthermore, in human colorectal cancers,

  • the elevated expression of Smurf1, Nedd8, NAE1 and Ubc12
  • correlates with cancer progression and poor prognosis.

These findings provide evidence that

  • neddylation is important in HECT ubiquitin ligase activation and
  • shed new light on the tumour-promoting role of Smurf1.

 Swinging domains in HECT E3

Swinging domains in HECT E3

Subject terms: Biological sciences Cancer Cell biology

Figure 1: Smurf1 expression is elevated in colorectal cancer tissues.

Smurf1 expression is elevated in colorectal cancer tissues.

Smurf1 expression is elevated in colorectal cancer tissues.

(a) Smurf1 expression scores are shown as box plots, with the horizontal lines representing the median; the bottom and top of the boxes representing the 25th and 75th percentiles, respectively; and the vertical bars representing the ra

Figure 2: Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer.

Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer

Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer

(a) Representative images from immunohistochemical staining of Smurf1, Ubc12, NAE1 and Nedd8 in the same colorectal cancer tumour. Scale bars, 100 μm. (bd) The expression scores of Nedd8 (b, n=283 ), NAE1 (c, n=281) and Ubc12 (d, n=19…

Figure 3: Smurf1 interacts with Ubc12.

Smurf1 interacts with Ubc12

Smurf1 interacts with Ubc12

(a) GST pull-down assay of Smurf1 with Ubc12. Both input and pull-down samples were subjected to immunoblotting with anti-His and anti-GST antibodies. Smurf1 interacted with Ubc12 and UbcH5c, but not with Ubc9. (b) Mapping the regions…

Figure 4: Nedd8 is attached to Smurf1through C426-catalysed autoneddylation.

Nedd8 is attached to Smurf1through C426-catalysed autoneddylation

Nedd8 is attached to Smurf1through C426-catalysed autoneddylation

(a) Covalent neddylation of Smurf1 in vitro.Purified His-Smurf1-WT or C699A proteins were incubated with Nedd8 and Nedd8-E1/E2. Reactions were performed as described in the Methods section. Samples were analysed by western blotting wi…

Figure 5: Neddylation of Smurf1 activates its ubiquitin ligase activity.

Neddylation of Smurf1 activates its ubiquitin ligase activity.

Neddylation of Smurf1 activates its ubiquitin ligase activity.

(a) In vivo Smurf1 ubiquitylation assay. Nedd8 was co-expressed with Smurf1 WT or C699A in HCT116 cells (left panels). Twenty-four hours post transfection, cells were treated with MG132 (20 μM, 8 h). HCT116 cells were transfected with…

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f1.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f2.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f3.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f4.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f5.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f6.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f7.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f8.jpg

The deubiquitylase USP33 discriminates between RALB functions in autophagy and innate immune response

M Simicek, S Lievens, M Laga, D Guzenko, VN. Aushev, et al.
Nature Cell Biology 2013; 15, 1220–1230    http://dx.doi.org:/10.1038/ncb2847

The RAS-like GTPase RALB mediates cellular responses to nutrient availability or viral infection by respectively

  • engaging two components of the exocyst complex, EXO84 and SEC5.
  1. RALB employs SEC5 to trigger innate immunity signalling, whereas
  2. RALB–EXO84 interaction induces autophagocytosis.

How this differential interaction is achieved molecularly by the RAL GTPase remains unknown.

We found that whereas GTP binding

  • turns on RALB activity,

ubiquitylation of RALB at Lys 47

  • tunes its activity towards a particular effector.

Specifically, ubiquitylation at Lys 47

  • sterically inhibits RALB binding to EXO84, while
  • facilitating its interaction with SEC5.

Double-stranded RNA promotes

  • RALB ubiquitylation and
  • SEC5–TBK1 complex formation.

In contrast, nutrient starvation

  • induces RALB deubiquitylation
  • by accumulation and relocalization of the deubiquitylase USP33
  • to RALB-positive vesicles.

Deubiquitylated RALB

  • promotes the assembly of the RALB–EXO84–beclin-1 complexes
  • driving autophagosome formation. Thus,
  • ubiquitylation within the effector-binding domain
  • provides the switch for the dual functions of RALB in
    • autophagy and innate immune responses.

Part 5. Metabolic Syndrome

Single Enzyme is Necessary for Development of Diabetes

Published: Aug 20, 2014 http://www.technologynetworks.com/Metabolomics/news.aspx?ID=169416

12-LO enzyme promotes the obesity-induced oxidative stress in the pancreatic cells.

An enzyme called 12-LO promotes the obesity-induced oxidative stress in the pancreatic cells that leads

  • to pre-diabetes, and diabetes.

12-LO’s enzymatic action is the last step in

  • the production of certain small molecules that harm the cell,

according to a team from Indiana University School of Medicine, Indianapolis.

The findings will enable the development of drugs that can interfere with this enzyme, preventing or even reversing diabetes. The research is published ahead of print in the journal Molecular and Cellular Biology.

In earlier studies, these researchers and their collaborators at Eastern Virginia Medical School showed that

  • 12-LO (which stands for 12-lipoxygenase) is present in these cells
  • only in people who become overweight.

The harmful small molecules resulting from 12-LO’s enzymatic action are known as HETEs, short for hydroxyeicosatetraenoic acid.

  1. HETEs harm the mitochondria, which then
  2. fail to produce sufficient energy to enable
  3. the pancreatic cells to manufacture the necessary quantities of insulin.

For the study, the investigators genetically engineered mice that

  • lacked the gene for 12-LO exclusively in their pancreas cells.

Mice were either fed a low-fat or high-fat diet.

Both the control mice and the knockout mice on the high fat diet

  • developed obesity and insulin resistance.

The investigators also examined the pancreatic beta cells of both knockout and control mice, using both microscopic studies and molecular analysis. Those from the knockout mice were intact and healthy, while

  • those from the control mice showed oxidative damage,
  • demonstrating that 12-LO and the resulting HETEs
  • caused the beta cell failure.

Mirmira notes that fatty diet used in the study was the Western Diet, which comprises mostly saturated-“bad”-fats. Based partly on a recent study of related metabolic pathways, he says that

  • the unsaturated and mono-unsaturated fats-which comprise most fats in the healthy,
  • relatively high fat Mediterranean diet-are unlikely to have the same effects.

“Our research is the first to show that 12-LO in the beta cell

  • is the culprit in the development of pre-diabetes, following high fat diets,” says Mirmira.

“Our work also lends important credence to the notion that

  • the beta cell is the primary defective cell in virtually all forms of diabetes and pre-diabetes.”

A New Player in Lipid Metabolism Discovered

Published: Aug18, 2014  http://www.technologynetworks.com/Metabolomics/news.aspx?ID=169356

Specially engineered mice gained no weight, and normal counterparts became obese

  • on the same high-fat, obesity-inducing Western diet.

Specially engineered mice that lacked a particular gene did not gain weight

  • when fed a typical high-fat, obesity-inducing Western diet.

Yet, these mice ate the same amount as their normal counterparts that became obese.

The mice were engineered with fat cells that lacked a gene called SEL1L,

  • known to be involved in the clearance of mis-folded proteins
  • in the cell’s protein making machinery called the endoplasmic reticulum (ER).

When mis-folded proteins are not cleared but accumulate,

  • they destroy the cell and contribute to such diseases as
  1. mad cow disease,
  2. Type 1 diabetes and
  3. cystic fibrosis.

“The million-dollar question is why don’t these mice gain weight? Is this related to its inability to clear mis-folded proteins in the ER?” said Ling Qi, associate professor of molecular and biochemical nutrition and senior author of the study published online July 24 in Cell Metabolism. Haibo Sha, a research associate in Qi’s lab, is the paper’s lead author.

Interestingly, the experimental mice developed a host of other problems, including

  • postprandial hypertriglyceridemia,
  • and fatty livers.

“Although we are yet to find out whether these conditions contribute to the lean phenotype, we found that

  • there was a lipid partitioning defect in the mice lacking SEL1L in fat cells,
  • where fat cells cannot store fat [lipids], and consequently
  • fat goes to the liver.

During the investigation of possible underlying mechanisms, we discovered

  • a novel function for SEL1L as a regulator of lipid metabolism,” said Qi.

Sha said “We were very excited to find that

  • SEL1L is required for the intracellular trafficking of
  • lipoprotein lipase (LPL), acting as a chaperone,” .

and added that “Using several tissue-specific knockout mouse models,

  • we showed that this is a general phenomenon,”

Without LPL, lipids remain in the circulation;

  • fat and muscle cells cannot absorb fat molecules for storage and energy combustion,

People with LPL mutations develop

  • postprandial hypertriglyceridemia similar to
  • conditions found in fat cell-specific SEL1L-deficient mice, said Qi.

Future work will investigate the

  • role of SEL1L in human patients carrying LPL mutations and
  • determine why fat cell-specific SEL1L-deficient mice remain lean under Western diets, said Sha.

Co-authors include researchers from Cedars-Sinai Medical Center in Los Angeles; Wageningen University in the Netherlands; Georgia State University; University of California, Los Angeles; and the Medical College of Soochow University in China.

The study was funded by the U.S. National Institutes of Health, the Netherlands Organization for Health Research and Development National Institutes of Health, the Cedars-Sinai Medical Center, Chinese National Science Foundation, the American Diabetes Association, Cornell’s Center for Vertebrate Genomics and the Howard Hughes Medical Institute.

Part 6. Biomarkers

Biomarkers Take Center Stage

Josh P. Roberts
GEN May 1, 2013 (Vol. 33, No. 9)  http://www.genengnews.com/

While work with biomarkers continues to grow, scientists are also grappling with research-related bottlenecks, such as

  1. affinity reagent development,
  2. platform reproducibility, and
  3. sensitivity.

Biomarkers by definition indicate some state or process that generally occurs

  • at a spatial or temporal distance from the marker itself, and

it would not be an exaggeration to say that biomedicine has become infatuated with them:

  1. where to find them,
  2. when they may appear,
  3. what form they may take, and
  4. how they can be used to diagnose a condition or
  5. predict whether a therapy may be successful.

Biomarkers are on the agenda of many if not most industry gatherings, and in cases such as Oxford Global’s recent “Biomarker Congress” and the GTC “Biomarker Summit”, they hold the naming rights. There, some basic principles were built upon, amended, and sometimes challenged.

In oncology, for example, biomarker discovery is often predicated on the premise that

  • proteins shed from a tumor will traverse to and persist in, and be detectable in, the circulation.

By quantifying these proteins—singularly or as part of a larger “signature”—the hope is

  1. to garner information about the molecular characteristics of the cancer
  2. that will help with cancer detection and
  3. personalization of the treatment strategy.

Yet this approach has not yet turned into the panacea that was hoped for. Bottlenecks exist in

  • affinity reagent development,
  • platform reproducibility, and
  • sensitivity.

There is also a dearth of understanding of some of the

  • fundamental principles of biomarker biology that we need to know the answers to,

said Parag Mallick, Ph.D., whose lab at Stanford University is “working on trying to understand where biomarkers come from.”

There are dogmas saying that

  • circulating biomarkers come solely from secreted proteins.

But Dr. Mallick’s studies indicate that fully

  • 50% of circulating proteins may come from intracellular sources or
  • proteins that are annotated as such.

“We don’t understand the processes governing

  • which tumor-derived proteins end up in the blood.”

Other questions include “how does the size of a tumor affect how much of a given protein will be in the blood?”—perhaps

  • the tumor is necrotic at the center, or
  • it’s hypervascular or hypovascular.

He points out “The problem is that these are highly nonlinear processes at work, and

  • there is a large number of factors that might affect the answer to that question,” .

Their research focuses on using

  1. mass spectrometry and
  2. computational analysis
  • to characterize the biophysical properties of the circulating proteome, and
  • relate these to measurements made of the tumor itself.

Furthermore, he said – “We’ve observed that the proteins that are likely to

  • first show up and persist in the circulation, ..
  • are more stable than proteins that don’t,”
  • “we can quantify how significant the effect is.”

The goal is ultimately to be able to

  1. build rigorous, formal mathematical models that will allow something measured in the blood
  2. to be tied back to the molecular biology taking place in the tumor.

And conversely, to use those models

  • to predict from a tumor what will be found in the circulation.

“Ultimately, the models will allow you to connect the dots between

  • what you measure in the blood and the biology of the tumor.”

Bound for Affinity Arrays

Affinity reagents are the main tools for large-scale protein biomarker discovery. And while this has tended to mean antibodies (or their derivatives), other affinity reagents are demanding a place in the toolbox.

Affimers, a type of affinity reagent being developed by Avacta, consist of

  1. a biologically inert, biophysically stable protein scaffold
  2. containing three variable regions into which
  3. distinct peptides are inserted.

The resulting three-dimensional surface formed by these peptides

  • interacts and binds to proteins and other molecules in solution,
  • much like the antigen-binding site of antibodies.

Unlike antibodies, Affimers are relatively small (13 KDa),

  • non-post-translationally modified proteins
  • that can readily be expressed in bacterial culture.

They may be made to bind surfaces through unique residues

  • engineered onto the opposite face of the Affimer,
  • allowing the binding site to be exposed to the target in solution.

“We don’t seem to see in what we’ve done so far

  • any real loss of activity or functionality of Affimers when bound to surfaces—

they’re very robust,” said CEO Alastair Smith, Ph.D.

Avacta is taking advantage of this stability and its large libraries of Affimers to develop

  • very large affinity microarrays for
  • drug and biomarker discovery.

To date they have printed arrays with around 20–25,000 features, and Dr. Smith is “sure that we can get toward about 50,000 on a slide,” he said. “There’s no real impediment to us doing that other than us expressing the proteins and getting on with it.”

Customers will be provided with these large, complex “naïve” discovery arrays, readable with standard equipment. The plan is for the company to then “support our customers by providing smaller arrays with

  • the Affimers that are binding targets of interest to them,” Dr. Smith foretold.

And since the intellectual property rights are unencumbered,

  • Affimers in those arrays can be licensed to the end users
  • to develop diagnostics that can be validated as time goes on.

Around 20,000-Affimer discovery arrays were recently tested by collaborator Professor Ann Morgan of the University of Leeds with pools of unfractionated serum from patients with symptoms of inflammatory disease. The arrays

  • “rediscovered” elevated C-reactive protein (CRP, the clinical gold standard marker)
  • as well as uncovered an additional 22 candidate biomarkers.
  • other candidates combined with CRP, appear able to distinguish between different diseases such as
  1. rheumatoid arthritis,
  2. psoriatic arthritis,
  3. SLE, or
  4. giant cell arteritis.

Epigenetic Biomarkers

Methylation of adenine

Sometimes biomarkers are used not to find disease but

  • to distinguish healthy human cell types, with
  •  examples being found in flow cytometry and immunohistochemistry.

These widespread applications, however, are difficult to standardize, being

  • subject to arbitrary or subjective gating protocols and other imprecise criteria.

Epiontis instead uses an epigenetic approach. “What we need is a unique marker that is

  • demethylated only in one cell type and
  • methylated in all the other cell types,”

Each cell of the right cell type will have

  • two demethylated copies of a certain gene locus,
  • allowing them to be enumerated by quantitative PCR.

The biggest challenge is finding that unique epigenetic marker. To do so they look through the literature for proteins and genes described as playing a role in the cell type’s biology, and then

  • look at the methylation patterns to see if one can be used as a marker,

They also “use customized Affymetrix chips to look at the

  • differential epigenetic status of different cell types on a genomewide scale.”

explained CBO and founder Ulrich Hoffmueller, Ph.D.

The company currently has a panel of 12 assays for 12 immune cell types. Among these is an assay for

  • regulatory T (Treg) cells that queries the Foxp3 gene—which is uniquely demethylated in Treg
  • even though it is transiently expressed in activated T cells of other subtypes.

Also assayed are Th17 cells, difficult to detect by flow cytometry because

  • “the cells have to be stimulated in vitro,” he pointed out.

Developing New Assays for Cancer Biomarkers

Researchers at Myriad RBM and the Cancer Prevention Research Institute of Texas are collaborating to develop

  • new assays for cancer biomarkers on the Myriad RBM Multi-Analyte Profile (MAP) platform.

The release of OncologyMAP 2.0 expanded Myriad RBM’s biomarker menu to over 250 analytes, which can be measured from a small single sample, according to the company. Using this menu, L. Stephen et al., published a poster, “Analysis of Protein Biomarkers in Prostate and Colorectal Tumor Lysates,” which showed the results of

  • a survey of proteins relevant to colorectal (CRC) and prostate (PC) tumors
  • to identify potential proteins of interest for cancer research.

The study looked at CRC and PC tumor lysates and found that 102 of the 115 proteins showed levels above the lower limit of quantification.

  • Four markers were significantly higher in PC and 10 were greater in CRC.

For most of the analytes, duplicate sections of the tumor were similar, although some analytes did show differences. In four of the CRC analytes, tumor number four showed differences for CEA and tumor number 2 for uPA.

Thirty analytes were shown to be

  • different in CRC tumor compared to its adjacent tissue.
  • Ten of the analytes were higher in adjacent tissue compared to CRC.
  • Eighteen of the markers examined demonstrated  —-

significant correlations of CRC tumor concentration to serum levels.

“This suggests.. that the Oncology MAP 2.0 platform “provides a good method for studying changes in tumor levels because many proteins can be assessed with a very small sample.”

Clinical Test Development with MALDI-ToF

While there have been many attempts to translate results from early discovery work on the serum proteome into clinical practice, few of these efforts have progressed past the discovery phase.

Matrix-assisted laser desorption/ionization-time of flight (MALDI-ToF) mass spectrometry on unfractionated serum/plasma samples offers many practical advantages over alternative techniques, and does not require

  • a shift from discovery to development and commercialization platforms.

Biodesix claims it has been able to develop the technology into

  • a reproducible, high-throughput tool to
  • routinely measure protein abundance from serum/plasma samples.

“.. we improved data-analysis algorithms to

  • reproducibly obtain quantitative measurements of relative protein abundance from MALDI-ToF mass spectra.

Heinrich Röder, CTO points out that the MALDI-ToF measurements

  • are combined with clinical outcome data using
  • modern learning theory techniques
  • to define specific disease states
  • based on a patient’s serum protein content,”

The clinical utility of the identification of these disease states can be investigated through a retrospective analysis of differing sample sets. For example, Biodesix clinically validated its first commercialized serum proteomic test, VeriStrat®, in 85 different retrospective sample sets.

Röder adds that “It is becoming increasingly clear that

  • the patients whose serum is characterized as VeriStrat Poor show
  • consistently poor outcomes irrespective of
  1. tumor type,
  2. histology, or
  3. molecular tumor characteristics,”

MALDI-ToF mass spectrometry, in its standard implementation,

  • allows for the observation of around 100 mostly high-abundant serum proteins.

Further, “while this does not limit the usefulness of tests developed from differential expression of these proteins,

  • the discovery potential would be greatly enhanced
  • if we could probe deeper into the proteome
  • while not giving up the advantages of the MALDI-ToF approach,”

Biodesix reports that its new MALDI approach, Deep MALDI™, can perform

  • simultaneous quantitative measurement of more than 1,000 serum protein features (or peaks) from 10 µL of serum in a high-throughput manner.
  • it increases the observable signal noise ratio from a few hundred to over 50,000,
  • resulting in the observation of many lower-abundance serum proteins.

Breast cancer, a disease now considered to be a collection of many complexes of symptoms and signatures—the dominant ones are labeled Luminal A, Luminal B, Her2, and Basal— which suggests different prognose, and

  • these labels are considered too simplistic for understanding and managing a woman’s cancer.

Studies published in the past year have looked at

  1. somatic mutations,
  2. gene copy number aberrations,
  3. gene expression abnormalities,
  4. protein and miRNA expression, and
  5. DNA methylation,

coming up with a list of significantly mutated genes—hot spots—in different categories of breast cancers. Targeting these will inevitably be the focus of much coming research.

“We’ve been taking these large trials and profiling these on a variety of array or sequence platforms. We think we’ll get

  1. prognostic drivers
  2. predictive markers for taxanes and
  3. monoclonal antibodies and
  4. tamoxifen and aromatase inhibitors,”
    explained Brian Leyland-Jones, Ph.D., director of Edith Sanford Breast Cancer Research. “We will end up with 20–40 different diseases, maybe more.”

Edith Sanford Breast Cancer Research is undertaking a pilot study in collaboration with The Scripps Research Institute, using a variety of tests on 25 patients to see how the information they provide complements each other, the overall flow, and the time required to get and compile results.

Laser-captured tumor samples will be subjected to low passage whole-genome, exome, and RNA sequencing (with targeted resequencing done in parallel), and reverse-phase protein and phosphorylation arrays, with circulating nucleic acids and circulating tumor cells being queried as well. “After that we hope to do a 100- or 150-patient trial when we have some idea of the best techniques,” he said.

Dr. Leyland-Jones predicted that ultimately most tumors will be found

  • to have multiple drivers,
  • with most patients receiving a combination of two, three, or perhaps four different targeted therapies.

Reduce to Practice

According to Randox, the evidence Investigator is a sophisticated semi-automated biochip sys­tem designed for research, clinical, forensic, and veterinary applications.

Once biomarkers that may have an impact on therapy are discovered, it is not always routine to get them into clinical practice. Leaving regulatory and financial, intellectual property and cultural issues aside, developing a diagnostic based on a biomarker often requires expertise or patience that its discoverer may not possess.

Andrew Gribben is a clinical assay and development scientist at Randox Laboratories, based in Northern Ireland, U.K. The company utilizes academic and industrial collaborators together with in-house discovery platforms to identify biomarkers that are

  • augmented or diminished in a particular pathology
  • relative to appropriate control populations.

Biomarkers can be developed to be run individually or

  • combined into panels of immunoassays on its multiplex biochip array technology.

Specificity can also be gained—or lost—by the affinity of reagents in an assay. The diagnostic potential of Heart-type fatty acid binding protein (H-FABP) abundantly expressed in human myocardial cells was recognized by Jan Glatz of Maastricht University, The Netherlands, back in 1988. Levels rise quickly within 30 minutes after a myocardial infarction, peaking at 6–8 hours and return to normal within 24–30 hours. Yet at the time it was not known that H-FABP was a member of a multiprotein family, with which the polyclonal antibodies being used in development of an assay were cross-reacting, Gribben related.

Randox developed monoclonal antibodies specific to H-FABP, funded trials investigating its use alone, and multiplexed with cardiac biomarker assays, and, more than 30 years after the biomarker was identified, in 2011, released a validated assay for H-FABP as a biomarker for early detection of acute myocardial infarction.

Ultrasensitive Immunoassays for Biomarker Development

Research has shown that detection and monitoring of biomarker concentrations can provide

  • insights into disease risk and progression.

Cytokines have become attractive biomarkers and candidates

  • for targeted therapies for a number of autoimmune diseases, including rheumatoid arthritis (RA), Crohn’s disease, and psoriasis, among others.

However, due to the low-abundance of circulating cytokines, such as IL-17A, obtaining robust measurements in clinical samples has been difficult.

Singulex reports that its digital single-molecule counting technology provides

  • increased precision and detection sensitivity over traditional ELISA techniques,
  • helping to shed light on biomarker verification and validation programs.

The company’s Erenna® immunoassay system, which includes optimized immunoassays, offers LLoQ to femtogram levels per mL resolution—even in healthy populations, at an improvement of 1-3 fold over standard ELISAs or any conventional technology and with a dynamic range of up to 4-logs, according to a Singulex official, who adds that

  • this sensitivity improvement helps minimize undetectable samples that
  • could otherwise delay or derail clinical studies.

The official also explains that the Singulex solution includes an array of products and services that are being applied to a number of programs and have enabled the development of clinically relevant biomarkers, allowing translation from discovery to the clinic.

In a poster entitled “Advanced Single Molecule Detection: Accelerating Biomarker Development Utilizing Cytokines through Ultrasensitive Immunoassays,” a case study was presented of work performed by Jeff Greenberg of NYU to show how the use of the Erenna system can provide insights toward

  • improving the clinical utility of biomarkers and
  • accelerating the development of novel therapies for treating inflammatory diseases.

A panel of inflammatory biomarkers was examined in DMARD (disease modifying antirheumatic drugs)-naïve RA (rheumatoid arthritis) vs. knee OA (osteoarthritis) patient cohorts. Markers that exhibited significant differences in plasma concentrations between the two cohorts included

  • CRP, IL-6R alpha, IL-6, IL-1 RA, VEGF, TNF-RII, and IL-17A, IL-17F, and IL-17A/F.

Among the three tested isoforms of IL-17,

  • the magnitude of elevation for IL-17F in RA patients was the highest.

“Singulex provides high-resolution monitoring of baseline IL-17A concentrations that are present at low levels,” concluded the researchers. “The technology also enabled quantification of other IL-17 isoforms in RA patients, which have not been well characterized before.”

The Singulex Erenna System has also been applied to cardiovascular disease research, for which its

  • cardiac troponin I (cTnI) digital assay can be used to measure circulating
  • levels of cTnI undetectable by other commercial assays.

Recently presented data from Brigham and Women’s Hospital and the TIMI-22 study showed that

  • using the Singulex test to serially monitor cTnI helps
  • stratify risk in post-acute coronary syndrome patients and
  • can identify patients with elevated cTnI
  • who have the most to gain from intensive vs. moderate-dose statin therapy,

according to the scientists involved in the research.

The study poster, “Prognostic Performance of Serial High Sensitivity Cardiac Troponin Determination in Stable Ischemic Heart Disease: Analysis From PROVE IT-TIMI 22,” was presented at the 2013 American College of Cardiology (ACC) Annual Scientific Session & Expo by R. O’Malley et al.

Biomarkers Changing Clinical Medicine

Better Diagnosis, Prognosis, and Drug Targeting Are among Potential Benefits

  1. John Morrow Jr., Ph.D.

Researchers at EMD Chemicals are developing biomarker immunoassays

  • to monitor drug-induced toxicity including kidney damage.

The pace of biomarker development is accelerating as investigators report new studies on cancer, diabetes, Alzheimer disease, and other conditions in which the evaluation and isolation of workable markers is prominently featured.

Wei Zheng, Ph.D., leader of the R&D immunoassay group at EMD Chemicals, is overseeing a program to develop biomarker immunoassays to

  • monitor drug-induced toxicity, including kidney damage.

“One of the principle reasons for drugs failing during development is because of organ toxicity,” says Dr. Zheng.
“proteins liberated into the serum and urine can serve as biomarkers of adverse response to drugs, as well as disease states.”

Through collaborative programs with Rules-Based Medicine (RBM), the EMD group has released panels for the profiling of human renal impairment and renal toxicity. These urinary biomarker based products fit the FDA and EMEA guidelines for assessment of drug-induced kidney damage in rats.

The group recently performed a screen for potential protein biomarkers in relation to

  • kidney toxicity/damage on a set of urine and plasma samples
  • from patients with documented renal damage.

Additionally, Dr. Zheng is directing efforts to move forward with the multiplexed analysis of

  • organ and cellular toxicity.

Diseases thought to involve compromised oxidative phosphorylation include

  • diabetes, Parkinson and Alzheimer diseases, cancer, and the aging process itself.

Good biomarkers allow Dr. Zheng to follow the mantra, “fail early, fail fast.” With robust, multiplexible biomarkers, EMD can detect bad drugs early and kill them before they move into costly large animal studies and clinical trials. “Recognizing the severe liability that toxicity presents, we can modify the structure of the candidate molecule and then rapidly reassess its performance.”

Scientists at Oncogene Science a division of Siemens Healthcare Diagnostics, are also focused on biomarkers. “We are working on a number of antibody-based tests for various cancers, including a test for the Ca-9 CAIX protein, also referred to as carbonic anhydrase,” Walter Carney, Ph.D., head of the division, states.

CAIX is a transmembrane protein that is

  • overexpressed in a number of cancers, and, like Herceptin and the Her-2 gene,
  • can serve as an effective and specific marker for both diagnostic and therapeutic purposes.
  • It is liberated into the circulation in proportion to the tumor burden.

Dr. Carney and his colleagues are evaluating patients after tumor removal for the presence of the Ca-9 CAIX protein. If

  • the levels of the protein in serum increase over time,
  • this suggests that not all the tumor cells were removed and the tumor has metastasized.

Dr. Carney and his team have developed both an immuno-histochemistry and an ELISA test that could be used as companion diagnostics in clinical trials of CAIX-targeted drugs.

The ELISA for the Ca-9 CAIX protein will be used in conjunction with Wilex’ Rencarex®, which is currently in a

  • Phase III trial as an adjuvant therapy for non-metastatic clear cell renal cancer.

Additionally, Oncogene Science has in its portfolio an FDA-approved test for the Her-2 marker. Originally approved for Her-2/Neu-positive breast cancer, its indications have been expanded over time, and was approved

  • for the treatment of gastric cancer last year.

It is normally present on breast cancer epithelia but

  • overexpressed in some breast cancer tumors.

“Our products are designed to be used in conjunction with targeted therapies,” says Dr. Carney. “We are working with companies that are developing technology around proteins that are

  • overexpressed in cancerous tissues and can be both diagnostic and therapeutic targets.”

The long-term goal of these studies is to develop individualized therapies, tailored for the patient. Since the therapies are expensive, accurate diagnostics are critical to avoid wasting resources on patients who clearly will not respond (or could be harmed) by the particular drug.

“At this time the rate of response to antibody-based therapies may be very poor, as

  • they are often employed late in the course of the disease, and patients are in such a debilitated state
  • that they lack the capacity to react positively to the treatment,” Dr. Carney explains.

Nanoscale Real-Time Proteomics

Stanford University School of Medicine researchers, working with Cell BioSciences, have developed a

  • nanofluidic proteomic immunoassay that measures protein charge,
  • similar to immunoblots, mass spectrometry, or flow cytometry.
  • unlike these platforms, this approach can measure the amount of individual isoforms,
  • specifically, phosphorylated molecules.

“We have developed a nanoscale device for protein measurement, which I believe could be useful for clinical analysis,” says Dean W. Felsher, M.D., Ph.D., associate professor at Stanford University School of Medicine.

Critical oncogenic transformations involving

  • the activation of the signal-related kinases ERK-1 and ERK-2 can now be followed with ease.

“The fact that we measure nanoquantities with accuracy means that

  • we can interrogate proteomic profiles in clinical patients,

by drawing tiny needle aspirates from tumors over the course of time,” he explains.

“This allows us to observe the evolution of tumor cells and

  • their response to therapy
  • from a baseline of the normal tissue as a standard of comparison.”

According to Dr. Felsher, 20 cells is a large enough sample to obtain a detailed description. The technology is easy to automate, which allows

  • the inclusion of hundreds of assays.

Contrasting this technology platform with proteomic analysis using microarrays, Dr. Felsher notes that the latter is not yet workable for revealing reliable markers.

Dr. Felsher and his group published a description of this technology in Nature Medicine. “We demonstrated that we could take a set of human lymphomas and distinguish them from both normal tissue and other tumor types. We can

  • quantify changes in total protein, protein activation, and relative abundance of specific phospho-isoforms
  • from leukemia and lymphoma patients receiving targeted therapy.

Even with very small numbers of cells, we are able to show that the results are consistent, and

  • our sample is a random profile of the tumor.”

Splice Variant Peptides

“Aberrations in alternative splicing may generate

  • much of the variation we see in cancer cells,”

says Gilbert Omenn, Ph.D., director of the center for computational medicine and bioinformatics at the University of Michigan School of Medicine. Dr. Omenn and his colleague, Rajasree Menon, are

  • using this variability as a key to new biomarker identification.

It is becoming evident that splice variants play a significant role in the properties of cancer cells, including

  • initiation, progression, cell motility, invasiveness, and metastasis.

Alternative splicing occurs through multiple mechanisms

  • when the exons or coding regions of the DNA transcribe mRNA,
  • generating initiation sites and connecting exons in protein products.

Their translation into protein can result in numerous protein isoforms, and

  • these isoforms may reflect a diseased or cancerous state.

Regulatory elements within the DNA are responsible for selecting different alternatives; thus

  • the splice variants are tempting targets for exploitation as biomarkers.

Analyses of the splice-site mutation

Analyses of the splice-site mutation

Despite the many questions raised by these observations, splice variation in tumor material has not been widely studied. Cancer cells are known for their tremendous variability, which allows them to

  • grow rapidly, metastasize, and develop resistance to anticancer drugs.

Dr. Omenn and his collaborators used

  • mass spec data to interrogate a custom-built database of all potential mRNA sequences
  • to find alternative splice variants.

When they compared normal and malignant mammary gland tissue from a mouse model of Her2/Neu human breast cancers, they identified a vast number (608) of splice variant proteins, of which

  • peptides from 216 were found only in the tumor sample.

“These novel and known alternative splice isoforms

  • are detectable both in tumor specimens and in plasma and
  • represent potential biomarker candidates,” Dr. Omenn adds.

Dr. Omenn’s observations and those of his colleague Lewis Cantley, Ph.D., have also

  • shed light on the origins of the classic Warburg effect,
  • the shift to anaerobic glycolysis in tumor cells.

The novel splice variant M2, of muscle pyruvate kinase,

  • is observed in embryonic and tumor tissue.

It is associated with this shift, the result of

  • the expression of a peptide splice variant sequence.

It is remarkable how many different areas of the life sciences are tied into the phenomenon of splice variation. The changes in the genetic material can be much greater than point mutations, which have been traditionally considered to be the prime source of genetic variability.

“We now have powerful methods available to uncover a whole new category of variation,” Dr. Omenn says. “High-throughput RNA sequencing and proteomics will be complementary in discovery studies of splice variants.”

Splice variation may play an important role in rapid evolutionary changes, of the sort discussed by Susumu Ohno and Stephen J. Gould decades ago. They, and other evolutionary biologists, argued that

  • gene duplication, combined with rapid variability, could fuel major evolutionary jumps.

At the time, the molecular mechanisms of variation were poorly understood, but today

  • the tools are available to rigorously evaluate the role of
  • splice variation and other contributors to evolutionary change.

“Biomarkers derived from studies of splice variants, could, in the future, be exploited

  • both for diagnosis and prognosis and
  • for drug targeting of biological networks,
  • in situations such as the Her-2/Neu breast cancers,” Dr. Omenn says.

Aminopeptidase Activities

“By correlating the proteolytic patterns with disease groups and controls, we have shown that

  • exopeptidase activities contribute to the generation of not only cancer-specific
  • but also cancer type specific serum peptides.

according to Paul Tempst, Ph.D., professor and director of the Protein Center at the Memorial Sloan-Kettering Cancer Center.

So there is a direct link between peptide marker profiles of disease and differential protease activity.” For this reason Dr. Tempst argues that “the patterns we describe may have value as surrogate markers for detection and classification of cancer.”

To investigate this avenue, Dr. Tempst and his colleagues have followed

  • the relationship between exopeptidase activities and metastatic disease.

“We monitored controlled, de novo peptide breakdown in large numbers of biological samples using mass spectrometry, with relative quantitation of the metabolites,” Dr. Tempst explains. This entailed the use of magnetic, reverse-phase beads for analyte capture and a MALDI-TOF MS read-out.

“In biomarker discovery programs, functional proteomics is usually not pursued,” says Dr. Tempst. “For putative biomarkers, one may observe no difference in quantitative levels of proteins, while at the same time, there may be substantial differences in enzymatic activity.”

In a preliminary prostate cancer study, the team found a significant difference

  • in activity levels of exopeptidases in serum from patients with metastatic prostate cancer
  • as compared to primary tumor-bearing individuals and normal healthy controls.

However, there were no differences in amounts of the target protein, and this potential biomarker would have been missed if quantitative levels of protein had been the only criterion of selection.

It is frequently stated that “practical fusion energy is 30 years in the future and always will be.” The same might be said of functional, practical biomarkers that can pass muster with the FDA. But splice variation represents a new handle on this vexing problem. It appears that we are seeing the emergence of a new approach that may finally yield definitive diagnostic tests, detectable in serum and urine samples.

Part 7. Epigenetics and Drug Metabolism

DNA Methylation Rules: Studying Epigenetics with New Tools

The tools to unravel the epigenetic control mechanisms that influence how cells control access of transcriptional proteins to DNA are just beginning to emerge.

Patricia Fitzpatrick Dimond, Ph.D.

http://www.genengnews.com/media/images/AnalysisAndInsight/Feb7_2013_24454248_GreenPurpleDNA_EpigeneticsToolsII3576166141.jpg

New tools may help move the field of epigenetic analysis forward and potentially unveil novel biomarkers for cellular development, differentiation, and disease.

DNA sequencing has had the power of technology behind it as novel platforms to produce more sequencing faster and at lower cost have been introduced. But the tools to unravel the epigenetic control mechanisms that influence how cells control access of transcriptional proteins to DNA are just beginning to emerge.

Among these mechanisms, DNA methylation, or the enzymatically mediated addition of a methyl group to cytosine or adenine dinucleotides,

  • serves as an inherited epigenetic modification that
  • stably modifies gene expression in dividing cells.

The unique methylomes are largely maintained in differentiated cell types, making them critical to understanding the differentiation potential of the cell.

In the DNA methylation process, cytosine residues in the genome are enzymatically modified to 5-methylcytosine,

  • which participates in transcriptional repression of genes during development and disease progression.

5-methylcytosine can be further enzymatically modified to 5-hydroxymethylcytosine by the TET family of methylcytosine dioxygenases. DNA methylation affects gene transcription by physically

  • interfering with the binding of proteins involved in gene transcription.

Methylated DNA may be bound by methyl-CpG-binding domain proteins (MBDs) that can

  • then recruit additional proteins. Some of these include histone deacetylases and other chromatin remodeling proteins that modify histones, thereby
  • forming compact, inactive chromatin, or heterochromatin.

While DNA methylation doesn’t change the genetic code,

  • it influences chromosomal stability and gene expression.

Epigenetics and Cancer Biomarkers

multistage chemical carcinogenesis

multistage chemical carcinogenesis

And because of the increasing recognition that DNA methylation changes are involved in human cancers, scientists have suggested that these epigenetic markers may provide biological markers for cancer cells, and eventually point toward new diagnostic and therapeutic targets. Cancer cell genomes display genome-wide abnormalities in DNA methylation patterns,

  • some of which are oncogenic and contribute to genome instability.

In particular, de novo methylation of tumor suppressor gene promoters

  • occurs frequently in cancers, thereby silencing them and promoting transformation.

Cytosine hydroxymethylation (5-hydroxymethylcytosine, or 5hmC), the aforementioned DNA modification resulting from the enzymatic conversion of 5mC into 5-hydroxymethylcytosine by the TET family of oxygenases, has been identified

  • as another key epigenetic modification marking genes important for
  • pluripotency in embryonic stem cells (ES), as well as in cancer cells.

The base 5-hydroxymethylcytosine was recently identified as an oxidation product of 5-methylcytosine in mammalian DNA. In 2011, using sensitive and quantitative methods to assess levels of 5-hydroxymethyl-2′-deoxycytidine (5hmdC) and 5-methyl-2′-deoxycytidine (5mdC) in genomic DNA, scientists at the Department of Cancer Biology, Beckman Research Institute of the City of Hope, Duarte, California investigated

  • whether levels of 5hmC can distinguish normal tissue from tumor tissue.

They showed that in squamous cell lung cancers, levels of 5hmdC showed

  • up to five-fold reduction compared with normal lung tissue.

In brain tumors,5hmdC showed an even more drastic reduction

  • with levels up to more than 30-fold lower than in normal brain,
  • but 5hmdC levels were independent of mutations in isocitrate dehydrogenase-1, the enzyme that converts 5hmC to 5hmdC.

Immunohistochemical analysis indicated that 5hmC is “remarkably depleted” in many types of human cancer.

  • there was an inverse relationship between 5hmC levels and cell proliferation with lack of 5hmC in proliferating cells.

Their data suggest that 5hmdC is strongly depleted in human malignant tumors,

  • a finding that adds another layer of complexity to the aberrant epigenome found in cancer tissue.

In addition, a lack of 5hmC may become a useful biomarker for cancer diagnosis.

Enzymatic Mapping

But according to New England Biolabs’ Sriharsa Pradhan, Ph.D., methods for distinguishing 5mC from 5hmC and analyzing and quantitating the cell’s entire “methylome” and “hydroxymethylome” remain less than optimal.

The protocol for bisulphite conversion to detect methylation remains the “gold standard” for DNA methylation analysis. This method is generally followed by PCR analysis for single nucleotide resolution to determine methylation across the DNA molecule. According to Dr. Pradhan, “.. bisulphite conversion does not distinguish 5mC and 5hmC,”

Recently we found an enzyme, a unique DNA modification-dependent restriction endonuclease, AbaSI, which can

  • decode the hydryoxmethylome of the mammalian genome.

You easily can find out where the hydroxymethyl regions are.”

AbaSI, recognizes 5-glucosylatedmethylcytosine (5gmC) with high specificity when compared to 5mC and 5hmC, and

  • cleaves at narrow range of distances away from the recognized modified cytosine.

By mapping the cleaved ends, the exact 5hmC location can, the investigators reported, be determined.

Dr. Pradhan and his colleagues at NEB; the Department of Biochemistry, Emory University School of Medicine, Atlanta; and the New England Biolabs Shanghai R&D Center described use of this technique in a paper published in Cell Reports this month, in which they described high-resolution enzymatic mapping of genomic hydroxymethylcytosine in mouse ES cells.

In the current report, the authors used the enzyme technology for the genome-wide high-resolution hydroxymethylome, describing simple library construction even with a low amount of input DNA (50 ng) and the ability to readily detect 5hmC sites with low occupancy.

As a result of their studies, they propose that

factors affecting the local 5mC accessibility to TET enzymes play important roles in the 5hmC deposition

  • including include chromatin compaction, nucleosome positioning, or TF binding.
  •  the regularly oscillating 5hmC profile around the CTCF-binding sites, suggests 5hmC ‘‘writers’’ may be sensitive to the nucleosomal environment.
  • some transiently stable 5hmCs may indicate a poised epigenetic state or demethylation intermediate, whereas others may suggest a locally accessible chromosomal environment for the TET enzymatic apparatus.

“We were able to do complete mapping in mouse embryonic cells and are pleased about what this enzyme can do and how it works,” Dr. Pradhan said.

And the availability of novel tools that make analysis of the methylome and hypomethylome more accessible will move the field of epigenetic analysis forward and potentially novel biomarkers for cellular development, differentiation, and disease.

Patricia Fitzpatrick Dimond, Ph.D. (pdimond@genengnews.com), is technical editor at Genetic Engineering & Biotechnology News.

Epigenetic Regulation of ADME-Related Genes: Focus on Drug Metabolism and Transport

Published: Sep 23, 2013

Epigenetic regulation of gene expression refers to heritable factors that are functionally relevant genomic modifications but that do not involve changes in DNA sequence.

Examples of such modifications include

  • DNA methylation, histone modifications, noncoding RNAs, and chromatin architecture.

Epigenetic modifications are crucial for

packaging and interpreting the genome, and they have fundamental functions in regulating gene expression and activity under the influence of physiologic and environmental factors.

In this issue of Drug Metabolism and Disposition, a series of articles is presented to demonstrate the role of epigenetic factors in regulating

  • the expression of genes involved in drug absorption, distribution, metabolism, and excretion in organ development, tissue-specific gene expression, sexual dimorphism, and in the adaptive response to xenobiotic exposure, both therapeutic and toxic.

The articles also demonstrate that, in addition to genetic polymorphisms, epigenetics may also contribute to wide inter-individual variations in drug metabolism and transport. Identification of functionally relevant epigenetic biomarkers in human specimens has the potential to improve prediction of drug responses based on patient’s epigenetic profiles.

http://www.technologynetworks.com/Metabolomics/news.aspx?ID=157804

This study is published online in Drug Metabolism and Disposition

Part 8.  Pictorial Maps

 Prediction of intracellular metabolic states from extracellular metabolomic data

MK Aurich, G Paglia, Ottar Rolfsson, S Hrafnsdottir, M Magnusdottir, MM Stefaniak, BØ Palsson, RMT Fleming &

Ines Thiele

Metabolomics Aug 14, 2014;

http://dx.doi.org:/10.1007/s11306-014-0721-3

http://link.springer.com/article/10.1007/s11306-014-0721-3/fulltext.html#Sec1

http://link.springer.com/static-content/images/404/art%253A10.1007%252Fs11306-014-0721-3/MediaObjects/11306_2014_721_Fig1_HTML.gif

Metabolic models can provide a mechanistic framework

  • to analyze information-rich omics data sets, and are
  • increasingly being used to investigate metabolic alternations in human diseases.

An expression of the altered metabolic pathway utilization is the selection of metabolites consumed and released by cells. However, methods for the

  • inference of intracellular metabolic states from extracellular measurements in the context of metabolic models remain underdeveloped compared to methods for other omics data.

Herein, we describe a workflow for such an integrative analysis

  • emphasizing on extracellular metabolomics data.

We demonstrate,

  • using the lymphoblastic leukemia cell lines Molt-4 and CCRF-CEM,

how our methods can reveal differences in cell metabolism. Our models explain metabolite uptake and secretion by predicting

  • a more glycolytic phenotype for the CCRF-CEM model and
  • a more oxidative phenotype for the Molt-4 model,
  • which was supported by our experimental data.

Gene expression analysis revealed altered expression of gene products at

  • key regulatory steps in those central metabolic pathways, and

literature query emphasized the role of these genes in cancer metabolism.

Moreover, in silico gene knock-outs identified unique

  •  control points for each cell line model, e.g., phosphoglycerate dehydrogenase for the Molt-4 model.

Thus, our workflow is well suited to the characterization of cellular metabolic traits based on

  • -extracellular metabolomic data, and it allows the integration of multiple omics data sets
  • into a cohesive picture based on a defined model context.

Keywords Constraint-based modeling _ Metabolomics _ Multi-omics _ Metabolic network _ Transcriptomics

1 Introduction

Modern high-throughput techniques have increased the pace of biological data generation. Also referred to as the ‘‘omics avalanche’’, this wealth of data provides great opportunities for metabolic discovery. Omics data sets

  • contain a snapshot of almost the entire repertoire of mRNA, protein, or metabolites at a given time point or

under a particular set of experimental conditions. Because of the high complexity of the data sets,

  • computational modeling is essential for their integrative analysis.

Currently, such data analysis is a bottleneck in the research process and methods are needed to facilitate the use of these data sets, e.g., through meta-analysis of data available in public databases [e.g., the human protein atlas (Uhlen et al. 2010) or the gene expression omnibus (Barrett et al.  2011)], and to increase the accessibility of valuable information for the biomedical research community.

Constraint-based modeling and analysis (COBRA) is

  • a computational approach that has been successfully used to
  • investigate and engineer microbial metabolism through the prediction of steady-states (Durot et al.2009).

The basis of COBRA is network reconstruction: networks are assembled in a bottom-up fashion based on

  • genomic data and extensive
  • organism-specific information from the literature.

Metabolic reconstructions capture information on the

  • known biochemical transformations taking place in a target organism
  • to generate a biochemical, genetic and genomic knowledge base (Reed et al. 2006).

Once assembled, a

  • metabolic reconstruction can be converted into a mathematical model (Thiele and Palsson 2010), and
  • model properties can be interrogated using a great variety of methods (Schellenberger et al. 2011).

The ability of COBRA models

  • to represent genotype–phenotype and environment–phenotype relationships arises
  • through the imposition of constraints, which
  • limit the system to a subset of possible network states (Lewis et al. 2012).

Currently, COBRA models exist for more than 100 organisms, including humans (Duarte et al. 2007; Thiele et al. 2013).

Since the first human metabolic reconstruction was described [Recon 1 (Duarte et al. 2007)],

  • biomedical applications of COBRA have increased (Bordbar and Palsson 2012).

One way to contextualize networks is to

  • define their system boundaries according to the metabolic states of the system, e.g., disease or dietary regimes.

The consequences of the applied constraints can

  • then be assessed for the entire network (Sahoo and Thiele 2013).

Additionally, omics data sets have frequently been used

  • to generate cell-type or condition-specific metabolic models.

Models exist for specific cell types, such as

  1. enterocytes (Sahoo and Thiele2013),
  2. macrophages (Bordbar et al. 2010),
  3. adipocytes (Mardinoglu et al. 2013),
  4. even multi-cell assemblies that represent the interactions of brain cells (Lewis et al. 2010).

All of these cell type specific models, except the enterocyte reconstruction

  • were generated based on omics data sets.

Cell-type-specific models have been used to study

  • diverse human disease conditions.

For example, an adipocyte model was generated using

  • transcriptomic, proteomic, and metabolomics data.

This model was subsequently used to investigate metabolic alternations in adipocytes

  • that would allow for the stratification of obese patients (Mardinoglu et al. 2013).

The biomedical applications of COBRA have been

  1. cancer metabolism (Jerby and Ruppin, 2012).
  2. predicting drug targets (Folger et al. 2011; Jerby et al. 2012).

A cancer model was generated using

  • multiple gene expression data sets and subsequently used
  • to predict synthetic lethal gene pairs as potential drug targets
  • selective for the cancer model, but non-toxic to the global model (Recon 1),

a consequence of the reduced redundancy in the cancer specific model (Folger et al. 2011).

In a follow up study, lethal synergy between FH and enzymes of the heme metabolic pathway

  • were experimentally validated and resolved the mechanism by which FH deficient cells,
    e.g., in renal-cell cancer cells survive a non-functional TCA cycle (Frezza et al. 2011).

Contextualized models, which contain only the subset of reactions active in a particular tissue (or cell-) type,

  • can be generated in different ways (Becker and Palsson, 2008; Jerby et al. 2010).

However, the existing algorithms mainly consider

  • gene expression and proteomic data
  • to define the reaction sets that comprise the contextualized metabolic models.

These subset of reactions are usually defined

  • based on the expression or absence of expression of the genes or proteins (present and absent calls),
  • or inferred from expression values or differential gene expression.

Comprehensive reviews of the methods are available (Blazier and Papin, 2012; Hyduke et al. 2013). Only the compilation of a large set of omics data sets

  • can result in a tissue (or cell-type) specific metabolic model, whereas

the representation of one particular experimental condition is achieved

  • through the integration of omics data set generated from one experiment only (condition-specific cell line model).

Recently, metabolomic data sets have become more comprehensive and

  • using these data sets allow direct determination of the metabolic network components (the metabolites).

Additionally, metabolomics has proven to be stable, relatively inexpensive, and highly reproducible (Antonucci et al. 2012). These factors make metabolomic data sets particularly valuable for

  • interrogation of metabolic phenotypes.

Thus, the integration of these data sets is now an active field of research (Li et al. 2013; Mo et al. 2009; Paglia et al. 2012b; Schmidt et al. 2013).

Generally, metabolomic data can be incorporated into metabolic networks as

  • qualitative, quantitative, and thermodynamic constraints (Fleming et al. 2009; Mo et al. 2009).

Mo et al. used metabolites detected in the

  • spent medium of yeast cells to determine intracellular flux states through a sampling analysis (Mo et al. 2009),
  • which allowed unbiased interrogation of the possible network states (Schellenberger and Palsson 2009) and
  • prediction of internal pathway use.

Modes of transcriptional regulation during the YMC

Modes of transcriptional regulation during the YMC

Such analyses have also been used to reveal the effects of

  1. enzymopathies on red blood cells (Price et al. 2004),
  2. to study effects of diet on diabetes (Thiele et al. 2005) and
  3. to define macrophage metabolic states (Bordbar et al. 2010).

This type of analysis is available as a function in the COBRA toolbox (Schellenberger et al. 2011).

In this study, we established a workflow

  • for the generation and analysis of condition-specific metabolic cell line models
  • that can facilitate the interpretation of metabolomic data.

Our modeling yields meaningful predictions regarding

  • metabolic differences between two lymphoblastic leukemia cell lines (Fig. 1A).

Fig. 1