Advertisements
Feeds:
Posts
Comments

Archive for the ‘Commercialization’ Category

Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting


 

Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting

Achievement Beyond Regulatory Approval – Design for Commercial Success

philly2nightStephen J. Williams, Ph.D.: Reporter

The Mid-Atlantic group Life Sciences Collaborative, a select group of industry veterans and executives from the pharmaceutical, biotechnology, and medical device sectors whose mission is to increase the success of emerging life sciences businesses in the Mid-Atlantic region through networking, education, training and mentorship, met Tuesday March 3, 2015 at the University of the Sciences in Philadelphia (USP) to discuss post-approval regulatory issues and concerns such as designing strong patent protection, developing strategies for insurance reimbursement, and securing financing for any stage of a business.

The meeting was divided into three panel discussions and keynote speech:

  1. Panel 1: Design for Market Protection– Intellectual Property Strategy Planning
  2. Panel 2: Design for Market Success– Commercial Strategy Planning
  3. Panel 3: Design for Investment– Financing Each Stage
  4. Keynote Speaker: Robert Radie, President & CEO Egalet Corporation

Below are Notes from each PANEL Discussion:

For more information about the Life Sciences Collaborative SEE

Website: http://www.lifesciencescollaborative.org/

Or On Facebook

Or On Twitter @LSCollaborative

Panel 1: Design for Market Protection; Intellectual Property Strategy Planning

Take-home Message: Developing a very strong Intellectual Property (IP) portfolio and strategy for a startup is CRITICALLY IMPORTANT for its long-term success. Potential investors, partners, and acquirers will focus on the strength of a startup’s IP so important to take advantage of the legal services available. Do your DUE DIGILENCE.

Panelists:

John F. Ritter, J.D.., MBA; Director Office Tech. Licensing Princeton University

Cozette McAvoy; Senior Attorney Novartis Oncology Pharma Patents

Ryan O’Donnell; Partner Volpe & Koenig

Panel Moderator: Dipanjan “DJ” Nag, PhD, MBA, CLP, RTTP; President CEO IP Shaktl, LLC

Notes:

Dr. Nag:

  • Sometimes IP can be a double edged sword; e.g. Herbert Boyer with Paul Berg and Stanley Cohen credited with developing recombinant technology but they did not keep the IP strict and opened the door for a biotech revolution (see nice review from Chemical Heritage Foundation).
  • Naked patent licenses are most profitable when try to sell IP

John Ritter: Mr. Ritter gave Princeton University’s perspective on developing and promoting a university-based IP portfolio.

  • 30-40% of Princeton’s IP portfolio is related to life sciences
  • Universities will prefer to seek provisional patent status as a quicker process and allows for publication
  • Princeton will work closely with investigators to walk them through process – Very Important to have support system in place INCLUDING helping investigators and early startups establish a STRONG startup MANAGEMENT TEAM, and making important introductions to and DEVELOPING RELATIONSHIOPS with investors, angels
  • Good to cast a wide net when looking at early development partners like pharma
  • Good example of university which takes active role in developing startups is University of Pennsylvania’s Penn UPstart program.
  • Last 2 years many universities filing patents for startups as a micro-entity

Comment from attendee: Universities are not using enough of their endowments for purpose of startups. Princeton only using $500,00 for accelerator program.

Cozette McAvoy: Mrs. McAvoy talked about monetizing your IP from an industry perspective

  • Industry now is looking at “indirect monetization” of their and others IP portfolio. Indirect monetization refers to unlocking the “indirect value” of intellectual property; for example research tools, processes, which may or may not be related to a tangible product.
  • Good to make a contractual bundle of IP – “days of the $million check is gone”
  • Big companies like big pharma looks to PR (press relation) buzz surrounding new technology, products SO IMPORTANT FOR STARTUP TO FOCUS ON YOUR PR

Ryan O’Donnell: talked about how life science IP has changed especially due to America Invests Act

  • Need to develop a GLOBAL IP strategy so whether drug or device can market in multiple countries
  • Diagnostics and genes not patentable now – Major shift in patent strategy
  • Companies like Unified Patents can protect you against the patent trolls – if patent threatened by patent troll (patent assertion entity) will file a petition with the USPTO (US Patent Office) requesting institution of inter partes review (IPR); this may cost $40,000 BUT WELL WORTH the money – BE PROACTIVE about your patents and IP

Panel 2: Design for Market Success; Commercial Strategy Planning

Take-home Message: Commercial strategy development is defined market facing data, reimbursement strategies and commercial planning that inform labeling requirements, clinical study designs, healthcare economic outcomes and pricing targets. Clarity from payers is extremely important to develop any market strategy. Develop this strategy early and seek advice from payers.

Panelists:

David Blaszczak; Founder, Precipio Health Strategies

Terri Bernacchi, PharmD, MBA; Founder & President Cambria Health Advisory Professionals

Paul Firuta; President US Commercial Operations, NPS Pharma

 

Panel Moderator: Matt Cabrey; Executive Director, Select Greater Philadelphia

 

Notes:

David Blaszczak:

  • Commercial payers are bundling payment: most important to get clarity from these payers
  • Payers are using clinical trials to alter marketing (labeling) so IMPORTANT to BUILD LABEL in early clinical trial phases (phase I or II)
  • When in early phases of small company best now to team or partner with a Medicare or PBM (pharmacy benefit manager) and payers to help develop and spot tier1 and tier 2 companies in their area

Terri Bernacchi:

  • Building relationship with the payer is very important but firms like hers will also look to patients and advocacy groups to see how they respond to a given therapy and decrease the price risk by bundling
  • Value-based contracting with manufacturers can save patient and payer $$
  • As most PBMs formularies are 80% generics goal is how to make money off of generics
  • Patent extension would have greatest impact on price, value

Paul Firuta:

  • NPS Pharma developing a pharmacy benefit program for orphan diseases
  • How you pay depends on mix of Medicare, private payers now
  • Most important change which could affect price is change in compliance regulations

Panel 3: Design for Investment; Financing Each Stage

Take-home Message: VC is a personal relationship so spend time making those relationships. Do your preparation on your value and your market. Look to non-VC avenues: they are out there.

Panelists:

Ting Pau Oei; Managing Director, Easton Capital (NYC)

Manya Deehr; CEO & Founder, Pediva Therapeutics

Sanjoy Dutta, PhD; Assistant VP, Translational Devel. & Intl. Res., Juvenile Diabetes Research Foundation

 

Panel Moderator: Shahram Hejazi, PhD; Venture Partner, BioAdvance

  • In 2000 his experience finding 1st capital was what are your assets; now has changed to value

Notes:

Ting Pau Oei:

  • Your very 1st capital is all about VALUE– so plan where you add value
  • Venture Capital is a PERSONAL RELATIONSHIP
  • 1) you need the management team, 2) be able to communicate effectively                  (Powerpoint, elevator pitch, business plan) and #1 and #2 will get you important 2nd Venture Capital meeting; VC’s don’t decide anything in 1st meeting
  • VC’s don’t normally do a good job of premarket valuation or premarket due diligence but know post market valuation well
  • Best advice: show some phase 2 milestones and VC will knock on your door

Manya Deehr:

  • Investment is more niche oriented so find your niche investors
  • Define your product first and then match the investors
  • Biggest failure she has experienced: companies that go out too early looking for capital

Dr. Dutta: funding from a non-profit patient advocacy group perspective

  • Your First Capital: find alliances which can help you get out of “valley of death
  • Develop a targeted product and patient treatment profile
  • Non-profit groups ask three questions:

1) what is the value to patients (non-profits want to partner)

2) what is your timeline (we can wait longer than VC; for example Cystic Fibrosis Foundation waited long time but got great returns for their patients with Kalydeco™)

3) when can we see return

  • Long-term market projections are the knowledge gaps that startups have (the landscape) and startups don’t have all the competitive intelligence
  • Have a plan B every step of the way

Other posts on this site related to Philadelphia Biotech, Startup Funding, Payer Issues, and Intellectual Property Issues include:

PCCI’s 7th Annual Roundtable “Crowdfunding for Life Sciences: A Bridge Over Troubled Waters?” May 12 2014 Embassy Suites Hotel, Chesterbrook PA 6:00-9:30 PM
The Vibrant Philly Biotech Scene: Focus on KannaLife Sciences and the Discipline and Potential of Pharmacognosy
The Vibrant Philly Biotech Scene: Focus on Computer-Aided Drug Design and Gfree Bio, LLC
The Vibrant Philly Biotech Scene: Focus on Vaccines and Philimmune, LLC
The Bioscience Crowdfunding Environment: The Bigger Better VC?
Foundations as a Funding Source
Venture Capital Funding in the Life Sciences: Phase4 Ventures – A Case Study
10 heart-focused apps & devices are crowdfunding for American Heart Association’s open innovation challenge
Funding, Deals & Partnerships
Medicare Panel Punts on Best Tx for Carotid Plaque
9:15AM–2:00PM, January 27, 2015 – Regulatory & Reimbursement Frameworks for Molecular Testing, LIVE @Silicon Valley 2015 Personalized Medicine World Conference, Mountain View, CA
FDA Commissioner, Dr. Margaret A. Hamburg on HealthCare for 310Million Americans and the Role of Personalized Medicine
Biosimilars: Intellectual Property Creation and Protection by Pioneer and by Biosimilar Manufacturers
Litigation on the Way: Broad Institute Gets Patent on Revolutionary Gene-Editing Method
The Patents for CRISPR, the DNA editing technology as the Biggest Biotech Discovery of the Century

 

 

Advertisements

Read Full Post »


Over 500,000 Unique Site Visits reached by PharmaceuticalIntelligence.com Reported today by Leaders in Pharmaceutical Business Intelligence (LPBI)

PRESS RELEASE

Stats URL = http://www.prlog.org/pub/stats.html?id=12396533
Access Code = 8fb97b

 

Pharma_BI-background0238

November 18, 2014

Leaders in Pharmaceutical Business Intelligence (LPBI), launched in April 2012 as an Open Access Online Scientific Journal (http://pharmaceuticalintelligence.com), has reached over 500,000 unique site visits, with approximately 30,000 monthly views, 1,440 followers and 133 sites linking into the journal, including ncbi.nlm.nih.gov, Nature.com, cancer.gov, commons.wikipedia.org, dx.doi.org, NEJM, medscape.com, personalizedmedicine.partners.org, masspec.scripps.edu, and sciencedirect.com.

Leaders in Pharmaceutical Business Intelligence is a scientific, medical and business multi-expert authoring platform in several domains, life sciences, pharmaceutical, healthcare & medical industries. The venture operates as an online scientific intellectual exchange at its website http://pharmaceuticalintelligence.com for curation and reporting on innovations in biomedical and biological sciences, healthcare economics, pharmacology, pharmaceutical research & medicine by PhD level Scientists, MDs, MD/PhDs and PhamDs. The Open Access Online Scientific Journal currently contains 2,521 annotated & curated articles in over 300 categories of research in Biotech, Life Sciences and Medicine.

The LPBI venture publishes a BioMedical Series of e-Books available on mobile platforms (current & future titles at https://pharmaceuticalintelligence.com/biomed-e-books/.)

In the Funding, Deals and Partnerships, the venture represents more than a dozen of early stage Biotechnology and Medical Devices startups in Private Equity Funding.

The venture’s expertise in Real-Time Scientific Conferences Coverage, using social media and creating electronic scientific conference proceedings posted on the Internet, has included the following events: MassBio Annual Meeting 2014, BioIT World, April 30, 2014, Koch Institute for Integrative Cancer Research @MIT – Summer Symposium 2014: RNA Biology, Cancer and Therapeutic Implications, June 13, 2014, Gabbay Award Lectures in Biotechnology and Medicine – Hosted by Rosenstiel Basic Medical Sciences Research Center, Brandeis University, October 27, 2014, 10th Annual Personalized Medicine Conference at the Harvard Medical School, November 12-13, 2014, and two Biotech Investment Conferences in 2014, one in NYC and one in Basel by Sachs Associates of London.

Contact

Aviva Lev-Ari, PhD, RN

1-617-244-4024

avivalev-ari@alum.berkeley.edu

Scientific Journal and BioMed e-Books Series – Editor-in-Chief

https://pharmaceuticalintelligence.com/biomed-e-books/

Founder & Director

Leaders in Pharmaceutical Business Intelligence

http://pharmaceuticalintelligence.com

Read Full Post »

Summary and Perspectives: Impairments in Pathological States: Endocrine Disorders, Stress Hypermetabolism and Cancer


Summary and Perspectives: Impairments in Pathological States: Endocrine Disorders, Stress Hypermetabolism and Cancer

Author and Curator: Larry H. Bernstein, MD, FCAP

This summary is the last of a series on the impact of transcriptomics, proteomics, and metabolomics on disease investigation, and the sorting and integration of genomic signatures and metabolic signatures to explain phenotypic relationships in variability and individuality of response to disease expression and how this leads to  pharmaceutical discovery and personalized medicine.  We have unquestionably better tools at our disposal than has ever existed in the history of mankind, and an enormous knowledge-base that has to be accessed.  I shall conclude here these discussions with the powerful contribution to and current knowledge pertaining to biochemistry, metabolism, protein-interactions, signaling, and the application of the -OMICS to diseases and drug discovery at this time.

The Ever-Transcendent Cell

Deriving physiologic first principles By John S. Torday | The Scientist Nov 1, 2014
http://www.the-scientist.com/?articles.view/articleNo/41282/title/The-Ever-Transcendent-Cell/

Both the developmental and phylogenetic histories of an organism describe the evolution of physiology—the complex of metabolic pathways that govern the function of an organism as a whole. The necessity of establishing and maintaining homeostatic mechanisms began at the cellular level, with the very first cells, and homeostasis provides the underlying selection pressure fueling evolution.

While the events leading to the formation of the first functioning cell are debatable, a critical one was certainly the formation of simple lipid-enclosed vesicles, which provided a protected space for the evolution of metabolic pathways. Protocells evolved from a common ancestor that experienced environmental stresses early in the history of cellular development, such as acidic ocean conditions and low atmospheric oxygen levels, which shaped the evolution of metabolism.

The reduction of evolution to cell biology may answer the perennially unresolved question of why organisms return to their unicellular origins during the life cycle.

As primitive protocells evolved to form prokaryotes and, much later, eukaryotes, changes to the cell membrane occurred that were critical to the maintenance of chemiosmosis, the generation of bioenergy through the partitioning of ions. The incorporation of cholesterol into the plasma membrane surrounding primitive eukaryotic cells marked the beginning of their differentiation from prokaryotes. Cholesterol imparted more fluidity to eukaryotic cell membranes, enhancing functionality by increasing motility and endocytosis. Membrane deformability also allowed for increased gas exchange.

Acidification of the oceans by atmospheric carbon dioxide generated high intracellular calcium ion concentrations in primitive aquatic eukaryotes, which had to be lowered to prevent toxic effects, namely the aggregation of nucleotides, proteins, and lipids. The early cells achieved this by the evolution of calcium channels composed of cholesterol embedded within the cell’s plasma membrane, and of internal membranes, such as that of the endoplasmic reticulum, peroxisomes, and other cytoplasmic organelles, which hosted intracellular chemiosmosis and helped regulate calcium.

As eukaryotes thrived, they experienced increasingly competitive pressure for metabolic efficiency. Engulfed bacteria, assimilated as mitochondria, provided more bioenergy. As the evolution of eukaryotic organisms progressed, metabolic cooperation evolved, perhaps to enable competition with biofilm-forming, quorum-sensing prokaryotes. The subsequent appearance of multicellular eukaryotes expressing cellular growth factors and their respective receptors facilitated cell-cell signaling, forming the basis for an explosion of multicellular eukaryote evolution, culminating in the metazoans.

Casting a cellular perspective on evolution highlights the integration of genotype and phenotype. Starting from the protocell membrane, the functional homolog for all complex metazoan organs, it offers a way of experimentally determining the role of genes that fostered evolution based on the ontogeny and phylogeny of cellular processes that can be traced back, in some cases, to our last universal common ancestor.  ….

As eukaryotes thrived, they experienced increasingly competitive pressure for metabolic efficiency. Engulfed bacteria, assimilated as mitochondria, provided more bioenergy. As the evolution of eukaryotic organisms progressed, metabolic cooperation evolved, perhaps to enable competition with biofilm-forming, quorum-sensing prokaryotes. The subsequent appearance of multicellular eukaryotes expressing cellular growth factors and their respective receptors facilitated cell-cell signaling, forming the basis for an explosion of multicellular eukaryote evolution, culminating in the metazoans.

Casting a cellular perspective on evolution highlights the integration of genotype and phenotype. Starting from the protocell membrane, the functional homolog for all complex metazoan organs, it offers a way of experimentally determining the role of genes that fostered evolution based on the ontogeny and phylogeny of cellular processes that can be traced back, in some cases, to our last universal common ancestor.

Given that the unicellular toolkit is complete with all the traits necessary for forming multicellular organisms (Science, 301:361-63, 2003), it is distinctly possible that metazoans are merely permutations of the unicellular body plan. That scenario would clarify a lot of puzzling biology: molecular commonalities between the skin, lung, gut, and brain that affect physiology and pathophysiology exist because the cell membranes of unicellular organisms perform the equivalents of these tissue functions, and the existence of pleiotropy—one gene affecting many phenotypes—may be a consequence of the common unicellular source for all complex biologic traits.  …

The cell-molecular homeostatic model for evolution and stability addresses how the external environment generates homeostasis developmentally at the cellular level. It also determines homeostatic set points in adaptation to the environment through specific effectors, such as growth factors and their receptors, second messengers, inflammatory mediators, crossover mutations, and gene duplications. This is a highly mechanistic, heritable, plastic process that lends itself to understanding evolution at the cellular, tissue, organ, system, and population levels, mediated by physiologically linked mechanisms throughout, without having to invoke random, chance mechanisms to bridge different scales of evolutionary change. In other words, it is an integrated mechanism that can often be traced all the way back to its unicellular origins.

The switch from swim bladder to lung as vertebrates moved from water to land is proof of principle that stress-induced evolution in metazoans can be understood from changes at the cellular level.

http://www.the-scientist.com/Nov2014/TE_21.jpg

A MECHANISTIC BASIS FOR LUNG DEVELOPMENT: Stress from periodic atmospheric hypoxia (1) during vertebrate adaptation to land enhances positive selection of the stretch-regulated parathyroid hormone-related protein (PTHrP) in the pituitary and adrenal glands. In the pituitary (2), PTHrP signaling upregulates the release of adrenocorticotropic hormone (ACTH) (3), which stimulates the release of glucocorticoids (GC) by the adrenal gland (4). In the adrenal gland, PTHrP signaling also stimulates glucocorticoid production of adrenaline (5), which in turn affects the secretion of lung surfactant, the distension of alveoli, and the perfusion of alveolar capillaries (6). PTHrP signaling integrates the inflation and deflation of the alveoli with surfactant production and capillary perfusion.  THE SCIENTIST STAFF

From a cell-cell signaling perspective, two critical duplications in genes coding for cell-surface receptors occurred during this period of water-to-land transition—in the stretch-regulated parathyroid hormone-related protein (PTHrP) receptor gene and the β adrenergic (βA) receptor gene. These gene duplications can be disassembled by following their effects on vertebrate physiology backwards over phylogeny. PTHrP signaling is necessary for traits specifically relevant to land adaptation: calcification of bone, skin barrier formation, and the inflation and distention of lung alveoli. Microvascular shear stress in PTHrP-expressing organs such as bone, skin, kidney, and lung would have favored duplication of the PTHrP receptor, since sheer stress generates radical oxygen species (ROS) known to have this effect and PTHrP is a potent vasodilator, acting as an epistatic balancing selection for this constraint.

Positive selection for PTHrP signaling also evolved in the pituitary and adrenal cortex (see figure on this page), stimulating the secretion of ACTH and corticoids, respectively, in response to the stress of land adaptation. This cascade amplified adrenaline production by the adrenal medulla, since corticoids passing through it enzymatically stimulate adrenaline synthesis. Positive selection for this functional trait may have resulted from hypoxic stress that arose during global episodes of atmospheric hypoxia over geologic time. Since hypoxia is the most potent physiologic stressor, such transient oxygen deficiencies would have been acutely alleviated by increasing adrenaline levels, which would have stimulated alveolar surfactant production, increasing gas exchange by facilitating the distension of the alveoli. Over time, increased alveolar distension would have generated more alveoli by stimulating PTHrP secretion, impelling evolution of the alveolar bed of the lung.

This scenario similarly explains βA receptor gene duplication, since increased density of the βA receptor within the alveolar walls was necessary for relieving another constraint during the evolution of the lung in adaptation to land: the bottleneck created by the existence of a common mechanism for blood pressure control in both the lung alveoli and the systemic blood pressure. The pulmonary vasculature was constrained by its ability to withstand the swings in pressure caused by the systemic perfusion necessary to sustain all the other vital organs. PTHrP is a potent vasodilator, subserving the blood pressure constraint, but eventually the βA receptors evolved to coordinate blood pressure in both the lung and the periphery.

Gut Microbiome Heritability

Analyzing data from a large twin study, researchers have homed in on how host genetics can shape the gut microbiome.
By Tracy Vence | The Scientist Nov 6, 2014

Previous research suggested host genetic variation can influence microbial phenotype, but an analysis of data from a large twin study published in Cell today (November 6) solidifies the connection between human genotype and the composition of the gut microbiome. Studying more than 1,000 fecal samples from 416 monozygotic and dizygotic twin pairs, Cornell University’s Ruth Ley and her colleagues have homed in on one bacterial taxon, the family Christensenellaceae, as the most highly heritable group of microbes in the human gut. The researchers also found that Christensenellaceae—which was first described just two years ago—is central to a network of co-occurring heritable microbes that is associated with lean body mass index (BMI).  …

Of particular interest was the family Christensenellaceae, which was the most heritable taxon among those identified in the team’s analysis of fecal samples obtained from the TwinsUK study population.

While microbiologists had previously detected 16S rRNA sequences belonging to Christensenellaceae in the human microbiome, the family wasn’t named until 2012. “People hadn’t looked into it, partly because it didn’t have a name . . . it sort of flew under the radar,” said Ley.

Ley and her colleagues discovered that Christensenellaceae appears to be the hub in a network of co-occurring heritable taxa, which—among TwinsUK participants—was associated with low BMI. The researchers also found that Christensenellaceae had been found at greater abundance in low-BMI twins in older studies.

To interrogate the effects of Christensenellaceae on host metabolic phenotype, the Ley’s team introduced lean and obese human fecal samples into germ-free mice. They found animals that received lean fecal samples containing more Christensenellaceae showed reduced weight gain compared with their counterparts. And treatment of mice that had obesity-associated microbiomes with one member of the Christensenellaceae family, Christensenella minuta, led to reduced weight gain.   …

Ley and her colleagues are now focusing on the host alleles underlying the heritability of the gut microbiome. “We’re running a genome-wide association analysis to try to find genes—particular variants of genes—that might associate with higher levels of these highly heritable microbiota.  . . . Hopefully that will point us to possible reasons they’re heritable,” she said. “The genes will guide us toward understanding how these relationships are maintained between host genotype and microbiome composition.”

J.K. Goodrich et al., “Human genetics shape the gut microbiome,” Cell,  http://dx.doi.org:/10.1016/j.cell.2014.09.053, 2014.

Light-Operated Drugs

Scientists create a photosensitive pharmaceutical to target a glutamate receptor.
By Ruth Williams | The Scentist Nov 1, 2014
http://www.the-scientist.com/?articles.view/articleNo/41279/title/Light-Operated-Drugs/

light operated drugs MO1

light operated drugs MO1

http://www.the-scientist.com/Nov2014/MO1.jpg

The desire for temporal and spatial control of medications to minimize side effects and maximize benefits has inspired the development of light-controllable drugs, or optopharmacology. Early versions of such drugs have manipulated ion channels or protein-protein interactions, “but never, to my knowledge, G protein–coupled receptors [GPCRs], which are one of the most important pharmacological targets,” says Pau Gorostiza of the Institute for Bioengineering of Catalonia, in Barcelona.

Gorostiza has taken the first step toward filling that gap, creating a photosensitive inhibitor of the metabotropic glutamate 5 (mGlu5) receptor—a GPCR expressed in neurons and implicated in a number of neurological and psychiatric disorders. The new mGlu5 inhibitor—called alloswitch-1—is based on a known mGlu receptor inhibitor, but the simple addition of a light-responsive appendage, as had been done for other photosensitive drugs, wasn’t an option. The binding site on mGlu5 is “extremely tight,” explains Gorostiza, and would not accommodate a differently shaped molecule. Instead, alloswitch-1 has an intrinsic light-responsive element.

In a human cell line, the drug was active under dim light conditions, switched off by exposure to violet light, and switched back on by green light. When Gorostiza’s team administered alloswitch-1 to tadpoles, switching between violet and green light made the animals stop and start swimming, respectively.

The fact that alloswitch-1 is constitutively active and switched off by light is not ideal, says Gorostiza. “If you are thinking of therapy, then in principle you would prefer the opposite,” an “on” switch. Indeed, tweaks are required before alloswitch-1 could be a useful drug or research tool, says Stefan Herlitze, who studies ion channels at Ruhr-Universität Bochum in Germany. But, he adds, “as a proof of principle it is great.” (Nat Chem Biol, http://dx.doi.org:/10.1038/nchembio.1612, 2014)

Enhanced Enhancers

The recent discovery of super-enhancers may offer new drug targets for a range of diseases.
By Eric Olson | The Scientist Nov 1, 2014
http://www.the-scientist.com/?articles.view/articleNo/41281/title/Enhanced-Enhancers/

To understand disease processes, scientists often focus on unraveling how gene expression in disease-associated cells is altered. Increases or decreases in transcription—as dictated by a regulatory stretch of DNA called an enhancer, which serves as a binding site for transcription factors and associated proteins—can produce an aberrant composition of proteins, metabolites, and signaling molecules that drives pathologic states. Identifying the root causes of these changes may lead to new therapeutic approaches for many different diseases.

Although few therapies for human diseases aim to alter gene expression, the outstanding examples—including antiestrogens for hormone-positive breast cancer, antiandrogens for prostate cancer, and PPAR-γ agonists for type 2 diabetes—demonstrate the benefits that can be achieved through targeting gene-control mechanisms.  Now, thanks to recent papers from laboratories at MIT, Harvard, and the National Institutes of Health, researchers have a new, much bigger transcriptional target: large DNA regions known as super-enhancers or stretch-enhancers. Already, work on super-enhancers is providing insights into how gene-expression programs are established and maintained, and how they may go awry in disease.  Such research promises to open new avenues for discovering medicines for diseases where novel approaches are sorely needed.

Super-enhancers cover stretches of DNA that are 10- to 100-fold longer and about 10-fold less abundant in the genome than typical enhancer regions (Cell, 153:307-19, 2013). They also appear to bind a large percentage of the transcriptional machinery compared to typical enhancers, allowing them to better establish and enforce cell-type specific transcriptional programs (Cell, 153:320-34, 2013).

Super-enhancers are closely associated with genes that dictate cell identity, including those for cell-type–specific master regulatory transcription factors. This observation led to the intriguing hypothesis that cells with a pathologic identity, such as cancer cells, have an altered gene expression program driven by the loss, gain, or altered function of super-enhancers.

Sure enough, by mapping the genome-wide location of super-enhancers in several cancer cell lines and from patients’ tumor cells, we and others have demonstrated that genes located near super-enhancers are involved in processes that underlie tumorigenesis, such as cell proliferation, signaling, and apoptosis.

Super-enhancers cover stretches of DNA that are 10- to 100-fold longer and about 10-fold less abundant in the genome than typical enhancer regions.

Genome-wide association studies (GWAS) have found that disease- and trait-associated genetic variants often occur in greater numbers in super-enhancers (compared to typical enhancers) in cell types involved in the disease or trait of interest (Cell, 155:934-47, 2013). For example, an enrichment of fasting glucose–associated single nucleotide polymorphisms (SNPs) was found in the stretch-enhancers of pancreatic islet cells (PNAS, 110:17921-26, 2013). Given that some 90 percent of reported disease-associated SNPs are located in noncoding regions, super-enhancer maps may be extremely valuable in assigning functional significance to GWAS variants and identifying target pathways.

Because only 1 to 2 percent of active genes are physically linked to a super-enhancer, mapping the locations of super-enhancers can be used to pinpoint the small number of genes that may drive the biology of that cell. Differential super-enhancer maps that compare normal cells to diseased cells can be used to unravel the gene-control circuitry and identify new molecular targets, in much the same way that somatic mutations in tumor cells can point to oncogenic drivers in cancer. This approach is especially attractive in diseases for which an incomplete understanding of the pathogenic mechanisms has been a barrier to discovering effective new therapies.

Another therapeutic approach could be to disrupt the formation or function of super-enhancers by interfering with their associated protein components. This strategy could make it possible to downregulate multiple disease-associated genes through a single molecular intervention. A group of Boston-area researchers recently published support for this concept when they described inhibited expression of cancer-specific genes, leading to a decrease in cancer cell growth, by using a small molecule inhibitor to knock down a super-enhancer component called BRD4 (Cancer Cell, 24:777-90, 2013).  More recently, another group showed that expression of the RUNX1 transcription factor, involved in a form of T-cell leukemia, can be diminished by treating cells with an inhibitor of a transcriptional kinase that is present at the RUNX1 super-enhancer (Nature, 511:616-20, 2014).

Fungal effector Ecp6 outcompetes host immune receptor for chitin binding through intrachain LysM dimerization 
Andrea Sánchez-Vallet, et al.   eLife 2013;2:e00790 http://elifesciences.org/content/2/e00790#sthash.LnqVMJ9p.dpuf

LysM effector

LysM effector

http://img.scoop.it/ZniCRKQSvJOG18fHbb4p0Tl72eJkfbmt4t8yenImKBVvK0kTmF0xjctABnaLJIm9

While host immune receptors

  • detect pathogen-associated molecular patterns to activate immunity,
  • pathogens attempt to deregulate host immunity through secreted effectors.

Fungi employ LysM effectors to prevent

  • recognition of cell wall-derived chitin by host immune receptors

Structural analysis of the LysM effector Ecp6 of

  • the fungal tomato pathogen Cladosporium fulvum reveals
  • a novel mechanism for chitin binding,
  • mediated by intrachain LysM dimerization,

leading to a chitin-binding groove that is deeply buried in the effector protein.

This composite binding site involves

  • two of the three LysMs of Ecp6 and
  • mediates chitin binding with ultra-high (pM) affinity.

The remaining singular LysM domain of Ecp6 binds chitin with

  • low micromolar affinity but can nevertheless still perturb chitin-triggered immunity.

Conceivably, the perturbation by this LysM domain is not established through chitin sequestration but possibly through interference with the host immune receptor complex.

Mutated Genes in Schizophrenia Map to Brain Networks
From www.nih.gov –  Sep 3, 2013

Previous studies have shown that many people with schizophrenia have de novo, or new, genetic mutations. These misspellings in a gene’s DNA sequence

  • occur spontaneously and so aren’t shared by their close relatives.

Dr. Mary-Claire King of the University of Washington in Seattle and colleagues set out to

  • identify spontaneous genetic mutations in people with schizophrenia and
  • to assess where and when in the brain these misspelled genes are turned on, or expressed.

The study was funded in part by NIH’s National Institute of Mental Health (NIMH). The results were published in the August 1, 2013, issue of Cell.

The researchers sequenced the exomes (protein-coding DNA regions) of 399 people—105 with schizophrenia plus their unaffected parents and siblings. Gene variations
that were found in a person with schizophrenia but not in either parent were considered spontaneous.

The likelihood of having a spontaneous mutation was associated with

  • the age of the father in both affected and unaffected siblings.

Significantly more mutations were found in people

  • whose fathers were 33-45 years at the time of conception compared to 19-28 years.

Among people with schizophrenia, the scientists identified

  • 54 genes with spontaneous mutations
  • predicted to cause damage to the function of the protein they encode.

The researchers used newly available database resources that show

  • where in the brain and when during development genes are expressed.

The genes form an interconnected expression network with many more connections than

  • that of the genes with spontaneous damaging mutations in unaffected siblings.

The spontaneously mutated genes in people with schizophrenia

  • were expressed in the prefrontal cortex, a region in the front of the brain.

The genes are known to be involved in important pathways in brain development. Fifty of these genes were active

  • mainly during the period of fetal development.

“Processes critical for the brain’s development can be revealed by the mutations that disrupt them,” King says. “Mutations can lead to loss of integrity of a whole pathway,
not just of a single gene.”

These findings support the concept that schizophrenia may result, in part, from

  • disruptions in development in the prefrontal cortex during fetal development.

James E. Darnell’s “Reflections”

A brief history of the discovery of RNA and its role in transcription — peppered with career advice
By Joseph P. Tiano

James Darnell begins his Journal of Biological Chemistry “Reflections” article by saying, “graduate students these days

  • have to swim in a sea virtually turgid with the daily avalanche of new information and
  • may be momentarily too overwhelmed to listen to the aging.

I firmly believe how we learned what we know can provide useful guidance for how and what a newcomer will learn.” Considering his remarkable discoveries in

  • RNA processing and eukaryotic transcriptional regulation

spanning 60 years of research, Darnell’s advice should be cherished. In his second year at medical school at Washington University School of Medicine in St. Louis, while
studying streptococcal disease in Robert J. Glaser’s laboratory, Darnell realized he “loved doing the experiments” and had his first “career advancement event.”
He and technician Barbara Pesch discovered that in vivo penicillin treatment killed streptococci only in the exponential growth phase and not in the stationary phase. These
results were published in the Journal of Clinical Investigation and earned Darnell an interview with Harry Eagle at the National Institutes of Health.

Darnell arrived at the NIH in 1956, shortly after Eagle  shifted his research interest to developing his minimal essential cell culture medium, still used. Eagle, then studying cell metabolism, suggested that Darnell take up a side project on poliovirus replication in mammalian cells in collaboration with Robert I. DeMars. DeMars’ Ph.D.
adviser was also James  Watson’s mentor, so Darnell met Watson, who invited him to give a talk at Harvard University, which led to an assistant professor position
at the MIT under Salvador Luria. A take-home message is to embrace side projects, because you never know where they may lead: this project helped to shape
his career.

Darnell arrived in Boston in 1961. Following the discovery of DNA’s structure in 1953, the world of molecular biology was turning to RNA in an effort to understand how
proteins are made. Darnell’s background in virology (it was discovered in 1960 that viruses used RNA to replicate) was ideal for the aim of his first independent lab:
exploring mRNA in animal cells grown in culture. While at MIT, he developed a new technique for purifying RNA along with making other observations

  • suggesting that nonribosomal cytoplasmic RNA may be involved in protein synthesis.

When Darnell moved to Albert Einstein College of Medicine for full professorship in 1964,  it was hypothesized that heterogenous nuclear RNA was a precursor to mRNA.
At Einstein, Darnell discovered RNA processing of pre-tRNAs and demonstrated for the first time

  • that a specific nuclear RNA could represent a possible specific mRNA precursor.

In 1967 Darnell took a position at Columbia University, and it was there that he discovered (simultaneously with two other labs) that

  • mRNA contained a polyadenosine tail.

The three groups all published their results together in the Proceedings of the National Academy of Sciences in 1971. Shortly afterward, Darnell made his final career move
four short miles down the street to Rockefeller University in 1974.

Over the next 35-plus years at Rockefeller, Darnell never strayed from his original research question: How do mammalian cells make and control the making of different
mRNAs? His work was instrumental in the collaborative discovery of

  • splicing in the late 1970s and
  • in identifying and cloning many transcriptional activators.

Perhaps his greatest contribution during this time, with the help of Ernest Knight, was

  • the discovery and cloning of the signal transducers and activators of transcription (STAT) proteins.

And with George Stark, Andy Wilks and John Krowlewski, he described

  • cytokine signaling via the JAK-STAT pathway.

Darnell closes his “Reflections” with perhaps his best advice: Do not get too wrapped up in your own work, because “we are all needed and we are all in this together.”

Darnell Reflections - James_Darnell

Darnell Reflections – James_Darnell

http://www.asbmb.org/assets/0/366/418/428/85528/85529/85530/8758cb87-84ff-42d6-8aea-96fda4031a1b.jpg

Recent findings on presenilins and signal peptide peptidase

By Dinu-Valantin Bălănescu

γ-secretase and SPP

γ-secretase and SPP

Fig. 1 from the minireview shows a schematic depiction of γ-secretase and SPP

http://www.asbmb.org/assets/0/366/418/428/85528/85529/85530/c2de032a-daad-41e5-ba19-87a17bd26362.png

GxGD proteases are a family of intramembranous enzymes capable of hydrolyzing

  • the transmembrane domain of some integral membrane proteins.

The GxGD family is one of the three families of

  • intramembrane-cleaving proteases discovered so far (along with the rhomboid and site-2 protease) and
  • includes the γ-secretase and the signal peptide peptidase.

Although only recently discovered, a number of functions in human pathology and in numerous other biological processes

  • have been attributed to γ-secretase and SPP.

Taisuke Tomita and Takeshi Iwatsubo of the University of Tokyo highlighted the latest findings on the structure and function of γ-secretase and SPP
in a recent minireview in The Journal of Biological Chemistry.

  • γ-secretase is involved in cleaving the amyloid-β precursor protein, thus producing amyloid-β peptide,

the main component of senile plaques in Alzheimer’s disease patients’ brains. The complete structure of mammalian γ-secretase is not yet known; however,
Tomita and Iwatsubo note that biochemical analyses have revealed it to be a multisubunit protein complex.

  • Its catalytic subunit is presenilin, an aspartyl protease.

In vitro and in vivo functional and chemical biology analyses have revealed that

  • presenilin is a modulator and mandatory component of the γ-secretase–mediated cleavage of APP.

Genetic studies have identified three other components required for γ-secretase activity:

  1. nicastrin,
  2. anterior pharynx defective 1 and
  3. presenilin enhancer 2.

By coexpression of presenilin with the other three components, the authors managed to

  • reconstitute γ-secretase activity.

Tomita and Iwatsubo determined using the substituted cysteine accessibility method and by topological analyses, that

  • the catalytic aspartates are located at the center of the nine transmembrane domains of presenilin,
  • by revealing the exact location of the enzyme’s catalytic site.

The minireview also describes in detail the formerly enigmatic mechanism of γ-secretase mediated cleavage.

SPP, an enzyme that cleaves remnant signal peptides in the membrane

  • during the biogenesis of membrane proteins and
  • signal peptides from major histocompatibility complex type I,
  • also is involved in the maturation of proteins of the hepatitis C virus and GB virus B.

Bioinformatics methods have revealed in fruit flies and mammals four SPP-like proteins,

  • two of which are involved in immunological processes.

By using γ-secretase inhibitors and modulators, it has been confirmed

  • that SPP shares a similar GxGD active site and proteolytic activity with γ-secretase.

Upon purification of the human SPP protein with the baculovirus/Sf9 cell system,

  • single-particle analysis revealed further structural and functional details.

HLA targeting efficiency correlates with human T-cell response magnitude and with mortality from influenza A infection

From www.pnas.org –  Sep 3, 2013 4:24 PM

Experimental and computational evidence suggests that

  • HLAs preferentially bind conserved regions of viral proteins, a concept we term “targeting efficiency,” and that
  • this preference may provide improved clearance of infection in several viral systems.

To test this hypothesis, T-cell responses to A/H1N1 (2009) were measured from peripheral blood mononuclear cells obtained from a household cohort study
performed during the 2009–2010 influenza season. We found that HLA targeting efficiency scores significantly correlated with

  • IFN-γ enzyme-linked immunosorbent spot responses (P = 0.042, multiple regression).

A further population-based analysis found that the carriage frequencies of the alleles with the lowest targeting efficiencies, A*24,

  • were associated with pH1N1 mortality (r = 0.37, P = 0.031) and
  • are common in certain indigenous populations in which increased pH1N1 morbidity has been reported.

HLA efficiency scores and HLA use are associated with CD8 T-cell magnitude in humans after influenza infection.
The computational tools used in this study may be useful predictors of potential morbidity and

  • identify immunologic differences of new variant influenza strains
  • more accurately than evolutionary sequence comparisons.

Population-based studies of the relative frequency of these alleles in severe vs. mild influenza cases

  • might advance clinical practices for severe H1N1 infections among genetically susceptible populations.

Metabolomics in drug target discovery

J D Rabinowitz et al.

Lewis-Sigler Institute for Integrative Genomics, Princeton University, Princeton, NJ.
Cold Spring Harbor Symposia on Quantitative Biology 11/2011; 76:235-46.
http://dx.doi.org:/10.1101/sqb.2011.76.010694 

Most diseases result in metabolic changes. In many cases, these changes play a causative role in disease progression. By identifying pathological metabolic changes,

  • metabolomics can point to potential new sites for therapeutic intervention.

Particularly promising enzymatic targets are those that

  • carry increased flux in the disease state.

Definitive assessment of flux requires the use of isotope tracers. Here we present techniques for

  • finding new drug targets using metabolomics and isotope tracers.

The utility of these methods is exemplified in the study of three different viral pathogens. For influenza A and herpes simplex virus,

  • metabolomic analysis of infected versus mock-infected cells revealed
  • dramatic concentration changes around the current antiviral target enzymes.

Similar analysis of human-cytomegalovirus-infected cells, however, found the greatest changes

  • in a region of metabolism unrelated to the current antiviral target.

Instead, it pointed to the tricarboxylic acid (TCA) cycle and

  • its efflux to feed fatty acid biosynthesis as a potential preferred target.

Isotope tracer studies revealed that cytomegalovirus greatly increases flux through

  • the key fatty acid metabolic enzyme acetyl-coenzyme A carboxylase.
  • Inhibition of this enzyme blocks human cytomegalovirus replication.

Examples where metabolomics has contributed to identification of anticancer drug targets are also discussed. Eventual proof of the value of

  • metabolomics as a drug target discovery strategy will be
  • successful clinical development of therapeutics hitting these new targets.

 Related References

Use of metabolic pathway flux information in targeted cancer drug design. Drug Discovery Today: Therapeutic Strategies 1:435-443, 2004.

Detection of resistance to imatinib by metabolic profiling: clinical and drug development implications. Am J Pharmacogenomics. 2005;5(5):293-302. Review. PMID: 16196499

Medicinal chemistry, metabolic profiling and drug target discovery: a role for metabolic profiling in reverse pharmacology and chemical genetics.
Mini Rev Med Chem.  2005 Jan;5(1):13-20. Review. PMID: 15638788 [PubMed – indexed for MEDLINE] Related citations

Development of Tracer-Based Metabolomics and its Implications for the Pharmaceutical Industry. Int J Pharm Med 2007; 21 (3): 217-224.

Use of metabolic pathway flux information in anticancer drug design. Ernst Schering Found Symp Proc. 2007;(4):189-203. Review. PMID: 18811058

Pharmacological targeting of glucagon and glucagon-like peptide 1 receptors has different effects on energy state and glucose homeostasis in diet-induced obese mice. J Pharmacol Exp Ther. 2011 Jul;338(1):70-81. http://dx.doi.org:/10.1124/jpet.111.179986. PMID: 21471191

Single valproic acid treatment inhibits glycogen and RNA ribose turnover while disrupting glucose-derived cholesterol synthesis in liver as revealed by the
[U-C(6)]-d-glucose tracer in mice. Metabolomics. 2009 Sep;5(3):336-345. PMID: 19718458

Metabolic Pathways as Targets for Drug Screening, Metabolomics, Dr Ute Roessner (Ed.), ISBN: 978-953-51-0046-1, InTech, Available from: http://www.intechopen.com/books/metabolomics/metabolic-pathways-as-targets-for-drug-screening

Iron regulates glucose homeostasis in liver and muscle via AMP-activated protein kinase in mice. FASEB J. 2013 Jul;27(7):2845-54.
http://dx.doi.org:/10.1096/fj.12-216929. PMID: 23515442

Metabolomics and systems pharmacology: why and how to model the human metabolic network for drug discovery

Drug Discov. Today 19 (2014), 171–182     http://dx.doi.org:/10.1016/j.drudis.2013.07.014

Highlights

  • We now have metabolic network models; the metabolome is represented by their nodes.
  • Metabolite levels are sensitive to changes in enzyme activities.
  • Drugs hitchhike on metabolite transporters to get into and out of cells.
  • The consensus network Recon2 represents the present state of the art, and has predictive power.
  • Constraint-based modelling relates network structure to metabolic fluxes.

Metabolism represents the ‘sharp end’ of systems biology, because changes in metabolite concentrations are

  • necessarily amplified relative to changes in the transcriptome, proteome and enzyme activities, which can be modulated by drugs.

To understand such behaviour, we therefore need (and increasingly have) reliable consensus (community) models of

  • the human metabolic network that include the important transporters.

Small molecule ‘drug’ transporters are in fact metabolite transporters, because

  • drugs bear structural similarities to metabolites known from the network reconstructions and
  • from measurements of the metabolome.

Recon2 represents the present state-of-the-art human metabolic network reconstruction; it can predict inter alia:

(i) the effects of inborn errors of metabolism;

(ii) which metabolites are exometabolites, and

(iii) how metabolism varies between tissues and cellular compartments.

However, even these qualitative network models are not yet complete. As our understanding improves

  • so do we recognise more clearly the need for a systems (poly)pharmacology.

Introduction – a systems biology approach to drug discovery

It is clearly not news that the productivity of the pharmaceutical industry has declined significantly during recent years

  • following an ‘inverse Moore’s Law’, Eroom’s Law, or
  • that many commentators, consider that the main cause of this is
  • because of an excessive focus on individual molecular target discovery rather than a more sensible strategy
  • based on a systems-level approach (Fig. 1).
drug discovery science

drug discovery science

Figure 1.

The change in drug discovery strategy from ‘classical’ function-first approaches (in which the assay of drug function was at the tissue or organism level),
with mechanistic studies potentially coming later, to more-recent target-based approaches where initial assays usually involve assessing the interactions
of drugs with specified (and often cloned, recombinant) proteins in vitro. In the latter cases, effects in vivo are assessed later, with concomitantly high levels of attrition.

Arguably the two chief hallmarks of the systems biology approach are:

(i) that we seek to make mathematical models of our systems iteratively or in parallel with well-designed ‘wet’ experiments, and
(ii) that we do not necessarily start with a hypothesis but measure as many things as possible (the ’omes) and

  • let the data tell us the hypothesis that best fits and describes them.

Although metabolism was once seen as something of a Cinderella subject,

  • there are fundamental reasons to do with the organisation of biochemical networks as
  • to why the metabol(om)ic level – now in fact seen as the ‘apogee’ of the ’omics trilogy –
  •  is indeed likely to be far more discriminating than are
  • changes in the transcriptome or proteome.

The next two subsections deal with these points and Fig. 2 summarises the paper in the form of a Mind Map.

metabolomics and systems pharmacology

metabolomics and systems pharmacology

http://ars.els-cdn.com/content/image/1-s2.0-S1359644613002481-gr2.jpg

Metabolic Disease Drug Discovery— “Hitting the Target” Is Easier Said Than Done

David E. Moller, et al.   http://dx.doi.org:/10.1016/j.cmet.2011.10.012

Despite the advent of new drug classes, the global epidemic of cardiometabolic disease has not abated. Continuing

  • unmet medical needs remain a major driver for new research.

Drug discovery approaches in this field have mirrored industry trends, leading to a recent

  • increase in the number of molecules entering development.

However, worrisome trends and newer hurdles are also apparent. The history of two newer drug classes—

  1. glucagon-like peptide-1 receptor agonists and
  2. dipeptidyl peptidase-4 inhibitors—

illustrates both progress and challenges. Future success requires that researchers learn from these experiences and

  • continue to explore and apply new technology platforms and research paradigms.

The global epidemic of obesity and diabetes continues to progress relentlessly. The International Diabetes Federation predicts an even greater diabetes burden (>430 million people afflicted) by 2030, which will disproportionately affect developing nations (International Diabetes Federation, 2011). Yet

  • existing drug classes for diabetes, obesity, and comorbid cardiovascular (CV) conditions have substantial limitations.

Currently available prescription drugs for treatment of hyperglycemia in patients with type 2 diabetes (Table 1) have notable shortcomings. In general,

Therefore, clinicians must often use combination therapy, adding additional agents over time. Ultimately many patients will need to use insulin—a therapeutic class first introduced in 1922. Most existing agents also have

  • issues around safety and tolerability as well as dosing convenience (which can impact patient compliance).

Pharmacometabolomics, also known as pharmacometabonomics, is a field which stems from metabolomics,

  • the quantification and analysis of metabolites produced by the body.

It refers to the direct measurement of metabolites in an individual’s bodily fluids, in order to

  • predict or evaluate the metabolism of pharmaceutical compounds, and
  • to better understand the pharmacokinetic profile of a drug.

Alternatively, pharmacometabolomics can be applied to measure metabolite levels

  • following the administration of a pharmaceutical compound, in order to
  • monitor the effects of the compound on certain metabolic pathways(pharmacodynamics).

This provides detailed mapping of drug effects on metabolism and

  • the pathways that are implicated in mechanism of variation of response to treatment.

In addition, the metabolic profile of an individual at baseline (metabotype) provides information about

  • how individuals respond to treatment and highlights heterogeneity within a disease state.

All three approaches require the quantification of metabolites found

relationship between -OMICS

relationship between -OMICS

http://upload.wikimedia.org/wikipedia/commons/thumb/e/eb/OMICS.png/350px-OMICS.png

Pharmacometabolomics is thought to provide information that

Looking at the characteristics of an individual down through these different levels of detail, there is an

  • increasingly more accurate prediction of a person’s ability to respond to a pharmaceutical compound.
  1. the genome, made up of 25 000 genes, can indicate possible errors in drug metabolism;
  2. the transcriptome, made up of 85,000 transcripts, can provide information about which genes important in metabolism are being actively transcribed;
  3. and the proteome, >10,000,000 members, depicts which proteins are active in the body to carry out these functions.

Pharmacometabolomics complements the omics with

  • direct measurement of the products of all of these reactions, but with perhaps a relatively
  • smaller number of members: that was initially projected to be approximately 2200 metabolites,

but could be a larger number when gut derived metabolites and xenobiotics are added to the list. Overall, the goal of pharmacometabolomics is

  • to more closely predict or assess the response of an individual to a pharmaceutical compound,
  • permitting continued treatment with the right drug or dosage
  • depending on the variations in their metabolism and ability to respond to treatment.

Pharmacometabolomic analyses, through the use of a metabolomics approach,

  • can provide a comprehensive and detailed metabolic profile or “metabolic fingerprint” for an individual patient.

Such metabolic profiles can provide a complete overview of individual metabolite or pathway alterations,

This approach can then be applied to the prediction of response to a pharmaceutical compound

  • by patients with a particular metabolic profile.

Pharmacometabolomic analyses of drug response are

Pharmacogenetics focuses on the identification of genetic variations (e.g. single-nucleotide polymorphisms)

  • within patients that may contribute to altered drug responses and overall outcome of a certain treatment.

The results of pharmacometabolomics analyses can act to “inform” or “direct”

  • pharmacogenetic analyses by correlating aberrant metabolite concentrations or metabolic pathways to potential alterations at the genetic level.

This concept has been established with two seminal publications from studies of antidepressants serotonin reuptake inhibitors

  • where metabolic signatures were able to define a pathway implicated in response to the antidepressant and
  • that lead to identification of genetic variants within a key gene
  • within the highlighted pathway as being implicated in variation in response.

These genetic variants were not identified through genetic analysis alone and hence

  • illustrated how metabolomics can guide and inform genetic data.

en.wikipedia.org/wiki/Pharmacometabolomics

Benznidazole Biotransformation and Multiple Targets in Trypanosoma cruzi Revealed by Metabolomics

Andrea Trochine, Darren J. Creek, Paula Faral-Tello, Michael P. Barrett, Carlos Robello
Published: May 22, 2014   http://dx.doi.org:/10.1371/journal.pntd.0002844

The first line treatment for Chagas disease, a neglected tropical disease caused by the protozoan parasite Trypanosoma cruzi,

  • involves administration of benznidazole (Bzn).

Bzn is a 2-nitroimidazole pro-drug which requires nitroreduction to become active. We used a

  • non-targeted MS-based metabolomics approach to study the metabolic response of T. cruzi to Bzn.

Parasites treated with Bzn were minimally altered compared to untreated trypanosomes, although the redox active thiols

  1. trypanothione,
  2. homotrypanothione and
  3. cysteine

were significantly diminished in abundance post-treatment. In addition, multiple Bzn-derived metabolites were detected after treatment.

These metabolites included reduction products, fragments and covalent adducts of reduced Bzn

  • linked to each of the major low molecular weight thiols:
  1. trypanothione,
  2. glutathione,
  3. g-glutamylcysteine,
  4. glutathionylspermidine,
  5. cysteine and
  6. ovothiol A.

Bzn products known to be generated in vitro by the unusual trypanosomal nitroreductase, TcNTRI,

  • were found within the parasites,
  • but low molecular weight adducts of glyoxal, a proposed toxic end-product of NTRI Bzn metabolism, were not detected.

Our data is indicative of a major role of the

  • thiol binding capacity of Bzn reduction products
  • in the mechanism of Bzn toxicity against T. cruzi.

 

 

Read Full Post »


February 19-20, 2015 | The InterContinental San Francisco | San Francisco, CA

Reporter: Aviva Lev-Ari, PhD, RN

New Frontiers in Gene Editing

Transitioning From the Lab to the Clinic

February 19-20, 2015 | The InterContinental San Francisco | San Francisco, CA
Part of the 22nd International Molecular Medicine Tri-Conference

 

Gene editing is rapidly progressing from being a research/screening tool to one that promises important applications downstream in drug development and cell therapy. Cambridge Healthtech Institute’s inaugural symposium on New Frontiers in Gene Editing will bring together experts from all aspects of basic science and clinical research to talk about how and where gene editing can be best applied. What are the different tools that can be used for gene editing, and what are their strengths and limitations? How does the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)/Cas system, compare to Transcription Activator-like Effector Nucleases (TALENs), zinc finger nucleases (ZFNs) and other systems and where are they being used? Scientists and clinicians from pharma/biotech as well as from academic and government labs will share their experiences leveraging the utility of gene editing for functional screening, creating cell lines and knock-outs for disease modeling, and for cell therapy.


Day 1 | Day 2 | Download Brochure

Thursday, February 19

7:30 am Registration and Morning Coffee

 

USING GENE EDITING FOR FUNCTIONAL SCREENS

9:00 Chairperson’s Opening Remarks

Joseph C. Wu, M.D., Ph.D., Director, Stanford Cardiovascular Institute and Professor, Department of Medicine/Cardiology & Radiology, Stanford University School of Medicine

 

9:10 KEYNOTE PRESENTATION:

Genome Edited Induced Pluripotent Stem Cells (iPSCs) for Drug Screening

Joseph C. Wu, M.D., Ph.D., Director, Stanford Cardiovascular Institute and Professor, Department of Medicine/Cardiology & Radiology, Stanford University School of Medicine

Dr. Wu’s lab is focusing on human iPSCs for cardiac disease modeling, drug discovery, and regenerative medicine. We have been using ZFN, TALEN, and CRISPR to create isogenic iPSC lines that carry various cardiovascular diseases (e.g., LQT, HCM, DCM) as well as reporter genes for in vitro and in vivo tracking. We are also using this approach for improving the efficiency of high throughput drug screening.

 

9:40 Exploration of Cellular Stress and Trafficking Pathways Using shRNA and CRISPR/Cas9-Based Systems

Michael Bassik, Ph.D., Assistant Professor, Department of Genetics, Stanford University

We have developed high-complexity shRNA libraries (25 shRNAs/gene) that greatly reduce false negatives/false positives for RNAi screens, and have adapted these libraries to knock down gene pairs to perform systematic genetic interaction maps in mammalian cells. We have used these maps to study ER-trafficking toxins, identify novel protein complexes, and gain insights into retrograde trafficking. We are using this strategy together with the CRISPR/Cas9 system for functional genomics efforts and identification of novel drug targets.

10:10 Gene Editing in Patient-Derived Stem Cells for in vitro Modeling of Parkinson’s Disease

Birgitt Schuele, M.D., Associate Professor and Director of Gene Discovery and Stem Cell Modeling, The Parkinson’s Institute

Recent development of “genome editing” technologies to introduce site-specific genome modifications in disease relevant genes lay the foundation for new approaches to understand direct genotype-phenotype correlations at the molecular level in human disease. With the introduction of next-generation sequencing, many new genetic variants have been identified in Parkinson’s related genes; however, it is currently challenging to interrogate their functional relevance. Human-derived genome edited cell lines will be a way to analyze variants in a high-throughput format.

10:40 Coffee Break with Exhibit and Poster Viewing

 

INNOVATIVE TOOLS FOR SCREENING & DELIVERY

11:15 Massively Parallel Combinatorial Genetics to Overcome Drug Resistance in Bacterial Infections and Cancer

Timothy K. Lu, M.D., Ph.D., Associate Professor, Synthetic Biology Group, Department of Electrical Engineering and Computer Science and Department of Biological Engineering, Synthetic Biology Center, Massachusetts Institute of Technology

Complex biological phenotypes can result from the interplay of multiple genetic factors but deciphering the multifactorial genotypes that underlie these phenotypes is challenging. We have developed technologies for the scalable and barcoded assembly of high-order combinatorial genetic libraries. These strategies enable multiplexed tracking of individual genetic combinations with next-generation sequencing in pooled screens. We have used these technologies to perform massively parallel combinatorial genetics in bacteria and human cells and to modulate relevant phenotype.

11:45 Nucleic Acid Delivery Systems for RNA Therapy and Gene Editing

Daniel G. Anderson, Ph.D., Professor, Department of Chemical Engineering, Institute for Medical Engineering & Science, Harvard-MIT Division of Health Sciences & Technology and David H. Koch Institute for Integrative Cancer Research, Massachusetts Institute of Technology

High throughput, combinatorial approaches have revolutionized small molecule drug discovery. Here we describe our work on high throughput methods for developing and characterizing RNA delivery and gene editing systems. Libraries of degradable polymers and lipid-like materials have been synthesized, formulated and screened for their ability to delivery RNA, both in vitro and in vivo. A number of delivery formulations have been developed with in vivo efficacy, and show potential therapeutic application for the treatment of genetic disease, viral infection, and cancer.

12:15 pm Sponsored Presentation (Opportunity Available)

12:30 Session Break

Cellecta12:40 Luncheon Presentation

Speaker to be Announced

1:15 Session Break

 

TRANSLATING GENE EDITING IN VIVO

1:50 Chairperson’s Remarks

Eric N. Olson, Ph.D., Professor and Chairman, Department of Molecular Biology, The University of Texas Southwestern Medical Center.

 

2:00 KEYNOTE PRESENTATION:

Preventing Muscle Disease by Genomic Editing

Eric N. Olson, Ph.D., Professor and Chairman, Department of Molecular Biology, The University of Texas Southwestern Medical Center

Duchenne muscular dystrophy (DMD) is a fatal muscle disease caused by mutations in the gene encoding dystrophin, a protein required for muscle fiber integrity. We used CRISPR/Cas9-mediated genome editing to correct the dystrophin gene (Dmd) mutation in the germline of mdx mice, a model for DMD. The degree of muscle phenotypic rescue in mosaic mice exceeded the efficiency of gene correction, likely reflecting the progressive contribution of corrected cells to regenerating muscle. Progress toward the correction of DMD in adult myofibers will be discussed.

 

2:30 CRISPR-Cas: Tools and Applications for Genome Editing

Fei Ann Ran, Ph.D., Post-Doctoral Fellow, Laboratory of Dr. Feng Zhang, Broad Institute and Junior Fellow, Harvard Society of Fellows

Recently, the Cas9 nuclease from the bacterial CRISPR (clustered regularly interspaced short palindromic repeats) adaptive immune system has been adapted for targeted genome editing in a number of plant and animal species. Cas9 can be programmed by short guide RNAs to induce multiplexed gene knockout or homology-directed repair with robust efficiency. We have further identified additional small Cas9 orthologs that can be delivered by adeno-associated virus for effective gene modification of somatic tissues in vivo.

3:00 Refreshment Break with Exhibit and Poster Viewing

3:30 Anti-HIV Therapies: Genome Engineering the Virus and the Host

Paula M. Cannon Ph.D., Associate Professor, Molecular Microbiology & Immunology, Biochemistry, and Pediatrics, Keck School of Medicine, University of Southern California

By taking advantage of cellular repair pathways, targeted nucleases such as, zinc finger nucleases (ZFNs) can be used to achieve precise gene knockout, gene editing, or gene addition. For anti-HIV applications, nucleases can disrupt the CCR5 co-receptor gene, be used to insert anti-HIV genes at a designated site, or inactivate the viral genome that persists in infected cells. We use humanized mouse models to help us evaluate the translational potential of these different applications of targeted nuclease technologies.

4:00 Nuclease-Based Gene Correction for Treating Single Gene Disorders

Gang Bao, Ph.D., Robert A. Milton Chair Professor in Biomedical Engineering, Department of Biomedical Engineering, Georgia Institute of Technology and Emory University

We have developed a clinically applicable gene correction technology to treat sickle cell disease (SCD), which is caused by a single (A-T) mutation in the beta-globin gene. To treat SCD, we constructed TALENs and CRISPR/Cas9 systems that specifically target beta-globin gene and systematically evaluated their on- and off-target cleavage in different cells. We also quantified the nuclease-induced gene modification rates due to homologous recombination and non-homologous end joining. These studies significantly facilitated our pre-clinical investigation using mouse models.

4:30 Genome Editing for Genetic Diseases of the Blood

Matthew Porteus, M.D., Ph.D., Associate Professor, Pediatrics, Stanford University School of Medicine

A potentially ideal approach to the curative treatment of genetic blood diseases is to directly modify the hematopoietic stem cell in a precise fashion using genome editing. With the development of multiple different nuclease platforms, including zinc finger nucleases, TAL effector nucleases, and RNA-guided endonucleaes of the CRISPR/Cas9 family this can now be approached in a variety of different ways. We have focused on using this strategy for a number of different diseases and in this presentation will focus on our progress for severe combined immunodeficiency.

5:00 Close of Day


Day 1 | Day 2 | Download Brochure

Friday, February 20

8:00 am Morning Coffee

 

EXPLORING GENE EDITING FOR THERAPEUTIC USES

8:25 Chairperson’s Remarks

Charles A. Gersbach, Ph.D., Assistant Professor, Department of Biomedical Engineering, Center for Genomic and Computational Biology, Duke University

8:30 Genome Engineering Tools for Gene Therapy and Regenerative Medicine

Charles A. Gersbach, Ph.D., Assistant Professor, Department of Biomedical Engineering, Center for Genomic and Computational Biology, Duke University

Genome engineering tools, including zinc finger proteins, TALEs, and the CRISPR/Cas9 system, can be used to both edit gene sequences and control the expression of endogenous genes for applications in medicine and basic science. For example, we have used each of these tools for gene editing to restore the expression of the dystrophin protein that is mutated in cells from Duchenne muscular dystrophy patients. In other studies, we have developed gene regulation tools that can be applied to cell reprogramming for disease modeling and regenerative medicine.

9:00 Preventing Transmission of Mitochondrial Diseases by Germline Heteroplasmic Shift Using Transcription Activator-like Effector Nucleases (TALENs)

Juan Carlos Izpisua Belmonte, Ph.D., Roger Guillemin Chair Professor, Gene Expression Laboratory, The Salk Institute for Biological Studies

Mitochondrial diseases include a group of maternally inherited genetic disorders caused by mutations in the mitochondrial DNA (mtDNA). In most of these patients, mutant mtDNA coexists with wild type mtDNA, a situation known as mtDNA heteroplasmy. Pre-implantation genetic diagnosis can only help to reduce, but not fully prevent, the transmission of mitochondrial diseases. We will report a novel strategy towards preventing germline transmission of mitochondrial diseases by induction of mtDNAheteroplasmy shift.

 

9:30 KEYNOTE PRESENTATION:

Precise Single-base Genome Engineering for Human Diagnostics and Therapy

Bruce R. Conklin M.D., Investigator, Roddenberry Center for Stem Cell Biology and Medicine, Gladstone Institutes and Professor, Division of Genomic Medicine University of California, San Francisco

Dr. Conklin’s research focuses using genome engineering to find therapies for life threatening human genetic diseases. Current projects us human induced pluripotent stem cells (iPSCs) to model cardiac, hepatic and neurological diseases. The Conklin lab has recently developed a method that significantly increases the ability to perform scarless single-base genome editing to induce or revert disease mutations. These studies allow the construction of robust in vitro human disease models, and also provide a path to cell therapy with gene corrected cells.

 

Sigma Logo10:00 Sponsored Presentation

Speaker to be Announced

10:30 Coffee Break with Exhibit and Poster Viewing

 

THERAPEUTIC LANDSCAPE:
OPPORTUNITIES & CONCERNS

11:00 Gene Editing on the Cusp of Exciting Opportunities for Human Therapeutics

Rodger Novak, M.D., CEO, CRISPR Therapeutics

Within less than two years after its inception the CRISPR-Cas system has truly democratized genome editing with many areas of research being transformed due to ease of use and broad applicability of the technology. With such an enormous impact on many areas of life science the translation of the CRISRP-Cas technology into human therapeutics seems to be a logical consequence. However, besides many exciting opportunities a number of challenges will have to be addressed; some of them more obvious than others.

11:30 Advancing the CRISPR/Cas9 Technology Platform for Therapeutic Applications

Alexandra Glucksmann., Ph.D., COO, Editas Medicine

Genome editing technologies, including the CRISPR/Cas9 system, allow for precise and corrective molecular modifications to treat the underlying cause of genetic diseases.Key to the successful translation of CRISPR/Cas9 systems to the clinic is the optimization of the technology within the context of specific therapeutic applications. This presentation will focus on Editas Medicine’s approach to improving both activity and specificity of CRISPR/Cas9-mediated gene editing in parallel with the development of delivery solutions for therapeutic applications.

12:00 pm CRISPR/Cas-9: Navigating Intellectual Property (IP) Challenges in Gene Editing

Chelsea Loughran, Associate, Litigation Group, Wolf, Greenfield and Sacks, P.C.

Given the relative simplicity and widespread applicability of the CRISPR/Cas-9 genomic editing platform, scientists and university technology licensing officers need to be aware of the evolving IP landscape for this technology—and how it may impact research today and looking forward. We have been following the latest IP developments closely, and will discuss the IP landscape for this exciting new technology, as well as, possible impacts on scientists’ ability to carry out research, and technology licensing officers’ ability to license the CRISPR/Cas-9 technology.

12:30 Close of Symposium

 

Read Full Post »


Antimalarial flow synthesis closer to commercialisation.

Read Full Post »

Metabolomics Summary and Perspective


Metabolomics Summary and Perspective

Author and Curator: Larry H Bernstein, MD, FCAP 

 

This is the final article in a robust series on metabolism, metabolomics, and  the “-OMICS-“ biological synthesis that is creating a more holistic and interoperable view of natural sciences, including the biological disciplines, climate science, physics, chemistry, toxicology, pharmacology, and pathophysiology with as yet unforeseen consequences.

There have been impressive advances already in the research into developmental biology, plant sciences, microbiology, mycology, and human diseases, most notably, cancer, metabolic , and infectious, as well as neurodegenerative diseases.

Acknowledgements:

I write this article in honor of my first mentor, Harry Maisel, Professor and Emeritus Chairman of Anatomy, Wayne State University, Detroit, MI and to my stimulating mentors, students, fellows, and associates over many years:

Masahiro Chiga, MD, PhD, Averill A Liebow, MD, Nathan O Kaplan, PhD, Johannes Everse, PhD, Norio Shioura, PhD, Abraham Braude, MD, Percy J Russell, PhD, Debby Peters, Walter D Foster, PhD, Herschel Sidransky, MD, Sherman Bloom, MD, Matthew Grisham, PhD, Christos Tsokos, PhD,  IJ Good, PhD, Distinguished Professor, Raool Banagale, MD, Gustavo Reynoso, MD,Gustave Davis, MD, Marguerite M Pinto, MD, Walter Pleban, MD, Marion Feietelson-Winkler, RD, PhD,  John Adan,MD, Joseph Babb, MD, Stuart Zarich, MD,  Inder Mayall, MD, A Qamar, MD, Yves Ingenbleek, MD, PhD, Emeritus Professor, Bette Seamonds, PhD, Larry Kaplan, PhD, Pauline Y Lau, PhD, Gil David, PhD, Ronald Coifman, PhD, Emeritus Professor, Linda Brugler, RD, MBA, James Rucinski, MD, Gitta Pancer, Ester Engelman, Farhana Hoque, Mohammed Alam, Michael Zions, William Fleischman, MD, Salman Haq, MD, Jerard Kneifati-Hayek, Madeleine Schleffer, John F Heitner, MD, Arun Devakonda,MD, Liziamma George,MD, Suhail Raoof, MD, Charles Oribabor,MD, Anthony Tortolani, MD, Prof and Chairman, JRDS Rosalino, PhD, Aviva Lev Ari, PhD, RN, Rosser Rudolph, MD, PhD, Eugene Rypka, PhD, Jay Magidson, PhD, Izaak Mayzlin, PhD, Maurice Bernstein, PhD, Richard Bing, Eli Kaplan, PhD, Maurice Bernstein, PhD.

This article has EIGHT parts, as follows:

Part 1

Metabolomics Continues Auspicious Climb

Part 2

Biologists Find ‘Missing Link’ in the Production of Protein Factories in Cells

Part 3

Neuroscience

Part 4

Cancer Research

Part 5

Metabolic Syndrome

Part 6

Biomarkers

Part 7

Epigenetics and Drug Metabolism

Part 8

Pictorial

genome cartoon

genome cartoon

 iron metabolism

iron metabolism

personalized reference range within population range

personalized reference range within population range

Part 1.  MetabolomicsSurge

metagraph  _OMICS

metagraph _OMICS

Metabolomics Continues Auspicious Climb

Jeffery Herman, Ph.D.
GEN May 1, 2012 (Vol. 32, No. 9)

Aberrant biochemical and metabolite signaling plays an important role in

  • the development and progression of diseased tissue.

This concept has been studied by the science community for decades. However, with relatively

  1. recent advances in analytical technology and bioinformatics as well as
  2. the development of the Human Metabolome Database (HMDB),

metabolomics has become an invaluable field of research.

At the “International Conference and Exhibition on Metabolomics & Systems Biology” held recently in San Francisco, researchers and industry leaders discussed how

  • the underlying cellular biochemical/metabolite fingerprint in response to
  1. a specific disease state,
  2. toxin exposure, or
  3. pharmaceutical compound
  • is useful in clinical diagnosis and biomarker discovery and
  • in understanding disease development and progression.

Developed by BASF, MetaMap® Tox is

  • a database that helps identify in vivo systemic effects of a tested compound, including
  1. targeted organs,
  2. mechanism of action, and
  3. adverse events.

Based on 28-day systemic rat toxicity studies, MetaMap Tox is composed of

  • differential plasma metabolite profiles of rats
  • after exposure to a large variety of chemical toxins and pharmaceutical compounds.

“Using the reference data,

  • we have developed more than 110 patterns of metabolite changes, which are
  • specific and predictive for certain toxicological modes of action,”

said Hennicke Kamp, Ph.D., group leader, department of experimental toxicology and ecology at BASF.

With MetaMap Tox, a potential drug candidate

  • can be compared to a similar reference compound
  • using statistical correlation algorithms,
  • which allow for the creation of a toxicity and mechanism of action profile.

“MetaMap Tox, in the context of early pre-clinical safety enablement in pharmaceutical development,” continued Dr. Kamp,

  • has been independently validated “
  • by an industry consortium (Drug Safety Executive Council) of 12 leading biopharmaceutical companies.”

Dr. Kamp added that this technology may prove invaluable

  • allowing for quick and accurate decisions and
  • for high-throughput drug candidate screening, in evaluation
  1. on the safety and efficacy of compounds
  2. during early and preclinical toxicological studies,
  3. by comparing a lead compound to a variety of molecular derivatives, and
  • the rapid identification of the most optimal molecular structure
  • with the best efficacy and safety profiles might be streamlined.
Dynamic Construct of the –Omics

Dynamic Construct of the –Omics

Targeted Tandem Mass Spectrometry

Biocrates Life Sciences focuses on targeted metabolomics, an important approach for

  • the accurate quantification of known metabolites within a biological sample.

Originally used for the clinical screening of inherent metabolic disorders from dried blood-spots of newborn children, Biocrates has developed

  • a tandem mass spectrometry (MS/MS) platform, which allows for
  1. the identification,
  2. quantification, and
  3. mapping of more than 800 metabolites to specific cellular pathways.

It is based on flow injection analysis and high-performance liquid chromatography MS/MS.

Clarification of Pathway-Specific Inhibition by Fourier Transform Ion Cyclotron Resonance.Mass Spectrometry-Based Metabolic Phenotyping Studies F5.large

common drug targets

common drug targets

The MetaDisIDQ® Kit is a

  • “multiparamatic” diagnostic assay designed for the “comprehensive assessment of a person’s metabolic state” and
  • the early determination of pathophysiological events with regards to a specific disease.

MetaDisIDQ is designed to quantify

  • a diverse range of 181 metabolites involved in major metabolic pathways
  • from a small amount of human serum (10 µL) using isotopically labeled internal standards,

This kit has been demonstrated to detect changes in metabolites that are commonly associated with the development of

  • metabolic syndrome, type 2 diabetes, and diabetic nephropathy,

Dr. Dallman reports that data generated with the MetaDisIDQ kit correlates strongly with

  • routine chemical analyses of common metabolites including glucose and creatinine

Biocrates has also developed the MS/MS-based AbsoluteIDQ® kits, which are

  • an “easy-to-use” biomarker analysis tool for laboratory research.

The kit functions on MS machines from a variety of vendors, and allows for the quantification of 150-180 metabolites.

The SteroIDQ® kit is a high-throughput standardized MS/MS diagnostic assay,

  • validated in human serum, for the rapid and accurate clinical determination of 16 known steroids.

Initially focusing on the analysis of steroid ranges for use in hormone replacement therapy, the SteroIDQ Kit is expected to have a wide clinical application.

Hormone-Resistant Breast Cancer

Scientists at Georgetown University have shown that

  • breast cancer cells can functionally coordinate cell-survival and cell-proliferation mechanisms,
  • while maintaining a certain degree of cellular metabolism.

To grow, cells need energy, and energy is a product of cellular metabolism. For nearly a century, it was thought that

  1. the uncoupling of glycolysis from the mitochondria,
  2. leading to the inefficient but rapid metabolism of glucose and
  3. the formation of lactic acid (the Warburg effect), was

the major and only metabolism driving force for unchecked proliferation and tumorigenesis of cancer cells.

Other aspects of metabolism were often overlooked.

“.. we understand now that

  • cellular metabolism is a lot more than just metabolizing glucose,”

said Robert Clarke, Ph.D., professor of oncology and physiology and biophysics at Georgetown University. Dr. Clarke, in collaboration with the Waters Center for Innovation at Georgetown University (led by Albert J. Fornace, Jr., M.D.), obtained

  • the metabolomic profile of hormone-sensitive and -resistant breast cancer cells through the use of UPLC-MS.

They demonstrated that breast cancer cells, through a rather complex and not yet completely understood process,

  1. can functionally coordinate cell-survival and cell-proliferation mechanisms,
  2. while maintaining a certain degree of cellular metabolism.

This is at least partly accomplished through the upregulation of important pro-survival mechanisms; including

  • the unfolded protein response;
  • a regulator of endoplasmic reticulum stress and
  • initiator of autophagy.

Normally, during a stressful situation, a cell may

  • enter a state of quiescence and undergo autophagy,
  • a process by which a cell can recycle organelles
  • in order to maintain enough energy to survive during a stressful situation or,

if the stress is too great,

  • undergo apoptosis.

By integrating cell-survival mechanisms and cellular metabolism

  • advanced ER+ hormone-resistant breast cancer cells
  • can maintain a low level of autophagy
  • to adapt and resist hormone/chemotherapy treatment.

This adaptation allows cells

  • to reallocate important metabolites recovered from organelle degradation and
  • provide enough energy to also promote proliferation.

With further research, we can gain a better understanding of the underlying causes of hormone-resistant breast cancer, with

  • the overall goal of developing effective diagnostic, prognostic, and therapeutic tools.

NMR

Over the last two decades, NMR has established itself as a major tool for metabolomics analysis. It is especially adept at testing biological fluids. [Bruker BioSpin]

Historically, nuclear magnetic resonance spectroscopy (NMR) has been used for structural elucidation of pure molecular compounds. However, in the last two decades, NMR has established itself as a major tool for metabolomics analysis. Since

  • the integral of an NMR signal is directly proportional to
  • the molar concentration throughout the dynamic range of a sample,

“the simultaneous quantification of compounds is possible

  • without the need for specific reference standards or calibration curves,” according to Lea Heintz of Bruker BioSpin.

NMR is adept at testing biological fluids because of

  1.  high reproducibility,
  2. standardized protocols,
  3. low sample manipulation, and
  4. the production of a large subset of data,

Bruker BioSpin is presently involved in a project for the screening of inborn errors of metabolism in newborn children from Turkey, based on their urine NMR profiles. More than 20 clinics are participating to the project that is coordinated by INFAI, a specialist in the transfer of advanced analytical technology into medical diagnostics. The construction of statistical models are being developed

  • for the detection of deviations from normality, as well as
  • automatic quantification methods for indicative metabolites

Bruker BioSpin recently installed high-resolution magic angle spinning NMR (HRMAS-NMR) systems that can rapidly analyze tissue biopsies. The main objective for HRMAS-NMR is to establish a rapid and effective clinical method to assess tumor grade and other important aspects of cancer during surgery.

Combined NMR and Mass Spec

There is increasing interest in combining NMR and MS, two of the main analytical assays in metabolomic research, as a means

  • to improve data sensitivity and to
  • fully elucidate the complex metabolome within a given biological sample.
  •  to realize a potential for cancer biomarker discovery in the realms of diagnosis, prognosis, and treatment.

.

Using combined NMR and MS to measure the levels of nearly 250 separate metabolites in the patient’s blood, Dr. Weljie and other researchers at the University of Calgary were able to rapidly determine the malignancy of a  pancreatic lesion (in 10–15% of the cases, it is difficult to discern between benign and malignant), while avoiding unnecessary surgery in patients with benign lesions.

When performing NMR and MS on a single biological fluid, ultimately “we are,” noted Dr. Weljie,

  1. “splitting up information content, processing, and introducing a lot of background noise and error and
  2. then trying to reintegrate the data…
    It’s like taking a complex item, with multiple pieces, out of an IKEA box and trying to repackage it perfectly into another box.”

By improving the workflow between the initial splitting of the sample, they improved endpoint data integration, proving that

  • a streamlined approach to combined NMR/MS can be achieved,
  • leading to a very strong, robust and precise metabolomics toolset.

Metabolomics Research Picks Up Speed

Field Advances in Quest to Improve Disease Diagnosis and Predict Drug Response

John Morrow Jr., Ph.D.
GEN May 1, 2011 (Vol. 31, No. 9)

As an important discipline within systems biology, metabolomics is being explored by a number of laboratories for

  • its potential in pharmaceutical development.

Studying metabolites can offer insights into the relationships between genotype and phenotype, as well as between genotype and environment. In addition, there is plenty to work with—there are estimated to be some 2,900 detectable metabolites in the human body, of which

  1. 309 have been identified in cerebrospinal fluid,
  2. 1,122 in serum,
  3. 458 in urine, and
  4. roughly 300 in other compartments.

Guowang Xu, Ph.D., a researcher at the Dalian Institute of Chemical Physics.  is investigating the causes of death in China,

  • and how they have been changing over the years as the country has become a more industrialized nation.
  •  the increase in the incidence of metabolic disorders such as diabetes has grown to affect 9.7% of the Chinese population.

Dr. Xu,  collaborating with Rainer Lehman, Ph.D., of the University of Tübingen, Germany, compared urinary metabolites in samples from healthy individuals with samples taken from prediabetic, insulin-resistant subjects. Using mass spectrometry coupled with electrospray ionization in the positive mode, they observed striking dissimilarities in levels of various metabolites in the two groups.

“When we performed a comprehensive two-dimensional gas chromatography, time-of-flight mass spectrometry analysis of our samples, we observed several metabolites, including

  • 2-hydroxybutyric acid in plasma,
  •  as potential diabetes biomarkers,” Dr. Xu explains.

In other, unrelated studies, Dr. Xu and the German researchers used a metabolomics approach to investigate the changes in plasma metabolite profiles immediately after exercise and following a 3-hour and 24-hour period of recovery. They found that

  • medium-chain acylcarnitines were the most distinctive exercise biomarkers, and
  • they are released as intermediates of partial beta oxidation in human myotubes and mouse muscle tissue.

Dr. Xu says. “The traditional approach of assessment based on a singular biomarker is being superseded by the introduction of multiple marker profiles.”

Typical of the studies under way by Dr. Kaddurah-Daouk and her colleaguesat Duke University

  • is a recently published investigation highlighting the role of an SNP variant in
  • the glycine dehydrogenase gene on individual response to antidepressants.
  •  patients who do not respond to the selective serotonin uptake inhibitors citalopram and escitalopram
  • carried a particular single nucleotide polymorphism in the GD gene.

“These results allow us to pinpoint a possible

  • role for glycine in selective serotonin reuptake inhibitor response and
  • illustrate the use of pharmacometabolomics to inform pharmacogenomics.

These discoveries give us the tools for prognostics and diagnostics so that

  • we can predict what conditions will respond to treatment.

“This approach to defining health or disease in terms of metabolic states opens a whole new paradigm.

By screening hundreds of thousands of molecules, we can understand

  • the relationship between human genetic variability and the metabolome.”

Dr. Kaddurah-Daouk talks about statins as a current

  • model of metabolomics investigations.

It is now known that the statins  have widespread effects, altering a range of metabolites. To sort out these changes and develop recommendations for which individuals should be receiving statins will require substantial investments of energy and resources into defining the complex web of biochemical changes that these drugs initiate.
Furthermore, Dr. Kaddurah-Daouk asserts that,

  • “genetics only encodes part of the phenotypic response.

One needs to take into account the

  • net environment contribution in order to determine
  • how both factors guide the changes in our metabolic state that determine the phenotype.”

Interactive Metabolomics

Researchers at the University of Nottingham use diffusion-edited nuclear magnetic resonance spectroscopy to assess the effects of a biological matrix on metabolites. Diffusion-edited NMR experiments provide a way to

  • separate the different compounds in a mixture
  • based on the differing translational diffusion coefficients (which reflect the size and shape of the molecule).

The measurements are carried out by observing

  • the attenuation of the NMR signals during a pulsed field gradient experiment.

Clare Daykin, Ph.D., is a lecturer at the University of Nottingham, U.K. Her field of investigation encompasses “interactive metabolomics,”which she defines as

“the study of the interactions between low molecular weight biochemicals and macromolecules in biological samples ..

  • without preselection of the components of interest.

“Blood plasma is a heterogeneous mixture of molecules that

  1. undergo a variety of interactions including metal complexation,
  2. chemical exchange processes,
  3. micellar compartmentation,
  4. enzyme-mediated biotransformations, and
  5. small molecule–macromolecular binding.”

Many low molecular weight compounds can exist

  • freely in solution,
  • bound to proteins, or
  • within organized aggregates such as lipoprotein complexes.

Therefore, quantitative comparison of plasma composition from

  • diseased individuals compared to matched controls provides an incomplete insight to plasma metabolism.

“It is not simply the concentrations of metabolites that must be investigated,

  • but their interactions with the proteins and lipoproteins within this complex web.

Rather than targeting specific metabolites of interest, Dr. Daykin’s metabolite–protein binding studies aim to study

  • the interactions of all detectable metabolites within the macromolecular sample.

Such activities can be studied through the use of diffusion-edited nuclear magnetic resonance (NMR) spectroscopy, in which one can assess

  • the effects of the biological matrix on the metabolites.

“This can lead to a more relevant and exact interpretation

  • for systems where metabolite–macromolecule interactions occur.”

Diffusion-edited NMR experiments provide a way to separate the different compounds in a mixture based on

  • the differing translational diffusion coefficients (which reflect the size and shape of the molecule).

The measurements are carried out by observing

  • the attenuation of the NMR signals during a pulsed field gradient experiment.

Pushing the Limits

It is widely recognized that many drug candidates fail during development due to ancillary toxicity. Uwe Sauer, Ph.D., professor, and Nicola Zamboni, Ph.D., researcher, both at the Eidgenössische Technische Hochschule, Zürich (ETH Zürich), are applying

  • high-throughput intracellular metabolomics to understand
  • the basis of these unfortunate events and
  • head them off early in the course of drug discovery.

“Since metabolism is at the core of drug toxicity, we developed a platform for

  • measurement of 50–100 targeted metabolites by
  • a high-throughput system consisting of flow injection
  • coupled to tandem mass spectrometry.”

Using this approach, Dr. Sauer’s team focused on

  • the central metabolism of the yeast Saccharomyces cerevisiae, reasoning that
  • this core network would be most susceptible to potential drug toxicity.

Screening approximately 41 drugs that were administered at seven concentrations over three orders of magnitude, they observed changes in metabolome patterns at much lower drug concentrations without attendant physiological toxicity.

The group carried out statistical modeling of about

  • 60 metabolite profiles for each drug they evaluated.

This data allowed the construction of a “profile effect map” in which

  • the influence of each drug on metabolite levels can be followed, including off-target effects, which
  • provide an indirect measure of the possible side effects of the various drugs.

Dr. Sauer says.“We have found that this approach is

  • at least 100 times as fast as other omics screening platforms,”

“Some drugs, including many anticancer agents,

  • disrupt metabolism long before affecting growth.”
killing cancer cells

killing cancer cells

Furthermore, they used the principle of 13C-based flux analysis, in which

  • metabolites labeled with 13C are used to follow the utilization of metabolic pathways in the cell.

These 13C-determined intracellular responses of metabolic fluxes to drug treatment demonstrate

  • the functional performance of the network to be rather robust,
conformational changes leading to substrate efflux.

conformational changes leading to substrate efflux.

leading Dr. Sauer to the conclusion that

  • the phenotypic vigor he observes to drug challenges
  • is achieved by a flexible make up of the metabolome.

Dr. Sauer is confident that it will be possible to expand the scope of these investigations to hundreds of thousands of samples per study. This will allow answers to the questions of

  • how cells establish a stable functioning network in the face of inevitable concentration fluctuations.

Is Now the Hour?

There is great enthusiasm and agitation within the biotech community for

  • metabolomics approaches as a means of reversing the dismal record of drug discovery

that has accumulated in the last decade.

While the concept clearly makes sense and is being widely applied today, there are many reasons why drugs fail in development, and metabolomics will not be a panacea for resolving all of these questions. It is too early at this point to recognize a trend or a track record, and it will take some time to see how this approach can aid in drug discovery and shorten the timeline for the introduction of new pharmaceutical agents.

Degree of binding correlated with function

Degree of binding correlated with function

Diagram_of_a_two-photon_excitation_microscope_

Diagram_of_a_two-photon_excitation_microscope_

Part 2.  Biologists Find ‘Missing Link’ in the Production of Protein Factories in Cells

Biologists at UC San Diego have found

  • the “missing link” in the chemical system that
  • enables animal cells to produce ribosomes

—the thousands of protein “factories” contained within each cell that

  • manufacture all of the proteins needed to build tissue and sustain life.
‘Missing Link’

‘Missing Link’

Their discovery, detailed in the June 23 issue of the journal Genes & Development, will not only force

  • a revision of basic textbooks on molecular biology, but also
  • provide scientists with a better understanding of
  • how to limit uncontrolled cell growth, such as cancer,
  • that might be regulated by controlling the output of ribosomes.

Ribosomes are responsible for the production of the wide variety of proteins that include

  1. enzymes;
  2. structural molecules, such as hair,
  3. skin and bones;
  4. hormones like insulin; and
  5. components of our immune system such as antibodies.

Regarded as life’s most important molecular machine, ribosomes have been intensively studied by scientists (the 2009 Nobel Prize in Chemistry, for example, was awarded for studies of its structure and function). But until now researchers had not uncovered all of the details of how the proteins that are used to construct ribosomes are themselves produced.

In multicellular animals such as humans,

  • ribosomes are made up of about 80 different proteins
    (humans have 79 while some other animals have a slightly different number) as well as
  • four different kinds of RNA molecules.

In 1969, scientists discovered that

  • the synthesis of the ribosomal RNAs is carried out by specialized systems using two key enzymes:
  • RNA polymerase I and RNA polymerase III.

But until now, scientists were unsure if a complementary system was also responsible for

  • the production of the 80 proteins that make up the ribosome.

That’s essentially what the UC San Diego researchers headed by Jim Kadonaga, a professor of biology, set out to examine. What they found was the missing link—the specialized

  • system that allows ribosomal proteins themselves to be synthesized by the cell.

Kadonaga says that he and coworkers found that ribosomal proteins are synthesized via

  • a novel regulatory system with the enzyme RNA polymerase II and
  • a factor termed TRF2,”

“For the production of most proteins,

  1. RNA polymerase II functions with
  2. a factor termed TBP,
  3. but for the synthesis of ribosomal proteins, it uses TRF2.”
  •  this specialized TRF2-based system for ribosome biogenesis
  • provides a new avenue for the study of ribosomes and
  • its control of cell growth, and

“it should lead to a better understanding and potential treatment of diseases such as cancer.”

Coordination of the transcriptome and metabolome

Coordination of the transcriptome and metabolome

the potential advantages conferred by distal-site protein synthesis

the potential advantages conferred by distal-site protein synthesis

Other authors of the paper were UC San Diego biologists Yuan-Liang Wang, Sascha Duttke and George Kassavetis, and Kai Chen, Jeff Johnston, and Julia Zeitlinger of the Stowers Institute for Medical Research in Kansas City, Missouri. Their research was supported by two grants from the National Institutes of Health (1DP2OD004561-01 and R01 GM041249).

Turning Off a Powerful Cancer Protein

Scientists have discovered how to shut down a master regulatory transcription factor that is

  • key to the survival of a majority of aggressive lymphomas,
  • which arise from the B cells of the immune system.

The protein, Bcl6, has long been considered too complex to target with a drug since it is also crucial

  • to the healthy functioning of many immune cells in the body, not just B cells gone bad.

The researchers at Weill Cornell Medical College report that it is possible

  • to shut down Bcl6 in diffuse large B-cell lymphoma (DLBCL)
  • while not affecting its vital function in T cells and macrophages
  • that are needed to support a healthy immune system.

If Bcl6 is completely inhibited, patients might suffer from systemic inflammation and atherosclerosis. The team conducted this new study to help clarify possible risks, as well as to understand

  • how Bcl6 controls the various aspects of the immune system.

The findings in this study were inspired from

  • preclinical testing of two Bcl6-targeting agents that Dr. Melnick and his Weill Cornell colleagues have developed
  • to treat DLBCLs.

These experimental drugs are

  • RI-BPI, a peptide mimic, and
  • the small molecule agent 79-6.

“This means the drugs we have developed against Bcl6 are more likely to be

  • significantly less toxic and safer for patients with this cancer than we realized,”

says Ari Melnick, M.D., professor of hematology/oncology and a hematologist-oncologist at NewYork-Presbyterian Hospital/Weill Cornell Medical Center.

Dr. Melnick says the discovery that

  • a master regulatory transcription factor can be targeted
  • offers implications beyond just treating DLBCL.

Recent studies from Dr. Melnick and others have revealed that

  • Bcl6 plays a key role in the most aggressive forms of acute leukemia, as well as certain solid tumors.

Bcl6 can control the type of immune cell that develops in the bone marrow—playing many roles

  • in the development of B cells, T cells, macrophages, and other cells—including a primary and essential role in
  • enabling B-cells to generate specific antibodies against pathogens.

According to Dr. Melnick, “When cells lose control of Bcl6,

  • lymphomas develop in the immune system.

Lymphomas are ‘addicted’ to Bcl6, and therefore

  • Bcl6 inhibitors powerfully and quickly destroy lymphoma cells,” .

The big surprise in the current study is that rather than functioning as a single molecular machine,

  • Bcl6 functions like a Swiss Army knife,
  • using different tools to control different cell types.

This multifunction paradigm could represent a general model for the functioning of other master regulatory transcription factors.

“In this analogy, the Swiss Army knife, or transcription factor, keeps most of its tools folded,

  • opening only the one it needs in any given cell type,”

He makes the following analogy:

  • “For B cells, it might open and use the knife tool;
  • for T cells, the cork screw;
  • for macrophages, the scissors.”

“this means that you only need to prevent the master regulator from using certain tools to treat cancer. You don’t need to eliminate the whole knife,” . “In fact, we show that taking out the whole knife is harmful since

  • the transcription factor has many other vital functions that other cells in the body need.”

Prior to these study results, it was not known that a master regulator could separate its functions so precisely. Researchers hope this will be a major benefit to the treatment of DLBCL and perhaps other disorders that are influenced by Bcl6 and other master regulatory transcription factors.

The study is published in the journal Nature Immunology, in a paper titled “Lineage-specific functions of Bcl-6 in immunity and inflammation are mediated by distinct biochemical mechanisms”.

Part 3. Neuroscience

Vesicles influence function of nerve cells 
Oct, 06 2014        source: http://feeds.sciencedaily.com

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Neurons (blue) which have absorbed exosomes (green) have increased levels of the enzyme catalase (red), which helps protect them against peroxides.

Tiny vesicles containing protective substances

  • which they transmit to nerve cells apparently
  • play an important role in the functioning of neurons.

As cell biologists at Johannes Gutenberg University Mainz (JGU) have discovered,

  • nerve cells can enlist the aid of mini-vesicles of neighboring glial cells
  • to defend themselves against stress and other potentially detrimental factors.

These vesicles, called exosomes, appear to stimulate the neurons on various levels:

  • they influence electrical stimulus conduction,
  • biochemical signal transfer, and
  • gene regulation.

Exosomes are thus multifunctional signal emitters

  • that can have a significant effect in the brain.
Exosome

Exosome

The researchers in Mainz already observed in a previous study that

  • oligodendrocytes release exosomes on exposure to neuronal stimuli.
  • these are absorbed by the neurons and improve neuronal stress tolerance.

Oligodendrocytes, a type of glial cell, form an

  • insulating myelin sheath around the axons of neurons.

The exosomes transport protective proteins such as

  • heat shock proteins,
  • glycolytic enzymes, and
  • enzymes that reduce oxidative stress from one cell type to another,
  • but also transmit genetic information in the form of ribonucleic acids.

“As we have now discovered in cell cultures, exosomes seem to have a whole range of functions,” explained Dr. Eva-Maria Krmer-Albers. By means of their transmission activity, the small bubbles that are the vesicles

  • not only promote electrical activity in the nerve cells, but also
  • influence them on the biochemical and gene regulatory level.

“The extent of activities of the exosomes is impressive,” added Krmer-Albers. The researchers hope that the understanding of these processes will contribute to the development of new strategies for the treatment of neuronal diseases. Their next aim is to uncover how vesicles actually function in the brains of living organisms.

http://labroots.com/user/news/article/id/217438/title/vesicles-influence-function-of-nerve-cells

The above story is based on materials provided by Universitt Mainz.

Universitt Mainz. “Vesicles influence function of nerve cells.” ScienceDaily. ScienceDaily, 6 October 2014. www.sciencedaily.com/releases/2014/10/141006174214.htm

Neuroscientists use snail research to help explain “chemo brain”

10/08/2014
It is estimated that as many as half of patients taking cancer drugs experience a decrease in mental sharpness. While there have been many theories, what causes “chemo brain” has eluded scientists.

In an effort to solve this mystery, neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) conducted an experiment in an animal memory model and their results point to a possible explanation. Findings appeared in The Journal of Neuroscience.

In the study involving a sea snail that shares many of the same memory mechanisms as humans and a drug used to treat a variety of cancers, the scientists identified

  • memory mechanisms blocked by the drug.

Then, they were able to counteract or

  • unblock the mechanisms by administering another agent.

“Our research has implications in the care of people given to cognitive deficits following drug treatment for cancer,” said John H. “Jack” Byrne, Ph.D., senior author, holder of the June and Virgil Waggoner Chair and Chairman of the Department of Neurobiology and Anatomy at the UTHealth Medical School. “There is no satisfactory treatment at this time.”

Byrne’s laboratory is known for its use of a large snail called Aplysia californica to further the understanding of the biochemical signaling among nerve cells (neurons).  The snails have large neurons that relay information much like those in humans.

When Byrne’s team compared cell cultures taken from normal snails to

  • those administered a dose of a cancer drug called doxorubicin,

the investigators pinpointed a neuronal pathway

  • that was no longer passing along information properly.

With the aid of an experimental drug,

  • the scientists were able to reopen the pathway.

Unfortunately, this drug would not be appropriate for humans, Byrne said. “We want to identify other drugs that can rescue these memory mechanisms,” he added.

According the American Cancer Society, some of the distressing mental changes cancer patients experience may last a short time or go on for years.

Byrne’s UT Health research team includes co-lead authors Rong-Yu Liu, Ph.D., and Yili Zhang, Ph.D., as well as Brittany Coughlin and Leonard J. Cleary, Ph.D. All are affiliated with the W.M. Keck Center for the Neurobiology of Learning and Memory.

Byrne and Cleary also are on the faculty of The University of Texas Graduate School of Biomedical Sciences at Houston. Coughlin is a student at the school, which is jointly operated by UT Health and The University of Texas MD Anderson Cancer Center.

The study titled “Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase” received support from National Institutes of Health grant (NS019895) and the Zilkha Family Discovery Fellowship.

Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase

Source: Univ. of Texas Health Science Center at Houston

http://www.rdmag.com/news/2014/10/neuroscientists-use-snail-research-help-explain-E2_9_Cchemo-brain

Doxorubicin Attenuates Serotonin-Induced Long-Term Synaptic Facilitation by Phosphorylation of p38 Mitogen-Activated Protein Kinase

Rong-Yu Liu*,  Yili Zhang*,  Brittany L. Coughlin,  Leonard J. Cleary, and  John H. Byrne   +Show Affiliations
The Journal of Neuroscience, 1 Oct 2014, 34(40): 13289-13300;
http://dx.doi.org:/10.1523/JNEUROSCI.0538-14.2014

Doxorubicin (DOX) is an anthracycline used widely for cancer chemotherapy. Its primary mode of action appears to be

  • topoisomerase II inhibition, DNA cleavage, and free radical generation.

However, in non-neuronal cells, DOX also inhibits the expression of

  • dual-specificity phosphatases (also referred to as MAPK phosphatases) and thereby
  1. inhibits the dephosphorylation of extracellular signal-regulated kinase (ERK) and
  2. p38 mitogen-activated protein kinase (p38 MAPK),
  3. two MAPK isoforms important for long-term memory (LTM) formation.

Activation of these kinases by DOX in neurons, if present,

  • could have secondary effects on cognitive functions, such as learning and memory.

The present study used cultures of rat cortical neurons and sensory neurons (SNs) of Aplysia

  • to examine the effects of DOX on levels of phosphorylated ERK (pERK) and
  • phosphorylated p38 (p-p38) MAPK.

In addition, Aplysia neurons were used to examine the effects of DOX on

  • long-term enhanced excitability, long-term synaptic facilitation (LTF), and
  • long-term synaptic depression (LTD).

DOX treatment led to elevated levels of

  • pERK and p-p38 MAPK in SNs and cortical neurons.

In addition, it increased phosphorylation of

  • the downstream transcriptional repressor cAMP response element-binding protein 2 in SNs.

DOX treatment blocked serotonin-induced LTF and enhanced LTD induced by the neuropeptide Phe-Met-Arg-Phe-NH2. The block of LTF appeared to be attributable to

  • overriding inhibitory effects of p-p38 MAPK, because
  • LTF was rescued in the presence of an inhibitor of p38 MAPK
    (SB203580 [4-(4-fluorophenyl)-2-(4-methylsulfinylphenyl)-5-(4-pyridyl)-1H-imidazole]) .

These results suggest that acute application of DOX might impair the formation of LTM via the p38 MAPK pathway.
Terms: Aplysia chemotherapy ERK  p38 MAPK serotonin synaptic plasticity

Technology that controls brain cells with radio waves earns early BRAIN grant

10/08/2014

bright spots = cells with increased calcium after treatment with radio waves,  allows neurons to fire

bright spots = cells with increased calcium after treatment with radio waves, allows neurons to fire

BRAIN control: The new technology uses radio waves to activate or silence cells remotely. The bright spots above represent cells with increased calcium after treatment with radio waves, a change that would allow neurons to fire.

A proposal to develop a new way to

  • remotely control brain cells

from Sarah Stanley, a research associate in Rockefeller University’s Laboratory of Molecular Genetics, headed by Jeffrey M. Friedman, is

  • among the first to receive funding from U.S. President Barack Obama’s BRAIN initiative.

The project will make use of a technique called

  • radiogenetics that combines the use of radio waves or magnetic fields with
  • nanoparticles to turn neurons on or off.

The National Institutes of Health is one of four federal agencies involved in the BRAIN (Brain Research through Advancing Innovative Neurotechnologies) initiative. Following in the ambitious footsteps of the Human Genome Project, the BRAIN initiative seeks

  • to create a dynamic map of the brain in action,

a goal that requires the development of new technologies. The BRAIN initiative working group, which outlined the broad scope of the ambitious project, was co-chaired by Rockefeller’s Cori Bargmann, head of the Laboratory of Neural Circuits and Behavior.

Stanley’s grant, for $1.26 million over three years, is one of 58 projects to get BRAIN grants, the NIH announced. The NIH’s plan for its part of this national project, which has been pitched as “America’s next moonshot,” calls for $4.5 billion in federal funds over 12 years.

The technology Stanley is developing would

  • enable researchers to manipulate the activity of neurons, as well as other cell types,
  • in freely moving animals in order to better understand what these cells do.

Other techniques for controlling selected groups of neurons exist, but her new nanoparticle-based technique has a

  • unique combination of features that may enable new types of experimentation.
  • it would allow researchers to rapidly activate or silence neurons within a small area of the brain or
  • dispersed across a larger region, including those in difficult-to-access locations.

Stanley also plans to explore the potential this method has for use treating patients.

“Francis Collins, director of the NIH, has discussed

  • the need for studying the circuitry of the brain,
  • which is formed by interconnected neurons.

Our remote-control technology may provide a tool with which researchers can ask new questions about the roles of complex circuits in regulating behavior,” Stanley says.
Rockefeller University’s Laboratory of Molecular Genetics
Source: Rockefeller Univ.

Part 4.  Cancer

Two Proteins Found to Block Cancer Metastasis

Why do some cancers spread while others don’t? Scientists have now demonstrated that

  • metastatic incompetent cancers actually “poison the soil”
  • by generating a micro-environment that blocks cancer cells
  • from settling and growing in distant organs.

The “seed and the soil” hypothesis proposed by Stephen Paget in 1889 is now widely accepted to explain how

  • cancer cells (seeds) are able to generate fertile soil (the micro-environment)
  • in distant organs that promotes cancer’s spread.

However, this concept had not explained why some tumors do not spread or metastasize.

The researchers, from Weill Cornell Medical College, found that

  • two key proteins involved in this process work by
  • dramatically suppressing cancer’s spread.

The study offers hope that a drug based on these

  • potentially therapeutic proteins, prosaposin and Thrombospondin 1 (Tsp-1),

might help keep human cancer at bay and from metastasizing.

Scientists don’t understand why some tumors wouldn’t “want” to spread. It goes against their “job description,” says the study’s senior investigator, Vivek Mittal, Ph.D., an associate professor of cell and developmental biology in cardiothoracic surgery and director of the Neuberger Berman Foundation Lung Cancer Laboratory at Weill Cornell Medical College. He theorizes that metastasis occurs when

  • the barriers that the body throws up to protect itself against cancer fail.

But there are some tumors in which some of the barriers may still be intact. “So that suggests

  • those primary tumors will continue to grow, but that
  • an innate protective barrier still exists that prevents them from spreading and invading other organs,”

The researchers found that, like typical tumors,

  • metastasis-incompetent tumors also send out signaling molecules
  • that establish what is known as the “premetastatic niche” in distant organs.

These niches composed of bone marrow cells and various growth factors have been described previously by others including Dr. Mittal as the fertile “soil” that the disseminated cancer cell “seeds” grow in.

Weill Cornell’s Raúl Catena, Ph.D., a postdoctoral fellow in Dr. Mittal’s laboratory, found an important difference between the tumor types. Metastatic-incompetent tumors

  • systemically increased expression of Tsp-1, a molecule known to fight cancer growth.
  • increased Tsp-1 production was found specifically in the bone marrow myeloid cells
  • that comprise the metastatic niche.

These results were striking, because for the first time Dr. Mittal says

  • the bone marrow-derived myeloid cells were implicated as
  • the main producers of Tsp-1,.

In addition, Weill Cornell and Harvard researchers found that

  • prosaposin secreted predominantly by the metastatic-incompetent tumors
  • increased expression of Tsp-1 in the premetastatic lungs.

Thus, Dr. Mittal posits that prosaposin works in combination with Tsp-1

  • to convert pro-metastatic bone marrow myeloid cells in the niche
  • into cells that are not hospitable to cancer cells that spread from a primary tumor.
  • “The very same myeloid cells in the niche that we know can promote metastasis
  • can also be induced under the command of the metastatic incompetent primary tumor to inhibit metastasis,”

The research team found that

  • the Tsp-1–inducing activity of prosaposin
  • was contained in only a 5-amino acid peptide region of the protein, and
  • this peptide alone induced Tsp-1 in the bone marrow cells and
  • effectively suppressed metastatic spread in the lungs
  • in mouse models of breast and prostate cancer.

This 5-amino acid peptide with Tsp-1–inducing activity

  • has the potential to be used as a therapeutic agent against metastatic cancer,

The scientists have begun to test prosaposin in other tumor types or metastatic sites.

Dr. Mittal says that “The clinical implications of the study are:

  • “Not only is it theoretically possible to design a prosaposin-based drug or drugs
  • that induce Tsp-1 to block cancer spread, but
  • you could potentially create noninvasive prognostic tests
  • to predict whether a cancer will metastasize.”

The study was reported in the April 30 issue of Cancer Discovery, in a paper titled “Bone Marrow-Derived Gr1+ Cells Can Generate a Metastasis-Resistant Microenvironment Via Induced Secretion of Thrombospondin-1”.

Disabling Enzyme Cripples Tumors, Cancer Cells

First Step of Metastasis

First Step of Metastasis

Published: Sep 05, 2013  http://www.technologynetworks.com/Metabolomics/news.aspx?id=157138

Knocking out a single enzyme dramatically cripples the ability of aggressive cancer cells to spread and grow tumors.

The paper, published in the journal Proceedings of the National Academy of Sciences, sheds new light on the importance of lipids, a group of molecules that includes fatty acids and cholesterol, in the development of cancer.

Researchers have long known that cancer cells metabolize lipids differently than normal cells. Levels of ether lipids – a class of lipids that are harder to break down – are particularly elevated in highly malignant tumors.

“Cancer cells make and use a lot of fat and lipids, and that makes sense because cancer cells divide and proliferate at an accelerated rate, and to do that,

  • they need lipids, which make up the membranes of the cell,”

said study principal investigator Daniel Nomura, assistant professor in UC Berkeley’s Department of Nutritional Sciences and Toxicology. “Lipids have a variety of uses for cellular structure, but what we’re showing with our study is that

  • lipids can send signals that fuel cancer growth.”

In the study, Nomura and his team tested the effects of reducing ether lipids on human skin cancer cells and primary breast tumors. They targeted an enzyme,

  • alkylglycerone phosphate synthase, or AGPS,
  • known to be critical to the formation of ether lipids.

The researchers confirmed that

  1. AGPS expression increased when normal cells turned cancerous.
  2. inactivating AGPS substantially reduced the aggressiveness of the cancer cells.

“The cancer cells were less able to move and invade,” said Nomura.

The researchers also compared the impact of

  • disabling the AGPS enzyme in mice that had been injected with cancer cells.

Nomura. observes -“Among the mice that had the AGPS enzyme inactivated,

  • the tumors were nonexistent,”

“The mice that did not have this enzyme

  • disabled rapidly developed tumors.”

The researchers determined that

  • inhibiting AGPS expression depleted the cancer cells of ether lipids.
  • AGPS altered levels of other types of lipids important to the ability of the cancer cells to survive and spread, including
    • prostaglandins and acyl phospholipids.

“What makes AGPS stand out as a treatment target is that the enzyme seems to simultaneously

  • regulate multiple aspects of lipid metabolism
  • important for tumor growth and malignancy.”

Future steps include the

  • development of AGPS inhibitors for use in cancer therapy,

“This study sheds considerable light on the important role that AGPS plays in ether lipid metabolism in cancer cells, and it suggests that

  • inhibitors of this enzyme could impair tumor formation,”

said Benjamin Cravatt, Professor and Chair of Chemical Physiology at The Scripps Research Institute, who is not part of the UC.

Agilent Technologies Thought Leader Award Supports Translational Research Program
Published: Mon, March 04, 2013

The award will support Dr DePinho’s research into

  • metabolic reprogramming in the earliest stages of cancer.

Agilent Technologies Inc. announces that Dr. Ronald A. DePinho, a world-renowned oncologist and researcher, has received an Agilent Thought Leader Award.

DePinho is president of the University of Texas MD Anderson Cancer Center. DePinho and his team hope to discover and characterize

  • alterations in metabolic flux during tumor initiation and maintenance, and to identify biomarkers for early detection of pancreatic cancer together with
  • novel therapeutic targets.

Researchers on his team will work with scientists from the university’s newly formed Institute of Applied Cancer Sciences.

The Agilent Thought Leader Award provides funds to support personnel as well as a state-of-the-art Agilent 6550 iFunnel Q-TOF LC/MS system.

“I am extremely pleased to receive this award for metabolomics research, as the survival rates for pancreatic cancer have not significantly improved over the past 20 years,” DePinho said. “This technology will allow us to

  • rapidly identify new targets that drive the formation, progression and maintenance of pancreatic cancer.

Discoveries from this research will also lead to

  • the development of effective early detection biomarkers and novel therapeutic interventions.”

“We are proud to support Dr. DePinho’s exciting translational research program, which will make use of

  • metabolomics and integrated biology workflows and solutions in biomarker discovery,”

said Patrick Kaltenbach, Agilent vice president, general manager of the Liquid Phase Division, and the executive sponsor of this award.

The Agilent Thought Leader Program promotes fundamental scientific advances by support of influential thought leaders in the life sciences and chemical analysis fields.

The covalent modifier Nedd8 is critical for the activation of Smurf1 ubiquitin ligase in tumorigenesis

Ping Xie, Minghua Zhang, Shan He, Kefeng Lu, Yuhan Chen, Guichun Xing, et al.
Nature Communications
  2014; 5(3733).  http://dx.doi.org:/10.1038/ncomms4733

Neddylation, the covalent attachment of ubiquitin-like protein Nedd8, of the Cullin-RING E3 ligase family

  • regulates their ubiquitylation activity.

However, regulation of HECT ligases by neddylation has not been reported to date. Here we show that

  • the C2-WW-HECT ligase Smurf1 is activated by neddylation.

Smurf1 physically interacts with

  1. Nedd8 and Ubc12,
  2. forms a Nedd8-thioester intermediate, and then
  3. catalyses its own neddylation on multiple lysine residues.

Intriguingly, this autoneddylation needs

  • an active site at C426 in the HECT N-lobe.

Neddylation of Smurf1 potently enhances

  • ubiquitin E2 recruitment and
  • augments the ubiquitin ligase activity of Smurf1.

The regulatory role of neddylation

  • is conserved in human Smurf1 and yeast Rsp5.

Furthermore, in human colorectal cancers,

  • the elevated expression of Smurf1, Nedd8, NAE1 and Ubc12
  • correlates with cancer progression and poor prognosis.

These findings provide evidence that

  • neddylation is important in HECT ubiquitin ligase activation and
  • shed new light on the tumour-promoting role of Smurf1.
 Swinging domains in HECT E3

Swinging domains in HECT E3

Subject terms: Biological sciences Cancer Cell biology

Figure 1: Smurf1 expression is elevated in colorectal cancer tissues.

Smurf1 expression is elevated in colorectal cancer tissues.

Smurf1 expression is elevated in colorectal cancer tissues.

(a) Smurf1 expression scores are shown as box plots, with the horizontal lines representing the median; the bottom and top of the boxes representing the 25th and 75th percentiles, respectively; and the vertical bars representing the ra

Figure 2: Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer.

Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer

Positive correlation of Smurf1 expression with Nedd8 and its interacting enzymes in colorectal cancer

(a) Representative images from immunohistochemical staining of Smurf1, Ubc12, NAE1 and Nedd8 in the same colorectal cancer tumour. Scale bars, 100 μm. (bd) The expression scores of Nedd8 (b, n=283 ), NAE1 (c, n=281) and Ubc12 (d, n=19…

Figure 3: Smurf1 interacts with Ubc12.

Smurf1 interacts with Ubc12

Smurf1 interacts with Ubc12

(a) GST pull-down assay of Smurf1 with Ubc12. Both input and pull-down samples were subjected to immunoblotting with anti-His and anti-GST antibodies. Smurf1 interacted with Ubc12 and UbcH5c, but not with Ubc9. (b) Mapping the regions…

Figure 4: Nedd8 is attached to Smurf1through C426-catalysed autoneddylation.

Nedd8 is attached to Smurf1through C426-catalysed autoneddylation

Nedd8 is attached to Smurf1through C426-catalysed autoneddylation

(a) Covalent neddylation of Smurf1 in vitro.Purified His-Smurf1-WT or C699A proteins were incubated with Nedd8 and Nedd8-E1/E2. Reactions were performed as described in the Methods section. Samples were analysed by western blotting wi…

Figure 5: Neddylation of Smurf1 activates its ubiquitin ligase activity.

Neddylation of Smurf1 activates its ubiquitin ligase activity.

Neddylation of Smurf1 activates its ubiquitin ligase activity.

(a) In vivo Smurf1 ubiquitylation assay. Nedd8 was co-expressed with Smurf1 WT or C699A in HCT116 cells (left panels). Twenty-four hours post transfection, cells were treated with MG132 (20 μM, 8 h). HCT116 cells were transfected with…

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f1.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f2.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f3.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f4.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f5.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f6.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f7.jpg

http://www.nature.com/ncomms/2014/140513/ncomms4733/carousel/ncomms4733-f8.jpg

The deubiquitylase USP33 discriminates between RALB functions in autophagy and innate immune response

M Simicek, S Lievens, M Laga, D Guzenko, VN. Aushev, et al.
Nature Cell Biology 2013; 15, 1220–1230    http://dx.doi.org:/10.1038/ncb2847

The RAS-like GTPase RALB mediates cellular responses to nutrient availability or viral infection by respectively

  • engaging two components of the exocyst complex, EXO84 and SEC5.
  1. RALB employs SEC5 to trigger innate immunity signalling, whereas
  2. RALB–EXO84 interaction induces autophagocytosis.

How this differential interaction is achieved molecularly by the RAL GTPase remains unknown.

We found that whereas GTP binding

  • turns on RALB activity,

ubiquitylation of RALB at Lys 47

  • tunes its activity towards a particular effector.

Specifically, ubiquitylation at Lys 47

  • sterically inhibits RALB binding to EXO84, while
  • facilitating its interaction with SEC5.

Double-stranded RNA promotes

  • RALB ubiquitylation and
  • SEC5–TBK1 complex formation.

In contrast, nutrient starvation

  • induces RALB deubiquitylation
  • by accumulation and relocalization of the deubiquitylase USP33
  • to RALB-positive vesicles.

Deubiquitylated RALB

  • promotes the assembly of the RALB–EXO84–beclin-1 complexes
  • driving autophagosome formation. Thus,
  • ubiquitylation within the effector-binding domain
  • provides the switch for the dual functions of RALB in
    • autophagy and innate immune responses.

Part 5. Metabolic Syndrome

Single Enzyme is Necessary for Development of Diabetes

Published: Aug 20, 2014 http://www.technologynetworks.com/Metabolomics/news.aspx?ID=169416

12-LO enzyme promotes the obesity-induced oxidative stress in the pancreatic cells.

An enzyme called 12-LO promotes the obesity-induced oxidative stress in the pancreatic cells that leads

  • to pre-diabetes, and diabetes.

12-LO’s enzymatic action is the last step in

  • the production of certain small molecules that harm the cell,

according to a team from Indiana University School of Medicine, Indianapolis.

The findings will enable the development of drugs that can interfere with this enzyme, preventing or even reversing diabetes. The research is published ahead of print in the journal Molecular and Cellular Biology.

In earlier studies, these researchers and their collaborators at Eastern Virginia Medical School showed that

  • 12-LO (which stands for 12-lipoxygenase) is present in these cells
  • only in people who become overweight.

The harmful small molecules resulting from 12-LO’s enzymatic action are known as HETEs, short for hydroxyeicosatetraenoic acid.

  1. HETEs harm the mitochondria, which then
  2. fail to produce sufficient energy to enable
  3. the pancreatic cells to manufacture the necessary quantities of insulin.

For the study, the investigators genetically engineered mice that

  • lacked the gene for 12-LO exclusively in their pancreas cells.

Mice were either fed a low-fat or high-fat diet.

Both the control mice and the knockout mice on the high fat diet

  • developed obesity and insulin resistance.

The investigators also examined the pancreatic beta cells of both knockout and control mice, using both microscopic studies and molecular analysis. Those from the knockout mice were intact and healthy, while

  • those from the control mice showed oxidative damage,
  • demonstrating that 12-LO and the resulting HETEs
  • caused the beta cell failure.

Mirmira notes that fatty diet used in the study was the Western Diet, which comprises mostly saturated-“bad”-fats. Based partly on a recent study of related metabolic pathways, he says that

  • the unsaturated and mono-unsaturated fats-which comprise most fats in the healthy,
  • relatively high fat Mediterranean diet-are unlikely to have the same effects.

“Our research is the first to show that 12-LO in the beta cell

  • is the culprit in the development of pre-diabetes, following high fat diets,” says Mirmira.

“Our work also lends important credence to the notion that

  • the beta cell is the primary defective cell in virtually all forms of diabetes and pre-diabetes.”

A New Player in Lipid Metabolism Discovered

Published: Aug18, 2014  http://www.technologynetworks.com/Metabolomics/news.aspx?ID=169356

Specially engineered mice gained no weight, and normal counterparts became obese

  • on the same high-fat, obesity-inducing Western diet.

Specially engineered mice that lacked a particular gene did not gain weight

  • when fed a typical high-fat, obesity-inducing Western diet.

Yet, these mice ate the same amount as their normal counterparts that became obese.

The mice were engineered with fat cells that lacked a gene called SEL1L,

  • known to be involved in the clearance of mis-folded proteins
  • in the cell’s protein making machinery called the endoplasmic reticulum (ER).

When mis-folded proteins are not cleared but accumulate,

  • they destroy the cell and contribute to such diseases as
  1. mad cow disease,
  2. Type 1 diabetes and
  3. cystic fibrosis.

“The million-dollar question is why don’t these mice gain weight? Is this related to its inability to clear mis-folded proteins in the ER?” said Ling Qi, associate professor of molecular and biochemical nutrition and senior author of the study published online July 24 in Cell Metabolism. Haibo Sha, a research associate in Qi’s lab, is the paper’s lead author.

Interestingly, the experimental mice developed a host of other problems, including

  • postprandial hypertriglyceridemia,
  • and fatty livers.

“Although we are yet to find out whether these conditions contribute to the lean phenotype, we found that

  • there was a lipid partitioning defect in the mice lacking SEL1L in fat cells,
  • where fat cells cannot store fat [lipids], and consequently
  • fat goes to the liver.

During the investigation of possible underlying mechanisms, we discovered

  • a novel function for SEL1L as a regulator of lipid metabolism,” said Qi.

Sha said “We were very excited to find that

  • SEL1L is required for the intracellular trafficking of
  • lipoprotein lipase (LPL), acting as a chaperone,” .

and added that “Using several tissue-specific knockout mouse models,

  • we showed that this is a general phenomenon,”

Without LPL, lipids remain in the circulation;

  • fat and muscle cells cannot absorb fat molecules for storage and energy combustion,

People with LPL mutations develop

  • postprandial hypertriglyceridemia similar to
  • conditions found in fat cell-specific SEL1L-deficient mice, said Qi.

Future work will investigate the

  • role of SEL1L in human patients carrying LPL mutations and
  • determine why fat cell-specific SEL1L-deficient mice remain lean under Western diets, said Sha.

Co-authors include researchers from Cedars-Sinai Medical Center in Los Angeles; Wageningen University in the Netherlands; Georgia State University; University of California, Los Angeles; and the Medical College of Soochow University in China.

The study was funded by the U.S. National Institutes of Health, the Netherlands Organization for Health Research and Development National Institutes of Health, the Cedars-Sinai Medical Center, Chinese National Science Foundation, the American Diabetes Association, Cornell’s Center for Vertebrate Genomics and the Howard Hughes Medical Institute.

Part 6. Biomarkers

Biomarkers Take Center Stage

Josh P. Roberts
GEN May 1, 2013 (Vol. 33, No. 9)  http://www.genengnews.com/

While work with biomarkers continues to grow, scientists are also grappling with research-related bottlenecks, such as

  1. affinity reagent development,
  2. platform reproducibility, and
  3. sensitivity.

Biomarkers by definition indicate some state or process that generally occurs

  • at a spatial or temporal distance from the marker itself, and

it would not be an exaggeration to say that biomedicine has become infatuated with them:

  1. where to find them,
  2. when they may appear,
  3. what form they may take, and
  4. how they can be used to diagnose a condition or
  5. predict whether a therapy may be successful.

Biomarkers are on the agenda of many if not most industry gatherings, and in cases such as Oxford Global’s recent “Biomarker Congress” and the GTC “Biomarker Summit”, they hold the naming rights. There, some basic principles were built upon, amended, and sometimes challenged.

In oncology, for example, biomarker discovery is often predicated on the premise that

  • proteins shed from a tumor will traverse to and persist in, and be detectable in, the circulation.

By quantifying these proteins—singularly or as part of a larger “signature”—the hope is

  1. to garner information about the molecular characteristics of the cancer
  2. that will help with cancer detection and
  3. personalization of the treatment strategy.

Yet this approach has not yet turned into the panacea that was hoped for. Bottlenecks exist in

  • affinity reagent development,
  • platform reproducibility, and
  • sensitivity.

There is also a dearth of understanding of some of the

  • fundamental principles of biomarker biology that we need to know the answers to,

said Parag Mallick, Ph.D., whose lab at Stanford University is “working on trying to understand where biomarkers come from.”

There are dogmas saying that

  • circulating biomarkers come solely from secreted proteins.

But Dr. Mallick’s studies indicate that fully

  • 50% of circulating proteins may come from intracellular sources or
  • proteins that are annotated as such.

“We don’t understand the processes governing

  • which tumor-derived proteins end up in the blood.”

Other questions include “how does the size of a tumor affect how much of a given protein will be in the blood?”—perhaps

  • the tumor is necrotic at the center, or
  • it’s hypervascular or hypovascular.

He points out “The problem is that these are highly nonlinear processes at work, and

  • there is a large number of factors that might affect the answer to that question,” .

Their research focuses on using

  1. mass spectrometry and
  2. computational analysis
  • to characterize the biophysical properties of the circulating proteome, and
  • relate these to measurements made of the tumor itself.

Furthermore, he said – “We’ve observed that the proteins that are likely to

  • first show up and persist in the circulation, ..
  • are more stable than proteins that don’t,”
  • “we can quantify how significant the effect is.”

The goal is ultimately to be able to

  1. build rigorous, formal mathematical models that will allow something measured in the blood
  2. to be tied back to the molecular biology taking place in the tumor.

And conversely, to use those models

  • to predict from a tumor what will be found in the circulation.

“Ultimately, the models will allow you to connect the dots between

  • what you measure in the blood and the biology of the tumor.”

Bound for Affinity Arrays

Affinity reagents are the main tools for large-scale protein biomarker discovery. And while this has tended to mean antibodies (or their derivatives), other affinity reagents are demanding a place in the toolbox.

Affimers, a type of affinity reagent being developed by Avacta, consist of

  1. a biologically inert, biophysically stable protein scaffold
  2. containing three variable regions into which
  3. distinct peptides are inserted.

The resulting three-dimensional surface formed by these peptides

  • interacts and binds to proteins and other molecules in solution,
  • much like the antigen-binding site of antibodies.

Unlike antibodies, Affimers are relatively small (13 KDa),

  • non-post-translationally modified proteins
  • that can readily be expressed in bacterial culture.

They may be made to bind surfaces through unique residues

  • engineered onto the opposite face of the Affimer,
  • allowing the binding site to be exposed to the target in solution.

“We don’t seem to see in what we’ve done so far

  • any real loss of activity or functionality of Affimers when bound to surfaces—

they’re very robust,” said CEO Alastair Smith, Ph.D.

Avacta is taking advantage of this stability and its large libraries of Affimers to develop

  • very large affinity microarrays for
  • drug and biomarker discovery.

To date they have printed arrays with around 20–25,000 features, and Dr. Smith is “sure that we can get toward about 50,000 on a slide,” he said. “There’s no real impediment to us doing that other than us expressing the proteins and getting on with it.”

Customers will be provided with these large, complex “naïve” discovery arrays, readable with standard equipment. The plan is for the company to then “support our customers by providing smaller arrays with

  • the Affimers that are binding targets of interest to them,” Dr. Smith foretold.

And since the intellectual property rights are unencumbered,

  • Affimers in those arrays can be licensed to the end users
  • to develop diagnostics that can be validated as time goes on.

Around 20,000-Affimer discovery arrays were recently tested by collaborator Professor Ann Morgan of the University of Leeds with pools of unfractionated serum from patients with symptoms of inflammatory disease. The arrays

  • “rediscovered” elevated C-reactive protein (CRP, the clinical gold standard marker)
  • as well as uncovered an additional 22 candidate biomarkers.
  • other candidates combined with CRP, appear able to distinguish between different diseases such as
  1. rheumatoid arthritis,
  2. psoriatic arthritis,
  3. SLE, or
  4. giant cell arteritis.

Epigenetic Biomarkers

Methylation of adenine

Sometimes biomarkers are used not to find disease but

  • to distinguish healthy human cell types, with
  •  examples being found in flow cytometry and immunohistochemistry.

These widespread applications, however, are difficult to standardize, being

  • subject to arbitrary or subjective gating protocols and other imprecise criteria.

Epiontis instead uses an epigenetic approach. “What we need is a unique marker that is

  • demethylated only in one cell type and
  • methylated in all the other cell types,”

Each cell of the right cell type will have

  • two demethylated copies of a certain gene locus,
  • allowing them to be enumerated by quantitative PCR.

The biggest challenge is finding that unique epigenetic marker. To do so they look through the literature for proteins and genes described as playing a role in the cell type’s biology, and then

  • look at the methylation patterns to see if one can be used as a marker,

They also “use customized Affymetrix chips to look at the

  • differential epigenetic status of different cell types on a genomewide scale.”

explained CBO and founder Ulrich Hoffmueller, Ph.D.

The company currently has a panel of 12 assays for 12 immune cell types. Among these is an assay for

  • regulatory T (Treg) cells that queries the Foxp3 gene—which is uniquely demethylated in Treg
  • even though it is transiently expressed in activated T cells of other subtypes.

Also assayed are Th17 cells, difficult to detect by flow cytometry because

  • “the cells have to be stimulated in vitro,” he pointed out.

Developing New Assays for Cancer Biomarkers

Researchers at Myriad RBM and the Cancer Prevention Research Institute of Texas are collaborating to develop

  • new assays for cancer biomarkers on the Myriad RBM Multi-Analyte Profile (MAP) platform.

The release of OncologyMAP 2.0 expanded Myriad RBM’s biomarker menu to over 250 analytes, which can be measured from a small single sample, according to the company. Using this menu, L. Stephen et al., published a poster, “Analysis of Protein Biomarkers in Prostate and Colorectal Tumor Lysates,” which showed the results of

  • a survey of proteins relevant to colorectal (CRC) and prostate (PC) tumors
  • to identify potential proteins of interest for cancer research.

The study looked at CRC and PC tumor lysates and found that 102 of the 115 proteins showed levels above the lower limit of quantification.

  • Four markers were significantly higher in PC and 10 were greater in CRC.

For most of the analytes, duplicate sections of the tumor were similar, although some analytes did show differences. In four of the CRC analytes, tumor number four showed differences for CEA and tumor number 2 for uPA.

Thirty analytes were shown to be

  • different in CRC tumor compared to its adjacent tissue.
  • Ten of the analytes were higher in adjacent tissue compared to CRC.
  • Eighteen of the markers examined demonstrated  —-

significant correlations of CRC tumor concentration to serum levels.

“This suggests.. that the Oncology MAP 2.0 platform “provides a good method for studying changes in tumor levels because many proteins can be assessed with a very small sample.”

Clinical Test Development with MALDI-ToF

While there have been many attempts to translate results from early discovery work on the serum proteome into clinical practice, few of these efforts have progressed past the discovery phase.

Matrix-assisted laser desorption/ionization-time of flight (MALDI-ToF) mass spectrometry on unfractionated serum/plasma samples offers many practical advantages over alternative techniques, and does not require

  • a shift from discovery to development and commercialization platforms.

Biodesix claims it has been able to develop the technology into

  • a reproducible, high-throughput tool to
  • routinely measure protein abundance from serum/plasma samples.

“.. we improved data-analysis algorithms to

  • reproducibly obtain quantitative measurements of relative protein abundance from MALDI-ToF mass spectra.

Heinrich Röder, CTO points out that the MALDI-ToF measurements

  • are combined with clinical outcome data using
  • modern learning theory techniques
  • to define specific disease states
  • based on a patient’s serum protein content,”

The clinical utility of the identification of these disease states can be investigated through a retrospective analysis of differing sample sets. For example, Biodesix clinically validated its first commercialized serum proteomic test, VeriStrat®, in 85 different retrospective sample sets.

Röder adds that “It is becoming increasingly clear that

  • the patients whose serum is characterized as VeriStrat Poor show
  • consistently poor outcomes irrespective of
  1. tumor type,
  2. histology, or
  3. molecular tumor characteristics,”

MALDI-ToF mass spectrometry, in its standard implementation,

  • allows for the observation of around 100 mostly high-abundant serum proteins.

Further, “while this does not limit the usefulness of tests developed from differential expression of these proteins,

  • the discovery potential would be greatly enhanced
  • if we could probe deeper into the proteome
  • while not giving up the advantages of the MALDI-ToF approach,”

Biodesix reports that its new MALDI approach, Deep MALDI™, can perform

  • simultaneous quantitative measurement of more than 1,000 serum protein features (or peaks) from 10 µL of serum in a high-throughput manner.
  • it increases the observable signal noise ratio from a few hundred to over 50,000,
  • resulting in the observation of many lower-abundance serum proteins.

Breast cancer, a disease now considered to be a collection of many complexes of symptoms and signatures—the dominant ones are labeled Luminal A, Luminal B, Her2, and Basal— which suggests different prognose, and

  • these labels are considered too simplistic for understanding and managing a woman’s cancer.

Studies published in the past year have looked at

  1. somatic mutations,
  2. gene copy number aberrations,
  3. gene expression abnormalities,
  4. protein and miRNA expression, and
  5. DNA methylation,

coming up with a list of significantly mutated genes—hot spots—in different categories of breast cancers. Targeting these will inevitably be the focus of much coming research.

“We’ve been taking these large trials and profiling these on a variety of array or sequence platforms. We think we’ll get

  1. prognostic drivers
  2. predictive markers for taxanes and
  3. monoclonal antibodies and
  4. tamoxifen and aromatase inhibitors,”
    explained Brian Leyland-Jones, Ph.D., director of Edith Sanford Breast Cancer Research. “We will end up with 20–40 different diseases, maybe more.”

Edith Sanford Breast Cancer Research is undertaking a pilot study in collaboration with The Scripps Research Institute, using a variety of tests on 25 patients to see how the information they provide complements each other, the overall flow, and the time required to get and compile results.

Laser-captured tumor samples will be subjected to low passage whole-genome, exome, and RNA sequencing (with targeted resequencing done in parallel), and reverse-phase protein and phosphorylation arrays, with circulating nucleic acids and circulating tumor cells being queried as well. “After that we hope to do a 100- or 150-patient trial when we have some idea of the best techniques,” he said.

Dr. Leyland-Jones predicted that ultimately most tumors will be found

  • to have multiple drivers,
  • with most patients receiving a combination of two, three, or perhaps four different targeted therapies.

Reduce to Practice

According to Randox, the evidence Investigator is a sophisticated semi-automated biochip sys­tem designed for research, clinical, forensic, and veterinary applications.

Once biomarkers that may have an impact on therapy are discovered, it is not always routine to get them into clinical practice. Leaving regulatory and financial, intellectual property and cultural issues aside, developing a diagnostic based on a biomarker often requires expertise or patience that its discoverer may not possess.

Andrew Gribben is a clinical assay and development scientist at Randox Laboratories, based in Northern Ireland, U.K. The company utilizes academic and industrial collaborators together with in-house discovery platforms to identify biomarkers that are

  • augmented or diminished in a particular pathology
  • relative to appropriate control populations.

Biomarkers can be developed to be run individually or

  • combined into panels of immunoassays on its multiplex biochip array technology.

Specificity can also be gained—or lost—by the affinity of reagents in an assay. The diagnostic potential of Heart-type fatty acid binding protein (H-FABP) abundantly expressed in human myocardial cells was recognized by Jan Glatz of Maastricht University, The Netherlands, back in 1988. Levels rise quickly within 30 minutes after a myocardial infarction, peaking at 6–8 hours and return to normal within 24–30 hours. Yet at the time it was not known that H-FABP was a member of a multiprotein family, with which the polyclonal antibodies being used in development of an assay were cross-reacting, Gribben related.

Randox developed monoclonal antibodies specific to H-FABP, funded trials investigating its use alone, and multiplexed with cardiac biomarker assays, and, more than 30 years after the biomarker was identified, in 2011, released a validated assay for H-FABP as a biomarker for early detection of acute myocardial infarction.

Ultrasensitive Immunoassays for Biomarker Development

Research has shown that detection and monitoring of biomarker concentrations can provide

  • insights into disease risk and progression.

Cytokines have become attractive biomarkers and candidates

  • for targeted therapies for a number of autoimmune diseases, including rheumatoid arthritis (RA), Crohn’s disease, and psoriasis, among others.

However, due to the low-abundance of circulating cytokines, such as IL-17A, obtaining robust measurements in clinical samples has been difficult.

Singulex reports that its digital single-molecule counting technology provides

  • increased precision and detection sensitivity over traditional ELISA techniques,
  • helping to shed light on biomarker verification and validation programs.

The company’s Erenna® immunoassay system, which includes optimized immunoassays, offers LLoQ to femtogram levels per mL resolution—even in healthy populations, at an improvement of 1-3 fold over standard ELISAs or any conventional technology and with a dynamic range of up to 4-logs, according to a Singulex official, who adds that

  • this sensitivity improvement helps minimize undetectable samples that
  • could otherwise delay or derail clinical studies.

The official also explains that the Singulex solution includes an array of products and services that are being applied to a number of programs and have enabled the development of clinically relevant biomarkers, allowing translation from discovery to the clinic.

In a poster entitled “Advanced Single Molecule Detection: Accelerating Biomarker Development Utilizing Cytokines through Ultrasensitive Immunoassays,” a case study was presented of work performed by Jeff Greenberg of NYU to show how the use of the Erenna system can provide insights toward

  • improving the clinical utility of biomarkers and
  • accelerating the development of novel therapies for treating inflammatory diseases.

A panel of inflammatory biomarkers was examined in DMARD (disease modifying antirheumatic drugs)-naïve RA (rheumatoid arthritis) vs. knee OA (osteoarthritis) patient cohorts. Markers that exhibited significant differences in plasma concentrations between the two cohorts included

  • CRP, IL-6R alpha, IL-6, IL-1 RA, VEGF, TNF-RII, and IL-17A, IL-17F, and IL-17A/F.

Among the three tested isoforms of IL-17,

  • the magnitude of elevation for IL-17F in RA patients was the highest.

“Singulex provides high-resolution monitoring of baseline IL-17A concentrations that are present at low levels,” concluded the researchers. “The technology also enabled quantification of other IL-17 isoforms in RA patients, which have not been well characterized before.”

The Singulex Erenna System has also been applied to cardiovascular disease research, for which its

  • cardiac troponin I (cTnI) digital assay can be used to measure circulating
  • levels of cTnI undetectable by other commercial assays.

Recently presented data from Brigham and Women’s Hospital and the TIMI-22 study showed that

  • using the Singulex test to serially monitor cTnI helps
  • stratify risk in post-acute coronary syndrome patients and
  • can identify patients with elevated cTnI
  • who have the most to gain from intensive vs. moderate-dose statin therapy,

according to the scientists involved in the research.

The study poster, “Prognostic Performance of Serial High Sensitivity Cardiac Troponin Determination in Stable Ischemic Heart Disease: Analysis From PROVE IT-TIMI 22,” was presented at the 2013 American College of Cardiology (ACC) Annual Scientific Session & Expo by R. O’Malley et al.

Biomarkers Changing Clinical Medicine

Better Diagnosis, Prognosis, and Drug Targeting Are among Potential Benefits

  1. John Morrow Jr., Ph.D.

Researchers at EMD Chemicals are developing biomarker immunoassays

  • to monitor drug-induced toxicity including kidney damage.

The pace of biomarker development is accelerating as investigators report new studies on cancer, diabetes, Alzheimer disease, and other conditions in which the evaluation and isolation of workable markers is prominently featured.

Wei Zheng, Ph.D., leader of the R&D immunoassay group at EMD Chemicals, is overseeing a program to develop biomarker immunoassays to

  • monitor drug-induced toxicity, including kidney damage.

“One of the principle reasons for drugs failing during development is because of organ toxicity,” says Dr. Zheng.
“proteins liberated into the serum and urine can serve as biomarkers of adverse response to drugs, as well as disease states.”

Through collaborative programs with Rules-Based Medicine (RBM), the EMD group has released panels for the profiling of human renal impairment and renal toxicity. These urinary biomarker based products fit the FDA and EMEA guidelines for assessment of drug-induced kidney damage in rats.

The group recently performed a screen for potential protein biomarkers in relation to

  • kidney toxicity/damage on a set of urine and plasma samples
  • from patients with documented renal damage.

Additionally, Dr. Zheng is directing efforts to move forward with the multiplexed analysis of

  • organ and cellular toxicity.

Diseases thought to involve compromised oxidative phosphorylation include

  • diabetes, Parkinson and Alzheimer diseases, cancer, and the aging process itself.

Good biomarkers allow Dr. Zheng to follow the mantra, “fail early, fail fast.” With robust, multiplexible biomarkers, EMD can detect bad drugs early and kill them before they move into costly large animal studies and clinical trials. “Recognizing the severe liability that toxicity presents, we can modify the structure of the candidate molecule and then rapidly reassess its performance.”

Scientists at Oncogene Science a division of Siemens Healthcare Diagnostics, are also focused on biomarkers. “We are working on a number of antibody-based tests for various cancers, including a test for the Ca-9 CAIX protein, also referred to as carbonic anhydrase,” Walter Carney, Ph.D., head of the division, states.

CAIX is a transmembrane protein that is

  • overexpressed in a number of cancers, and, like Herceptin and the Her-2 gene,
  • can serve as an effective and specific marker for both diagnostic and therapeutic purposes.
  • It is liberated into the circulation in proportion to the tumor burden.

Dr. Carney and his colleagues are evaluating patients after tumor removal for the presence of the Ca-9 CAIX protein. If

  • the levels of the protein in serum increase over time,
  • this suggests that not all the tumor cells were removed and the tumor has metastasized.

Dr. Carney and his team have developed both an immuno-histochemistry and an ELISA test that could be used as companion diagnostics in clinical trials of CAIX-targeted drugs.

The ELISA for the Ca-9 CAIX protein will be used in conjunction with Wilex’ Rencarex®, which is currently in a

  • Phase III trial as an adjuvant therapy for non-metastatic clear cell renal cancer.

Additionally, Oncogene Science has in its portfolio an FDA-approved test for the Her-2 marker. Originally approved for Her-2/Neu-positive breast cancer, its indications have been expanded over time, and was approved

  • for the treatment of gastric cancer last year.

It is normally present on breast cancer epithelia but

  • overexpressed in some breast cancer tumors.

“Our products are designed to be used in conjunction with targeted therapies,” says Dr. Carney. “We are working with companies that are developing technology around proteins that are

  • overexpressed in cancerous tissues and can be both diagnostic and therapeutic targets.”

The long-term goal of these studies is to develop individualized therapies, tailored for the patient. Since the therapies are expensive, accurate diagnostics are critical to avoid wasting resources on patients who clearly will not respond (or could be harmed) by the particular drug.

“At this time the rate of response to antibody-based therapies may be very poor, as

  • they are often employed late in the course of the disease, and patients are in such a debilitated state
  • that they lack the capacity to react positively to the treatment,” Dr. Carney explains.

Nanoscale Real-Time Proteomics

Stanford University School of Medicine researchers, working with Cell BioSciences, have developed a

  • nanofluidic proteomic immunoassay that measures protein charge,
  • similar to immunoblots, mass spectrometry, or flow cytometry.
  • unlike these platforms, this approach can measure the amount of individual isoforms,
  • specifically, phosphorylated molecules.

“We have developed a nanoscale device for protein measurement, which I believe could be useful for clinical analysis,” says Dean W. Felsher, M.D., Ph.D., associate professor at Stanford University School of Medicine.

Critical oncogenic transformations involving

  • the activation of the signal-related kinases ERK-1 and ERK-2 can now be followed with ease.

“The fact that we measure nanoquantities with accuracy means that

  • we can interrogate proteomic profiles in clinical patients,

by drawing tiny needle aspirates from tumors over the course of time,” he explains.

“This allows us to observe the evolution of tumor cells and

  • their response to therapy
  • from a baseline of the normal tissue as a standard of comparison.”

According to Dr. Felsher, 20 cells is a large enough sample to obtain a detailed description. The technology is easy to automate, which allows

  • the inclusion of hundreds of assays.

Contrasting this technology platform with proteomic analysis using microarrays, Dr. Felsher notes that the latter is not yet workable for revealing reliable markers.

Dr. Felsher and his group published a description of this technology in Nature Medicine. “We demonstrated that we could take a set of human lymphomas and distinguish them from both normal tissue and other tumor types. We can

  • quantify changes in total protein, protein activation, and relative abundance of specific phospho-isoforms
  • from leukemia and lymphoma patients receiving targeted therapy.

Even with very small numbers of cells, we are able to show that the results are consistent, and

  • our sample is a random profile of the tumor.”

Splice Variant Peptides

“Aberrations in alternative splicing may generate

  • much of the variation we see in cancer cells,”

says Gilbert Omenn, Ph.D., director of the center for computational medicine and bioinformatics at the University of Michigan School of Medicine. Dr. Omenn and his colleague, Rajasree Menon, are

  • using this variability as a key to new biomarker identification.

It is becoming evident that splice variants play a significant role in the properties of cancer cells, including

  • initiation, progression, cell motility, invasiveness, and metastasis.

Alternative splicing occurs through multiple mechanisms

  • when the exons or coding regions of the DNA transcribe mRNA,
  • generating initiation sites and connecting exons in protein products.

Their translation into protein can result in numerous protein isoforms, and

  • these isoforms may reflect a diseased or cancerous state.

Regulatory elements within the DNA are responsible for selecting different alternatives; thus

  • the splice variants are tempting targets for exploitation as biomarkers.
Analyses of the splice-site mutation

Analyses of the splice-site mutation

Despite the many questions raised by these observations, splice variation in tumor material has not been widely studied. Cancer cells are known for their tremendous variability, which allows them to

  • grow rapidly, metastasize, and develop resistance to anticancer drugs.

Dr. Omenn and his collaborators used

  • mass spec data to interrogate a custom-built database of all potential mRNA sequences
  • to find alternative splice variants.

When they compared normal and malignant mammary gland tissue from a mouse model of Her2/Neu human breast cancers, they identified a vast number (608) of splice variant proteins, of which

  • peptides from 216 were found only in the tumor sample.

“These novel and known alternative splice isoforms

  • are detectable both in tumor specimens and in plasma and
  • represent potential biomarker candidates,” Dr. Omenn adds.

Dr. Omenn’s observations and those of his colleague Lewis Cantley, Ph.D., have also

  • shed light on the origins of the classic Warburg effect,
  • the shift to anaerobic glycolysis in tumor cells.

The novel splice variant M2, of muscle pyruvate kinase,

  • is observed in embryonic and tumor tissue.

It is associated with this shift, the result of

  • the expression of a peptide splice variant sequence.

It is remarkable how many different areas of the life sciences are tied into the phenomenon of splice variation. The changes in the genetic material can be much greater than point mutations, which have been traditionally considered to be the prime source of genetic variability.

“We now have powerful methods available to uncover a whole new category of variation,” Dr. Omenn says. “High-throughput RNA sequencing and proteomics will be complementary in discovery studies of splice variants.”

Splice variation may play an important role in rapid evolutionary changes, of the sort discussed by Susumu Ohno and Stephen J. Gould decades ago. They, and other evolutionary biologists, argued that

  • gene duplication, combined with rapid variability, could fuel major evolutionary jumps.

At the time, the molecular mechanisms of variation were poorly understood, but today

  • the tools are available to rigorously evaluate the role of
  • splice variation and other contributors to evolutionary change.

“Biomarkers derived from studies of splice variants, could, in the future, be exploited

  • both for diagnosis and prognosis and
  • for drug targeting of biological networks,
  • in situations such as the Her-2/Neu breast cancers,” Dr. Omenn says.

Aminopeptidase Activities

“By correlating the proteolytic patterns with disease groups and controls, we have shown that

  • exopeptidase activities contribute to the generation of not only cancer-specific
  • but also cancer type specific serum peptides.

according to Paul Tempst, Ph.D., professor and director of the Protein Center at the Memorial Sloan-Kettering Cancer Center.

So there is a direct link between peptide marker profiles of disease and differential protease activity.” For this reason Dr. Tempst argues that “the patterns we describe may have value as surrogate markers for detection and classification of cancer.”

To investigate this avenue, Dr. Tempst and his colleagues have followed

  • the relationship between exopeptidase activities and metastatic disease.

“We monitored controlled, de novo peptide breakdown in large numbers of biological samples using mass spectrometry, with relative quantitation of the metabolites,” Dr. Tempst explains. This entailed the use of magnetic, reverse-phase beads for analyte capture and a MALDI-TOF MS read-out.

“In biomarker discovery programs, functional proteomics is usually not pursued,” says Dr. Tempst. “For putative biomarkers, one may observe no difference in quantitative levels of proteins, while at the same time, there may be substantial differences in enzymatic activity.”

In a preliminary prostate cancer study, the team found a significant difference

  • in activity levels of exopeptidases in serum from patients with metastatic prostate cancer
  • as compared to primary tumor-bearing individuals and normal healthy controls.

However, there were no differences in amounts of the target protein, and this potential biomarker would have been missed if quantitative levels of protein had been the only criterion of selection.

It is frequently stated that “practical fusion energy is 30 years in the future and always will be.” The same might be said of functional, practical biomarkers that can pass muster with the FDA. But splice variation represents a new handle on this vexing problem. It appears that we are seeing the emergence of a new approach that may finally yield definitive diagnostic tests, detectable in serum and urine samples.

Part 7. Epigenetics and Drug Metabolism

DNA Methylation Rules: Studying Epigenetics with New Tools

The tools to unravel the epigenetic control mechanisms that influence how cells control access of transcriptional proteins to DNA are just beginning to emerge.

Patricia Fitzpatrick Dimond, Ph.D.

http://www.genengnews.com/media/images/AnalysisAndInsight/Feb7_2013_24454248_GreenPurpleDNA_EpigeneticsToolsII3576166141.jpg

New tools may help move the field of epigenetic analysis forward and potentially unveil novel biomarkers for cellular development, differentiation, and disease.

DNA sequencing has had the power of technology behind it as novel platforms to produce more sequencing faster and at lower cost have been introduced. But the tools to unravel the epigenetic control mechanisms that influence how cells control access of transcriptional proteins to DNA are just beginning to emerge.

Among these mechanisms, DNA methylation, or the enzymatically mediated addition of a methyl group to cytosine or adenine dinucleotides,

  • serves as an inherited epigenetic modification that
  • stably modifies gene expression in dividing cells.

The unique methylomes are largely maintained in differentiated cell types, making them critical to understanding the differentiation potential of the cell.

In the DNA methylation process, cytosine residues in the genome are enzymatically modified to 5-methylcytosine,

  • which participates in transcriptional repression of genes during development and disease progression.

5-methylcytosine can be further enzymatically modified to 5-hydroxymethylcytosine by the TET family of methylcytosine dioxygenases. DNA methylation affects gene transcription by physically

  • interfering with the binding of proteins involved in gene transcription.

Methylated DNA may be bound by methyl-CpG-binding domain proteins (MBDs) that can

  • then recruit additional proteins. Some of these include histone deacetylases and other chromatin remodeling proteins that modify histones, thereby
  • forming compact, inactive chromatin, or heterochromatin.

While DNA methylation doesn’t change the genetic code,

  • it influences chromosomal stability and gene expression.

Epigenetics and Cancer Biomarkers

multistage chemical carcinogenesis

multistage chemical carcinogenesis

And because of the increasing recognition that DNA methylation changes are involved in human cancers, scientists have suggested that these epigenetic markers may provide biological markers for cancer cells, and eventually point toward new diagnostic and therapeutic targets. Cancer cell genomes display genome-wide abnormalities in DNA methylation patterns,

  • some of which are oncogenic and contribute to genome instability.

In particular, de novo methylation of tumor suppressor gene promoters

  • occurs frequently in cancers, thereby silencing them and promoting transformation.

Cytosine hydroxymethylation (5-hydroxymethylcytosine, or 5hmC), the aforementioned DNA modification resulting from the enzymatic conversion of 5mC into 5-hydroxymethylcytosine by the TET family of oxygenases, has been identified

  • as another key epigenetic modification marking genes important for
  • pluripotency in embryonic stem cells (ES), as well as in cancer cells.

The base 5-hydroxymethylcytosine was recently identified as an oxidation product of 5-methylcytosine in mammalian DNA. In 2011, using sensitive and quantitative methods to assess levels of 5-hydroxymethyl-2′-deoxycytidine (5hmdC) and 5-methyl-2′-deoxycytidine (5mdC) in genomic DNA, scientists at the Department of Cancer Biology, Beckman Research Institute of the City of Hope, Duarte, California investigated

  • whether levels of 5hmC can distinguish normal tissue from tumor tissue.

They showed that in squamous cell lung cancers, levels of 5hmdC showed

  • up to five-fold reduction compared with normal lung tissue.

In brain tumors,5hmdC showed an even more drastic reduction

  • with levels up to more than 30-fold lower than in normal brain,
  • but 5hmdC levels were independent of mutations in isocitrate dehydrogenase-1, the enzyme that converts 5hmC to 5hmdC.

Immunohistochemical analysis indicated that 5hmC is “remarkably depleted” in many types of human cancer.

  • there was an inverse relationship between 5hmC levels and cell proliferation with lack of 5hmC in proliferating cells.

Their data suggest that 5hmdC is strongly depleted in human malignant tumors,

  • a finding that adds another layer of complexity to the aberrant epigenome found in cancer tissue.

In addition, a lack of 5hmC may become a useful biomarker for cancer diagnosis.

Enzymatic Mapping

But according to New England Biolabs’ Sriharsa Pradhan, Ph.D., methods for distinguishing 5mC from 5hmC and analyzing and quantitating the cell’s entire “methylome” and “hydroxymethylome” remain less than optimal.

The protocol for bisulphite conversion to detect methylation remains the “gold standard” for DNA methylation analysis. This method is generally followed by PCR analysis for single nucleotide resolution to determine methylation across the DNA molecule. According to Dr. Pradhan, “.. bisulphite conversion does not distinguish 5mC and 5hmC,”

Recently we found an enzyme, a unique DNA modification-dependent restriction endonuclease, AbaSI, which can

  • decode the hydryoxmethylome of the mammalian genome.

You easily can find out where the hydroxymethyl regions are.”

AbaSI, recognizes 5-glucosylatedmethylcytosine (5gmC) with high specificity when compared to 5mC and 5hmC, and

  • cleaves at narrow range of distances away from the recognized modified cytosine.

By mapping the cleaved ends, the exact 5hmC location can, the investigators reported, be determined.

Dr. Pradhan and his colleagues at NEB; the Department of Biochemistry, Emory University School of Medicine, Atlanta; and the New England Biolabs Shanghai R&D Center described use of this technique in a paper published in Cell Reports this month, in which they described high-resolution enzymatic mapping of genomic hydroxymethylcytosine in mouse ES cells.

In the current report, the authors used the enzyme technology for the genome-wide high-resolution hydroxymethylome, describing simple library construction even with a low amount of input DNA (50 ng) and the ability to readily detect 5hmC sites with low occupancy.

As a result of their studies, they propose that

factors affecting the local 5mC accessibility to TET enzymes play important roles in the 5hmC deposition

  • including include chromatin compaction, nucleosome positioning, or TF binding.
  •  the regularly oscillating 5hmC profile around the CTCF-binding sites, suggests 5hmC ‘‘writers’’ may be sensitive to the nucleosomal environment.
  • some transiently stable 5hmCs may indicate a poised epigenetic state or demethylation intermediate, whereas others may suggest a locally accessible chromosomal environment for the TET enzymatic apparatus.

“We were able to do complete mapping in mouse embryonic cells and are pleased about what this enzyme can do and how it works,” Dr. Pradhan said.

And the availability of novel tools that make analysis of the methylome and hypomethylome more accessible will move the field of epigenetic analysis forward and potentially novel biomarkers for cellular development, differentiation, and disease.

Patricia Fitzpatrick Dimond, Ph.D. (pdimond@genengnews.com), is technical editor at Genetic Engineering & Biotechnology News.

Epigenetic Regulation of ADME-Related Genes: Focus on Drug Metabolism and Transport

Published: Sep 23, 2013

Epigenetic regulation of gene expression refers to heritable factors that are functionally relevant genomic modifications but that do not involve changes in DNA sequence.

Examples of such modifications include

  • DNA methylation, histone modifications, noncoding RNAs, and chromatin architecture.

Epigenetic modifications are crucial for

packaging and interpreting the genome, and they have fundamental functions in regulating gene expression and activity under the influence of physiologic and environmental factors.

In this issue of Drug Metabolism and Disposition, a series of articles is presented to demonstrate the role of epigenetic factors in regulating

  • the expression of genes involved in drug absorption, distribution, metabolism, and excretion in organ development, tissue-specific gene expression, sexual dimorphism, and in the adaptive response to xenobiotic exposure, both therapeutic and toxic.

The articles also demonstrate that, in addition to genetic polymorphisms, epigenetics may also contribute to wide inter-individual variations in drug metabolism and transport. Identification of functionally relevant epigenetic biomarkers in human specimens has the potential to improve prediction of drug responses based on patient’s epigenetic profiles.

http://www.technologynetworks.com/Metabolomics/news.aspx?ID=157804

This study is published online in Drug Metabolism and Disposition

Part 8.  Pictorial Maps

 Prediction of intracellular metabolic states from extracellular metabolomic data

MK Aurich, G Paglia, Ottar Rolfsson, S Hrafnsdottir, M Magnusdottir, MM Stefaniak, BØ Palsson, RMT Fleming &

Ines Thiele

Metabolomics Aug 14, 2014;

http://dx.doi.org:/10.1007/s11306-014-0721-3

http://link.springer.com/article/10.1007/s11306-014-0721-3/fulltext.html#Sec1

http://link.springer.com/static-content/images/404/art%253A10.1007%252Fs11306-014-0721-3/MediaObjects/11306_2014_721_Fig1_HTML.gif

Metabolic models can provide a mechanistic framework

  • to analyze information-rich omics data sets, and are
  • increasingly being used to investigate metabolic alternations in human diseases.

An expression of the altered metabolic pathway utilization is the selection of metabolites consumed and released by cells. However, methods for the

  • inference of intracellular metabolic states from extracellular measurements in the context of metabolic models remain underdeveloped compared to methods for other omics data.

Herein, we describe a workflow for such an integrative analysis

  • emphasizing on extracellular metabolomics data.

We demonstrate,

  • using the lymphoblastic leukemia cell lines Molt-4 and CCRF-CEM,

how our methods can reveal differences in cell metabolism. Our models explain metabolite uptake and secretion by predicting

  • a more glycolytic phenotype for the CCRF-CEM model and
  • a more oxidative phenotype for the Molt-4 model,
  • which was supported by our experimental data.

Gene expression analysis revealed altered expression of gene products at

  • key regulatory steps in those central metabolic pathways, and

literature query emphasized the role of these genes in cancer metabolism.

Moreover, in silico gene knock-outs identified unique

  •  control points for each cell line model, e.g., phosphoglycerate dehydrogenase for the Molt-4 model.

Thus, our workflow is well suited to the characterization of cellular metabolic traits based on

  • -extracellular metabolomic data, and it allows the integration of multiple omics data sets
  • into a cohesive picture based on a defined model context.

Keywords Constraint-based modeling _ Metabolomics _ Multi-omics _ Metabolic network _ Transcriptomics

1 Introduction

Modern high-throughput techniques have increased the pace of biological data generation. Also referred to as the ‘‘omics avalanche’’, this wealth of data provides great opportunities for metabolic discovery. Omics data sets

  • contain a snapshot of almost the entire repertoire of mRNA, protein, or metabolites at a given time point or

under a particular set of experimental conditions. Because of the high complexity of the data sets,

  • computational modeling is essential for their integrative analysis.

Currently, such data analysis is a bottleneck in the research process and methods are needed to facilitate the use of these data sets, e.g., through meta-analysis of data available in public databases [e.g., the human protein atlas (Uhlen et al. 2010) or the gene expression omnibus (Barrett et al.  2011)], and to increase the accessibility of valuable information for the biomedical research community.

Constraint-based modeling and analysis (COBRA) is

  • a computational approach that has been successfully used to
  • investigate and engineer microbial metabolism through the prediction of steady-states (Durot et al.2009).

The basis of COBRA is network reconstruction: networks are assembled in a bottom-up fashion based on

  • genomic data and extensive
  • organism-specific information from the literature.

Metabolic reconstructions capture information on the

  • known biochemical transformations taking place in a target organism
  • to generate a biochemical, genetic and genomic knowledge base (Reed et al. 2006).

Once assembled, a

  • metabolic reconstruction can be converted into a mathematical model (Thiele and Palsson 2010), and
  • model properties can be interrogated using a great variety of methods (Schellenberger et al. 2011).

The ability of COBRA models

  • to represent genotype–phenotype and environment–phenotype relationships arises
  • through the imposition of constraints, which
  • limit the system to a subset of possible network states (Lewis et al. 2012).

Currently, COBRA models exist for more than 100 organisms, including humans (Duarte et al. 2007; Thiele et al. 2013).

Since the first human metabolic reconstruction was described [Recon 1 (Duarte et al. 2007)],

  • biomedical applications of COBRA have increased (Bordbar and Palsson 2012).

One way to contextualize networks is to

  • define their system boundaries according to the metabolic states of the system, e.g., disease or dietary regimes.

The consequences of the applied constraints can

  • then be assessed for the entire network (Sahoo and Thiele 2013).

Additionally, omics data sets have frequently been used

  • to generate cell-type or condition-specific metabolic models.

Models exist for specific cell types, such as

  1. enterocytes (Sahoo and Thiele2013),
  2. macrophages (Bordbar et al. 2010),
  3. adipocytes (Mardinoglu et al. 2013),
  4. even multi-cell assemblies that represent the interactions of brain cells (Lewis et al. 2010).

All of these cell type specific models, except the enterocyte reconstruction

  • were generated based on omics data sets.

Cell-type-specific models have been used to study

  • diverse human disease conditions.

For example, an adipocyte model was generated using

  • transcriptomic, proteomic, and metabolomics data.

This model was subsequently used to investigate metabolic alternations in adipocytes

  • that would allow for the stratification of obese patients (Mardinoglu et al. 2013).

The biomedical applications of COBRA have been

  1. cancer metabolism (Jerby and Ruppin, 2012).
  2. predicting drug targets (Folger et al. 2011; Jerby et al. 2012).

A cancer model was generated using

  • multiple gene expression data sets and subsequently used
  • to predict synthetic lethal gene pairs as potential drug targets
  • selective for the cancer model, but non-toxic to the global model (Recon 1),

a consequence of the reduced redundancy in the cancer specific model (Folger et al. 2011).

In a follow up study, lethal synergy between FH and enzymes of the heme metabolic pathway

  • were experimentally validated and resolved the mechanism by which FH deficient cells,
    e.g., in renal-cell cancer cells survive a non-functional TCA cycle (Frezza et al. 2011).

Contextualized models, which contain only the subset of reactions active in a particular tissue (or cell-) type,

  • can be generated in different ways (Becker and Palsson, 2008; Jerby et al. 2010).

However, the existing algorithms mainly consider

  • gene expression and proteomic data
  • to define the reaction sets that comprise the contextualized metabolic models.

These subset of reactions are usually defined

  • based on the expression or absence of expression of the genes or proteins (present and absent calls),
  • or inferred from expression values or differential gene expression.

Comprehensive reviews of the methods are available (Blazier and Papin, 2012; Hyduke et al. 2013). Only the compilation of a large set of omics data sets

  • can result in a tissue (or cell-type) specific metabolic model, whereas

the representation of one particular experimental condition is achieved

  • through the integration of omics data set generated from one experiment only (condition-specific cell line model).

Recently, metabolomic data sets have become more comprehensive and

  • using these data sets allow direct determination of the metabolic network components (the metabolites).

Additionally, metabolomics has proven to be stable, relatively inexpensive, and highly reproducible (Antonucci et al. 2012). These factors make metabolomic data sets particularly valuable for

  • interrogation of metabolic phenotypes.

Thus, the integration of these data sets is now an active field of research (Li et al. 2013; Mo et al. 2009; Paglia et al. 2012b; Schmidt et al. 2013).

Generally, metabolomic data can be incorporated into metabolic networks as

  • qualitative, quantitative, and thermodynamic constraints (Fleming et al. 2009; Mo et al. 2009).

Mo et al. used metabolites detected in the

  • spent medium of yeast cells to determine intracellular flux states through a sampling analysis (Mo et al. 2009),
  • which allowed unbiased interrogation of the possible network states (Schellenberger and Palsson 2009) and
  • prediction of internal pathway use.
Modes of transcriptional regulation during the YMC

Modes of transcriptional regulation during the YMC

Such analyses have also been used to reveal the effects of

  1. enzymopathies on red blood cells (Price et al. 2004),
  2. to study effects of diet on diabetes (Thiele et al. 2005) and
  3. to define macrophage metabolic states (Bordbar et al. 2010).

This type of analysis is available as a function in the COBRA toolbox (Schellenberger et al. 2011).

In this study, we established a workflow

  • for the generation and analysis of condition-specific metabolic cell line models
  • that can facilitate the interpretation of metabolomic data.

Our modeling yields meaningful predictions regarding

  • metabolic differences between two lymphoblastic leukemia cell lines (Fig. 1A).

Fig. 1

metabol leukem cell lines11306_2014_721_Fig1_HTML

metabol leukem cell lines11306_2014_721_Fig1_HTML

A Combined experimental and computational pipeline to study human metabolism.

  1. Experimental work and omics data analysis steps precede computational modeling.
  2. Model predictions are validated based on targeted experimental data.
  3. Metabolomic and transcriptomic data are used for model refinement and submodel extraction.
  4. Functional analysis methods are used to characterize the metabolism of the cell-line models and compare it to additional experimental data.
  5. The validated models are subsequently used for the prediction of drug targets.

B Uptake and secretion pattern of model metabolites. All metabolite uptakes and secretions that were mapped during model generation are shown.

  • Metabolite uptakes are depicted on the left, and
  • secreted metabolites are shown on the right.
  1. A number of metabolite exchanges mapped to the model were unique to one cell line.
  2. Differences between cell lines were used to set quantitative constraints for the sampling analysis.

C Statistics about the cell line-specific network generation.

D Quantitative constraints.

For the sampling analysis, an additional set of constraints was imposed on the cell line specific models,

  • emphasizing the differences in metabolite uptake and secretion between cell lines.

Higher uptake of a metabolite was allowed

  • in the model of the cell line that consumed more of the metabolite in vitro, whereas
  • the supply was restricted for the model with lower in vitro uptake.

This was done by establishing the same ratio between the models bounds as detected in vitro.

X denotes the factor (slope ratio) that distinguishes the bounds, and

  • which was individual for each metabolite.

(a) The uptake of a metabolite could be x times higher in CCRF-CEM cells,

(b) the metabolite uptake could be x times higher in Molt-4,

(c) metabolite secretion could be x times higher in CCRF-CEM, or

(d) metabolite secretion could be x times higher in Molt-4 cells.LOD limit of detection.

The consequence of the adjustment was, in case of uptake, that one model was constrained to a lower metabolite uptake (A, B), and the difference depended on the ratio detected in vitro. In case of secretion, one model

  • had to secrete more of the metabolite, and again
  • the difference depended on the experimental difference detected between the cell lines

2 Results

We set up a pipeline that could be used to infer intracellular metabolic states

  • from semi-quantitative data regarding metabolites exchanged between cells and their environment.

Our pipeline combined the following four steps:

  1. data acquisition,
  2. data analysis,
  3. metabolic modeling and
  4. experimental validation of the model predictions (Fig. 1A).

We demonstrated the pipeline and the predictive potential to predict metabolic alternations in diseases such as cancer based on

^two lymphoblastic leukemia cell lines.

The resulting Molt-4 and CCRF-CEM condition-specific cell line models could explain

^  metabolite uptake and secretion
^  by predicting the distinct utilization of central metabolic pathways by the two cell lines.
^  the CCRF-CEM model resembled more a glycolytic, commonly referred to as ‘Warburg’ phenotype,
^  our model predicted a more respiratory phenotype for the Molt-4 model.

We found these predictions to be in agreement with measured gene expression differences

  • at key regulatory steps in the central metabolic pathways, and they were also
  • consistent with additional experimental data regarding the energy and redox states of the cells.

After a brief discussion of the data generation and analysis steps, the results derived from model generation and analysis will be described in detail.

2.1 Pipeline for generation of condition-specific metabolic cell line models

integration of exometabolomic (EM) data

integration of exometabolomic (EM) data

2.1.1 Generation of experimental data

We monitored the growth and viability of lymphoblastic leukemia cell lines in serum-free medium (File S2, Fig. S1). Multiple omics data sets were derived from these cells.Extracellular metabolomics (exo-metabolomic) data,

integration of exometabolomic (EM) data

integration of exometabolomic (EM) data

^  comprising measurements of the metabolites in the spent medium of the cell cultures (Paglia et al. 2012a),
^ were collected along with transcriptomic data, and these data sets were used to construct the models.

2.1.4 Condition-specific models for CCRF-CEM and Molt-4 cells

To determine whether we had obtained two distinct models, we evaluated the reactions, metabolites, and genes of the two models. Both the Molt-4 and CCRF-CEM models contained approximately half of the reactions and metabolites present in the global model (Fig. 1C). They were very similar to each other in terms of their reactions, metabolites, and genes (File S1, Table S5A–C).

(1) The Molt-4 model contained seven reactions that were not present in the CCRF-CEM model (Co-A biosynthesis pathway and exchange reactions).
(2) The CCRF-CEM contained 31 unique reactions (arginine and proline metabolism, vitamin B6 metabolism, fatty acid activation, transport, and exchange reactions).
(3) There were 2 and 15 unique metabolites in the Molt-4 and CCRF-CEM models, respectively (File S1, Table S5B).
(4) Approximately three quarters of the global model genes remained in the condition-specific cell line models (Fig. 1C).
(5) The Molt-4 model contained 15 unique genes, and the CCRF-CEM model had 4 unique genes (File S1, Table S5C).
(6) Both models lacked NADH dehydrogenase (complex I of the electron transport chain—ETC), which was determined by the absence of expression of a mandatory subunit (NDUFB3, Entrez gene ID 4709).

Rather, the ETC was fueled by FADH2 originating from succinate dehydrogenase and from fatty acid oxidation, which through flavoprotein electron transfer

FADH2

FADH2

  • could contribute to the same ubiquinone pool as complex I and complex II (succinate dehydrogenase).

Despite their different in vitro growth rates (which differed by 11 %, see File S2, Fig. S1) and
^^^ differences in exo-metabolomic data (Fig. 1B) and transcriptomic data,
^^^ the internal networks were largely conserved in the two condition-specific cell line models.

2.1.5 Condition-specific cell line models predict distinct metabolic strategies

Despite the overall similarity of the metabolic models, differences in their cellular uptake and secretion patterns suggested distinct metabolic states in the two cell lines (Fig. 1B and see “Materials and methods” section for more detail). To interrogate the metabolic differences, we sampled the solution space of each model using an Artificial Centering Hit-and-Run (ACHR) sampler (Thiele et al. 2005). For this analysis, additional constraints were applied, emphasizing the quantitative differences in commonly uptaken and secreted metabolites. The maximum possible uptake and maximum possible secretion flux rates were reduced
^^^ according to the measured relative differences between the cell lines (Fig. 1D, see “Materials and methods” section).

We plotted the number of sample points containing a particular flux rate for each reaction. The resulting binned histograms can be understood as representing the probability that a particular reaction can have a certain flux value.

A comparison of the sample points obtained for the Molt-4 and CCRF-CEM models revealed

  • a considerable shift in the distributions, suggesting a higher utilization of glycolysis by the CCRF-CEM model
    (File S2, Fig. S2).

This result was further supported by differences in medians calculated from sampling points (File S1, Table S6).
The shift persisted throughout all reactions of the pathway and was induced by the higher glucose uptake (34 %) from the extracellular medium in CCRF-CEM cells.

The sampling median for glucose uptake was 34 % higher in the CCRF-CEM model than in Molt-4 model (File S2, Fig. S2).

The usage of the TCA cycle was also distinct in the two condition-specific cell-line models (Fig. 2). Interestingly,
the models used succinate dehydrogenase differently (Figs. 2, 3).

TCA_reactions

TCA_reactions

The Molt-4 model utilized an associated reaction to generate FADH2, whereas

  • in the CCRF-CEM model, the histogram was shifted in the opposite direction,
  • toward the generation of succinate.

Additionally, there was a higher efflux of citrate toward amino acid and lipid metabolism in the CCRF-CEM model (Fig. 2). There was higher flux through anaplerotic and cataplerotic reactions in the CCRF-CEM model than in the Molt-4 model (Fig. 2); these reactions include

(1) the efflux of citrate through ATP-citrate lyase,
(2) uptake of glutamine,
(3) generation of glutamate from glutamine,
(4) transamination of pyruvate and glutamate to alanine and to 2-oxoglutarate,
(5) secretion of nitrogen, and
(6) secretion of alanine.

energetics-of-cellular-respiration

energetics-of-cellular-respiration

The Molt-4 model showed higher utilization of oxidative phosphorylation (Fig. 3), again supported by
elevated median flux through ATP synthase (36 %) and other enzymes, which contributed to higher oxidative metabolism. The sampling analysis therefore revealed different usage of central metabolic pathways by the condition-specific models.

Fig. 2

Differences in the use of  the TCA cycle by the CCRF-CEM model (red) and the Molt-4 model (blue).

Differences in the use of the TCA cycle by the CCRF-CEM model (red) and the Molt-4 model (blue).

Differences in the use of the TCA cycle by the CCRF-CEM model (red) and the Molt-4 model (blue).

The table provides the median values of the sampling results. Negative values in histograms and in the table describe reversible reactions with flux in the reverse direction. There are multiple reversible reactions for the transformation of isocitrate and α-ketoglutarate, malate and fumarate, and succinyl-CoA and succinate. These reactions are unbounded, and therefore histograms are not shown. The details of participating cofactors have been removed.

Figure 3.

Molt-4 has higher median flux through ETC reactions II–IV 11306_2014_721_Fig3_HTML

Molt-4 has higher median flux through ETC reactions II–IV 11306_2014_721_Fig3_HTML

Atp ATP, cit citrate, adp ADP, pi phosphate, oaa oxaloacetate, accoa acetyl-CoA, coa coenzyme-A, icit isocitrate, αkg α-ketoglutarate, succ-coa succinyl-CoA, succ succinate, fumfumarate, mal malate, oxa oxaloacetate,
pyr pyruvate, lac lactate, ala alanine, gln glutamine, ETC electron transport chain

Ingenuity network analysis showing up (red) and downregulation (green) of miRNAs involved in PC and their target genes

Ingenuity network analysis showing up (red) and downregulation (green) of miRNAs involved in PC and their target genes

metabolic pathways 1476-4598-10-70-1

metabolic pathways 1476-4598-10-70-1

Metabolic Systems Research Team fig2

Metabolic Systems Research Team fig2

Metabolic control analysis of respiration in human cancer tissue. fphys-04-00151-g001

Metabolic control analysis of respiration in human cancer tissue. fphys-04-00151-g001

Metabolome Informatics Research fig1

Metabolome Informatics Research fig1

Modelling of Central Metabolism network3

Modelling of Central Metabolism network3

N. gaditana metabolic pathway map ncomms1688-f4

N. gaditana metabolic pathway map ncomms1688-f4

protein changes in biological mechanisms

protein changes in biological mechanisms

Read Full Post »


Development Of Super-Resolved Fluorescence Microscopy

 

Author and Curator: Larry H. Bernstein, MD, FCAP

CSO, Leaders in Pharmaceutical Business Intelligence

Development Of Super-Resolved Fluorescence Microscopy

 

Part I. Nobel Prize For Chemistry 2014: Eric Betzig, Stefan W. Hell
and William E. Moerner Honored For Development Of Super-
Resolved Fluorescence Microscopy

The 2014 Nobel Prize in Chemistry was awarded on 10/08/2014 to
Eric Betzig, Stefan W. Hell and William E. Moerner for
“the development of super-resolved fluorescence microscopy.”

The invention of the electron microscope by Max Knoll and Ernst Ruska at the
Berlin Technische Hochschule in 1931 finally overcame the barrier to higher
resolution that had been imposed by the limitations of visible light. Since then
resolution has defined the progress of the technology.

The ultimate goal was atomic resolution – the ability to see atoms – but this would
have to be approached incrementally over the course of decades. The earliest microscopes merely proved the concept: electron beams could, indeed, be tamed
to provide visible images of matter. By the late 1930s electron microscopes with theoretical resolutions of 10 nm were being designed and produced, and by 1944
this was further reduced to 2 nm. (The theoretical resolution of a an optical light microscope is 200 nm.)

Increases in the accelerating voltage of the electron beam accounted for much of
the improvement in resolution. But voltage was not everything. Improvements in electron lens technology minimized aberrations and provided a clearer picture,
which also contributed to improved resolution, as did better vacuum systems and brighter electron guns. So increasing the resolution of electron microscopes was a main driving force throughout the instrument’s development.

With nanoscopy, scientists could observe viruses, proteins and molecules there
are smaller than 0.0000002 metres.

Three researchers won the 2014 Nobel Prize in Chemistry on Wednesday,
October 8, for giving microscopes much sharper vision than was thought possible, letting scientists peer into living cells with unprecedented detail to seek the roots
of disease.  It was awarded to U.S. researchers Eric Betzig and William Moerner
and German scientist Stefan Hell. They found ways to use molecules that glow on demand to overcome what was considered a fundamental limitation for optical microscopes.

Hell, 52, of Germany, is the director at the Max Planck Institute for Biophysical Chemistry and the division head at the German Cancer Research Center in
Heidelberg. He was honored for his work on fluorescence microscopy, a kind
of nano-flashlight where scientists use fluorescent molecules to see parts of a
cell. Later in his career, he developed the STED microscope, which collects light
from “a multitude of small volumes to create a whole.”

Moerner, a 61-year-old professor in chemistry and applied physics at Stanford University in California, is the recipient of the 2008 Wolf Prize in Chemistry, the
2009 Irving Langmuir Award and the 2013 Peter Debye Award. In 1989, he
was the first scientist to be able to measure the light absorption of a single molecule.
This inspired many chemists to begin focusing on single molecules, including Betzig.

Betzig, 54, the group leader at Janelia Farm Research campus at the Howard
Hughes Medical Institute in Virginia, developed new optical imaging tools for
biology. His work involved taking images of the same area multiple times, and illuminating just a few molecules each time. These images were then
superimposed to create a dense super image at the nano level,

The limitation of optical microscopy was thought to have been determined in a calculation published in 1873 that defined the limit of how tiny a detail could be revealed by optical microscopes. Based on experimental evidence and basic principles of physics, Ernst Abbe and Lord Rayleigh defined and formulated
this diffraction-limited resolution in the late 19th century (Abbe, 1873; Rayleigh,
1896
).  However, only cellular structure and objects that were at least 200 to
350 nm apart could be resolved by light microscopy because, the optical resolution
of light microscopy was limited to approximately half of the wavelength of the light used.  Later key innovations—including fluorescence and confocal laser scanning microscopy (CLSM)—made optical microscopy one of the most powerful and
versatile diagnostic tools in modern cell biology. Using highly specific fluorescent labeling techniques such as immunocytochemistry, in situ hybridization, or
fluorescent protein tags, the spatial distribution and dynamics of virtually every subcellular structure, protein, or genomic sequence of interest can be analyzed in chemically fixed or living samples (Conchello and Lichtman, 2005; Giepmans et al., 2006).

The result of their advance is “really a window into the cell which we didn’t have before,” said Catherine Lewis, director of the cell biology and biophysics division
of the National Institute of General Medical Sciences in Bethesda, Maryland.

“You can observe the behavior of individual molecules in living cells in real time.
You can see … molecules moving around inside the cell. You can see them interacting with each other.”

The research of the three men has let scientists study diseases such as
Parkinson’s, Alzheimer’s and Huntington’s at a molecular level, the Royal
Swedish Academy of Sciences said.

Part II. Electron microscopy limitations

Manfred Von Ardenne in Berlin produced the earliest scanning-transmission
electron microscope in 1937. At the University of Toronto in Canada, Cecil Hall, James Hillier, and Albert Prebus, working under the direction of Eli Burton,
produced an advanced 1938 Toronto Model electron microscope that would
later become the basis for Radio Corporation of America’s Model B, the first commercial electron microscope in North America. Ruska at Siemens in
Germany produced the first commercial electron microscope in the world in 938.

Starting in 1939, scientists in Japan gathered to decide on the best way to build
an electron microscope. This group evolved into the Japan Electron Optics Laboratory (JEOL) that would eventually produce more models and varieties
of electron microscopes than any other company. Hitachi and Toshiba in Japan
also played a major role in the early development process.

The 1960s through the 1990s produced many innovative instruments and trends.
The introduction of the first commercial scanning electron microscopes (SEMs)
in 1965 opened up a new world of analysis for materials scientists. Ultrahigh
voltage TEM instruments (up to 3 MeV at CEMES-LOE/CNRS in Toulouse,
France, and at Hitachi in Tokyo, Japan), in the 1960s and 1970s gave electrons higher energy to penetrate more deeply into thick samples. The evolution and incorporation of other detectors (electron microprobes, electron energy loss spectroscopy (EELS), etc.) made the SEM into a true analytical electron
microscope (AEM) beginning in the 1970s. The development of brighter
electron sources, such as the lanthanum hexaboride filament (LAB6) and the
field emission gun in the 1960s, and their commercialization in the 1970s
brought researchers a brighter source of electrons and with it better imaging
and resolution. Tilting specimen stages permitting examination of the specimen
from different angles aided significantly in the determination of crystal structure.
In the late 1980s and throughout the 1990s, the environmental electron
microscopes that allow scientists to examine samples under more natural
conditions of temperature and pressure have dramatically expanded the
types of samples that can be examined.

In medicine, the EM made a unique contribution to diagnostic anatomic
pathology in renal biopsy analysis. However, the small sample had to be
embedded, and in the early days one cut the specimen by breaking glass
for the cutting of the specimen. But even though EM ushered in a new era of molecular pathology, the contribution was limited, despite incremental
improvements.

In the past, the use of microscopes was limited by a physical restriction;
scientists could only see items that were larger than roughly half the
wavelength of light (.2 micrometers)
. However, the groundbreaking work
of the Nobel laureates bypassed the maximum resolution of traditional
microscopes and launched optical microscopy into the nanodimension.

Part III. Super resolution fluorescence microscopy

Bo Huang,1,2 Mark Bates,3 and Xiaowei Zhuang1,2,4
Author information ► Copyright and License information ►
Annu Rev Biochem. 2009; 78: 993–1016.
http://dx.doi.org:/10.1146/annurev.biochem.77.061906.092014
PMCID: PMC2835776  NIHMSID: NIHMS179491

Achieving a spatial resolution that is not limited by the diffraction of
light, recent developments of super-resolution fluorescence microscopy
techniques allow the observation of many biological structures not
resolvable in conventional fluorescence microscopy. New advances
in these techniques now give them the ability to image three-dimensional
(3D) structures, measure interactions by multicolor colocalization, and
record dynamic processes in living cells at the nanometer scale. It is
anticipated that super-resolution fluorescence microscopy will become
a widely used tool for cell and tissue imaging to provide previously
unobserved details of biological structures and processes.

Keywords: Sub-diffraction limit, single-molecule, multicolor imaging,
three-dimensional imaging, live cell imaging, single-particle tracking,
photoswitchable probe

Among the various microscopy techniques, fluorescence microscopy is
one of the most widely used because of its two principal advantages:
Specific cellular components may be observed through molecule-specific
labeling, and light microscopy allows the observation of structures inside
a live sample in real time. Compared to other imaging techniques such
as electron microscopy (EM), however, conventional fluorescence
microscopy is limited by relatively low spatial resolution because of the
diffraction of light. This diffraction limit, about 200–300 nm in the lateral
direction and 500–700 nm in the axial direction, is comparable to or larger
than many subcellular structures, leaving them too small to be observed in
detail. In recent years, a number of “super-resolution” fluorescence microscopy techniques have been invented to overcome the diffraction barrier, including techniques that employ nonlinear effects to sharpen the point-spread function
of the microscope, such as stimulated emission depletion (STED) microscopy
(1, 2), related methods using other reversible saturable optically linear
fluorescence transitions (RESOLFTs) (3), and saturated structured-illumination microscopy (SSIM) (4), as well as techniques that are based on the localization
of individual fluorescent molecules, such as stochastic optical reconstruction microscopy (STORM) (5), photoactivated localization microscopy (PALM) (6),
and fluorescence photoactivation localization microscopy (FPALM) (7). These methods have yielded an order of magnitude improvement in spatial resolution
in all three dimensions over conventional light microscopy.

THE RESOLUTION LIMIT IN OPTICAL MICROSCOPY

Microscopes can be used to visualize fine structures in a sample by providing
a magnified image. However, even an arbitrarily high magnification does not
translate into the ability to see infinitely small details. Instead, the resolution
of light microscopy is limited because light is a wave and is subject to diffraction.

The diffraction limit

An optical microscope can be thought of as a lens system that produces a
magnified image of a small object. In this imaging process, light rays from
each point on the object converge to a single point at the image plane. However,
the diffraction of light prevents exact convergence of the rays, causing a sharp
point on the object to blur into a finite-sized spot in the image. The three-
dimensional (3D) intensity distribution of the image of a point object is called
the point spread function (PSF). The size of the PSF determines the resolution
of the microscope: Two points closer than the full width at half-maximum
(FWHM) of the PSF will be difficult to resolve because their images overlap substantially.

The FWHM of the PSF in the lateral directions (the x–y directions perpendicular
to the optical axis) can be approximated as Δxy ≈ 0.61λ / NA, where λ is the wavelength of the light, and NA is the numerical aperture of the objective
defined as NA = n sinα, with n being the refractive index of the medium and
α being the half-cone angle of the focused light produced by the objective.
The axial width of the PSF is about 2–3 times as large as the lateral width
for ordinary high NA objectives. When imaging with visible light (λ ≈ 550 nm),
the commonly used oil immersion objective with NA = 1.40 yields a PSF with
a lateral size of ~200 nm and an axial size of ~500 nm in a refractive index-
matched medium (Figure 1) (8).

Figure 1

The PSF of a common oil immersion objective with NA = 1.40, showing the
focal spot of 550 nm light in a medium with refractive index n = 1.515. The
intensity distribution in the x-z plane of the focus spot is computed numerically.

PFS of oil immersion microscope

PFS of oil immersion microscope

Because the loss of high-frequency spatial information in optical microscopy
results from the diffraction of light when it propagates through a distance larger
than the wavelength of the light (far field), near-field microscopy is one of the
earliest approaches sought to achieve high spatial resolution. By exciting the fluorophores or detecting the signal through the nonpropagating light near the fluorophore, high-resolution information be retained. Near-field scanning optical microscopy (NSOM) acquires an image by scanning a sharp probe tip across
the sample, typically providing a resolution of 20–50 nm (911). Wide-field
imaging has also been recently demonstrated in the near-field regime using
a super lens with negative refractive index (12, 13). However, the short range
of the near-field region (tens of nanometers) compromises the ability of light microscopy to look into a sample, limiting the application of near-field microscopy
to near-surface features only. This limit highlights the need to develop far-field
high-resolution imaging methods.

Among far-field fluorescence microscopy techniques, confocal and multiphoton microscopy are among the most widely used to moderately enhance the spatial resolution (14, 15). By combining a focused laser for excitation and a pinhole for detection, confocal microscopy can, in principle, have a factor of √2 improvement
in the spatial resolution. In multiphoton microscopy, nonlinear absorption processes reduce the effective size of the excitation PSF. However, this gain in the PSF size
is counteracted by the increased wavelength of the excitation light. Thus, instead
of improving the resolution, the main advantage of confocal and multi-photon microscopy over wide-field microscopy is the reduction of out-of-focus fluorescence background, allowing optical sectioning in 3D imaging.

Two techniques, 4Pi and I5M microscopy, approach this ideal situation by using
two opposing objectives for excitation and/or detection (16, 17). By acquiring
multiple images with illumination patterns of different phases and orientations,
a high-resolution image can be reconstructed. Because the illumination pattern
itself is also limited by the diffraction of light, structured illumination microscopy
(SIM) is only capable of doubling the spatial resolution by combining two diffraction-limited sources of information.  The best achievable result using these methods
would be an isotropic PSF with an additional factor of 2 in resolution improvement. This would correspond to ~100-nm image resolution in all three dimensions, as
has been demonstrated by the I5S technique, which combines I5M and SIM (22). Albeit a significant improvement, this resolution is still fundamentally limited by
the diffraction of light.

SUPER RESOLUTION FLUORESCENCE MICROSCOPY BY SPATIALLY PATTERNED EXCITATION

One approach to attain a resolution far beyond the limit of diffraction, i.e., to
realize super-resolution microscopy, is to introduce sub-diffraction-limit features
in the excitation pattern so that small-length-scale information can be read out.
We refer to this approach, including STED, RESOLFT, and SSIM, as super-
resolution microscopy by spatially patterned excitation or the “patterned excitation” approach.

The concept of STED microscopy was first proposed in 1994 (1) and subsequently demonstrated experimentally (2). Simply speaking, it uses a second laser (STED laser) to suppress the fluorescence emission from the fluorophores located off the center of the excitation. This suppression is achieved through stimulated emission: When an excited-state fluorophores encounters a photon that matches the energy difference between the excited and the ground state, it can be brought back to
the ground state through stimulated emission before spontaneous fluorescence emission occurs. This process effectively depletes excited-state fluorophores
capable of fluorescence emission (Figure 2a,b).

Figure 2

The principle of STED microscopy. (a) The process of stimulated emission. A
ground state (S0) fluorophore can absorb a photon from the excitation light and
jump to the excited state (S1).

STED microsopy

STED microsopy

The pattern of the STED laser is typically generated by inserting a phase mask
into the light path to modulate its phase-spatial distribution (Figure 2b). One such phase mask generates a donut-shaped STED pattern in the xy plane (Figure 2c)
and has provided an xy resolution of ~30 nm (24). STED can also be employed
in 4Pi microscopy (STED-4Pi), resulting in an axial resolution of 30–40 nm (25). STED has been applied to biological samples either immuno-stained with
fluorophore labeled antibodies (26) or genetically tagged with fluorescent
proteins (FPs) (27). Dyes with high photostability under STED conditions and
large stimulated emission cross sections in the visible to near infrared (IR) range
are preferred. Atto 532 and Atto 647N are among the most often used dyes for
STED microscopy.

Stimulated emission is not the only mechanism capable of suppressing
undesired fluorescence emission. A more general scheme using saturable
depletion to achieve super resolution has been formalized with the name
RESOLFT microscopy (3). This scheme employs fluorescent probes that
can be reversibly photoswitched between a fluorescent on state and a dark
off state. The off state can be the ground state of a fluorophores as in the
case of STED, the triplet state as in ground-state-depletion microscopy
(28, 29), or the dark state of a reversibly photoswitchable fluorophore (30).  RESOLFT has been demonstrated using a reversibly photoswitchable
fluorescent protein as FP595 which leads to a resolution better than 100 nm
at a depletion laser intensity of 600 W/cm2(30).

The same concept of employing saturable processes can also be applied
to SIM by introducing sub-diffraction-limit spatial features into the excitation
pattern. SSIM has been demonstrated using the saturation of fluorescence
emission, which occurs when a fluorophore is illuminated by a very high
intensity of excitation light (4). Under this strong excitation, it is immediately
pumped to the excited state each time it returns to the ground state. In SSIM,
where the sample is illuminated with a sinusoidal pattern of strong excitation
light, the peaks of the excitation pattern can be clipped by fluorescence
saturation and become flat, whereas fluorescence emission is still absent
from the zero points in the valleys (Figure 3a). These effects add higher order
spatial frequencies to the excitation pattern. Mixing this excitation pattern with
the high-frequency spatial features in the sample can effectively bring the sub-diffraction-limit spatial features into the detection range of the microscopy
(Figure 3b).

Figure 3

The principle of SSIM. (a) The generation of the illumination pattern. A
diffractive grating in the excitation path splits the light into two beams. Their interference after emerging from the objective and reaching the sample creates
a sinusoidal illumination

SSIM

SSIM

Although the image of a single fluorophore, which resembles the PSF, is a
finite-sized spot, the precision of determining the fluorophores position from
its image can be much higher than the diffraction limit, as long as the image
results from multiple photons emitted from the fluorophore. Fitting an image
consisting of N photons can be viewed as N measurements of the fluorophore position, each with an uncertainty determined by the PSF (8), thus leading to
a localization precision approximated by:

Δloc≈ΔN−−√

where Δloc is the localization precision and Δ is the size of the PSF. This
scaling of the localization precision with the photon number allows super-
resolution microscopy with a resolution not limited by the diffraction of light.

High-precision localization of bright light has reached a precision as high
as ~1 Å (33). Taking advantage of single-molecule detection and imaging
(34, 35), nanometer localization precision has been achieved for single
fluorescent molecules (36).

Using fluorescent probes that can switch between a fluorescent and a dark
state, a recent invention overcomes this barrier by separating in the time
domain the otherwise spatially overlapping fluorescent images. In this approach, molecules within a diffraction limited region can be activated at different time
points so that they can be individually imaged, localized, and subsequently deactivated (Figure 4). Massively parallel localization is achieved through
wide-field imaging, so that the coordinates of many fluorophores can be
mapped and a super-resolution images subsequently reconstructed. This
concept has been independently conceived and implemented by three labs,
and it was given the names STORM (5), PALM (6), and FPALM (7), respectively.

Iterating the activation and imaging process allows the locations of many
fluorophores to be mapped and a super-resolution image to be constructed
from these fluorophore locations. In the following, we refer to this approach
as super-resolution microscopy by single-molecule localization.

Figure 4

The principle of stochastic optical reconstruction microscopy (STORM), photoactivated localization microscopy (PALM), and fluorescence photo-
activation localization microscopy (FPALM). Different fluorescent probes
marking the sample structure are activated.

STORM

STORM

After capturing the images with a digital camera, the point-spread functions
of the individual molecules are localized with high precision based on the
photon output before the probes spontaneously photo-bleach or switch to
a dark state. The positions of localized molecular centers are indicated with
black crosses. The process is repeated in Figures (c) through (e) until all of
the fluorescent probes are exhausted due to photo-bleaching or because the background fluorescence becomes too high. The final super-resolution image
(Figure (f)) is constructed by plotting the measured positions of the fluorescent probes.
http://microscopyu.com/tutorials/flash/superresolution/storm/index.html

The resolution of this technique is limited by the number of photons detected
per photoactivation event, which varies from several hundred for FPs (6) to
several thousand for cyanine dyes such as Cy5 (5, 46). These numbers
theoretically allow more than an order of magnitude improvement in spatial
resolution according to the √N scaling rule. In practice, a lateral resolution
of ~20 nm has been established experimentally using the photoswitchable
cyanine dyes (5, 46). Super-resolution images of biological samples have
been reported with directly labeled DNA structures and immunostained DNA-
protein complexes in vitro (5) as well as with FPtagged or immunostained
cellular structures (6, 44, 46).

Table 1   Photoswitchable fluorophores used in super resolution
fluorescence microscopy

Photoswitchable fluorophores

Photoswitchable fluorophores

Recent advances in super-resolution fluorescence microscopy
(including the capability for 3D, multicolor, live-cell imaging) enable
new applications in biological samples. These technical advances
were made possible through the development of both imaging optics
and fluorescent probes.

  • 3D imaging using the single-molecule localization approach
  • 3D imaging using the patterned excitation approach
  • Multicolor imaging
  • Multicolor imaging using the patterned excitation approach
  • Multicolor imaging using the single-molecule localization approach
  • Live cell imaging

Fluorescence imaging of a live cell has two requirements: specific labeling
of the cell and a time resolution that is high enough to record relevant
dynamics in the cell.  Many fluorescent proteins and organic dyes, including
cyanine dyes (46) and caged dyes, have been shown switchable in live cells.

Because STED has a much smaller PSF than scanning confocal microscopy,
STED would inherently take more time to scan though the same size of image
field. By increasing the scanning speed and limiting the field of view to a few µm, Westphal and coworkers have observed Brownian motion of a dense suspension
of nanoparticles with an impressive rate of 80 frames per second (fps) using
STED microscopy (63). More recently, they have demonstrated video-rate
(28 fps) imaging of live hippocampal neurons and observed the movement of individual synaptic vesicles with 60–80-nm resolution (64).

Sub-diffraction-limit imaging of focal adhesion proteins in live cells has recently
been demonstrated (65). Photoswitchable fluorescent protein, EosFP, was used
to label the focal adhesion protein paxillin. A time resolution of ~25–60 seconds
per frame was obtained, and during this time interval, approximately 103
fluorophores were activated and localized per square micrometer, providing
an effective resolution of 60–70 nm by the Nyquist criterion (65). More recently, super-resolution imaging has also been demonstrated in live bacteria with photoswitchable enhanced yellow fluorescent protein (EYFP), allowing the
MreB structure in the cell to be traced (66).

The optical resolution

Optical resolution is the intrinsic ability of a given method to resolve a structure
and can be defined as the ability to distinguish two point sources in proximity.
For the patterned excitation approaches, such as STED, SSIM, and RESOLFT,
the optical resolution is represented by the size of the effective PSF. For the
single-molecule localization approach, such as STORM/PALM/FPALM, the
precision of determining the positions of individual fluorescent probes is the
principal measure of optical resolution.

By using a spatially patterned excitation profile, this approach achieves super resolution by generating an effective excitation volume with dimensions far
below the diffraction limit. Taking STED as an example, the sharpness of the
PSF results from the saturation of depletion of excited-state fluorophores in
the region neighboring the zero point of the STED laser (which coincide with
the focal point of the excitation laser). With an increasing STED laser power,
the saturated region expands toward the zero point, but fluorophores at the
zero point are not affected by the STED laser if the zero point is strictly kept
at zero intensity. Therefore, a theoretically unlimited gain in spatial resolution
may be achieved if the zero point in the depletion pattern is ideal.

The single-molecule localization approach achieves super resolution through
high precision localization of individual fluorophores. The number of photons
collected from a fluorophore is a principal factor limiting the localization
precision and hence the resolution of the final image.

Several photoswitchable fluorophores have been reported to give thousands
of photons detected per activation event [e.g., 6000 from Cy5 (46)].With the
PSF fitting procedure and the mechanical stability of the system optimized,
the background signal suppressed, and the nonuniformity of camera pixels
corrected, optical resolution of just a few nanometers could potentially be
achieved, reaching the molecular scale. As in the case of the patterned
excitation approach, the optical resolution here is also unlimited, in principle,
given a sufficient number of photons detected from the fluorescent probes.

Part III. A guide to super-resolution fluorescence microscopy

L Schermelleh1R Heintzmann2,3,4, and H Leonhardt1
JCB Jul 19, 2010 // 190(2): 165-175
The Rockefeller University Press,
http://dx.doi.org:/10.1083/jcb.201002018

Based on experimental evidence and basic principles of physics, Ernst Abbe
and Lord Rayleigh defined and formulated this diffraction-limited resolution in
the late 19th century (Abbe, 1873Rayleigh, 1896). Later key innovations—including fluorescence and confocal laser scanning microscopy (CLSM)—made optical microscopy one of the most powerful and versatile diagnostic
tools in modern cell biology.

The optical resolution defines the physical limit of the smallest structure it
can resolve. When imaging a biological sample, the effective resolution is
also affected by several sample-specific factors, including the labeling density,
probe size, and how well the ultrastructures are preserved during sample
preparation.

The diffraction (Abbe) limit of detection

Resolution is often defined as the largest distance at which the image of
two point-like objects seems to amalgamate. Thus, most resolution criteria
(Rayleigh limit,Sparrow limit, full width at half maximum of the PSF) directly
relate to properties of the PSF. These are useful resolution criteria for visible
observation of specimen, but there are several shortcomings of such a definition
of resolution: (1) Knowing that the image is an image of two particles, these
can in fact be discriminated with the help of a computer down to arbitrary
smaller distances. Determining the positions of two adjacent particles thus
becomes a question of experimental precision and most notably photon statistics
rather than being described by the Rayleigh limit. (2) These limits do not
necessarily correspond well to what level of detail can be seen in images or
real world objects; e.g., the Rayleigh limit is defined as the distance from the
center to the first minimum of the point spread function, which can be made
arbitrarily small with the help of ordinary linear optics (e.g., Toraldo-filters),
albeit at the expense of the side lobes becoming much higher than the central
maximum. (3)

Abbe’s formulation of a resolution limit avoids all of the above shortcomings
at the expense of a less direct interpretation. The process of imaging can be
described by a convolution operation. With the help of a Fourier transformation,
every object (whether periodic or not) can uniquely be described as a sum of
sinusoidal curves with different spatial frequencies (where higher frequencies
represent fine object details and lower frequencies represent coarse details).
The rather complex process of convolution can be greatly simplified by looking
at the equivalent operation in Fourier space: The Fourier-transformed object
just needs to be multiplied with the
Fourier-transformed PSF to yield the Fourier-transformed ideal image (without
the noise). Because the Fourier-transformed PSF now describes how well each
spatial frequency of the Fourier-transformed object gets transferred to appear in the
image, this Fourier-transformed PSF is called the optical transfer function, OTF
(right panel). Its strength at each spatial frequency (e.g., measured in oscillations
per meter) conveniently describes the contrast that a sinusoidal object would
achieve in an image.

Abbe limit

Abbe limit

Interestingly, the detection OTF of a microscope has a fixed frequency
border (Abbe limit frequency, right panel). The maximum-to-maximum
distance Λmin of the corresponding sine curve is commonly referred to
as Abbe’s limit (left panel). In other words: The Abbe limit is the smallest
periodicity in a structure, which can be discriminated in its image. As a
point object contains all spatial frequencies, this Abbe limit sine curve
needs to also be present in the PSF. A standard wide-field microscope
creates an image of a point object (e.g., an emitting molecule) by capturing
the light from that molecule at various places of the objective lens, and
processing it with further lenses to then interfere at the image plane.
Conveniently due to the reciprocity principle in optics, the Abbe limit Λmin
along an in-plane direction in fluorescence imaging corresponds to the
maximum-to-maximum distance of the intensity structure one would get by
interfering two waves at extreme angles captured by the objective lens:
where λ/n is the wavelength of light in the medium of refractive index n.
The term NA = n sin(α) conveniently combines the half opening angle α
of the objective and the refractive index n of the embedding medium.

Abbe’s famous resolution limit is so attractive because it simply depends
on the maximal relative angle between different waves leaving the
object and being captured by the objective lens to be sent to the image.
It describes the smallest level of detail that can possibly be imaged with
this PSF “brush”. No periodic object detail smaller than this shortest
wavelength can possibly be transferred to the image.

Confocal laser scanning microscopy employs a redesigned optical
path and specialized hardware. A tightly focused spot of laser light is
used to scan the sample and a small aperture (or pinhole) in the
confocal image plane of the light path allows only light originating
from the nominal focus to pass (Cremer and Cremer, 1978Sheppard
and Wilson, 1981
Brakenhoff et al., 1985). The emitted light is
detected by a photomultiplier tube (PMT) or an avalanche photodiode
(APD) and the image is then constructed by mapping the detected
light in dependence of the position of the scanning spot. CLSM can
achieve a better resolution than wide-field fluorescence microscopy
but, to obtain a significant practical advantage, the pinhole needs to
be closed to an extent where most of the light is discarded
(Heintzmann et al., 2003).

Wide-field deconvolution and CLSM have long been the gold standards
in optical bioimaging, but we are now witnessing a revolution in light
microscopy that will fundamentally expand our perception of the cell.
Recently, several new technologies,collectively termed super-resolution
microscopy or nanoscopy, have been developed that break or bypass
the classical diffraction limit and shift the optical resolution down to
macromolecular or even molecular levels (Table I).

Super-resolution light microscopy methods

super resolution microscopy

super resolution microscopy

http://zeiss-campus.magnet.fsu.edu/articles/superresolution/introduction.html

Conceptually, one can discern near-field from far-field methods and
whether the subdiffraction resolution is based on a linear or nonlinear
response of the sample to its locally illuminating (exciting or depleting) irradiance. The required nonlinearity is currently achieved by using reversible saturable optical fluorescence transitions (RESOLFT) between molecular states (Hofmann et al., 2005Hell, 2007).

Besides these saturable optical fluorescence transitions also other
approaches, e.g., Rabi oscillations, could be used to generate the
required nonlinear response.

Note that each of the novel imaging modes has its individual signal-
to-noise consideration depending on various factors.  A full
discussion of this issue is beyond the scope of this review, but as a
general rule, single-point scanning systems, albeit fundamentally limited
in speed by fluorescence saturation effects, can have better signal-
to-noise performance for thicker samples.

With three-dimensional SIM (3D-SIM), an additional twofold increase
in the axial resolution can be achieved by generating an excitation
light modulation along the z-axis using three-beam interference
(Gustafsson et al., 2008Schermelleh et al.,2008) and processing a
z-stack of images accordingly. Thus, with 3D-SIM an approximately eightfold smaller volume can be resolved in comparison to conventional microscopy (Fig. 2). To computationally reconstruct a three-dimensional dataset of a typical mammalian cell of 8-µm height with a
z-spacing of 125 nm, roughly 1,000 raw images (512 × 512 pixels) are
recorded. Because no special photophysics is needed, virtually all modern fluorescent labels can be used provided they are sufficiently photostable
to accommodate the additional exposure cycles.

Resolvable volumes obtained with current commercial super-resolution microscopes.

A schematic 3D representation of focal volumes is shown for the indicated
emission maxima. The approximate lateral (x,y) and axial (z) resolution
and resolvable volumes are listed. Note that STED/CW-STED and 3D-SIM
can reach up to 20 µm into the sample, whereas PALM/STORM is usually
confined to the evanescent wave field near the sample bottom. It should be
noted that deconvolution approaches can further improve STED resolution.
For comparison the “focal volume” for PALM/STORM was estimated based
on the localization precision in combination with the z-range of TIRF.

Resolvable volumes obtained

Resolvable volumes obtained

Super-resolution microscopy of biological samples.

(A) Conventional wide-field image (left) and 3D-SIM image of a mouse
C2C12 prometaphase cell stained with primary antibodies against
lamin B and tubulin, and secondary antibodies conjugated to Alexa 488
(green) and Alexa 594 (red), respectively. Nuclear chromatin was stained
with DAPI (blue). 3D image stacks were acquired with a DeltaVision OMX
prototype system (Applied Precision). The bottom panel shows the
respective orthogonal cross sections. (B) HeLa cell stained with primary
antibodies against the nuclear pore complex protein Nup153 and
secondary antibodies conjugated with ATTO647N. The image was
acquired with a TCS STED confocal microscope (Leica). (C) TdEosFP-
paxillin expressed in a Hep G2 cell to label adhesion complexes at
the lower surface. The image was acquired on an ELYRA P.1
prototype system (Carl Zeiss, Inc.) using TIRF illumination. Single
molecule positional information was projected from 10,000 frames
recorded at 30 frames per second. On the left, signals were summed
up to generate a TIRF image with conventional wide-field lateral
resolution. Bars: 5 µm (insets, 0.5 µm).

biological images

biological images

APPLICATIONS IN BIOLOGICAL SYSTEMS

The cytoskeleton of mammalian cells, especially microtubules
(Figure 5a) (29444652), is the most commonly used benchmark
structure for super-resolution imaging. Other cytoskeletal structures
imaged so far include actin filaments in the lamellipodium (6),
keratin intermediate filaments (59), neurofilaments (2683) and
MreB in Caulobacter (66).

Figure 5

cytoskeleton. f5.

cytoskeleton. f5.

Examples of super-resolution images of biological samples.
(a) Two-color STORM imaging of immunostained microtubule (green)
and clathrin-coated pits (red) (From Reference 46. Reprinted with
permission from AAAS).

Organelles, such as the endoplasmic reticulum (27), lysosome (6),
endocytic and exocytic vesicles (465264), and mitochondria
(65356), have also been imaged. For example, using the single-molecule localization approach, 3D STORM imaging has clearly
resolved the ~150-nm diameter, hemispherical cage shape of clathrin-coated pits (4652), which only appear as diffraction-limited spots
without any feature in conventional fluorescence microscopy (Figure 5a,b).
Two-color 3D STED has resolved the hollow shape of the mitochondrial
outer membrane (marked by the translocase protein Tom20), enclosing
a matrix protein Hsp60 (56), even though the diameter of mitochondria is
only about 300–500 nm (Figure 5c). The outer membrane structure of
mitochondria and their interactions with microtubules have been resolved
by two-color 3D STORM (53). The transport of synaptic vesicles
has been recorded at video rate using 2D STED (Figure 5d ) (64).

Many plasma membrane proteins or membrane associated protein
complexes have also been studied by super-resolution fluorescence
microscopy. For example, synaptotagmin clusters after exocytosis in
primary cultured hippocampal neurons (84), the donut-shaped
clusters of Drosophila protein Bruchpilot at the neuromuscular
synaptic active zone (85), and the size distribution of syntaxin clusters
have all been imaged (8687). Photoactivation has enabled the tracking
of the influenza protein hemagglutinin and the retroviral protein Gag in
live cells, revealing the membrane microdomains (67) and the spatial
heterogeneity of membrane diffusion (68). The morphology and transport
of the focal adhension complex has also been observed using live-cell
PALM (Figure 5e) (65).

Summary points

  1. Super resolution fluorescence microscopy with a spatial resolution not limited by the diffraction of
    light has been implemented using saturated depletion/excitation or single-molecule localization
    of switchable fluorophores.
  2. Three-dimensional imaging with an optical resolution as high as ~20 nm in the lateral direction
    and 40–50 nm in axial dimension has been achieved.
  3. The resolution of these super-resolution fluorescence microscopy techniques can in principle
    reach molecular scale.
  4. In practice, the resolution of the images are not only limited by the intrinsic optical resolution,
    but also by sample specific factors including the labeling density, probe size and sample preservation.
  5. Multicolor super resolution imaging has been implemented, allowing colocalization measurements
    to be performed at nanometer scale resolution and molecular interaction to be more précisely
    identified in cells.
  6. Super-resolution fluorescence imaging allows dynamic processes to be investigated at the tens of
    nanometer resolution in living cells.
  7. Many cellular structures have been imaged at sub-diffraction-limit resolution.

Future issues

  1. Achieving molecular scale resolution (a few nanometers or less).
  2. Fast super resolution imaging of a large view field by multi-point scanning or high-speed single-molecule switching/localization.
  3. Developing new fluorescent probes that are brighter, more photostable and switchable fluorophores
    that have high on-off contrast and fast switching rate.
  4. Developing fluorescent labeling methods that can stain the target with small molecules at high specificity,
    high density and good ultrastructure preservation.
  5. Application of super resolution microscopy to provide novel biological insights

Acronyms

FP

Fluorescent Protein

FPALM

Fluorescence PhotoActivation Localization Microscopy

I5M

Combination of I2M (Illumination Interference Microscopy) and I3M
(Incoherent Imaging Interference Microscopy)

PALM

PhotoActivated Localization Microscopy

PSF

Point Spread Function

RESOLFT

REversible Saturable Optically Linear Fluorescence Transition

SIM

Structured Illumination Microscopy

SSIM

Saturated Structured Illumination Microscopy

STED

STimulated Emission Depletion

STORM

STochastic Optical Reconstruction Microscopy

glossary

Numerical aperture (NA)

The numerical aperture of an objective characterizes the solid angle
of light collected from a point light source at the focus of the objective.

Stimulated emission

The process that an excited state molecule or atom jumps to the
ground state by emitting another photon that is identical to the incoming
photon. It is the basis of laser.

Fluorescence saturation

At high excitation intensity, the fluorescence lifetime instead of the excitation
rate becomes the rate limiting step of fluorescence emission, causing the
fluorescence signal not to increase proportionally with the excitation intensity.

Nyquist criterion

To determine a structure, the sampling interval needs to be no larger than
half of the feature size.

Mitochondria

Organelles in eukaryotic cells for APT generation, consisting of two
membrane (inner and outer) enclosing the inter membrane space and
the matrix inside the inner membrane.

Clathrin-coated pit

Vesicle forming machinery involved in endocytosis and intracellular
vesicle transport, consisting of clathrin coats, adapter proteins, and
other regulatory proteins.

Focal adhesion

The macromolecular complex serving as the mechanical connection
and signaling hub between a cell and the extracellular matrix or other cells.

Selected references with abstract

Near-Field Optics: Microscopy, Spectroscopy, and Surface
Modification Beyond the Diffraction Limit
Eric Betzig,  Jay K. Trautman
AT&T Bell Laboratories, Murray Hill, NJ 07974
Science 10 Jul 1992; 257(5067) pp. 189-195
http://dx.doi.org:/0.1126/science.257.5067.189

 The near-field optical interaction between a sharp probe and a sample
of interest can be exploited to image, spectroscopically probe, or modify
surfaces at a resolution (down to ∼12 nm) inaccessible by traditional far-field
techniques. Many of the attractive features of conventional optics are
retained, including noninvasiveness, reliability, and low cost. In addition, most
optical contrast mechanisms can be extended to the near-field regime,
resulting in a technique of considerable versatility. This versatility
is demonstrated by several examples, such as the imaging of nanometric-scale features in mammalian tissue sections and the creation of ultrasmall,
magneto-optic domains having implications for high density data storage.
Although the technique may find uses in many diverse fields, two of the
most exciting possibilities are localized optical spectroscopy of semiconductors
and the fluorescence imaging of living cells.

Imaging Intracellular Fluorescent Proteins at Nanometer Resolution

 E Betzig1,2,*,†, GH. Patterson3, R Sougrat3, O.W Lindwasser3,
S Olenych4, JS. Bonifacino3, MW. Davidson4, JL Schwartz3, HF. Hess5,*  1 Howard Hughes Medical Institute, Janelia Farm Research Campus,
Ashburn, VA   2 New Millennium Research, LLC, Okemos, MI.   3 Cell Biology and Metabolism Branch, National Institute of Child Health
and Human Development (NICHD), Bethesda, MD.  4 National High
Magnetic Field Laboratory, Florida State University, Tallahassee, FL.
5 NuQuest Research, LLC, La Jolla, CA.
Science 15 Sep 2006; 313(5793): pp. 1642-1645
http://dx.doi.org:/10.1126/science.1127344

We introduce a method for optically imaging intracellular proteins at
nanometer spatial resolution. Numerous sparse subsets of photo-activatable fluorescent protein molecules were activated, localized
(to ∼2 to 25 nanometers), and then bleached. The
aggregate position information from all subsets was then assembled
into a super-resolution image. We used this method—termed photo-
activated localization microscopy to image specific target proteins
in thin sections of lysosomes and mitochondria; in fixed whole cells,
we imaged vinculin at focal adhesions, actin within a lamellipodium,
and the distribution of the retroviral protein Gag at the plasma
membrane.

Toward fluorescence nanoscopy.

Hell SW.   Author information 
Nat Biotechnol. 2003 Nov; 21(11):1347-55.
http://www.ncbi.nlm.nih.gov/pubmed/14595362

For more than a century, the resolution of focusing light microscopy
has been limited by diffraction to 180 nm in the focal plane and to
500 nm along the optic axis. Recently, microscopes have been
reported that provide three- to seven-fold improved axial
resolution in live cells. Moreover, a family of concepts has emerged
that overcomes the diffraction barrier altogether. Its first exponent,
stimulated emission depletion microscopy, has so far displayed a
resolution down to 28 nm. Relying on saturated optical transitions,
these concepts are limited only by the attainable saturation level.
As strong saturation should be feasible at low light intensities,
nanoscale imaging with focused light may be closer than ever.
PMID: 14595362

Far-field optical nanoscopy.

Hell SW.  Author information 
Science. 2007 May 25;316(5828):1153-8.
http://www.ncbi.nlm.nih.gov/pubmed/17525330

In 1873, Ernst Abbe discovered what was to become a well-known
paradigm: the inability of a lens-based optical microscope to
discern details that are closer together than half of the wavelength
for its most popular imaging mode, fluorescence microscopy, the
diffraction barrier is crumbling. Here, I discuss the physical concepts
that have pushed fluorescence microscopy to the nanoscale, once
the prerogative of electron and scanning probe microscopes. Initial
applications indicate that emergent far-field optical nanoscopy will
have a strong impact in the life sciences and in other areas benefiting
from nanoscale visualization.
PMID:  17525330

Imaging intracellular fluorescent proteins at nanometer resolution.

Betzig E1, Patterson GHSougrat RLindwasser OWOlenych S,
Bonifacino JSDavidson MWLippincott-Schwartz JHess HF.
Author information
Science. 2006 Sep 15;313(5793):1642-5. Epub 2006 Aug 10
http://www.ncbi.nlm.nih.gov/pubmed/16902090

We introduce a method for optically imaging intracellular proteins at
nanometer spatial resolution. Numerous sparse subsets of photo-ctivatable fluorescent protein molecules were activated, localized
(to approximately 2 to 25 nanometers), and then bleached. The
aggregate position information from all subsets was then assembled
into a super-resolution image. We used this method–termed photo-activated localization microscopy–to image specific target proteins in
thin sections of lysosomes and mitochondria; in fixed whole cells,
we imaged vinculin at focal adhesions, actin within a lamellipodium,
and the distribution of the retroviral protein Gag at the plasma
membrane.

Comment in

PMID:  16902090  [PubMed – indexed for MEDLINE]

Illuminating single molecules in condensed matter.

Moerner WE1, Orrit M.  Author information 
Science. 1999 Mar 12;283(5408):1670-6.
http://www.ncbi.nlm.nih.gov/pubmed/10073924

Efficient collection and detection of fluorescence coupled with careful
minimization of background from impurities and Raman scattering
now enable routine optical microscopy and study of single molecules
in complex condensed matter environments. This ultimate method
for unraveling ensemble averages leads to the observation of
new effects and to direct measurements of stochastic fluctuations.
Experiments at cryogenic temperatures open new directions in
molecular spectroscopy, quantum optics, and solid-state dynamics.
Room-emperature investigations apply several techniques
(polarization microscopy, single-molecule imaging, emission time
dependence, energy transfer, lifetime studies, and the like) to a
growing array of biophysical problems where new insight may be
gained from direct observations of hidden static and dynamic
inhomogeneity.  PMID: 10073924

Fluorescence microscopy with super-resolved optical sections.

Egner A1, Hell SW.  Author information 
Trends Cell Biol. 2005 Apr;15(4):207-15.
http://www.ncbi.nlm.nih.gov/pubmed/15817377

The fluorescence microscope, especially its confocal variant, has
become a standard tool in cell biology research for delivering
3D-images of intact cells. However, the resolution of any standard
optical microscope is atleast 3 times poorer along the axis of the
lens that in its focal plane. Here, we review principles and applications
of an emerging family of fluorescence microscopes, such as 4Pi
microscopes, which improve axial resolution by a factor of seven by
employing two opposing lenses. Noninvasive axial sections of 80-160 nm
thickness deliver more faithful 3D-images of subcellular features,
providing a new opportunity to significantly enhance our understanding
of cellular structure and function. PMID: 15817377

4Pi-confocal microscopy provides three-dimensional images of the
microtubule network with 100- to 150-nm resolution.

Nagorni M1, Hell SW.  Author information 
J Struct Biol. 1998 Nov;123(3):236-47.

We show the applicability of 4Pi-confocal microscopy to three-dimensional imaging of the microtubule network in a fixed mouse
fibroblast cell.Comparison with two-photon confocal resolution
reveals a fourfold better axial resolution in the 4Pi-confocal case.
By combining 4Pi-confocal microscopy with Richardson-Lucy
image restoration a further resolution increase is achieved.
Featuring a three-dimensional resolution in the range 100-150 nm,
the 4Pi-confocal (restored) images are intrinsically more detailed
than their confocal counterparts. Our images constitute what
to our knowledge are the best-resolved three-dimensional
images of entangled cellular microtubules obtained with light
to date.  PMID: 9878578

Part IV. Super-resolution microscopy

Super-resolution microscopy is a form of light microscopy. Due
to the diffraction of light, the resolution of conventional light
microscopy is limited as stated by Ernst Abbe in 1873.[1]
A good approximation of the resolution attainable is the full
width at half maximum 
 (FWHM) of the point spread function,
and a precise wide-field microscope with high numerical
aperture
 and visible light usually reaches a resolution of ~250 nm.

Super-resolution techniques allow the capture of images with
a higher resolution than the diffraction limit. They fall into
two broad categories,
“true” super-resolution techniques, which capture information
contained in evanescent waves, and “functional” super-
resolution techniques, which use clever experimental
techniques and known limitations on the matter being
imaged to reconstruct a super-resolution image.[2]

True subwavelength imaging techniques include those that
utilize the Pendry Superlens and near field scanning optical
microscopy
, the 4Pi Microscope and structured illumination
microscopy technologies like SIM and SMI. However, the
majority of techniques of importance in biological imaging
fall into the functional category.

Groups of methods for functional super-resolution microscopy:

  1. Deterministic super-resolution: The most commonly used emitters in biological
    microscopy, fluorophores, show a nonlinear response to excitation, and this
    nonlinear response can be exploited to enhance resolution. These
    methods include STEDGSDRESOLFTand SSIM.
  2. Stochastic super-resolution: The chemical complexity of many molecular
    light sources gives them a complex temporal behaviour, which can be used
    to make several close-by fluorophores emit light at separate times and
    thereby become resolvable in time.  These methods include SOFI and all
    single-molecule localization methods (SMLM) such as SPDM,
    SPDMphymodPALM, FPALM, STORM and dSTORM.

Part V. HIV-1

Conformational dynamics of single HIV-1 envelope
trimers on the surface of native virions

James B. Munro1,*,Jason Gorman2Xiaochu Ma1,
Zhou Zhou3James Arthos4,
Dennis R. Burton5,6, et al.
1Department of Microbial Pathogenesis, Yale University
School of Medicine, New Haven, CT. 2Vaccine Research
Center, National Institute of Allergy and Infectious
Diseases, National Institutes of Health, Bethesda, MD .
3Department of Physiology and Biophysics, Weill
Cornell Medical College of Cornell University, New York, NY .
4Laboratory of Immunoregulation, National Institute of Allergy
and Infectious Diseases, National Institutes of Health, Bethesda,
MD . 5Department of Immunology and Microbial Science, and
IAVI Neutralizing Antibody Center, The Scripps Research
Institute, La Jolla, CA . 6Ragon Institute of MGH, MIT, and
Harvard, Cambridge, MA. 7International AIDS Vaccine Initiative
(IAVI), New York, NY . 8Department of
Chemistry, University of Pennsylvania, Philadelphia, PA.

The HIV-1 envelope (Env) mediates viral entry into host cells.
To enable the direct imaging of conformational dynamics
within Env we introduced fluorophores into variable
regions of the gp120 subunit and measured single-molecule
fluorescence resonance energy transfer (smFRET) within
the context of native trimers on the surface of HIV-1 virions.
Our observations revealed unliganded HIV-1 Env to be
intrinsically dynamic, transitioning between three distinct
pre-fusion conformations, whose relative occupancies
were remodeled by receptor CD4 and antibody binding.
The distinct properties of neutralization-sensitive and
neutralization-resistant HIV-1 isolates support a dynamics-based mechanism of immune evasion and ligand recognition.

Read Full Post »

« Newer Posts - Older Posts »