Posts Tagged ‘time resolved drystallograpky’

Metabolic Reactions Need Just Enough

Author and Curator: Larry H Bernstein, MD, FCAP 


This is another installment of the metabolomics series that has delved into the relationship between the building blocks of life.
There would be no life without the genetic code, which has changed over the span of life in our universe, but with retention of the instructions that have selective advantage under the existing conditions, which include environmental temperature, water, metallic elements, and the most abundant elements essential for organic reactions – carbon, hydrogen, oxygen, nitrogen, sulfur, to which we would add iron, calcium, sodium, chloride, potassium, magnesium, cadmium, manganese, nickel and selenium.   Many consider it a miracle that life would evolve out of this primordial mix.  Those who are of a different mind have spent generations in human history piecing together the evidence that our existence and our improvement has elements to understand, and is subject to improvement.  This is encountered in the sciences and, to a serious extent in the humanities as well.  This is why we have gone from the most basic to the more comprehensive, if also seemingly incomprehensible because of complexities, uncertainties, and insufficient information to complete the puzzle, which may never be completed.  The pursuit has led our society from – village, to town, to city, to metropolis, with intermingling of societies, as if societies become like living organisms of another order.  If this is the case, then war and peace, and competition for resources, and barriers, and issues of control are another dimension of an intricate network.  This is what propagates the imaginings of Science Fiction noire.


Part I.  Everything works in concert

Getting metabolism right

10/08/2014 – Larry Hardesty, MIT News Office


Metabolic networks are mathematical models of chemical reactions

Metabolic networks are mathematical models of chemical reactions



Image: Jose-Luis Olivares/MIT

Metabolic networks are mathematical models of every possible sequence of chemical reactions available to an organ or organism, and they’re used to design microbes for manufacturing processes or to study disease. Based on both genetic analysis and empirical study, they can take years to assemble.

An analytic tool developed at Massachusetts Institute of Technology (MIT) suggests that many of those models may be wrong, but the same tool may make it fairly straightforward to repair them.

“They have all these models in this database at [the Univ. of California at] San Diego,” says Bonnie Berger, a professor of applied mathematics and computer science at MIT and one of the tool’s developers. Many of them have computational errors because they were calculated with floating-point arithmetic, used to increase efficiency. The MIT team has proved that you need to compute them in exact arithmetic. They found that models that were believed to be realistic don’t produce growth that is expected.

The new tool, and the analyses performed with it has been published in Nature Communications, with Leonid Chindelevitch, first author, a graduate student in Berger’s group, now a postdoctoral researcher at the Harvard School of Public Health. He and Berger are joined by Aviv Regev, an associate professor of biology at MIT, and Jason Trigg, another of Berger’s former students.

Pruning the network
Metabolic networks, Chindelevitch says, “describe the set of all reactions that are available to a particular organism that we might be interested in. So if we’re interested in yeast or E. coli or the tuberculosis bacterium, this is a way to put together everything we know about what this organism can do to transform some substances into some other substances.

  1. it gets nutrients from the environment,
  2. it will transform them by its own internal mechanisms

The network thus represents every sequence of chemical reactions catalyzed by enzymes encoded in an organism’s DNA that could

  • lead from particular nutrients
  • to particular chemical products.

Every node of the network represents an intermediary stage in some chain of reactions.

To simplify such networks enough to enable exact arithmetical analysis, Chindelevitch and Berger developed an algorithm that

  1. first identifies all the sequences of reactions that, for one reason or another, can’t occur within the context of the model;
  2. it then deletes these.
  3. it identifies clusters of reactions that always work in concert: Whatever their intermediate products may be, they effectively perform a single reaction.
  4. The algorithm then collapses those clusters into a single reaction.

Chindelevitch and Berger were able to mathematically prove that these modifications wouldn’t affect the outcome of the analysis.

“What the exact-arithmetic approach allows you to do is respect the key assumption of the model, which is that

  • at steady state, every metabolite is neither produced in excess nor depleted in excess,” Chindelevitch says. “The production balances the consumption for every substance.”

When Chindelevitch and Berger applied their analysis to 89 metabolic-network models in the San Diego database, they found that 44 of them contained errors or omissions:

  • If the products of all the reactions in the networks were in equilibrium, the organisms modeled would be unable to grow.

Patching it up
By adapting algorithms used in the field of compressed sensing, however, Chindelevitch and Berger are also able to identify

  • likely locations of network errors.

Compressed sensing exploits the observation that some complex signals—such as audio recordings or digital images—that are computationally intensive to acquire can, upon acquisition, be compressed. It performs the initial sampling in a clever way that allows it to build up the simpler representation without having to pass through a more complex representation. Chindelevitch and Berger’s algorithm can isolate just those links in a metabolic network that contribute most to its chemical imbalance.
Source: Massachusetts Institute of Technology

Researchers purified the protein and used electron microscopy to reveal its structure.

Scientists have taken pictures of the BRCA2 protein, showing how it works to repair damaged DNA, providing insight into how mutations in the gene that encodes BRCA2 would raise the risk of breast and ovarian cancers. Though the protein is known to be involved in DNA repair, its shape and mechanism have been unclear.

Researchers at Imperial College London and the Cancer Research UK London Research Institute purified the protein and used electron microscopy to reveal its structure and how it interacts with other proteins and DNA. The results are published in Nature Structural and Molecular Biology.

The lifetime risk of breast cancer for women with BRCA2 mutations is 40 to 85 per cent, depending on the mutation, compared with around 12 per cent for the general population. Many women who test positive for BRCA1 and BRCA2 mutations choose to undergo surgery to reduce their risk of breast cancer. The BRCA1 and BRCA2 genes encode proteins involved in DNA repair.

The study, led by Professor Xiaodong Zhang from the Department of Medicine at Imperial College London and Dr Stephen West at the London Research Institute, according to Professor Zhang, “is our first view of how the protein looks and how it works”. “Once we have added more detail to the picture, we can design ways to correct defects in BRCA2 and help cells repair DNA more effectively to prevent cancer”, but also think about how to make autophagy (protein repair) less effective in cancer cells, so that they die.”

The study found that BRCA2 proteins work in pairs – which the researchers found surprising since BRCA2 is one of the largest proteins in the cell.

BRCA2 works in partnership with another protein called

BRCA2 helps RAD51 molecules to

  • assemble on strands of broken DNA and form filaments.

The RAD51 filaments then search for

  • matching strands of DNA in order to repair the break.

The findings showed that

  • each pair of BRCA2 proteins binds two sets of RAD51 that run in opposite directions.

This allows it to work on strands of broken DNA that point in either direction. They also show that BRCA2’s job is to help RAD51 form short filaments at multiple sites along the DNA, presumably to increase the efficiency of establishing longer filaments required to search for matching strands.



Unlocking The Non-Coding Half of Human Genome

Texas A&M biologists unlock non-coding half of human genome with novel DNA sequencing technique.    Oct 07, 2014  http://www.technologynetworks.com/Genomics

An obscure swatch of human DNA once thought to be nothing more than biological trash may actually offer a treasure trove of insight into complex genetic-related diseases, thanks to a novel technique developed by biologists at Texas A&M University, doctoral candidate John C. Aldrich and Dr. Keith A. Maggert, an associate professor in the Department of Biology, in measuring variation in heterochromatin. This tightly packed section of the non-coding human genome, was until recently thought to have no discernable function.

Aldrich monitored the dynamics of the heterochromatic sequence in Drosophyla by modifying the quantitative polymerase chain reaction (QPCR) used to amplify specific DNA sequences, adding a fluorescent dye that allowed him to monitor the fruit-fly DNA changes and to observe any variations.

Aldrich’s findings, published in the online edition of the journal PLOS ONE, showed that differences in the heterochromatin exist, confirming that the junk DNA is not stagnant as researchers originally had believed and that mutations which could affect other parts of the genome occur in non-coding DNA.

“This work opens up the non-coding half of the genome.”  The coding regions, contain the information necessary for a cell to make proteins, but far less is known about the non-coding regions, beyond the fact that

  • they are not directly related to making proteins.

Maggert said. “In my opinion, there are about 30,000 protein-coding genes. The rest of the DNA –

  • greater than 90 percent –
  • either controls those genes and therefore is technically part of them, or
  • is within this mush that we study and, thanks to John, can now measure.

The heterochromatin that we study definitely has effects, but it’s not possible to think of it as discrete genes. So, we prefer to think of it as

  • 30,000 protein-coding genes plus this one big, complex one that can orchestrate the other 30,000.”

When human DNA was finally sequenced with the completion of the Human Genome Project in 2003, researchers determined that only two percent of the genome (about 21,000 genes) represented coding DNA. Since then, numerous other studies have emerged debating the functionality, or lack thereof, of non-coding, so-called “junk DNA.”

“There is so much talk about understanding the connection between genetics and disease and finding personalized therapies,” Maggert said. “However, this topic is incomplete unless biologists can look at the entire genome.

Breakthrough allows researchers to watch molecules “wiggle”



time-resolved crystallography

time-resolved crystallography

A new crystallographic technique developed at the University of Leeds,
published in the journal Nature Methods,  describes a new way of doing time-resolved crystallography, a method that researchers use to observe changes within
the structure of molecules. Fast time-resolved crystallography (Laue crystallography) has only been available at three sites worldwide. This resulted in only a handful of proteins having been studied using the technique. The new method will allow researchers across the world to carry out dynamic crystallography.

Further, it is likely to provide a major boost to research on understanding how molecules work. Understanding how structure and dynamics are linked to function is key to designing better medicines targeted at specific states of molecules, helping to avoid unwanted side effects.

“A time-resolved structure is a bit like having a movie for crystallographers,” said Professor Arwen Pearson, who led a team of researchers in the University’s Faculty of Biological Sciences and School of Chemistry. “Life wiggles. It moves about and, to understand it,

  • you need to be able to see how biological structures move at the atomic scale. This breakthrough allows us to do that.”

Traditional x-ray crystallography fires x-rays into crystallized molecules and creates an image that allows researchers to work out the atomic structure of the molecules. A major limitation is that the picture created is the average of all the molecules in a crystal and their motions over the time of an experiment.

Dr. Briony Yorke, the lead researcher on the project, said: “A static picture is not very helpful if you want to observe how molecular structures work. ..it is hard to really understand something without seeing it in action.”

The existing method of getting around the problem could be compared to the laborious process of making an animated film. Scientists “synchronise” a set of molecules in an identical state and then activate, or “pump”, the changes in the molecules. They take a crystallographic snapshot of the structure after a set time. The researchers then have to repeat the process. This approach was first proposed by the British Nobel Prize winning chemist George Porter in the 1940s. However, there are only three x-ray generators, in the world that are capable of delivering a powerful enough beam to create a crystallographic image..

The new method uses clever mathematics (a Hadamard Transform) to open up the field to much less powerful “beamlines”, that scientists use to harness powerful synchrotron light for crystallography and other techniques. This will enable facilities, to do time-resolved crystallography.

As in Porter’s method, in the new approach researchers synchronise their molecules and activate them. However, they then make a series of crystallographic “probes” of the moving structures using a pattern of light pulses. These pulses build up a single crystallographic image—a bit like a long exposure photograph. The researchers then repeat the experiment using  different patterns of light pulses and create different “long exposure” images, repeated until all of the pulse patterns created (using a mathematical formula) have been completed. Even though  the “long exposure” images created from the pulse patterns are blurred, the differences between the pulse patterns that created them allow researchers to extract a moving picture of the molecules’ changing structures.

Professor Pearson said that this method doesn’t need the very strong light required by the Porter method, thereby overcoming many of the current limitations.” Co-author Professor Godfrey Beddard, Emeritus Professor of Chemical Physics at the University of Leeds, said: “We demonstrate this method for crystallography, but it will work for any time-resolved experiment where the probe can be encoded. This new method means that, instead of having to go to one of the three instruments in the world that can currently do time-resolved crystallography, you can go to any beamline at any synchrotron—basically it massively opens the field for these kinds of experiments.”

Co-author Dr Robin Owen, Principal Beamline Scientist at Diamond Light Source, said: “The beauty of the approach is that it uses existing equipment in a new way to facilitate new science. The novel use of the Hadamard transform, or multiple-exposure, approach helps open the door for time-resolved science at a much wider range of beamlines and synchrotron sources than is currently possible. By exploiting the approach we will be able to obtain multiple sequential images of a protein while it carries out its function, providing a much clearer understanding of the relationship between structure and function.”

Professor Paul Raithby, Chair of Inorganic Chemistry at the University of Bath, a leading expert on time-resolved crystallography, who was not one of the authors of the paper, said: “This is a very exciting development in the area of macromolecular and molecular crystallography.  The new method will allow us to “watch” chemical and biological processes as they happen in a way that has not been possible previously,…”

The research was funded by the Wellcome Trust and was conducted at the University of Leeds and the Diamond Light Source. Professor Pearson is now Professor of Experimental Biophysics at The Hamburg Centre for Ultrafast Imaging (CUI) of Universität Hamburg. Dr Yorke is now a postdoctoral research fellow, also at Universität Hamburg.

Time-resolved crystallography using the Hadamard Transform

Time-resolved crystallography and protein design: signalling photoreceptors and optogenetics

Keith Moffat
University of Chicago
Phil. Trans. R. Soc. B 17 July 2014; 369(1647): 20130568
http://dx.doi.org:/ 10.1098/rstb.2013.0568

Time-resolved X-ray crystallography and solution scattering have been successfully conducted on proteins on time-scales down to around 100 ps, set by the duration of the hard X-ray pulses emitted by synchrotron sources. The advent of hard X-ray free-electron lasers (FELs), which emit extremely intense, very brief, coherent X-ray pulses, opens the exciting possibility of time-resolved experiments with femtosecond time resolution on macromolecular structure, in both single crystals and solution. The X-ray pulses emitted by an FEL differ greatly in many properties from those emitted by a synchrotron, in ways that at first glance make time-resolved measurements of X-ray scattering with the required accuracy extremely challenging. This opens up several questions which I consider in this brief overview. Are there likely to be chemically and biologically interesting structural changes to be revealed on the femtosecond time-scale? How shall time-resolved experiments best be designed and conducted to exploit the properties of FELs and overcome challenges that they pose? To date, fast time-resolved reactions have been initiated by a brief laser pulse, which obviously requires that the system under study be light-sensitive. Although this is true for proteins of the visual system and for signalling photoreceptors, it is not naturally the case for most interesting biological systems. To generate more biological targets for time-resolved study, can this limitation be overcome by optogenetic, chemical or other means?


Part 2. Metabolomics and Systems Biology

Metabolomics in systems biology.

Weckwerth W.
Annu Rev Plant Biol. 2003;54:669-89.   http://www.ncbi.nlm.nih.gov/pubmed/14503007
The primary aim of “omic” technologies is the non-targeted

  • identification of all gene products (transcripts, proteins, and metabolites)
  • present in a specific biological sample.

These technologies reveal unexpected properties of biological systems.

A second and more challenging aspect of omic technologies is the

  • refined analysis of quantitative dynamics in biological systems.
  • gas and liquid chromatography coupled to mass spectrometry are well suited for coping with
    1. high sample numbers in reliable measurement times with respect to both
    2. technical accuracy and
    3. the identification and quantitation of small-molecular-weight metabolites.

This potential is a prerequisite for the analysis of dynamic systems. Thus, metabolomics is a key technology for systems biology. The aim of this review is to

(a) provide an in-depth overview about metabolomic technology,
(b) explore how metabolomic networks can be connected to the underlying reaction pathway structure, and
(c) discuss the need to investigate integrative biochemical networks.     PMID:14503007

Systems Biology, Metabolomics, and Cancer Metabolism

Masaru Tomita, Kenjiro Kami
Institute for Advanced Biosciences, Keio University, Tsuruoka,  Japan; Systems Biology Program, Graduate School of Media and Governance, Keio University, Fujisawa, Japan; and Human Metabolome Technologies Inc., Tsuruoka, Japan.
Science 25 May 2012; 336(6084): 990-991   http://dx.doi.org:/10.1126/science.1223066

Recent breakthroughs in cancer metabolism include

  • the identification of an alternative glycolytic pathway in proliferative cells

(1) and an essential role for the serine synthesis pathway in breast cancer
(2). With a data-driven approach, as opposed to the conventional hypothesis-driven approach, in this issue, on page 1040, Jain et al.
(3) determined that rapidly proliferating cancer cells require large amounts of the nonessential amino acid glycine, which has clear and direct implications for cancer therapy.
Source: Univ. of Leeds

Metabolite Profiling Identifies a Key Role for Glycine in Rapid Cancer
Mohit Jain et al.
Science 336, 1040 (2012);

New Signaling Pathways for Hormones and Cyclic Adenosine 3′,5′-Monophosphate Action in Endocrine Cells

JoAnne S. Richards
Molec Endocrinol 1 Feb, 2001; 15(2)

The glycoprotein hormones, ACTH, TSH, FSH, and LH

  • regulate diverse functions in endocrine cells.

Although cAMP and PKA have long been shown to mediate specific intracellular signaling events including

  • the transcription of specific genes via the CREB-CBP complex,

recent observations have indicated that

  • PKA does not account for all of the intracellular targets of cAMP.
  1. TSH stimulation of thyroid cell proliferation is not completely blocked by PKA inhibitors.
  2. TSH and FSH can stimulate PKB phosphorylation by a PKA independent but PI3-K/PDK1-dependent pathway.

An FSH inducible kinase, Sgk,

  1. has recently been shown to be a close relative of PKB.
  2. Sgk is a target of PI3-K-PDK1 pathway,

indicating that some effects previously ascribed to PKB

  • may be mediated by this inducible kinase.

The identification of novel cAMP-binding proteins

  1. exhibiting guanine nucleotide exchange (GEF) activity
    (cAMP-GEFS; Epacs)
  2. opens new doors for cAMP action that include activation of small GTPases
    1. such as Rap1a, Rap2, and possibly Ras.

These GTPases are known activators of downstream kinase cascades,

  • including p38MAPK and Erk1/2 as well as PI3-K.

Thus, FSH and TSH activation of PKB and Sgk may occur via

  • this alternative cAMP pathway that involves
  • cAMP-GEFs and
  • the activation of the PI3-K/PDK1 pathway.

Molecular Control of Immune/Inflammatory Responses: Interactions Between Nuclear Factor-κB and Steroid Receptor-Signaling Pathways

Lorraine I. McKay, and John A. Cidlowski
Endocr Rev 1 Aug, 1999; 20(4)

Nuclear Factor-κB (NF-κB)

  1. NF-κB is a dimeric transcription factor
  2. The regulatory subunit IκB is an inhibitor of NF-κB
  3. Activation and function of NF-κB
  4. The transcription factor NF-κB interacts with multiple transcription factors and transcriptional co-factors
  5. Transgenic animals suggest a complex role for NF-κB family members in immunity and development

Steroid Hormones/Receptors: Glucocorticoids and the Glucocorticoid Receptor (GR)

  1. Glucocorticoid mechanism of action: the GR
  2. Glucocorticoid physiology
  3. GR/NF-κB interactions
  4. GR interacts with other transcription factors and transcriptional cofactors

NF-κB and GR Antagonism: Physiological Significance?

Interactions Between NF-κB and Other Steroid Hormone Receptors

  1. Androgen receptor (AR)
  2. Estrogen receptor (ER)
  3. Progesterone receptor (PR)

Structural Biochemistry/Cell Signaling Pathways/Endocrine System

There are many types of signaling involved in the endocrine system including: autocrine, paracrine, and juxtacrine. Autocrine hormones act on the secreting cell itself, paracrine hormones act only on neighboring cells, and juxtacrine hormones act either on the emitting cell or adjacent cells.

Relationship of Metabolomics to Traditional Metabolism

The traditional methodology of analytical biochemistry as it relates to metabolism is slowly and carefully being replaced by the newer and far more effective methods of the new field Metabolomics. This is being done simply because the old methods of classic metabolism can’t yield the type of data needed for the aims of systems biology and metabolic engineering by concentrating on

  • single pathways and only
  • minor interactions between them.

In comparison Metabolomics is far more effective for a wide variety of systems biology concerns, like

  • nutrigenomics and toxicology.

Previously all attempts had been concentrated on

  • proteomics and genomics

because keeping track of the entire metabolome was an extraordinarily difficult task. But as more cheap and effective methods of doing this were developed Metabolomics steadily became more effective than even proteomics and genomics.

The differences are strong enough to necessitate a rethinking of the experimental processes and procedures and the integrations of data sharing and acquistion. Even the nomenclature and terminology is undergoing an overhaul showing just how much of a radical change in focus and method Metabolomics is. This doesn’t mean that the reductionism method is useless by any means. Parts of the biochemical processes and the metabolic systems of organisms can be better understood through reductionism Classical analytical biochemistry for metabolism is not being replaced. It just has a brand new systems orientated partner in the new and exciting biological and biochemistry fields of study and application that are opening up even now.

The focus of this resource is specifically

  1. the description of Metabolism as a concept and
  2. partially the description of the classical methodology of investigating its function and predicting its actions
    1. normally and
    2. when perturbed.

It describes the classic methods of investigating and quantifying metabolism

  • as following a reductionist approach by focusing on single metabolic pathways or
  • on minor interactions between several pathways. see picture)

The methods used here often were

  • the tracking of radioactive tracers through a pathway or
  • the tracking of metabolic levels of certain key metabolites and biomarkers.

Slightly newer pre Metabolomics methods included using

  • genomic and proteomic data to apply holistic mathematical and statistic analysis to the metabolic systems overall. (see picture)

These methods were still less effective than Metabolomics would presumably be.




An approach to understanding the function and nature of a complex entity or process by reducing it to the interactions of its parts and subprocesses. wiki/Reductionism

Metabolic Network

The complete set of metabolic and physical processes that determine the physiological and biochemical properties of a cell. wiki/Metabolic_network

Radioactive Tracer

A radioactive molecule used to track the flow of molecules and atoms within a set of reactions.

Metabolic Pathway

A naming convention in biochemistry, the word pathway describes a collection of related chemical reactions that all happen in sequence. Metabolic pathways are specifically biochemical pathways of the metabolome.

Molecular Dynamics

a form of computer simulation that attempts to model the motions and interactions of atoms and molecules under the known laws of physics. In the context of this resource it was one of the methods of classical biochemistry, using the reduced aspects of chemistry to try to model the whole. wiki/Molecular_dynamics

Ontology (information science)

The representation of a set of concepts within a domain and the relationships between those concepts. wiki/Ontology_(information_science). In the context of this resource the domain is metabolic networks and the metabolome as well as the science of Metabolomics and the concepts contained within.

Controlled Vocabularies (CV’s)

Collection of terms and descriptions of concepts that are forced to follow specific rules or conventions to allow for maximum usefulness in the discourse about a field of study.

Disparate resources 

Diverse or markedly different resources. This state in resources can often be a cause of problems for data communication.

Systems Biology

The new realm of biological study that concentrates on the systematic analysis of complex interactions in biological systems. This represents a move away from reductionism in biology towards the perspective of integration.


The products and intermediate materials of metabolic processes.

Hypercycles (chemical) 

A self reproducing macromolecular system in which the RNAs and enzymes cooperate (see picture) The macromolecules also cooperate to provide primitive translation abilities which allows information to be translated into enzymes.


“The quantitative measurement of the dynamic multiparametric metabolic response of loving systems to pathophysiological stimul or genetic modification” wiki/Metabolite#Metabonomics


The study of the relation between nutrition and genomics with the application of boosting and monitoring human health. wiki/Nutrigenomics

Metabolic engineering

The optimization of the regulatory and genetic processes in a cell in order to produce certain substances more efficiently and faster. The entire context of this article orientates around making this sort of thing easier and more effective.

Holistic Approach

An approach that avoids the idea that the parts could yield an idea of what the whole would do and instead attempts to understand the function of the whole system. (gleaned from context in the article)

Hierarchical Metabolic Regulation

A set of theories that state that metabolic regulation operates in a hierarchy, that the genetic level is the first level, the protein translation level is the next level and the enzymatic regulation level is after that. It also states that complex interactions between level 2 and 3 often occur and blend the two together. (gleaned from context in the article)


Double growth. A description of the growth phases of a bacterial colony that is metabolizing a mixture of metabolites, usually sugars. wiki/Diauxie

Metabolomics Society Workgroups

Biological metadata workgroups are responsible for detailing the metadata of the experiments for Metabolomics and setting up the standards for running a Metabolomics experiment as detailed by the Metabolomics Society Metabolomics Society Webpage.

The chemical analysis workgroup’s job is to “identify, develop and disseminate best chemical analysis practices in all aspects of Metabolomics” CAWG. It’s not their job to determine how experiments should be run but to establish a set of minimal standards to follow.

The Data Processing workgroup concentrates on establishing standards for algorithms and data reporting DPWG.

The Ontology workgroup will concentrate on making the language of Metabolomics coherent and understandable as well as relevant to the sciences OWG.

The exchange format Workgroup concentrates on the exchange of information and the format of analysis. EFGW.
The focus of this article is to describe the impact of the expansion of traditional sciences into “–omics” a shorthand reference for a systems biology approach that expands

  • from a single function or pathway (something like genetics or metabolism) into
  • an integrated system model (like genomics and metabolomics).

It goes over specifically the advances made in each field and how those advances serve to benefit metabolic engineering overall. The article first describes

  1. the nature of the situation giving background on what we know about regulation and the hierarchy of the regulation of metabolic processes (see picture) and then
  2. goes deeper into the contributions of proteomics, systems biology, genomics and finally metabolomics (see picture).
  3. They wrap up the article discussing how this will benefit metabolic engineering more than previous techniques.

This article connects to Biochemistry


The article itself however is suggesting a move to the more systems orientated approach in Metabolomics (among other -omics) because the older methods of concentrating on single pathways and small scale integration simply does not give the knowledge necessary to achieve the aims that metabolic engineers wish to achieve. This relates to our Metabolomics projects and their contrast to the techniques and information we’ve learned that follows the more traditional approach of

  • reduction of the systems to stand alone pathways with
  • small levels of integration.

his article focuses entirely on Metabolomics and whether it will be a scientific contender in the near future. It initially describes the history of Metabolomics and how it fits into the entire scheme of biological investigation and prediction for systems biology (see picture) as well as the past difficulties in working in this relatively new field. Because the numbers of metabolites that need to be kept track of at once are so high, the sciences have put more energy into proteomics and genomics previously. However the new techniques being used are high thorough put and cheap to use. Due to this Metabolomics has easily surpassed past Metabolism investigation methods and is beginning to surpass proteomics and genomics as well.

The article describes several major success stories for Metabolomics including comparisons of silent phenotypes in yeast, a high throughput diagnosis of

  • coronary artery disease, and
  • monitoring gene therapy in Duchenne Muscular Dystrophy

among several others. These things in particular are in contrast to previous investigations of simple metabolism mostly due to their higher level of application. Metabolomics is simply capable of a far greater effect on the application of biochemistry than the original reductionist approaches of metabolism

The article also discusses the sheer volume of data that needs to be cataloged and measured before full effectiveness was reached and how

  • cross correlations between Metabolomics and other “-omics” technologies can have major mutual benefits.

Metabolomics is an effective

  • rapid phenotyping tool for mutant tracking in genomics and can
  • speed up the data acquisition in many genomics investigations
  • as well as giving a more accurate view (see picture).

The article also discusses in slightly less detail the need for powerful databases and accounts for the fact that the technology and methods already exist to create and populate these data storage and manipulation tools. The article proceedes to point out the need for new and more powerful analysis technology due to the sheer amount of data that one needs to acquire. New Software is especially needed to manipulate and analyze the data as it comes in. The article concludes by stating the great potential Metabolomics has both

  • in working with other “-omics” and
  • in revolutionizing metabolic profiling

but states that the Metabolomics needs to carefully consider a lot of different factors to get its foot in the door, especially in terms of metadata.

The focus of this article is describing the issues surrounding the previous metabolic profiling approaches that centered themselves on reductionism pathway analysis. It points out the shortcomings of attempts to draw genome scale metabolic networks using the typical pathway methods.

The article is a useful view into the methodology of traditional metabolism. For instance, it describes in the background how many biochemists would study one particular pathway, like glycolysis without taking into account other seemingly unrelated pathways that could interact with it. This article cited the usefulness of having large-scale representations of the metabolic profile and how it allowed a scientist to track perturbations of the metabolic system in multiple locations therefore boosting the efficiency and accuracy of metabolic investigation.

The article also discusses the issues with overlapping nodes and proposes a system in which concentration and focus of the metabolic profile and drawing may be chosen by the individual using it, to eliminate overlapping nodes but avoiding the loss of necessary data and context. They propose a software system using several algorithms to draw the metabolic maps in a more effective way. Several of these test maps are shown (see picture).

The article suggests using mixed bipartite graphs to model the data (see picture) and multi scale clustering in the drawing algorithm in order to help group together the drawing in a way that can be tracked visually and easily but not result in data loss. (see picture). The drawing method also draws metanodes to further enhance visualization with a recursive algorithm that draws the subgraphs from the most nested to the least nested. (see picture)

The article tested the software and methods and compared the drawing to other methodology tracking whether the drawing method was more or less accurate and whether it was easier or more difficult to read.




The Cinderella story of metabolic profiling: does metabolomics get to go to the functional genomics ball?

Metabolic network visualization eliminating node redundance and preserving metabolic pathways


2 Metabolites

o2.1 Metabolites and their pathways

2.1.1 KEGG Pathways

2.1.2 MetaCyc

2.1.3 The Human Metabolome Database

2.1.4 Institute for Analytical Sciences


Guanosine Monophosphate (GMP)


Guanosine monophosphate structure

Guanosine monophosphate structure

Guanosine monophosphate structure


Researchers have utilized chemical proteomics in order to identify the novel target molecules of cyclic guanosine monophosphate (cGMP), with the intention of obtaining a better understanding of the cGMP pathway. Experiments were conducted on cGMP that had been immobilized onto agarose beads with linkers directed at three different cGMP positions. The employment of agarose beads allowed for maximum accessibility of cGMP to its binding partners.

Using a pull-down assay with the beads as bait on tissue lysates, nine proteins were identified via Matrix-Assisted Laser Desorption/Ionization Time-of-Flight (MALDI-TOF) mass spectrometry. A portion of these proteins consisted of previously identified cGMP targets, which included

  • cGMP-dependent protein kinase and
  • cGMP-stimulated phosphodiesterase.

Evidence from competition binding assays determined that protein interactions occurred by

  • specific binding of cGMP
  • into the binding pockets of its target proteins,
  • and were also highly stereo-specific to cGMP

against other nucleotides. MAPK1 was confirmed

  • as one of the identified target proteins

via immunoblotting with an anti-MAPK1 antibody. Further evidence was provided by observing the

  • stimulation of mitogen-activated protein kinase 1 signaling
  • by membrane-permeable cGMP,

in the treated cells. Further research in the field of proteomics is expected to yield more efficient tools and techniques applicable to the identification and analysis of bioactive molecules and their target proteins.

cGMP binding protein isolation revealed that

  • the brain tissue samples had a higher concentration of cGMP binding proteins
  • than did the heart or liver tissue samples.

This observation implied that there is a

  • more diverse cGMP signal transduction role in the brain than in the heart or liver.

In addition, an increase of MAPK phosphorylation was discovered via immunoblotting with an anti-phospho MAPK antibody. Researchers have determined that

  • direct interactions occur between cGMP binding proteins and cGMP.

The binding proteins are also strongly believed to be regulated by the concentration of cellular cGMP. Further research in the field of proteomics is expected to yield more efficient tools and techniques applicable to the identification and analysis of bioactive molecules and their target proteins.




Nucleotide Metabolism


This resource provides a very comprehensive overview of multiple aspects of nucleotide metabolism. These include

  • biosynthesis,
  • catabolism,
  • salvage pathways, and
  • regulation as well as
  • clinical significance of both purine and pyrimidine nucleotides.

Regulation of deoxyribonucleotides (dNTP’s) and interconversion of nucleotides are also discussed.

An advantage to this website is that mechanisms are displayed pictorially to make it easier to follow and understand the movement of electrons, bonds, charge, molecules and substituents in these complicated pathways.

When analyzing the mechanism for purine nucleotide biosynthesis, there are many common metabolic features present, which we’ve discussed throughout the quarter.
Purine nucleotides are built upon a sugar.

In the first step, catalyzed by glutamine-PRPP amidotransferase, glutamine acts as a source of ammonia and PPi (inorganic pyrophosphate) is released. The release of this PPi can lead to its cleavage to form two inorganic phosphates. The cleavage of this phosphoanhydride bond provides energy to drive reactions forward.

In the steps two, four and five, ATP, an activated molecule is used for energy. In the third and ninth step, tetrahydrofolate, a cofactor, acts to perform 1-carbon transfers at intermediate oxidation levels.

Glutamine is used again in the fourth step as a source of ammonia. Step six is a carboxylation reaction, and it’s very unusual that the cofactor biotin is not utilized. Most other carboxylation reactions are biotin dependent.

The fumarate produced in step eight can be used to replenish citric acid cycle intermediates, meaning that purine nucleotide synthesis acts as an anaplerotic reaction.

Targets of Natural Compounds Vs. Targets of Chemotherapy Drugs


Cancer cells that receive a high throughput of proliferation signals keep dividing uncontrollably, but if not bombarded with these signals will enter apoptosis.

This resource discusses the differences between what natural compounds target and what chemotherapy drugs target in order to reduce the flow of information to a cell leading to cell proliferation, in order to prevent cancer These drugs specifically target the structure of nucleotides and the integrity of them within DNA as well as enzymes that participate in the synthesis phase such as DNA polymerase and topoisomerase in order to prevent completion of the cell cycle.  Chemotherapeutic agents act by inhibiting enzymes in the nucleotide biosynthesis pathway because cancer cells have a greater requirement for nucleotides as DNA precursors. Glutamine analogs such as azaserine and acivicin inhibit glutamine amidotransferase, making it impossible for glutamine to act as a nitrogen donor.

Purine and Pyrimidine Metabolism Disorders


Under normal conditions, nucleotides act as components of cellular energy systems, signaling, and DNA and RNA production. However, when an enzyme has a defect causing it to malfunction leading to accumulation of compounds in blood, urine, or tissues, this can result in diseased states which can severely affect people and their everyday lives. This resource discusses several disorders of nucleotide metabolism; including disorders of purine salvage, purine nucleotide synthesis, purine catabolism, and pyrimidine metabolism. Not only is the nature of several deficiencies discussed, but diagnosis as well as possible treatment and diet adjustments are mentioned.

  1. Lesch-Nyhan syndrome is a disorder of purine salvage and results from a deficiency in the hypoxanthine-guanine phosphoribosyl transferase (HPRT) enzyme which normally aids in salvage pathway for hypoxanthine and guanine leading to uric acid overproduction.
  2. Adenosine deaminase deficiency is a disorder of purine catabolism, which results in accumulation of adenosine due to inability of enzyme to convert adenosine and deoxyadenosine to inosine and deoxyinosine.
  3. High levels of adenosine causes an increase in levels of ATP and dATP, and the latter inhibits ribonucleotide reductase causing underproduction of the other deoxribunucleotides compromising DNA replication. Immune cells are sensitive to this and this deficiency causes Severe Combined Immunodeficiency.
  4. Xanthine oxidase deficiency is a disorder of purine catabolism in which there is a buildup of xanthine due to the incapability of the enzyme to produce uric acid from xanthine and hypoxanthine.


Article #1: Enhanced Activity of the Purine Nucleotide Cycle of the Exercising Muscle in Patients with Hyperthyroidism



Article #2: Hypoxanthine-guanine phosophoribosyltransferase (HPRT) deficiency: Lesch-Nyhan syndrome



Article #3: Anaplerotic processes in human skeletal muscle during brief dynamic exercise



Salvage pathways of purine and pyrimidine nucleotides 



Salvage pathways of pyrimidine ribonucleotides 



Salvage pathways of pyrimidine deoxyribonucleotides 



Read Full Post »