Feeds:
Posts
Comments

Archive for the ‘De novo synthesis’ Category

Will Web 3.0 Do Away With Science 2.0? Is Science Falling Behind?

Curator: Stephen J. Williams, Ph.D.

UPDATED 4/06/2022

A while back (actually many moons ago) I had put on two posts on this site:

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

Twitter is Becoming a Powerful Tool in Science and Medicine

Each of these posts were on the importance of scientific curation of findings within the realm of social media and the Web 2.0; a sub-environment known throughout the scientific communities as Science 2.0, in which expert networks collaborated together to produce massive new corpus of knowledge by sharing their views, insights on peer reviewed scientific findings. And through this new media, this process of curation would, in itself generate new ideas and new directions for research and discovery.

The platform sort of looked like the image below:

 

This system lied above a platform of the original Science 1.0, made up of all the scientific journals, books, and traditional literature:

In the old Science 1.0 format, scientific dissemination was in the format of hard print journals, and library subscriptions were mandatory (and eventually expensive). Open Access has tried to ameliorate the expense problem.

Previous image source: PeerJ.com

To index the massive and voluminous research and papers beyond the old Dewey Decimal system, a process of curation was mandatory. The dissemination of this was a natural for the new social media however the cost had to be spread out among numerous players. Journals, faced with the high costs of subscriptions and their only way to access this new media as an outlet was to become Open Access, a movement first sparked by journals like PLOS and PeerJ but then begrudingly adopted throughout the landscape. But with any movement or new adoption one gets the Good the Bad and the Ugly (as described in my cited, above, Clive Thompson article). The bad side of Open Access Journals were

  1. costs are still assumed by the individual researcher not by the journals
  2. the arise of the numerous Predatory Journals

 

Even PeerJ, in their column celebrating an anniversary of a year’s worth of Open Access success stories, lamented the key issues still facing Open Access in practice

  • which included the cost and the rise of predatory journals.

In essence, Open Access and Science 2.0 sprung full force BEFORE anyone thought of a way to defray the costs

 

Can Web 3.0 Finally Offer a Way to Right the Issues Facing High Costs of Scientific Publishing?

What is Web 3.0?

From Wikipedia: https://en.wikipedia.org/wiki/Web3

Web 1.0 and Web 2.0 refer to eras in the history of the Internet as it evolved through various technologies and formats. Web 1.0 refers roughly to the period from 1991 to 2004, where most websites were static webpages, and the vast majority of users were consumers, not producers, of content.[6][7] Web 2.0 is based around the idea of “the web as platform”,[8] and centers on user-created content uploaded to social-networking services, blogs, and wikis, among other services.[9] Web 2.0 is generally considered to have begun around 2004, and continues to the current day.[8][10][4]

Terminology[edit]

The term “Web3”, specifically “Web 3.0”, was coined by Ethereum co-founder Gavin Wood in 2014.[1] In 2020 and 2021, the idea of Web3 gained popularity[citation needed]. Particular interest spiked towards the end of 2021, largely due to interest from cryptocurrency enthusiasts and investments from high-profile technologists and companies.[4][5] Executives from venture capital firm Andreessen Horowitz travelled to Washington, D.C. in October 2021 to lobby for the idea as a potential solution to questions about Internet regulation with which policymakers have been grappling.[11]

Web3 is distinct from Tim Berners-Lee‘s 1999 concept for a semantic web, which has also been called “Web 3.0”.[12] Some writers referring to the decentralized concept usually known as “Web3” have used the terminology “Web 3.0”, leading to some confusion between the two concepts.[2][3] Furthermore, some visions of Web3 also incorporate ideas relating to the semantic web.[13][14]

Concept[edit]

Web3 revolves around the idea of decentralization, which proponents often contrast with Web 2.0, wherein large amounts of the web’s data and content are centralized in the fairly small group of companies often referred to as Big Tech.[4]

Specific visions for Web3 differ, but all are heavily based in blockchain technologies, such as various cryptocurrencies and non-fungible tokens (NFTs).[4] Bloomberg described Web3 as an idea that “would build financial assets, in the form of tokens, into the inner workings of almost anything you do online”.[15] Some visions are based around the concepts of decentralized autonomous organizations (DAOs).[16] Decentralized finance (DeFi) is another key concept; in it, users exchange currency without bank or government involvement.[4] Self-sovereign identity allows users to identify themselves without relying on an authentication system such as OAuth, in which a trusted party has to be reached in order to assess identity.[17]

Reception[edit]

Technologists and journalists have described Web3 as a possible solution to concerns about the over-centralization of the web in a few “Big Tech” companies.[4][11] Some have expressed the notion that Web3 could improve data securityscalability, and privacy beyond what is currently possible with Web 2.0 platforms.[14] Bloomberg states that sceptics say the idea “is a long way from proving its use beyond niche applications, many of them tools aimed at crypto traders”.[15] The New York Times reported that several investors are betting $27 billion that Web3 “is the future of the internet”.[18][19]

Some companies, including Reddit and Discord, have explored incorporating Web3 technologies into their platforms in late 2021.[4][20] After heavy user backlash, Discord later announced they had no plans to integrate such technologies.[21] The company’s CEO, Jason Citron, tweeted a screenshot suggesting it might be exploring integrating Web3 into their platform. This led some to cancel their paid subscriptions over their distaste for NFTs, and others expressed concerns that such a change might increase the amount of scams and spam they had already experienced on crypto-related Discord servers.[20] Two days later, Citron tweeted that the company had no plans to integrate Web3 technologies into their platform, and said that it was an internal-only concept that had been developed in a company-wide hackathon.[21]

Some legal scholars quoted by The Conversation have expressed concerns over the difficulty of regulating a decentralized web, which they reported might make it more difficult to prevent cybercrimeonline harassmenthate speech, and the dissemination of child abuse images.[13] But, the news website also states that, “[decentralized web] represents the cyber-libertarian views and hopes of the past that the internet can empower ordinary people by breaking down existing power structures.” Some other critics of Web3 see the concept as a part of a cryptocurrency bubble, or as an extension of blockchain-based trends that they see as overhyped or harmful, particularly NFTs.[20] Some critics have raised concerns about the environmental impact of cryptocurrencies and NFTs. Others have expressed beliefs that Web3 and the associated technologies are a pyramid scheme.[5]

Kevin Werbach, author of The Blockchain and the New Architecture of Trust,[22] said that “many so-called ‘web3’ solutions are not as decentralized as they seem, while others have yet to show they are scalable, secure and accessible enough for the mass market”, adding that this “may change, but it’s not a given that all these limitations will be overcome”.[23]

David Gerard, author of Attack of the 50 Foot Blockchain,[24] told The Register that “web3 is a marketing buzzword with no technical meaning. It’s a melange of cryptocurrencies, smart contracts with nigh-magical abilities, and NFTs just because they think they can sell some monkeys to morons”.[25]

Below is an article from MarketWatch.com Distributed Ledger series about the different forms and cryptocurrencies involved

From Marketwatch: https://www.marketwatch.com/story/bitcoin-is-so-2021-heres-why-some-institutions-are-set-to-bypass-the-no-1-crypto-and-invest-in-ethereum-other-blockchains-next-year-11639690654?mod=home-page

by Frances Yue, Editor of Distributed Ledger, Marketwatch.com

Clayton Gardner, co-CEO of crypto investment management firm Titan, told Distributed Ledger that as crypto embraces broader adoption, he expects more institutions to bypass bitcoin and invest in other blockchains, such as Ethereum, Avalanche, and Terra in 2022. which all boast smart-contract features.

Bitcoin traditionally did not support complex smart contracts, which are computer programs stored on blockchains, though a major upgrade in November might have unlocked more potential.

“Bitcoin was originally seen as a macro speculative asset by many funds and for many it still is,” Gardner said. “If anything solidifies its use case, it’s a store of value. It’s not really used as originally intended, perhaps from a medium of exchange perspective.”

For institutions that are looking for blockchains that can “produce utility and some intrinsic value over time,” they might consider some other smart contract blockchains that have been driving the growth of decentralized finance and web 3.0, the third generation of the Internet, according to Gardner. 

Bitcoin is still one of the most secure blockchains, but I think layer-one, layer-two blockchains beyond Bitcoin, will handle the majority of transactions and activities from NFT (nonfungible tokens) to DeFi,“ Gardner said. “So I think institutions see that and insofar as they want to put capital to work in the coming months, I think that could be where they just pump the capital.”

Decentralized social media? 

The price of Decentralized Social, or DeSo, a cryptocurrency powering a blockchain that supports decentralized social media applications, surged roughly 74% to about $164 from $94, after Deso was listed at Coinbase Pro on Monday, before it fell to about $95, according to CoinGecko.

In the eyes of Nader Al-Naji, head of the DeSo foundation, decentralized social media has the potential to be “a lot bigger” than decentralized finance.

“Today there are only a few companies that control most of what we see online,” Al-Naji told Distributed Ledger in an interview. But DeSo is “creating a lot of new ways for creators to make money,” Al-Naji said.

“If you find a creator when they’re small, or an influencer, you can invest in that, and then if they become bigger and more popular, you make money and they make and they get capital early on to produce their creative work,” according to AI-Naji.

BitClout, the first application that was created by AI-Naji and his team on the DeSo blockchain, had initially drawn controversy, as some found that they had profiles on the platform without their consent, while the application’s users were buying and selling tokens representing their identities. Such tokens are called “creator coins.”

AI-Naji responded to the controversy saying that DeSo now supports more than 200 social-media applications including Bitclout. “I think that if you don’t like those features, you now have the freedom to use any app you want. Some apps don’t have that functionality at all.”

 

But Before I get to the “selling monkeys to morons” quote,

I want to talk about

THE GOOD, THE BAD, AND THE UGLY

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THE GOOD

My foray into Science 2.0 and then pondering what the movement into a Science 3.0 led me to an article by Dr. Vladimir Teif, who studies gene regulation and the nucleosome, as well as creating a worldwide group of scientists who discuss matters on chromatin and gene regulation in a journal club type format.

For more information on this Fragile Nucleosome journal club see https://generegulation.org/fragile-nucleosome/.

Fragile Nucleosome is an international community of scientists interested in chromatin and gene regulation. Fragile Nucleosome is active in several spaces: one is the Discord server where several hundred scientists chat informally on scientific matters. You can join the Fragile Nucleosome Discord server. Another activity of the group is the organization of weekly virtual seminars on Zoom. Our webinars are usually conducted on Wednesdays 9am Pacific time (5pm UK, 6pm Central Europe). Most previous seminars have been recorded and can be viewed at our YouTube channel. The schedule of upcoming webinars is shown below. Our third activity is the organization of weekly journal clubs detailed at a separate page (Fragile Nucleosome Journal Club).

 

His lab site is at https://generegulation.org/ but had published a paper describing what he felt what the #science2_0 to #science3_0 transition would look like (see his blog page on this at https://generegulation.org/open-science/).

This concept of science 3.0 he had coined back in 2009.  As Dr Teif had mentioned

So essentially I first introduced this word Science 3.0 in 2009, and since then we did a lot to implement this in practice. The Twitter account @generegulation is also one of examples

 

This is curious as we still have an ill defined concept of what #science3_0 would look like but it is a good read nonetheless.

His paper,  entitled “Science 3.0: Corrections to the Science 2.0 paradigm” is on the Cornell preprint server at https://arxiv.org/abs/1301.2522 

 

Abstract

Science 3.0: Corrections to the Science 2.0 paradigm

The concept of Science 2.0 was introduced almost a decade ago to describe the new generation of online-based tools for researchers allowing easier data sharing, collaboration and publishing. Although technically sound, the concept still does not work as expected. Here we provide a systematic line of arguments to modify the concept of Science 2.0, making it more consistent with the spirit and traditions of science and Internet. Our first correction to the Science 2.0 paradigm concerns the open-access publication models charging fees to the authors. As discussed elsewhere, we show that the monopoly of such publishing models increases biases and inequalities in the representation of scientific ideas based on the author’s income. Our second correction concerns post-publication comments online, which are all essentially non-anonymous in the current Science 2.0 paradigm. We conclude that scientific post-publication discussions require special anonymization systems. We further analyze the reasons of the failure of the current post-publication peer-review models and suggest what needs to be changed in Science 3.0 to convert Internet into a large journal club. [bold face added]
In this paper it is important to note the transition of a science 1.0, which involved hard copy journal publications usually only accessible in libraries to a more digital 2.0 format where data, papers, and ideas could be easily shared among networks of scientists.
As Dr. Teif states, the term “Science 2.0” had been coined back in 2009, and several influential journals including Science, Nature and Scientific American endorsed this term and suggested scientists to move online and their discussions online.  However, even at present there are thousands on this science 2.0 platform, Dr Teif notes the number of scientists subscribed to many Science 2.0 networking groups such as on LinkedIn and ResearchGate have seemingly saturated over the years, with little new members in recent times. 
The consensus is that science 2.0 networking is:
  1. good because it multiplies the efforts of many scientists, including experts and adds to the scientific discourse unavailable on a 1.0 format
  2. that online data sharing is good because it assists in the process of discovery (can see this evident with preprint servers, bio-curated databases, Github projects)
  3. open-access publishing is beneficial because free access to professional articles and open-access will be the only publishing format in the future (although this is highly debatable as many journals are holding on to a type of “hybrid open access format” which is not truly open access
  4. only sharing of unfinished works and critiques or opinions is good because it creates visibility for scientists where they can receive credit for their expert commentary

There are a few concerns on Science 3.0 Dr. Teif articulates:

A.  Science 3.0 Still Needs Peer Review

Peer review of scientific findings will always be an imperative in the dissemination of well-done, properly controlled scientific discovery.  As Science 2.0 relies on an army of scientific volunteers, the peer review process also involves an army of scientific experts who give their time to safeguard the credibility of science, by ensuring that findings are reliable and data is presented fairly and properly.  It has been very evident, in this time of pandemic and the rapid increase of volumes of preprint server papers on Sars-COV2, that peer review is critical.  Many of these papers on such preprint servers were later either retracted or failed a stringent peer review process.

Now many journals of the 1.0 format do not generally reward their peer reviewers other than the self credit that researchers use on their curriculum vitaes.  Some journals, like the MDPI journal family, do issues peer reviewer credits which can be used to defray the high publication costs of open access (one area that many scientists lament about the open access movement; where the burden of publication cost lies on the individual researcher).

An issue which is highlighted is the potential for INFORMATION NOISE regarding the ability to self publish on Science 2.0 platforms.

 

The NEW BREED was born in 4/2012

An ongoing effort on this platform, https://pharmaceuticalintelligence.com/, is to establish a scientific methodology for curating scientific findings where one the goals is to assist to quell the information noise that can result from the massive amounts of new informatics and data occurring in the biomedical literature. 

B.  Open Access Publishing Model leads to biases and inequalities in the idea selection

The open access publishing model has been compared to the model applied by the advertising industry years ago and publishers then considered the journal articles as “advertisements”.  However NOTHING could be further from the truth.  In advertising the publishers claim the companies not the consumer pays for the ads.  However in scientific open access publishing, although the consumer (libraries) do not pay for access the burden of BOTH the cost of doing the research and publishing the findings is now put on the individual researcher.  Some of these publishing costs can be as high as $4000 USD per article, which is very high for most researchers.  However many universities try to refund the publishers if they do open access publishing so it still costs the consumer and the individual researcher, limiting the cost savings to either.  

However, this sets up a situation in which young researchers, who in general are not well funded, are struggling with the publication costs, and this sets up a bias or inequitable system which rewards the well funded older researchers and bigger academic labs.

C. Post publication comments and discussion require online hubs and anonymization systems

Many recent publications stress the importance of a post-publication review process or system yet, although many big journals like Nature and Science have their own blogs and commentary systems, these are rarely used.  In fact they show that there are just 1 comment per 100 views of a journal article on these systems.  In the traditional journals editors are the referees of comments and have the ability to censure comments or discourse.  The article laments that comments should be easy to do on journals, like how easy it is to make comments on other social sites, however scientists are not offering their comments or opinions on the matter. 

In a personal experience, 

a well written commentary goes through editors which usually reject a comment like they were rejecting an original research article.  Thus many scientists, I believe, after fashioning a well researched and referenced reply, do not get the light of day if not in the editor’s interests.  

Therefore the need for anonymity is greatly needed and the lack of this may be the hindrance why scientific discourse is so limited on these types of Science 2.0 platforms.  Platforms that have success in this arena include anonymous platforms like Wikipedia or certain closed LinkedIn professional platforms but more open platforms like Google Knowledge has been a failure.

A great example on this platform was a very spirited conversation on LinkedIn on genomics, tumor heterogeneity and personalized medicine which we curated from the LinkedIn discussion (unfortunately LinkedIn has closed many groups) seen here:

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

 

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

In this discussion, it was surprising that over a weekend so many scientists from all over the world contributed to a great discussion on the topic of tumor heterogeneity.

But many feel such discussions would be safer if they were anonymized.  However then researchers do not get any credit for their opinions or commentaries.

A Major problem is how to take the intangible and make them into tangible assets which would both promote the discourse as well as reward those who take their time to improve scientific discussion.

This is where something like NFTs or a decentralized network may become important!

See

https://pharmaceuticalintelligence.com/portfolio-of-ip-assets/

 

UPDATED 5/09/2022

Below is an online @TwitterSpace Discussion we had with some young scientists who are just starting out and gave their thoughts on what SCIENCE 3.0 and the future of dissemination of science might look like, in light of this new Meta Verse.  However we have to define each of these terms in light of Science and not just the Internet as merely a decentralized marketplace for commonly held goods.

This online discussion was tweeted out and got a fair amount of impressions (60) as well as interactors (50).

 For the recording on both Twitter as well as on an audio format please see below

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Set a reminder for my upcoming Space! <a href=”https://t.co/7mOpScZfGN”>https://t.co/7mOpScZfGN</a&gt; <a href=”https://twitter.com/Pharma_BI?ref_src=twsrc%5Etfw”>@Pharma_BI</a&gt; <a href=”https://twitter.com/PSMTempleU?ref_src=twsrc%5Etfw”>@PSMTempleU</a&gt; <a href=”https://twitter.com/hashtag/science3_0?src=hash&amp;ref_src=twsrc%5Etfw”>#science3_0</a&gt; <a href=”https://twitter.com/science2_0?ref_src=twsrc%5Etfw”>@science2_0</a></p>&mdash; Stephen J Williams (@StephenJWillia2) <a href=”https://twitter.com/StephenJWillia2/status/1519776668176502792?ref_src=twsrc%5Etfw”>April 28, 2022</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js&#8221; charset=”utf-8″></script>

 

 

To introduce this discussion first a few startoff material which will fram this discourse

 






The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?

Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.

Each version of Science has their caveats: their benefits as well as deficiencies. This curation and the ongoing discussion is meant to solidy the basis for the new format, along with definitions and determination of structure.

We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!

 

We really need people and organizations to get on top of this new Web 3.0 or metaverse so the similar issues do not get in the way: namely we need to create an organizing structure (maybe as knowledgebases), we need INCENTIVIZED co-curators, and we need ANALYSIS… lots of it!!

Are these new technologies the cure or is it just another headache?

 

There were a few overarching themes whether one was talking about AI, NLP, Virtual Reality, or other new technologies with respect to this new meta verse and a concensus of Decentralized, Incentivized, and Integrated was commonly expressed among the attendees

The Following are some slides from representative Presentations

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Other article of note on this topic on this Open Access Scientific Journal Include:

Electronic Scientific AGORA: Comment Exchanges by Global Scientists on Articles published in the Open Access Journal @pharmaceuticalintelligence.com – Four Case Studies

eScientific Publishing a Case in Point: Evolution of Platform Architecture Methodologies and of Intellectual Property Development (Content Creation by Curation) Business Model 

e-Scientific Publishing: The Competitive Advantage of a Powerhouse for Curation of Scientific Findings and Methodology Development for e-Scientific Publishing – LPBI Group, A Case in Point

@PharmaceuticalIntelligence.com –  A Case Study on the LEADER in Curation of Scientific Findings

Real Time Coverage @BIOConvention #BIO2019: Falling in Love with Science: Championing Science for Everyone, Everywhere

Old Industrial Revolution Paradigm of Education Needs to End: How Scientific Curation Can Transform Education

 

Read Full Post »

Article Title, Author/Curator’s Name and Article Views >1,000, 4/2012 – 1/2019 @pharmaceuticalintelligence.com

 

Reporter: Aviva Lev-Ari, PhD, RN

 

Expert, Author, Writer’s Initials

Name & Bio

Roles

@LPBI Group

LHB Larry Bernstein, MD, FACP,

 

Member of the Board

Expert, Author, Writer – All Specialties of Medicine & Pathology

Content Consultant to Series B,C,D,E

Editor, Series D, Vol. 1, Series E, Vols 2,3,

Co-Editor – BioMed E-Series 13 of the 16 Vols

JDP Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC,

 

Expert, Author, Writer, All Specialties of Medicine, Cardiology and Cardiac Imaging

Content Consultant for SERIES A, Cardiovascular Diseases Co-Editor: Vols 2,3,4,5,6

ALA Aviva Lev-Ari, PhD, RN,

-Ex – SRI, Int’l

-Ex – MITRE

-Ex – McGraw-Hill

Director and Founder

Editor-in-Chief, @pharmaceuticalintelligence.com

Methodologies Developer:

  • Journal Platform Architect,
  • CURATION of Scientific Findings Modules,
  • REALTIME eProceedings Digital 1-Click Publishing

Expert, Author, Writer:

  • Analytics
  • Molecular Cardiology
  • Vascular Biology
TB Tilda Barliya, PhD,

@BIU

Expert, Author, Writer: Nanotechnology for Drug Delivery

Co-Editor, Series C, Vols. 1,2

DN Dror Nir, PhD,

 

Expert, Author, Writer: Cancer & Medical Imaging Algorithms
ZR       
Ziv Raviv, PhD,
@Technion
Expert, Author, Writer: Biological Sciences, Cancer
ZS Zohi Sternberg, PhD, Expert, GUEST Author, Writer

 

Expert, GUEST Author, Writer

Neurological Sciences

SJW Stephen J. Williams, PhD Pharmacology, BSc Toxicology

Ex-Fox Chase

EAW – Cancer Biology

Co-Editor, Series A, Vol.1

Co-Editor, Series B, Genomics: Vols. 1,2

Co-Editor, Series C, Cancer, Vols. 1,2

DS Demet Sag, PhD, CRA, GCP,

 

Expert, Author, Writer: Genome Biology, Immunology, Biological Sciences: Cancer
SS Sudipta Saha, PhD,

 

Expert, Author, Writer: Reproductive Biology, Endocrinology, Bio-Instrumentation

Co-Editor, Series D, Volume 2, Infectious Diseases

AV Aviral Vatsa, PhD, MBBS

 

Expert, Author, Writer: Medical Sciences, Bone Disease, Human Sensation and Cellular Transduction: Physiology and Therapeutics

 

RS Ritu Saxena, PhD,

 

Expert, Author, Writer: Biological Sciences, Bone Disease, Cancer (Lung, Liver)
GST Gail S. Thornton, PhD(c),

Ex-MERCK

Contributing Editor, Author and Medical Writer

Co-Editor, Series E, Vol.1 Voices of Patients

RN Raphael Nir, PhD, MSM, MSc

Ex-ScheringPlough

– Expert, Author, Writer – Member of the Cancer Research Team: Brain Cancer, Liver Cancer, Cytokines

– CSO, SBH Sciences, Inc.

MB Michael R. Briggs, Ph.D.

Ex-Pfizer

– Expert, Author, Writer – Member of the Cancer Research Team: NASH

– CSO, Woodland Biosciences

AK Alan F. Kaul, R.Ph., Pharm.D, M.Sc., M.B.A., FCCP, Expert, Author, Writer

Ex-Director BWH Pharmacy

Expert, Author, Writer: Pharmacology – all aspects of Drug development and dispensation, Policy analyst
AS Anamika Sarkar, PhD,

 

Expert, Author, Writer: Computation Biology & Bioinformatics
MWF Marcus Feldman, PhD,

Stanford University, Biological Sciences, Center for Genomics

751
Research items
51,402
Reads
39,126
Citations
Member of the Board,

Scientific Counsel: Life Sciences,

Content Consultant Series B, Genomics, Vols. 1,2

Co-Editor, Vol. 2, NGS

 

Article Title and Views >1,000,

4/2012 – -1/2018

 

 

 

 

Home page / Archives

Authors

Curators

Reporters

by Name

 

Views

by 

eReaders

 

 

 

 

 

600,145

Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View? LHB 16,720
Do Novel Anticoagulants Affect the PT/INR? The Cases of XARELTO (rivaroxaban) and PRADAXA (dabigatran)

JDP

ALA

13,225
Paclitaxel vs Abraxane (albumin-bound paclitaxel) TB 11,872
Recent comprehensive review on the role of ultrasound in breast cancer management DN 11,715
Clinical Indications for Use of Inhaled Nitric Oxide (iNO) in the Adult Patient Market: Clinical Outcomes after Use, Therapy Demand and Cost of Care ALA 7,045
Apixaban (Eliquis): Mechanism of Action, Drug Comparison and Additional Indications ALA 6,435
Mesothelin: An early detection biomarker for cancer (By Jack Andraka) TB 6,309
Our TEAM ALA 6,213
Akt inhibition for cancer treatment, where do we stand today? ZR 4,744
Biochemistry of the Coagulation Cascade and Platelet Aggregation: Nitric Oxide: Platelets, Circulatory Disorders, and Coagulation Effects LHB 4,508
Newer Treatments for Depression: Monoamine, Neurotrophic Factor & Pharmacokinetic Hypotheses ZS 4,188
AstraZeneca’s WEE1 protein inhibitor AZD1775 Shows Success Against Tumors with a SETD2 mutation SJW 4,128
Confined Indolamine 2, 3 dioxygenase (IDO) Controls the Hemeostasis of Immune Responses for Good and Bad DS 3,678
The Centrality of Ca(2+) Signaling and Cytoskeleton Involving Calmodulin Kinases and Ryanodine Receptors in Cardiac Failure, Arterial Smooth Muscle, Post-ischemic Arrhythmia, Similarities and Differences, and Pharmaceutical Targets LHB 3,652
FDA Guidelines For Developmental and Reproductive Toxicology (DART) Studies for Small Molecules SJW 3,625
Cardiovascular Diseases, Volume One: Perspectives on Nitric Oxide in Disease Mechanisms Multiple

Authors

3,575
Interaction of enzymes and hormones SS 3,546
AMPK Is a Negative Regulator of the Warburg Effect and Suppresses Tumor Growth In Vivo SJW 3,403
Causes and imaging features of false positives and false negatives on 18F-PET/CT in oncologic imaging DN 3,399
Introduction to Transdermal Drug Delivery (TDD) system and nanotechnology TB 3,371
Founder ALA 3,363
BioMed e-Series ALA 3,246
Signaling and Signaling Pathways LHB 3,178
Sexed Semen and Embryo Selection in Human Reproduction and Fertility Treatment SS 3,044
Alternative Designs for the Human Artificial Heart: Patients in Heart Failure – Outcomes of Transplant (donor)/Implantation (artificial) and Monitoring Technologies for the Transplant/Implant Patient in the Community

JDP

LHB

ALA

3,034
The mechanism of action of the drug ‘Acthar’ for Systemic Lupus Erythematosus (SLE) Dr. Karra 3,016
VISION ALA 2,988
Targeting the Wnt Pathway [7.11] LHB 2,961
Bone regeneration and nanotechnology AV 2,922
Pacemakers, Implantable Cardioverter Defibrillators (ICD) and Cardiac Resynchronization Therapy (CRT) ALA 2,892
The History and Creators of Total Parenteral Nutrition LHB 2,846
Funding, Deals & Partnerships ALA 2,708
Paclitaxel: Pharmacokinetic (PK), Pharmacodynamic (PD) and Pharmacogenpmics (PG) TB 2,700
LIK 066, Novartis, for the treatment of type 2 diabetes LHB 2,693
FDA Adds Cardiac Drugs to Watch List – TOPROL-XL® ALA 2,606
Mitochondria: Origin from oxygen free environment, role in aerobic glycolysis, metabolic adaptation LHB 2,579
Nitric Oxide and Platelet Aggregation Dr. Karra 2,550
Treatment Options for Left Ventricular Failure – Temporary Circulatory Support: Intra-aortic balloon pump (IABP) – Impella Recover LD/LP 5.0 and 2.5, Pump Catheters (Non-surgical) vs Bridge Therapy: Percutaneous Left Ventricular Assist Devices (pLVADs) and LVADs (Surgical) LHB 2,549
Isoenzymes in cell metabolic pathways LHB 2,535
“The Molecular pathology of Breast Cancer Progression” TB 2,491
In focus: Circulating Tumor Cells RS 2,465
Nitric Oxide Function in Coagulation – Part II LHB 2,444
Monoclonal Antibody Therapy and Market DS 2,443
Update on FDA Policy Regarding 3D Bioprinted Material SJW 2,410
Journal PharmaceuticalIntelligence.com ALA 2,340
A Primer on DNA and DNA Replication LHB 2,323
Pyrroloquinoline quinone (PQQ) – an unproved supplement LHB 2,294
Integrins, Cadherins, Signaling and the Cytoskeleton LHB 2,265
Evolution of Myoglobin and Hemoglobin LHB 2,251
DNA Structure and Oligonucleotides LHB 2,187
Lipid Metabolism LHB 2,176
Non-small Cell Lung Cancer drugs – where does the Future lie? RS 2,143
Biosimilars: CMC Issues and Regulatory Requirements ALA 2,101
The SCID Pig: How Pigs are becoming a Great Alternate Model for Cancer Research SJW 2,092
About ALA 2,076
Sex Hormones LHB 2,066
CD47: Target Therapy for Cancer TB 2,041
Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes ALA 2,017
Swiss Paraplegic Centre, Nottwil, Switzerland – A World-Class Clinic for Spinal Cord Injuries GST 1,989
Introduction to Tissue Engineering; Nanotechnology applications TB 1,964
Problems of vegetarianism SS 1,940
The History of Infectious Diseases and Epidemiology in the late 19th and 20th Century LHB 1,817
The top 15 best-selling cancer drugs in 2022 & Projected Sales in 2020 of World’s Top Ten Oncology Drugs ALA 1,816
Nanotechnology: Detecting and Treating metastatic cancer in the lymph node TB 1,812
Unique Selling Proposition (USP) — Building Pharmaceuticals Brands ALA 1,809
Wnt/β-catenin Signaling [7.10] LHB 1,777
The role of biomarkers in the diagnosis of sepsis and patient management LHB 1,766
Neonatal Pathophysiology LHB 1,718
Nanotechnology and MRI imaging TB 1,672
Cardiovascular Complications: Death from Reoperative Sternotomy after prior CABG, MVR, AVR, or Radiation; Complications of PCI; Sepsis from Cardiovascular Interventions JDP

ALA

1,659
Ultrasound-based Screening for Ovarian Cancer DN 1,655
Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC, Expert, Author, Writer, Editor & Content Consultant for e-SERIES A: Cardiovascular Diseases JDP 1,653
Scientific and Medical Affairs Chronological CV ALA 1,619
Competition in the Ecosystem of Medical Devices in Cardiac and Vascular Repair: Heart Valves, Stents, Catheterization Tools and Kits for Open Heart and Minimally Invasive Surgery (MIS) ALA 1,609
Stenting for Proximal LAD Lesions ALA 1,603
Mitral Valve Repair: Who is a Patient Candidate for a Non-Ablative Fully Non-Invasive Procedure? JDP

ALA

1,602
Nitric Oxide, Platelets, Endothelium and Hemostasis (Coagulation Part II) LHB 1,597
Outcomes in High Cardiovascular Risk Patients: Prasugrel (Effient) vs. Clopidogrel (Plavix); Aliskiren (Tekturna) added to ACE or added to ARB LHB 1,588
Diet and Diabetes LHB 1,572
Clinical Trials Results for Endothelin System: Pathophysiological role in Chronic Heart Failure, Acute Coronary Syndromes and MI – Marker of Disease Severity or Genetic Determination? ALA 1,546
Dealing with the Use of the High Sensitivity Troponin (hs cTn) Assays LHB 1,540
Biosimilars: Intellectual Property Creation and Protection by Pioneer and by Biosimilar Manufacturers ALA 1,534
Altitude Adaptation LHB 1,527
Baby’s microbiome changing due to caesarean birth and formula feeding SS 1,498
Interview with the co-discoverer of the structure of DNA: Watson on The Double Helix and his changing view of Rosalind Franklin ALA 1,488
Triple Antihypertensive Combination Therapy Significantly Lowers Blood Pressure in Hard-to-Treat Patients with Hypertension and Diabetes ALA 1,476
IDO for Commitment of a Life Time: The Origins and Mechanisms of IDO, indolamine 2, 3-dioxygenase DS 1,469
CRISPR/Cas9: Contributions on Endoribonuclease Structure and Function, Role in Immunity and Applications in Genome Engineering LHB 1,468
Cancer Signaling Pathways and Tumor Progression: Images of Biological Processes in the Voice of a Pathologist Cancer Expert LHB 1,452
Signaling transduction tutorial LHB 1,443
Diagnostic Evaluation of SIRS by Immature Granulocytes LHB 1,440
UPDATED: PLATO Trial on ACS: BRILINTA (ticagrelor) better than Plavix® (clopidogrel bisulfate): Lowering chances of having another heart attack ALA 1,426
Cardio-oncology and Onco-Cardiology Programs: Treatments for Cancer Patients with a History of Cardiovascular Disease ALA 1,424
Nanotechnology and Heart Disease TB 1,419
Aviva Lev-Ari, PhD, RN, Director and Founder ALA 1,416
Cardiotoxicity and Cardiomyopathy Related to Drugs Adverse Effects LHB 1,415
Nitric Oxide and it’s impact on Cardiothoracic Surgery TB 1,405
A New Standard in Health Care – Farrer Park Hospital, Singapore’s First Fully Integrated Healthcare/Hospitality Complex GST 1,402
Mitochondrial Damage and Repair under Oxidative Stress LHB 1,398
Ovarian Cancer and fluorescence-guided surgery: A report TB 1,395
Sex determination vs. Sex differentiation SS 1,393
LPBI Group ALA 1,372
Closing the Mammography gap DN 1,368
Cytoskeleton and Cell Membrane Physiology LHB 1,367
Crucial role of Nitric Oxide in Cancer RS 1,364
Medical 3D Printing ALA 1,332
Survivals Comparison of Coronary Artery Bypass Graft (CABG) and Percutaneous Coronary Intervention (PCI) / Coronary Angioplasty LHB 1,325
The Final Considerations of the Role of Platelets and Platelet Endothelial Reactions in Atherosclerosis and Novel Treatments LHB 1,310
Disruption of Calcium Homeostasis: Cardiomyocytes and Vascular Smooth Muscle Cells: The Cardiac and Cardiovascular Calcium Signaling Mechanism

LHB

JDP

ALA

1,301
Mitochondrial Dynamics and Cardiovascular Diseases RS 1,284
Nitric Oxide and Immune Responses: Part 2 AV 1,282
Liver Toxicity halts Clinical Trial of IAP Antagonist for Advanced Solid Tumors SJW 1,269
Inactivation of the human papillomavirus E6 or E7 gene in cervical carcinoma cells using a bacterial CRISPR/Cas ALA 1,261
Autophagy LHB 1,255
Mitochondrial fission and fusion: potential therapeutic targets? RS 1,246
Summary of Lipid Metabolism LHB 1,239
Nitric Oxide has a Ubiquitous Role in the Regulation of Glycolysis – with a Concomitant Influence on Mitochondrial Function LHB 1,233
Future of Calcitonin…? Dr. Karra 1,211
Transcatheter Aortic Valve Implantation (TAVI): FDA approves expanded indication for two transcatheter heart valves for patients at intermediate risk for death or complications associated with open-heart surgery ALA 1,197
Gamma Linolenic Acid (GLA) as a Therapeutic tool in the Management of Glioblastoma

RN

MB

1,193
Nanotechnology and HIV/AIDS Treatment TB 1,181
Patiromer – New drug for Hyperkalemia ALA 1,179
‘Gamifying’ Drug R&D: Boehringer Ingelheim, Sanofi, Eli Lilly ALA 1,177
A Patient’s Perspective: On Open Heart Surgery from Diagnosis and Intervention to Recovery Guest Author: Ferez S. Nallaseth, Ph.D. 1,173
Assessing Cardiovascular Disease with Biomarkers LHB 1,167
Development Of Super-Resolved Fluorescence Microscopy LHB 1,166
Ubiquitin-Proteosome pathway, Autophagy, the Mitochondrion, Proteolysis and Cell Apoptosis: Part III LHB 1,162
Atrial Fibrillation contributing factor to Death, Autopsy suggests CEO Dave Goldberg had heart arrhythmia before death ALA 1,159
Linus Pauling: On Lipoprotein(a) Patents and On Vitamin C ALA 1,156
Bystolic’s generic Nebivolol – Positive Effect on circulating Endothelial Progenitor Cells Endogenous Augmentation ALA 1,154
The History of Hematology and Related Sciences LHB 1,151
Heroes in Medical Research: Barnett Rosenberg and the Discovery of Cisplatin SJW 1,146
Overview of New Strategy for Treatment of T2DM: SGLT2 Inhibiting Oral Antidiabetic Agents AV 1,143
Imatinib (Gleevec) May Help Treat Aggressive Lymphoma: Chronic Lymphocytic Leukemia (CLL) ALA 1,140
Issues in Personalized Medicine in Cancer: Intratumor Heterogeneity and Branched Evolution Revealed by Multiregion Sequencing SJW 1,137
New England Compounding Center: A Family Business AK 1,120
EpCAM [7.4] LHB 1,113
Amyloidosis with Cardiomyopathy LHB 1,110
Can Mobile Health Apps Improve Oral-Chemotherapy Adherence? The Benefit of Gamification. SJW 1,095
Acoustic Neuroma, Neurinoma or Vestibular Schwannoma: Treatment Options ALA 1,089
Treatment of Refractory Hypertension via Percutaneous Renal Denervation ALA 1,088
Proteomics – The Pathway to Understanding and Decision-making in Medicine LHB 1,085
Low Bioavailability of Nitric Oxide due to Misbalance in Cell Free Hemoglobin in Sickle Cell Disease – A Computational Model AS 1,085
Pancreatic Cancer: Genetics, Genomics and Immunotherapy TB 1,083
A NEW ERA OF GENETIC MANIPULATION   DS 1,075
Targeting Mitochondrial-bound Hexokinase for Cancer Therapy ZR 1,074
Normal and Anomalous Coronary Arteries: Dual Source CT in Cardiothoracic Imaging JDP

ALA

1,062
Transdermal drug delivery (TDD) system and nanotechnology: Part II TB 1,057
Lung Cancer (NSCLC), drug administration and nanotechnology TB 1,046
Pharma World: The Pharmaceutical Industry in Southeast Asia – Pharma CPhI 20-22 March, 2013, Jakarta International Expo, Jakarta, Indonesia ALA 1,045
Nitric Oxide and Sepsis, Hemodynamic Collapse, and the Search for Therapeutic Options LHB 1,044
Targeted delivery of therapeutics to bone and connective tissues: current status and challenges- Part I AV 1,044
Press Coverage ALA 1,036
Carbohydrate Metabolism LHB 1,036
Open Abdominal Aortic Aneurysm (AAA) repair (OAR) vs. Endovascular AAA Repair (EVAR) in Chronic Kidney Disease Patients – Comparison of Surgery Outcomes LHB

ALA

1,032
In focus: Melanoma Genetics RS 1,018
Cholesteryl Ester Transfer Protein (CETP) Inhibitor: Potential of Anacetrapib to treat Atherosclerosis and CAD ALA 1,015
Medical Devices Start Ups in Israel: Venture Capital Sourced Locally – Rainbow Medical (GlenRock) & AccelMed (Arkin Holdings) ALA 1,007
The Development of siRNA-Based Therapies for Cancer ZR 1,003

Other related articles published in this Open Access Online Scientific Journal include the following:

FIVE years of e-Scientific Publishing @pharmaceuticalintellicence.com, Top Articles by Author and by e-Views >1,000, 4/27/2012 to 1/29/2018

https://pharmaceuticalintelligence.com/2017/04/28/five-years-of-e-scientific-publishing-pharmaceuticalintellicence-com-top-articles-by-author-and-by-e-views-1000-4272012-to-4272017/

Read Full Post »

Electronic Scientific AGORA: Comment Exchanges by Global Scientists on Articles published in the Open Access Journal @pharmaceuticalintelligence.com – Four Case Studies

Curator and Editor-in-Chief: Journal and BioMed e-Series, Aviva Lev-Ari, PhD, RN

 

Introduction

Case Study #1: 40 Responses

  • Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View?

Author: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-cancer-a-21st-century-view/

Case Study #2: 26 Responses

·      Knowing the tumor’s size and location, could we target treatment to THE ROI by applying…..

Author: Dror Nir, PhD

https://pharmaceuticalintelligence.com/2012/10/16/knowing-the-tumors-size-and-location-could-we-target-treatment-to-the-roi-by-applying-imaging-guided-intervention/

Case Study #3: 24 Responses

  • Personalized Medicine: Cancer Cell Biology and Minimally Invasive Surgery (MIS)

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/12/01/personalized-medicine-cancer-cell-biology-and-minimally-invasive-surgery-mis/

Case Study #4: 13 Responses

  • Judging the ‘Tumor response’-there is more food for thought

https://pharmaceuticalintelligence.com/2012/12/04/judging-the-tumor-response-there-is-more-food-for-thought/

Conclusions

 

Introduction

Members of our Team published 5,295 articles, in the period between 4/2012 to 4/10/2018, and engaged in Comment Exchanges with Global Scientists Online. 1,412,106 eReaders had viewed our articles and 7,283 scientific comments are included in the Journal Archive.

Team Members’ Profile

Team Profile: DrugDiscovery @LPBI Group – A BioTech Start Up submitted for Funding Competition to MassChallenge Boston 2016 Accelerator

In our Scientific Agora: Multi Scientific Comment exchanges between Global e-Readers Scientists and LPBI’s Scientists/Experts/Authors/Writers take place. In this curation I am presenting four articles that generated dozens of scientific comments and multifaceted exchanges.

The Voice of Aviva Lev-Ari, PhD, RN:

It is my strongest conviction on the merit of the following features of Global SHARING the Scientific product, aka “An Article written by a Scientist” in the Digital Scientific Publishing Age:

  • Every new article published in Open Access Journals contributes to mitigate the most acute challenge of the e-Scientific Publishing industry today: Information Obsolescence – the newness of findings
  • Every new article published in Open Access Journals contributes AND in the Subscription-based Journals contributes to the second most acute challenge of of the e-Scientific Publishing industry today: Information Explosion – the volume of findings
  • The Scientific Agora as presented, below, in four Case Studies is an optimal means for Global SHARING in Real Time scientific knowledge deriving from clinical expertise and lab experience of all the participants in the Agora. REAL TIME means minimization of the negative impact of the most acute challenge of of the e-Scientific Publishing industry today: Information Obsolescence 
  • Knowledge SHARING of our Scientists articles occurs among two FORUMS:

Forum One, is the Scientists that joined the comment exchanges between the Article Author and other members of our Team on a given Scientific product, aka “An Article written by a Scientist”

Forum Two, is the Global Universe of Scientists that (a) are e-mail Followers opting to our Open Access Journal free subscription and (b) eReaders of our Journal that did not yet opt to follow the Journal by e-mail, a robust crowd of +1.4 Million Scientists

  • We mitigate the negative impact of the second most acute challenge of the e-Scientific Publishing industry today: Information Explosion by our own developed and advanced achievements reached in the practice of
  1. Development of the Methodology for Curation of Scientific Findings, Curation of Scientific Content @Leaders in Pharmaceutical Business Intelligence (LPBI) Group, Boston
  2. Application of the Methodology for Curation of Scientific Findings in a BioMed e-Series of 16-Volumes in Medicine and Life Sciences on Amazon.com

electronic Table of Contents (eTOCs) of each Volume in the SIXTEEN Volume BioMed e-Series

WE ARE ON AMAZON.COM

https://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Ddigital-text&field-keywords=Aviva+Lev-Ari&rh=n%3A133140011%2Ck%3AAviva+Lev-Ari

Commentaries on each Volume’s Contribution to Medical Education by L.H. Bernstein, MD, FCAP and by Aviva Lev-Ari, PhD, RN – BioMedical e-Books e-Series: Multiple Volumes in Five e-Series

https://pharmaceuticalintelligence.com/biomed-e-books/commentaries-on-each-volumes-contribution-to-medical-education-by-l-h-bernstein-md-fcap-and-aviva-lev-ari-phd-rn-biomedical-e-books-e-series-multiple-volumes-in-five-e-series/

In 2016, LPBI’s BioMed e-Series was Submitted for Nomination for 2016 COMMUNICATION AWARD FOR EXCELLENCE IN REPORTING SCIENCE, MEDICINE AND ENGINEERING – Reference #: 9076095, on 1/27/2016

https://pharmaceuticalintelligence.com/biomed-e-books/

  • Lastly, It is my strong belief that the Methodology of Curation will become a major tool used in Content creation for Curriculum Development in Medical Schools, in the Life Sciences and Healthcare Allied professions.
  • We have pioneered and showed the way BY EXAMPLE, +5,200 Scientific products, aka “An Article written by a Scientist” constitute our Journal Archive created by content curation
  • More New e-Book Titles are coming in 2018-2019 in LPBI’s BioMed e-Series.
  • More e-Scientific Publishers will use the Methodology of Creation of electronic Table of Contents of e-Books by combing Archives by very experienced subject matter Editors.
  • Global SHARING of Information became best practice for Academic Course Contents in the last ten years
  • On-Line Degrees are spreading in many disciplines and are offered by very many colleges, including the Ivy League
  • Open Access Scientific Journals is the FUTURE of the e-Scientific Publishing Industry.

 

Case Study #1:

  • Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View?

Author: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-cancer-a-21st-century-view/

40 Responses

  1. This is OUTSTANDING.

    Now we need a “shortcliff” post to follow one chart that traces the dynamic process, no reader shall get lost inside any of the process boxes.

  2. Really nice overview and very interesting metabolic changes.
    However, related to the title, the cancerous changes- event always comes first before lactate preferred metabolism comes into place. Right?

  3. This is what has been inferred. So if that is the premise, then the mutation would be the first event. That position has been successfully challenged and also poses a challenge to the proper view of genomic discovery. The real event may very well be the ongoing oxidative stress with aging, and decreased physiochemical reserve.

    I haven’t developed the whole picture. Nitric oxide and nitrosylation contribute to both vascular relaxation and vasoconstriction, which is also different in major organs. The major carriers of H+ are NADH and FADH2. Electron transport is in the ETC in mitochondria. I called attention to the “escape” of energy in aerobic glycolysis. As disease ensues, it appears that lactate generation is preferential as the mitochondrion takes up substrate from gluconeogenesis. Whether it is an endotoxic shock or a highly malignant fast growing tumor, the body becomes trapped in “autocatabolism”. So the tumor progresses, apoptosis is suppressed, and there is a loss of lean body mass.
    All of this is tied to genetic instability.

    We see the genetic instability as first because of the model DNA–RNA–protein. We don’t have a map.

  4. It is a very nice report. I did work for a short time to develop compounds to block the glucose uptake especially using glucose-mimics. I wonder is there any research on this area going on now?

  5. Thanks. I have been researching this exhaustively. There are even many patents trying to damp this down. You were on the right track. The biggest problem has been multidrug resistance and tumor progression.

  6. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? (pharmaceuticalintelligence.com) […]

  7. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? (pharmaceuticalintelligence.com) […]

  8. Martin Canizales • Warburg effect (http://www.cellsignal.com/reference/pathway/warburg_effect.html), is responsible of overactivation of the PI3K… the produced peroxide via free radicals over activate the cyclooxigenase and consequently the PI3K pathway activating there, the most important protein-kinase ever described in the last mmmh, 60-70 years? maybe… to broke the Warburg effect, will stop the PI3K activation (http://www.cellsignal.com/reference/pathway/Akt_PKB.html) then all the cancer protein related with the generation of tumor (pAKT,pP70S6K, Cyclin D1, HIF1, VEGF, EGFrc, GSK, Myc, etc, etc, etc), will get down regulation. That is what happen, when I knock down the new protein-kinase in pancreatic cancer cell lines… stable KD of pancreatic cancer cell lines divide very-very-veeeery slow (by Western blotting, cyclin D1 disapear, VEGF, HIF1a, MyC, pAKT, pP70S6K, GSK, and more and more also has, very-very few consume of glucose [diabetes and cancer]. Stable cells can be without change the media for 3 weeks and the color doesn’t change, cells divide but VERY slow and are alive [longevity]) are not able to generate xenograft tumors related, to scramble shRNA stable cell lines. When, we broke the warburg effect, the protein kinase get’s down as well all the others. Is the same, with bacteria infections…. bacteria infections, has many things to teach us about cancer and cell proliferation (http://www.ncbi.nlm.nih.gov/pubmed/22750098)

  9. edit this on November 12, 2012 at 5:41 PM | Replyhijoprodigoendistancia

    research paper, should be ready (writing) very soon and must be submmited before end this year. Hee hee! you know… end of the world is in December 21 2012

    • The emphasis on p13 and the work on pancreatic cancer is very interesting. I’ll check the references you give. The Warburg effect is still metabolic, and it looks like you are able to suppress the growth of either cancer cells or bacteria. The outstanding question is whether you can get a head start on the SIR transition to sepsis to severe sepsis to MODS, to shock.

      It looks like an article will be necessary after your work is accepted for publication. Thanks a lot for the response.

  10. edit this on November 12, 2012 at 8:52 PM | Replyhijoprodigoendistancia

    Also, when this protein-kinase is over expressed… UCP1 get down..then, less mitochondria, consequently less aerobic cell functions…in adipose tissue, less mitochondria promote the differentiation of BAT (Brown Adipose Tissue) to, WAT (White Agipose Tissue). Has relation with AS160 phosphorylation, Glut4 membrane translocation, promote the GABA phosphorylation (schizophrenia-autism), neuronal differentiation (NPCs:Neural Progenitor Cells), dopaminergic cell differentiation….

  11. edit this on November 12, 2012 at 8:55 PM | Replyhijoprodigoendistancia

    Larry, all comments are part of the second paper.

  12. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  13. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  14. Larry please take a look at Gonzalez et al. The Bioenergetic theory of Carcinogenesis. Med Hypotheses 2012; 79: 433-439 and let me know your thoughts.

  15. […] The Initiation and Growth of Molecular Biology and Genomics, Part I […]

  16. […] Is the Warburg Effect the cause or the effect of cancer: A 21st Century View? […]

  17. edit this on May 22, 2013 at 11:36 PM | ReplyAashir Awan, Phd

    Informative article especially concerning activation of HIF under normoxic conditions. Recently, a paper has come out showing patients showing symptoms of mood disorder having increased expression of Hif1a. Also, there are reports that Hif1a is important in development of certain tissue types.

  18. COLOURS AND LIFE. The basic idea of this theory is that the oxidation of hydrogen and carbon atoms, arising from the degradation of carbohydrates, is by two distinct processes based on oxidation-reduction electron transfer and photochemical process of energy release on the basis of color complementary, predominance of one or another depending on intracellular acid-base balance. I can not understand why nobody wants to do this experiment. I’m sure this assumption hides a truth. Before considering it a fiction to be checked experimentally. I would like to present a research project that concerns me for a long time that I can not experience myself.
    Involuntarily, after many years of searching, I have concluded that in the final biological oxidation, in addition to the oxidation-reduction electron transfer occurs photo-chemical process, accordance to the principle of color complementary energy transfer. I imagine an experiment that might be relevant (sure it can be improved). In my opinion, if this hypothesis proves true, one can control the energy metabolism of the cell by chromotherapy, as the structures involved are photosensitive and colorful. I would be very happy if this experiment were done under your leadership. Sincerely yours Dr. Viorel Bungau

    INNER LIGHT – LIGHT OF LIFE.
    CHROMOTHERAPY AND THE IMPLICATIONS IN THE METABOLISM OF THE NORMAL AND NEOPLASTIC CELL. “Chlorophyll and hemoglobin pigments of life porphyrin structure differs only in that chlorophyll is green because of magnesium atoms in the structure, and hemoglobin in red because of iron atoms in the structure. This is evidence of the common origin of life.” (Heilmeyer) We propose an experiment to prove that the final biological oxidation, in addition to its oxidation-reduction, with formation of H2O and CO2, there is a photochemical effect, by which energy is transferred from the H atom, or C, process is done selct, the colors, complementary colors on the basis of the structures involved are colored (red hemoglobin Fe, Mg chlorophyll green, blue ceruloplasmin Cu, Fe cytochrome oxidase red, green cytochrome oxidase with Cu etc.). The basic idea is that if life pigments (chlorophyll, hemoglobin, cytochromes), which provides energy metabolism of the cell, are colored, we can control their activities through chromotherapy, on the basis of complementary color and energy rebalance the body, with a figured X- body-colored-ray.
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. “Duality of cytochrome oxidase. Proliferation (growth) and Differentiation (maturation) cell.” Cytochrome oxidase is present in two forms, depending on the context of acid-base internal environment : 1.- Form acidic (acidosis), which contains two Iron atoms, will be red, will absorb the additional green energy of the hydrogen atom, derived from carbohydrates, with formation of H2O, metabolic context that will promote cell proliferation. 2.-Form alkaline (alkalosis), containing two copper atoms, will be green, will absorb the additional red energy of the carbon atom, derived from carbohydrates, with formation of CO2, metabolic context that will promote cell differentiation. Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors.
    “Inner Light- Light of Life. Endogenous monochromatic irradiation. Red ferment of Warburg – Green ferment of Warburg.”
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. If the structures involved in biological oxidation finals are colored, then their energy absorption is made based on the principle of complementary colors. If we can determine the absorption spectrum at different levels, we can control energy metabolism by chromotherapy – EXOGENOUS MONOCHROMATIC IRRADIATION . Energy absorption in biological oxidation process itself, based on complementary colors, the structures involved (cytochromes), is the nature of porphyrins, in combination with a metal becomes colored, will absorb the complementary color, corresponding to a specific absorption spectrum, it will be in – ENDOGENOUS MONOCHROMATIC IRRADIATION.
    This entitles us to believe that: In photosynthesis, light absorption and its storage form of carbohydrates, are selected, the colors, as in cellular energy metabolism, absorption of energy by the degradation of carbohydrates, is also done selectively, based on complementary colors. In the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process,based on complementary colors, the first in the electron transfer, the second in the energy transfer. So, in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with iron, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin, green cytochrome oxidase with copper, etc.)
    According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive, leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis. In connection with my research proposal, to prove that the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process, the first in the electron transfer, the second in the energy transfer.
    I SUGGEST TO YOU AN EXPERIMENT:

    TWO PLANTS, A RED (CORAILLE) LIGHT ONLY, IN BASIC MEDIUM, WITH ADDED COPPER, WILL GROW, FLOWER AND FRUIT WILL SHORT TIME, AND THE OTHER ONLY GREEN LIGHT (TOURQUOISE), IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR , WHICH GROWS THROUGHOUT WILL NOT GROW FLOWERS AND FRUIT WILL DO.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH MONOCHROMATIC GREEN ( TOURQUOISE) LIGHT, IN AN ALKALINE MEDIUM, WITH ADDED COPPER, WILL IN REGRESSION OF THE TISSUE CULTURE.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH RED ( CORAILLE) LIGHT, IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR, WILL LEAD TO EXAGERATED AND ANARCHICAL MULTIPLICATION.
    If in photosynthesis is the direct effect of monochromatic irradiation, in the final biological oxidation effect is reversed. Exogenous irradiation with green, induces endogenous irradiation with red, and vice versa. A body with cancer disease will become chemically color “red”- Acid -(pH, Rh, pCO2, alkaline reserve), and in terms of energy, green (X-body-colored-ray). A healthy body will become chemically color “green”-Alkaline – (as evidenced by laboratory), and in terms of energy, red (visible by X-body-colored-ray). Sincerely, Dr. Viorel Bungau

    -In addition-
    “Life balance: Darkness and Light – Water and Fire – Inn and Yang.”

    Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors. If neoplastic cells, because acidosis is overactive acid form of cytochrome oxidase (red with iron atoms), which will absorb the additional green energy hydrogen atom (exclusively), the production of H20 , so water will prevail, in Schizophrenia , neuronal intracellular alkaline environment, will promote the basic form of cytochrome oxidase (green with copper atoms), which will oxidize only carbon atoms, the energy absorption of red (complementary) and production of CO2, so the fire will prevail. Drawn from this theory interdependent relationship between water and fire, of hydrogen(H2O) and carbon(CO2) ,in a controlled relationship with oxygen (O2). If photosynthesis is a process of reducing carbon oxide(CO2) and hydrogen oxide(H2O), by increasing electronegativity of C and H atoms, with the electrons back to oxygen, which will be released in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful. It means that matter and energy in the universe are found in a relationship based on complementary colors, each color of energy, corresponding with a certain chemical structure. In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. The final biological oxidation is achieved through a process of oxidation-reduction, while a photochemical process, based on the principle of complementary colors, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with copper, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin,etc. If satisfied, the final biological oxidation is achieved by a photochemical mechanism (besides the oxidation-reduction), that energy is released based on complementary colors, means that we can control the final biological oxidation mechanism, irreversibly disrupted in cancer, by chromotherapy and correction of acid-base imbalance that underlies this disorder.We reached this conclusions studying the final biological oxidation, for understanding the biochemical mechanism of aerobic glycolysis in cancer. We found that cancer cell, energy metabolism is almost exclusively on hydrogen by oxidative dehydrogenation, due to excessive acidosis , coenzymes which makes carbon oxidation, as dormant (these coenzymes have become inactive). If we accept the nature of these coenzymes chloride (see Warburg ferment red), could be rectivate, by correcting acidosis (because that became leucoderivat), and by chromoterapie, on the basis of complementary colors. According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive (it contains two copper atoms) leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis.

    Malignant transformation occurs by energy metabolism imbalance in power generation purposes in the predominantly (exclusively) of the hydrogen atom of carbon oxidation is impossible. Thus at the cellular level will produce a multiplication (growth) exaggerated (exclusive), energy from hydrogen favoring growth, multiplication, at the expense of differentiation (maturation). Differentiation is achieved by energy obtained by oxidation of the carbon atom can not take, leading to carcinogenesis . The energy metabolism of the cell, an energy source is carbohydrate degradation, which is done by OXIDATIVE DEHYDROGENATION AND OXIDATIVE DECARBOXYLATION , to obtain energy and CO2 and H2O. In normal cells there is a balance between the two energy sources. If cancer cells, oxidation of the carbon atom is not possible, the cell being forced to summarize the only energy source available, of hydrogen. This disorder underlying malignant transformation of cells and affect the whole body, in various degrees, often managing to rebalance process, until at some point it becomes irreversible. The exclusive production of hydrogen energy will cause excessive multiplication, of immature cells, without functional differentiation. Exclusive carbon energy production will lead to hyperdifferentiation, hyperfunctional, multiplication is impossible. Normal cell is between two extremes, between some limits depending on the adjustment factors of homeostasis. Energy from energy metabolism is vital for cell (body). If the energy comes predominantly (or exclusively) by oxidation of the hydrogen atom, green energy, will occur at the structural level (biochemical), acidification of the cellular structures that will turn red, so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “RED”, WITH “GREEN” ENERGY. This background predisposes to accelerated growth, without differentiation, reaching up uncontrolled, anarchical. ENERGY STRUCTURE OF THE CELL BODY WOULD BE INN. If necessary energy cell derived mainly by oxidation of the carbon atom, red energy,cell structures will be colored green, will be alkaline(basic), so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “GREEN”, WITH “RED” ENERGY, on the same principle of complementarity. This context will lead hyperdifferentiation, hyperfunctional ,maturation, and grouth stops. ENERGY STRUCTURE OF THE CELL BODY WOULD BE YANG. If in photosynthesis, porphyrins chemicals group, whic be photosensitivity (their first feature), shows and a great affinity for metals with chelate forming and becoming colored (pigments of life), can absorb monochromatic light complementary, so if these pigments, which constitutes the group of chromoprotheine, in photosynthesis will achieve CO2 and H2O reduction the recovery of C, H respectively, and the issuance of and release of O, atoms as H and C that reduced the energy load, representing carbohydrates, is in the form of solar energy storage, in cellular energy metabolism, processes necessary life, energy will come from the degradation of substances produced in photosynthesis, the carbohydrates, by oxidative dehydrogenation and oxidative decarboxylation, through like substances, which form chelates with the metals, are colored, metals contained in the form of oxides of various colors(green Mg, red Fe, blue Cu,etc.),suffering from complementary color absorption process of reduction with H in case,if the oxidative dehydrogenation, when chelated metal pigment is red, becoming leucoderivat (colorless) by absorbing complementary color (green) of hydrogen, formation of H2O, or C, if the oxidative decarboxylation when chelated metallic pigment is green, energy absorbing additional, red energy of atom C, CO2 production, the process is identical. The process that lies at base cellular energy metabolism, takes place in the final biological oxidation, reducing the O atom in the form of metal oxide, in combination with photosensitive substance, porohyrin, colorful,absorbing complementary color, will reduce the O atom, with H and C, with the production of H2O and CO2. Green energy release of H atom in the oxidative dehydrogenation process, it is a process of”IRRADIATION MONOCHROMATIC ENDOGENOUS WITH GREEN”, and red energy release of C atom in the oxidative decarboxylation process, consists in an “IRRADIATION MONOCHROMATIC ENDOGENOUS WITH RED”. Porphyrin-metal combination in photosynthesis, the chelated form, by absorbing light in the visible spectrum, will be able to reduce to low and turn, C and H respectively, the state of oxide (CO2 and H2O),release of O. The final biological oxidation, the combination of metal-porphyrins in aerobically in the absence of light, will find in the oxidized state, so in the form of porphyrins and metal-oxide, will oxidize to C and H atom of hydrocarbonates, with formation of CO2 and H2O, or rather, will be reduced by C and H atom of hydrocarbonates,formation of CO2 and H2O, by absorbing energy produced by photosynthesis. If we can control the final biological oxidation, we can control cellular growth, thus multiplying, and on the other hand, maturation, so differentiation. Green energy will prevail if the cell (body) which multiplies (during growth), will in case of adult cell (functional) will prevail red energy . The two types of energy, that obtained by oxidative dehydrogenation , which will cause cell multiplication without differentiation , and that obtained by oxidative decarboxylation , which will be to stop proliferation, and will determine the differentiation (maturity, functionality). This process is carried out based on complementary colors, which are coenzymes oxidative dehydrogenation and oxidative decarboxylation is colored . It reveals the importance of acid-base balance, the predominance of the acidic or basic, as an acid structure (red), not only can gain energy from the carbon atom red (the principle of complementarity), but can not assimilate ( under the same principle). It must therefore acid-base balance of internal environment, and alkalinization his intake of organic substances by the electron donor. By alkalinization (addition of electrons) will occur neutralize acid structures, the red, they become leucoderivat, colorless, and inactive, while the basic, which because of acidosis became neutral, colorless and inactive, will be alkaline in electron contribution, will be in green, and will absorb red energy from the carbon atom. So, on two kinds of vital energy, it is clear correlation between the chemical structure of the cell(body),and type of energy that can produce and use. Thus a cell with acidic chemical structure, can produce only energy by oxidative dehydrogenation (green energy), because the acid can only be active coenzymes with acid chemical structure, red, will absorb the complementarity only green energy of hydrogen. Basic structures which should absorb red energy from carbon , are inactive due to acid environment, which in turn chemically in leucoderivat, so colorless structures, inactive. Conversion of these structures to normal, operation by alkalinization could be a long lasting process, therefore, we use parallel chromotherapy, based on the fact that these COENZYMES INVOLVED IN BIOLOGICAL OXIDATION FINALS ARE COLORED AND PHOTOSENSITIVE. Thus, exogenous irradiation with monochromatic green will neutralize, by complementarity, coenzymes red, acidic. In will reactivate alkaline coenzymes, which have become due acidosis leucoderivat, so colorless and inactive. Without producing CO2, carbonic anhydrase can not form H2CO3, severable and thus transferred through mitochondrial membrane. Will accumulate in the respiratory Flavin, OH groups, leading to excessive hydroxylation, followed by consecutive inclusion of amino (NH2). It is thus an imbalance between the hydrogenation-carboxylation and hydroxylation-amination, in favor of the latter. This will predominate AMINATION and HYDROXYLATION at the expense CARBOXYLATION and HYDROGENATION, leading to CONVERSION OF STRUCTURAL PROTEINS IN NUCLEIC ACIDS. Meanwhile, after chemical criteria not genetic, it synthesizes the remaining unoxidized carbon atoms, nucleic bases “de novo” by the same process of hydroxylation-amination, leading to THE SYNTHESIS OF NUCLEIC ACIDS “DE NOVO”. Sincerely yours, Dr. Viorel Bungau viorelbungau20@yahoo.com

    • Dr. Viorel Bungau,

      Your comment is beautiful, clorful, insightful, magestic.

      This article has drawn 3007 views

      Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Total
      2012 242 362 247 851
      2013 283 330 465 390 288 208 187 164 255 274 163 3,007

  19. Dear Mr. Professor, Please join me in this research proposal, as leader, because I can not go alone.
    The basic idea of this theory is that the oxidation of hydrogen and carbon atoms, arising from the degradation of carbohydrates, is by two distinct processes based on oxidation-reduction electron transfer and photochemical process of energy release on the basis of color complementary, predominance of one or another depending on intracellular acid-base balance. I can not understand why nobody wants to do this experiment. I’m sure this assumption hides a truth. Before considering it a fiction to be checked experimentally. I would like to present a research project that concerns me for a long time that I can not experience myself.
    Involuntarily, after many years of searching, I have concluded that in the final biological oxidation, in addition to the oxidation-reduction electron transfer occurs photo-chemical process, accordance to the principle of color complementary energy transfer. I imagine an experiment that might be relevant (sure it can be improved). In my opinion, if this hypothesis proves true, one can control the energy metabolism of the cell by chromotherapy, as the structures involved are photosensitive and colorful. I would be very happy if this experiment were done under your leadership. Sincerely yours, Dr. Viorel Bungau

    INNER LIGHT – LIGHT OF LIFE.
    CHROMOTHERAPY AND THE IMPLICATIONS IN THE METABOLISM OF THE NORMAL AND NEOPLASTIC CELL. “Chlorophyll and hemoglobin pigments of life porphyrin structure differs only in that chlorophyll is green because of magnesium atoms in the structure, and hemoglobin in red because of iron atoms in the structure. This is evidence of the common origin of life.” (Heilmeyer) We propose an experiment to prove that the final biological oxidation, in addition to its oxidation-reduction, with formation of H2O and CO2, there is a photochemical effect, by which energy is transferred from the H atom, or C, process is done selct, the colors, complementary colors on the basis of the structures involved are colored (red hemoglobin Fe, Mg chlorophyll green, blue ceruloplasmin Cu, Fe cytochrome oxidase red, green cytochrome oxidase with Cu etc.). The basic idea is that if life pigments (chlorophyll, hemoglobin, cytochromes), which provides energy metabolism of the cell, are colored, we can control their activities through chromotherapy, on the basis of complementary color and energy rebalance the body, with a figured X- body-colored-ray.
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. “Duality of cytochrome oxidase. Proliferation (growth) and Differentiation (maturation) cell.” Cytochrome oxidase is present in two forms, depending on the context of acid-base internal environment : 1.- Form acidic (acidosis), which contains two Iron atoms, will be red, will absorb the additional green energy of the hydrogen atom, derived from carbohydrates, with formation of H2O, metabolic context that will promote cell proliferation. 2.-Form alkaline (alkalosis), containing two copper atoms, will be green, will absorb the additional red energy of the carbon atom, derived from carbohydrates, with formation of CO2, metabolic context that will promote cell differentiation. Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors.
    “Inner Light- Light of Life. Endogenous monochromatic irradiation. Red ferment of Warburg – Green ferment of Warburg.”
    In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. If the structures involved in biological oxidation finals are colored, then their energy absorption is made based on the principle of complementary colors. If we can determine the absorption spectrum at different levels, we can control energy metabolism by chromotherapy – EXOGENOUS MONOCHROMATIC IRRADIATION . Energy absorption in biological oxidation process itself, based on complementary colors, the structures involved (cytochromes), is the nature of porphyrins, in combination with a metal becomes colored, will absorb the complementary color, corresponding to a specific absorption spectrum, it will be in – ENDOGENOUS MONOCHROMATIC IRRADIATION.
    This entitles us to believe that: In photosynthesis, light absorption and its storage form of carbohydrates, are selected, the colors, as in cellular energy metabolism, absorption of energy by the degradation of carbohydrates, is also done selectively, based on complementary colors. In the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process,based on complementary colors, the first in the electron transfer, the second in the energy transfer. So, in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with iron, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin, green cytochrome oxidase with copper, etc.)
    According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive, leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis. In connection with my research proposal, to prove that the final biological oxidation, in addition to an oxidation-reduction process takes place and a photo-chemical process, the first in the electron transfer, the second in the energy transfer.
    I SUGGEST TO YOU AN EXPERIMENT:

    TWO PLANTS, A RED (CORAILLE) LIGHT ONLY, IN BASIC MEDIUM, WITH ADDED COPPER, WILL GROW, FLOWER AND FRUIT WILL SHORT TIME, AND THE OTHER ONLY GREEN LIGHT (TOURQUOISE), IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR , WHICH GROWS THROUGHOUT WILL NOT GROW FLOWERS AND FRUIT WILL DO.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH MONOCHROMATIC GREEN ( TOURQUOISE) LIGHT, IN AN ALKALINE MEDIUM, WITH ADDED COPPER, WILL IN REGRESSION OF THE TISSUE CULTURE.

    CULTURE OF NEOPLASTIC TISSUE, IRRADIATED WITH RED ( CORAILLE) LIGHT, IN AN ACID MEDIUM, WITH ADDED COPPER CHELATOR, WILL LEAD TO EXAGERATED AND ANARCHICAL MULTIPLICATION.
    If in photosynthesis is the direct effect of monochromatic irradiation, in the final biological oxidation effect is reversed. Exogenous irradiation with green, induces endogenous irradiation with red, and vice versa. A body with cancer disease will become chemically color “red”- Acid -(pH, Rh, pCO2, alkaline reserve), and in terms of energy, green (X-body-colored-ray). A healthy body will become chemically color “green”-Alkaline – (as evidenced by laboratory), and in terms of energy, red (visible by X-body-colored-ray). Sincerely yours, Dr. Viorel Bungau

    -In addition-
    Life balance: Darkness and Light – Water and Fire – Inn and Yang.

    Cytochrome oxidase structure has two atoms of copper. It is known that in conditions of acidosis (oxidative potential), the principle electronegativity metals, copper is removed from combinations of the Iron. So cytochrome oxidase will contain two atoms of iron instead of copper atoms, which changes its oxidation-reduction potential, but (most important), and color. If the copper was green, the iron is red, which radically change its absorption spectrum, based on the principle of complementary colors. If neoplastic cells, because acidosis is overactive acid form of cytochrome oxidase (red with iron atoms), which will absorb the additional green energy hydrogen atom (exclusively), the production of H20 , so water will prevail, in Schizophrenia , neuronal intracellular alkaline environment, will promote the basic form of cytochrome oxidase (green with copper atoms), which will oxidize only carbon atoms, the energy absorption of red (complementary) and production of CO2, so the fire will prevail. Drawn from this theory interdependent relationship between water and fire, of hydrogen(H2O) and carbon(CO2) ,in a controlled relationship with oxygen (O2). If photosynthesis is a process of reducing carbon oxide(CO2) and hydrogen oxide(H2O), by increasing electronegativity of C and H atoms, with the electrons back to oxygen, which will be released in the mitochondria is a process of oxidation of atoms C and H, derived from carbohydrates, with energy release and absorption of its selection (the color), by the structures involved, which is the nature of porphyrins, are photosensitive and colorful. It means that matter and energy in the universe are found in a relationship based on complementary colors, each color of energy, corresponding with a certain chemical structure. In my opinion, at the basis of malign transformation is a disturbance of energetical metabolism, which reached a level that cell can not correct (after having succeeded before, many times), disturbance that affects the whole body in different degrees and requires corection from outside starting from the ideea that the final biological oxidizing takes place through photochemical process with releasing and receieving energy. The final biological oxidation is achieved through a process of oxidation-reduction, while a photochemical process, based on the principle of complementary colors, if we accept as coenzymes involved, containing a metal atom gives them a certain color, depending on the state of oxidation or reduction (red ferment of Warburg with copper, all copper cerloplasmin blue, green chlorophyll magnesium, red iron hemoglobin,etc. If satisfied, the final biological oxidation is achieved by a photochemical mechanism (besides the oxidation-reduction), that energy is released based on complementary colors, means that we can control the final biological oxidation mechanism, irreversibly disrupted in cancer, by chromotherapy and correction of acid-base imbalance that underlies this disorder.We reached this conclusions studying the final biological oxidation, for understanding the biochemical mechanism of aerobic glycolysis in cancer. We found that cancer cell, energy metabolism is almost exclusively on hydrogen by oxidative dehydrogenation, due to excessive acidosis , coenzymes which makes carbon oxidation, as dormant (these coenzymes have become inactive). If we accept the nature of these coenzymes chloride (see Warburg ferment red), could be rectivate, by correcting acidosis (because that became leucoderivat), and by chromoterapie, on the basis of complementary colors. According to the principle electronegativity metals, under certain conditions the acid-base imbalance (acidosis), iron will replace copper in combination , cytocromoxidase became inactive (it contains two copper atoms) leading to changing oxidation-reduction potential, BUT THE COLOR FROM GREEN, TO REED, to block the final biological oxidation and the appearance of aerobic glycolysis.

    Malignant transformation occurs by energy metabolism imbalance in power generation purposes in the predominantly (exclusively) of the hydrogen atom of carbon oxidation is impossible. Thus at the cellular level will produce a multiplication (growth) exaggerated (exclusive), energy from hydrogen favoring growth, multiplication, at the expense of differentiation (maturation). Differentiation is achieved by energy obtained by oxidation of the carbon atom can not take, leading to carcinogenesis . The energy metabolism of the cell, an energy source is carbohydrate degradation, which is done by OXIDATIVE DEHYDROGENATION AND OXIDATIVE DECARBOXYLATION , to obtain energy and CO2 and H2O. In normal cells there is a balance between the two energy sources. If cancer cells, oxidation of the carbon atom is not possible, the cell being forced to summarize the only energy source available, of hydrogen. This disorder underlying malignant transformation of cells and affect the whole body, in various degrees, often managing to rebalance process, until at some point it becomes irreversible. The exclusive production of hydrogen energy will cause excessive multiplication, of immature cells, without functional differentiation. Exclusive carbon energy production will lead to hyperdifferentiation, hyperfunctional, multiplication is impossible. Normal cell is between two extremes, between some limits depending on the adjustment factors of homeostasis. Energy from energy metabolism is vital for cell (body). If the energy comes predominantly (or exclusively) by oxidation of the hydrogen atom, green energy, will occur at the structural level (biochemical), acidification of the cellular structures that will turn red, so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “RED”, WITH “GREEN” ENERGY. This background predisposes to accelerated growth, without differentiation, reaching up uncontrolled, anarchical. ENERGY STRUCTURE OF THE CELL BODY WOULD BE INN. If necessary energy cell derived mainly by oxidation of the carbon atom, red energy,cell structures will be colored green, will be alkaline(basic), so WE HAVE MORPHOLOGICAL AND CHEMICAL STRUCTURES “GREEN”, WITH “RED” ENERGY, on the same principle of complementarity. This context will lead hyperdifferentiation, hyperfunctional ,maturation, and grouth stops. ENERGY STRUCTURE OF THE CELL BODY WOULD BE YANG. If in photosynthesis, porphyrins chemicals group, whic be photosensitivity (their first feature), shows and a great affinity for metals with chelate forming and becoming colored (pigments of life), can absorb monochromatic light complementary, so if these pigments, which constitutes the group of chromoprotheine, in photosynthesis will achieve CO2 and H2O reduction the recovery of C, H respectively, and the issuance of and release of O, atoms as H and C that reduced the energy load, representing carbohydrates, is in the form of solar energy storage, in cellular energy metabolism, processes necessary life, energy will come from the degradation of substances produced in photosynthesis, the carbohydrates, by oxidative dehydrogenation and oxidative decarboxylation, through like substances, which form chelates with the metals, are colored, metals contained in the form of oxides of various colors(green Mg, red Fe, blue Cu,etc.),suffering from complementary color absorption process of reduction with H in case,if the oxidative dehydrogenation, when chelated metal pigment is red, becoming leucoderivat (colorless) by absorbing complementary color (green) of hydrogen, formation of H2O, or C, if the oxidative decarboxylation when chelated metallic pigment is green, energy absorbing additional, red energy of atom C, CO2 production, the process is identical. The process that lies at base cellular energy metabolism, takes place in the final biological oxidation, reducing the O atom in the form of metal oxide, in combination with photosensitive substance, porohyrin, colorful,absorbing complementary color, will reduce the O atom, with H and C, with the production of H2O and CO2. Green energy release of H atom in the oxidative dehydrogenation process, it is a process of”IRRADIATION MONOCHROMATIC ENDOGENOUS WITH GREEN”, and red energy release of C atom in the oxidative decarboxylation process, consists in an “IRRADIATION MONOCHROMATIC ENDOGENOUS WITH RED”. Porphyrin-metal combination in photosynthesis, the chelated form, by absorbing light in the visible spectrum, will be able to reduce to low and turn, C and H respectively, the state of oxide (CO2 and H2O),release of O. The final biological oxidation, the combination of metal-porphyrins in aerobically in the absence of light, will find in the oxidized state, so in the form of porphyrins and metal-oxide, will oxidize to C and H atom of hydrocarbonates, with formation of CO2 and H2O, or rather, will be reduced by C and H atom of hydrocarbonates,formation of CO2 and H2O, by absorbing energy produced by photosynthesis. If we can control the final biological oxidation, we can control cellular growth, thus multiplying, and on the other hand, maturation, so differentiation. Green energy will prevail if the cell (body) which multiplies (during growth), will in case of adult cell (functional) will prevail red energy . The two types of energy, that obtained by oxidative dehydrogenation , which will cause cell multiplication without differentiation , and that obtained by oxidative decarboxylation , which will be to stop proliferation, and will determine the differentiation (maturity, functionality). This process is carried out based on complementary colors, which are coenzymes oxidative dehydrogenation and oxidative decarboxylation is colored . It reveals the importance of acid-base balance, the predominance of the acidic or basic, as an acid structure (red), not only can gain energy from the carbon atom red (the principle of complementarity), but can not assimilate ( under the same principle). It must therefore acid-base balance of internal environment, and alkalinization his intake of organic substances by the electron donor. By alkalinization (addition of electrons) will occur neutralize acid structures, the red, they become leucoderivat, colorless, and inactive, while the basic, which because of acidosis became neutral, colorless and inactive, will be alkaline in electron contribution, will be in green, and will absorb red energy from the carbon atom. So, on two kinds of vital energy, it is clear correlation between the chemical structure of the cell(body),and type of energy that can produce and use. Thus a cell with acidic chemical structure, can produce only energy by oxidative dehydrogenation (green energy), because the acid can only be active coenzymes with acid chemical structure, red, will absorb the complementarity only green energy of hydrogen. Basic structures which should absorb red energy from carbon , are inactive due to acid environment, which in turn chemically in leucoderivat, so colorless structures, inactive. Conversion of these structures to normal, operation by alkalinization could be a long lasting process, therefore, we use parallel chromotherapy, based on the fact that these COENZYMES INVOLVED IN BIOLOGICAL OXIDATION FINALS ARE COLORED AND PHOTOSENSITIVE. Thus, exogenous irradiation with monochromatic green will neutralize, by complementarity, coenzymes red, acidic. In will reactivate alkaline coenzymes, which have become due acidosis leucoderivat, so colorless and inactive. Without producing CO2, carbonic anhydrase can not form H2CO3, severable and thus transferred through mitochondrial membrane. Will accumulate in the respiratory Flavin, OH groups, leading to excessive hydroxylation, followed by consecutive inclusion of amino (NH2). It is thus an imbalance between the hydrogenation-carboxylation and hydroxylation-amination, in favor of the latter. This will predominate AMINATION and HYDROXYLATION at the expense CARBOXYLATION and HYDROGENATION, leading to CONVERSION OF STRUCTURAL PROTEINS IN NUCLEIC ACIDS. Meanwhile, after chemical criteria not genetic, it synthesizes the remaining unoxidized carbon atoms, nucleic bases “de novo” by the same process of hydroxylation-amination, leading to THE SYNTHESIS OF NUCLEIC ACIDS “DE NOVO”. Sincerely yours, Dr. Viorel Bungau viorelbungau20@yahoo.com

  20. […] Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View? Author: Larry H. Bernstein, MD, FCAP https://pharmaceuticalintelligence.com/2012/10/17/is-the-warburg-effect-the-cause-or-the-effect-of-ca&#8230; […]

Case Study #2:

·      Knowing the tumor’s size and location, could we target treatment to THE ROI by applying…..

Author: Dror Nir, PhD

https://pharmaceuticalintelligence.com/2012/10/16/knowing-the-tumors-size-and-location-could-we-target-treatment-to-the-roi-by-applying-imaging-guided-intervention/

26 Responses

  1. GREAT work.

    I’ll read and comment later on

  2. Highlights of The 2012 Johns Hopkins Prostate Disorders White Paper include:

    A promising new treatment for men with frequent nighttime urination.
    Answers to 8 common questions about sacral nerve stimulation for lower urinary tract symptoms.
    Surprising research on the link between smoking and prostate cancer recurrence.
    How men who drink 6 cups of coffee a day or more may reduce their risk of aggressive prostate cancer.
    Should you have a PSA screening test? Answers to important questions on the controversial USPSTF recommendation.
    Watchful waiting or radical prostatectomy for men with early-stage prostate cancer? What the research suggests.
    A look at state-of-the-art surveillance strategies for men on active surveillance for prostate cancer.
    Locally advanced prostate cancer: Will you benefit from radiation and hormones?
    New drug offers hope for men with metastatic castrate-resistant prostate cancer.
    Behavioral therapy for incontinence: Why it might be worth a try.

    You’ll also get the latest news on benign prostatic enlargement (BPE), also known as benign prostatic hyperplasia (BPH) and prostatitis:
    What’s your Prostate Symptom Score? Here’s a quick quiz you can take right now to determine if you should seek treatment for your enlarged prostate.
    Your surgical choices: a close look at simple prostatectomy, transurethral prostatectomy and open prostatectomy.
    New warnings about 5-alpha-reductase inhibitors and aggressive prostate cancer.

  3. Promising technique.

    INCORE pointed out in detail about the general problem judging response and the stil missing quality in standardization:

    http://www.futuremedicine.com/doi/abs/10.2217/fon.12.78?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dwww.ncbi.nlm.nih.gov

    I did research in response evaluation and prediction for about 15y now and being honest: neither the clinical, nor the molecular biological data proved significant benefit in changing a strategy in patient diagnosis and / or treatment. I would state: this brings us back on the ground and not upon the sky. Additionally it means: we have to ´work harder on that and the WHO has to take responsibility: clinicians use a reponse classification without knowing, that this is just related to “ONE” experiment from the 70’s and that this experiment never had been rescrutinized (please read the Editorial I provided – we use a clinical response classification since more than 30 years worldwide (Miller et al. Cancer 1981) but it is useless !

  4. Dr. BB

    Thank you for your comment.
    Dr. Nir will reply to your comment.
    Regarding the Response Classification in use, it seems that the College of Oncology should champion a task force to revisit the Best Practice in use in this domain and issue a revised version or a new effort for a a new classification system for Clinical Response to treatment in Cancer.

  5. I’m sorry that I was looking for this paper again earlier and didn’t find it. I answered my view on your article earlier.

    This is a method demonstration, but not a proof of concept by any means. It adds to the cacophany of approaches, and in a much larger study would prove to be beneficial in treatment, but not a cure for serious prostate cancer because it is unlikely that it can get beyond the margin, and also because there is overtreatment at the cutoff of PSA at 4.0. There is now a proved prediction model that went to press some 4 months ago. I think that the pathologist has to see the tissue, and the standard in pathology now is for any result that is cancer, two pathologist or a group sitting together should see it. It’s not an easy diagnosis.

    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital, & Alexander Stojadinovic. Tumor response criteria: are they appropriate? Future Oncol. (2012) 8(8), 903–906. 10.2217/FON.12.78. ISSN 1479-6694.

    ..Tumor heterogeneity is a ubiquitous phemomenon. In particular, there are important differences among the various types of gastrointestinal (GI) cancers in terms of tumor biology, treatment response and prognosis.

    ..This forms the principal basis for targeted therapy directed by tumor-specific testing at either the gene or protein level. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    ..Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?

    ..In 2000 theNCI with the European Association for Research and Treatment of Cancer, proposed a replacement of 2D measurement with a decrease in the largest tumor diameter by 30% in one dimension. Tumor response as defined would translate into a 50% decrease for a spherical lesion

    ..We must rethink how we may better determine treatment response in a reliable, reproducible way that is aimed at individualizing the therapy of cancer patients.

    ..we must change the tools we use to assess tumor response. The new modality should be based on empirical evidence that translates into relevant and meaningful clinical outcome data.

    ..This becomes a conundrum of sorts in an era of ‘minimally invasive treatment’.

    ..integrated multidisciplinary panel of international experts – not sure that that will do it

    Several years ago i heard Stamey present the totality of his work at Stanford, with great disappointment over hsPSA that they pioneered in. The outcomes were disappointing.

    I had published a review of all of our cases reviewed for 1 year with Marguerite Pinto.
    There’s a reason that the physicians line up outside of her office for her opinion.
    The review showed that a PSA over 24 ng/ml is predictive of bone metastasis. Any result over 10 was as likely to be prostatitis, BPH or cancer.

    I did an ordinal regression in the next study with Gustave Davis using a bivariate ordinal regression to predict lymph node metastasis using the PSA and the Gleason score. It was better than any univariate model, but there was no followup.

    I reviewed a paper for Clin Biochemistry (Elsevier) on a new method for PSA, very different than what we are familiar with. It was the most elegant paper I have seen in the treatment of the data. The model could predict post procedural time to recurrence to 8 years.

    • I hope we are in agreement on the fact that imaging guided interventions are needed for better treatment outcome. The point I’m trying to make in this post is that people are investing in developing imaging guided intervention and it is making progress.

      Over diagnosis and over treatment is another issue altogether. I think that many of my other posts are dealing with that.

  6. Tumor response criteria: are they appropriate?
    Future Oncology 2012; 8(8): 903-906 , DOI 10.2217/fon.12.78 (doi:10.2217/fon.12.78)
    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital & Alexander Stojadinovic
    Tumor heterogeneity is a problematic because of differences among the metabolic variety among types of gastrointestinal (GI) cancers, confounding treatment response and prognosis.
    This is in response to … a group of investigators from Sunnybrook Health Sciences Centre, University of Toronto, Ontario, Canada who evaluate the feasibility and safety of magnetic resonance (MR) imaging–controlled transurethral ultrasound therapy for prostate cancer in humans. Their study’s objective was to prove that using real-time MRI guidance of HIFU treatment is possible and it guarantees that the location of ablated tissue indeed corresponds to the locations planned for treatment.
    1. There is a difference between expected response to esophageal or gastric neoplasms both biologically and in expected response, even given variability within a class. The expected time to recurrence is usually longer in the latter case, but the confounders are – age at time of discovery, biological time of detection, presence of lymph node and/or distant metastasis, microscopic vascular invasion.
    2. There is a long latent period in abdominal cancers before discovery, unless a lesion is found incidentally in surgery for another reason.
    3. The undeniable reality is that it is not difficult to identify the main lesion, but it is difficult to identify adjacent epithelium that is at risk (transitional or pretransitional). Pathologists have a very good idea about precancerous cervical neoplasia.

    The heterogeneity rests within each tumor and between the primary and metastatic sites, which is expected to be improved by targeted therapy directed by tumor-specific testing. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    The heterogeneity is a problem that will take at least another decade to unravel because of the number of signaling pathways and the crosstalk that is specifically at issue.

    I must refer back to the work of Frank Dixon, Herschel Sidransky, and others, who did much to develop a concept of neoplasia occurring in several stages – minimal deviation and fast growing. These have differences in growth rates, anaplasia, and biochemical. This resembles the multiple “hit” theory that is described in “systemic inflammatory” disease leading to a final stage, as in sepsis and septic shock.
    In 1920, Otto Warburg received the Nobel Prize for his work on respiration. He postulated that cancer cells become anaerobic compared with their normal counterpart that uses aerobic respiration to meet most energy needs. He attributed this to “mitochondrial dysfunction. In fact, we now think that in response to oxidative stress, the mitochondrion relies on the Lynen Cycle to make more cells and the major source of energy becomes glycolytic, which is at the expense of the lean body mass (muscle), which produces gluconeogenic precursors from muscle proteolysis (cancer cachexia). There is a loss of about 26 ATP ~Ps in the transition.
    The mitochondrial gene expression system includes the mitochondrial genome, mitochondrial ribosomes, and the transcription and translation machinery needed to regulate and conduct gene expression as well as mtDNA replication and repair. Machinery involved in energetics includes the enzymes of the Kreb’s citric acid or TCA (tricarboxylic acid) cycle, some of the enzymes involved in fatty acid catabolism (β-oxidation), and the proteins needed to help regulate these systems. The inner membrane is central to mitochondrial physiology and, as such, contains multiple protein systems of interest. These include the protein complexes involved in the electron transport component of oxidative phosphorylation and proteins involved in substrate and ion transport.
    Mitochondrial roles in, and effects on, cellular homeostasis extend far beyond the production of ATP, but the transformation of energy is central to most mitochondrial functions. Reducing equivalents are also used for anabolic reactions. The energy produced by mitochondria is most commonly thought of to come from the pyruvate that results from glycolysis, but it is important to keep in mind that the chemical energy contained in both fats and amino acids can also be converted into NADH and FADH2 through mitochondrial pathways. The major mechanism for harvesting energy from fats is β-oxidation; the major mechanism for harvesting energy from amino acids and pyruvate is the TCA cycle. Once the chemical energy has been transformed into NADH and FADH2 (also discovered by Warburg and the basis for a second Nobel nomination in 1934), these compounds are fed into the mitochondrial respiratory chain.
    The hydroxyl free radical is extremely reactive. It will react with most, if not all, compounds found in the living cell (including DNA, proteins, lipids and a host of small molecules). The hydroxyl free radical is so aggressive that it will react within 5 (or so) molecular diameters from its site of production. The damage caused by it, therefore, is very site specific. The reactions of the hydroxyl free radical can be classified as hydrogen abstraction, electron transfer, and addition.
    The formation of the hydroxyl free radical can be disastrous for living organisms. Unlike superoxide and hydrogen peroxide, which are mainly controlled enzymatically, the hydroxyl free radical is far too reactive to be restricted in such a way – it will even attack antioxidant enzymes. Instead, biological defenses have evolved that reduce the chance that the hydroxyl free radical will be produced and, as nothing is perfect, to repair damage.
    Currently, some endogenous markers are being proposed as useful measures of total “oxidative stress” e.g., 8-hydroxy-2’deoxyguanosine in urine. The ideal scavenger must be non-toxic, have limited or no biological activity, readily reach the site of hydroxyl free radical production (i.e., pass through barriers such as the blood-brain barrier), react rapidly with the free radical, be specific for this radical, and neither the scavenger nor its product(s) should undergo further metabolism.
    Nitric oxide has a single unpaired electron in its π*2p antibonding orbital and is therefore paramagnetic. This unpaired electron also weakens the overall bonding seen in diatomic nitrogen molecules so that the nitrogen and oxygen atoms are joined by only 2.5 bonds. The structure of nitric oxide is a resonance hybrid of two forms.
    In living organisms nitric oxide is produced enzymatically. Microbes can generate nitric oxide by the reduction of nitrite or oxidation of ammonia. In mammals nitric oxide is produced by stepwise oxidation of L-arginine catalyzed by nitric oxide synthase (NOS). Nitric oxide is formed from the guanidino nitrogen of the L-arginine in a reaction that consumes five electrons and requires flavin adenine dinucleotide (FAD), flavin mononucleotide (FMN) tetrahydrobiopterin (BH4), and iron protoporphyrin IX as cofactors. The primary product of NOS activity may be the nitroxyl anion that is then converted to nitric oxide by electron acceptors.
    The thiol-disulfide redox couple is very important to oxidative metabolism. GSH is a reducing cofactor for glutathione peroxidase, an antioxidant enzyme responsible for the destruction of hydrogen peroxide. Thiols and disulfides can readily undergo exchange reactions, forming mixed disulfides. Thiol-disulfide exchange is biologically very important. For example, GSH can react with protein cystine groups and influence the correct folding of proteins, and it GSH may play a direct role in cellular signaling through thiol-disulfide exchange reactions with membrane bound receptor proteins (e.g., the insulin receptor complex), transcription factors (e.g., nuclear factor κB), and regulatory proteins in cells. Conditions that alter the redox status of the cell can have important consequences on cellular function.
    So the complexity of life is not yet unraveled.

    Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?
    The goal is not just complete response. Histopathological response seems to be related post-treatment histopathological assessment but it is not free from the challenge of accurately determining treatment response, as this method cannot delineate whether or not there are residual cancer cells. Functional imaging to assess metabolic response by 18-fluorodeoxyglucose PET also has its limits, as the results are impacted significantly by several variables:

    • tumor type
    • sizing
    • doubling time
    • anaplasia?
    • extent of tumor necrosis
    • type of antitumor therapy and the time when response was determined.
    The new modality should be based on individualized histopathology as well as tumor molecular, genetic and functional characteristics, and individual patients’ characteristics, a greater challenge in an era of ‘minimally invasive treatment’.
    This listing suggests that for every cancer the following data has to be collected (except doubling time). If there are five variables, the classification based on these alone would calculate to be very sizable based on Eugene Rypka’s feature extraction and classification. But looking forward, time to remission and disease free survival are additionally important. Treatment for cure is not the endpoint, but the best that can be done is to extend the time of survival to a realistic long term goal and retain a quality of life.

    Brücher BLDM, Piso P, Verwaal V et al. Peritoneal carcinomatosis: overview and basics. Cancer Invest.30(3),209–224 (2012).
    Brücher BLDM, Swisher S, Königsrainer A et al. Response to preoperative therapy in upper gastrointestinal cancers. Ann. Surg. Oncol.16(4),878–886 (2009).
    Miller AB, Hoogstraten B, Staquet M, Winkler A. Reporting results of cancer treatment. Cancer47(1),207–214 (1981).
    Therasse P, Arbuck SG, Eisenhauer EA et al. New guidelines to evaluate the response to treatment in solid tumors. European Organization for Research and Treatment of Cancer, National Cancer Institute of the United States, National Cancer Institute of Canada. J. Natl Cancer Inst.92(3),205–216 (2000).
    Brücher BLDM, Becker K, Lordick F et al. The clinical impact of histopathological response assessment by residual tumor cell quantification in esophageal squamous cell carcinomas. Cancer106(10),2119–2127 (2006).

    • Dr. Larry,

      Thank you for this comment.

      Please carry it as a stand alone post, Dr. Ritu will refer to it and reference it in her FORTHCOMING pst on Tumor Response which will integrate multiple sources.

      Please execute my instruction

      Thank you

    • Thank you Larry for this educating comment. It explains very well why the Canadian investigators did not try to measure therapy response!

      What they have demonstrated is the technological feasibility of coupling a treatment device to an imaging device and use that in order to guide the treatment to the right place.

      the issue of “choice of treatment” to which you are referring is not in the scope of this publication.
      The point is: if one treatment modality can be guided, other can as well! This should encourage others, to try and develop imaging-based treatment guidance systems.

  7. The crux of the matter in terms of capability is that the cancer tissue, adjacent tissue, and the fibrous matrix are all in transition to the cancerous state. It is taught to resect leaving “free margin”, which is better aesthetically, and has had success in breast surgery. The dilemma is that the patient may return, but how soon?

    • Correct. The philosophy behind lumpectomy is preserving quality of life. It was Prof. Veronesi (IEO) who introduced this method 30 years ago noticing that in the majority of cases, the patient will die from something else before presenting recurrence of breast cancer..

      It is well established that when the resection margins are declared by a pathologist (as good as he/she could be) as “free of cancer”, the probability of recurrence is much lower than otherwise.

  8. Dr. Larry,

    To assist Dr. Ritu, PLEASE carry ALL your comments above into a stand alone post and ADD to it your comment on my post on MIS

    Thank you

  9. Great post! Dr. Nir, can the ultrasound be used in conjunction with PET scanning as well to determine a spatial and functional map of the tumor. With a disease like serous ovarian cancer we typically see an intraperitoneal carcimatosis and it appears that clinicians are wanting to use fluorogenic probes and fiberoptics to visualize the numerous nodules located within the cavity Also is the technique being used mainy for surgery or image guided radiotherapy or can you use this for detecting response to various chemotherapeutics including immunotherapy.

    • Ultrasound can and is actually used in conjunction with PET scanning in many cases. The choice of using ultrasound is always left to the practitioner! Being a non-invasive, low cost procedure makes the use of ultrasound a non-issue. The down-side is that because it is so easy to access and operate, nobody bothers to develop rigorous guidelines about using it and the benefits remains the property of individuals.

      In regards to the possibility of screening for ovarian cancer and characterising pelvic masses using ultrasound I can refer you to scientific work in which I was involved:

      1. VAES (E.), MANCHANDA (R), AUTIER, NIR (R), NIR (D.), BLEIBERG (H.), ROBERT (A.), MENON (U.). Differential diagnosis of adnexal masses: Sequential use of the Risk of Malignancy Index and a novel computer aided diagnostic tool. Published in Ultrasound in Obstetrics & Gynecology. Issue 1 (January). Vol. 39. Page(s): 91-98.

      2. VAES (E.), MANCHANDA (R), NIR (R), NIR (D.), BLEIBERG (H.), AUTIER (P.), MENON (U.), ROBERT (A.). Mathematical models to discriminate between benign and malignant adnexal masses: potential diagnostic improvement using Ovarian HistoScanning. Published in International Journal of Gynecologic Cancer (IJGC). Issue 1. Vol. 21. Page(s): 35-43.

      3. LUCIDARME (0.), AKAKPO (J.-P.), GRANBERG (S.), SIDERI (M.), LEVAVI (H.), SCHNEIDER (A.), AUTIER (P.), NIR (D.), BLEIBERG (H.). A new computer aided diagnostic tool for non-invasive characterisation of malignant ovarian masses: Results of a multicentre validation study. Published in European Radiology. Issue 8. Vol. 20. Page(s): 1822-1830.

      Dror Nir, PhD
      Managing partner

      BE: +32 (0) 473 981896
      UK: +44 (0) 2032392424

      web: http://www.radbee.com/
      blogs: http://radbee.wordpress.com/ ; http://www.MedDevOnIce.com

       

  10. totally true and i am very thankfull for these briliant comments.

    Remember: 10years ago: every cancer researcher stated: “look at the tumor cells only – forget the stroma”. The era of laser-captured tumor-cell dissection started. Now , everyone knows: it is a system we are looking at and viewing and analyzing tumor cells only is really not enough.

    So if we would be honest, we would have to declare, that all data, which had been produced 13-8years ago, dealing with laser capture microdissection, that al these data would need a re-scrutinization, cause the influence of the stroma was “forgotten”. I ‘d better not try thinking about the waisted millions of dollars.

    If we keep on being honest: the surgeon looks at the “free margin” in a kind of reductionable model, the pathologist is more the control instance. I personally see the pathologist as “the control instance” of surgical quality. Therefore, not the wish of the surgeon is important, the objective way of looking into problems or challenges. Can a pathologist always state, if a R0-resection had been performed ?

    The use of the Resectability Classification:
    There had been many many surrogate marker analysis – nothing new. BUT never a real substantial well tought through structured analysis had been done: mm by mm by mm by mm and afterwards analyzing that by a ROC analysis. BUt against which goldstandard ? If you perform statistically a ROC analysis – you need a golstandard to compare to. Therefore what is the real R0-resectiòn? It had been not proven. It just had been stated in this or that tumor entity that this or that margin with this margin free mm distance or that mm distance is enough and it had been declared as “the real R0-classification”. In some organs it is very very difficult and we all (surgeons, pathologists, clinicians) that we always get to the limit, if we try interpretating the R-classification within the 3rd dimension. Often it is just declared and stated.

    Otherwise: if lymph nodes are negative it does not mean, lymph nodes are really negative, cause up to 38% for example in upper GI cancers have histological negative lymph nodes, but immunohistochemical positive lymph nodes. And this had been also shown by Stojadinovic at el analyzing the ultrastaging in colorectal cancer. So the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.

    AS we see: cancer has multifactorial reasons and it is necessary taking the challenge performing high sophisticated research by a multifactorial and multidisciplinary manner.

    Again my deep and heartly thanks for that productive and excellent discussion !

    • Dr. BB,

      Thank you for your comment.

      Multidisciplinary perspectives have illuminated the discussion on the pages of this Journal.

      Eager to review Dr. Ritu’s forthcoming paper – the topic has a life of its own and is embodied in your statement:

      “the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.”

    • Thank you BB for your comment. You have touched the core limitation of healthcare professionals: how do we know that we know!

      Do we have a reference to each of the test we perform?

      Do we have objective and standardise quality measures?

      Do we see what is out-there or are we imagining?

      The good news: Everyday we can “think” that we learned something new. We should be happy with that, even if it is means that we learned that yesterday’s truth is not true any-more and even if we are likely to be wrong again…:)

      But still, in the last decades, lots of progress was made….

  11. Dr. Nir,
    I thoroughly enjoyed reading your post as well as the comments that your post has attracted. There were different points of view and each one has been supported with relevant examples in the literature. Here are my two cents on the discussion:
    The paper that you have discussed had the objective of finding out whether real-time MRI guidance of treatment was even possible and if yes, and also if the treatment could be performed in accurate location of the ROI? The data reveals they were pretty successful in accomplishing their objective and of course that gives hope to the imaging-based targeted therapies.
    Whether the ROI is defined properly and if it accounts for the real tumor cure, is a different question. Role of pathologists and the histological analysis they bring about to the table cannot be ruled out, and the absence of a defined line between the tumor and the stromal region in the vicinity is well documented. However, that cannot rule out the value and scope of imaging-based detection and targeted therapy. After all, it is seminal in guiding minimally invasive surgery. As another arm of personalized medicine-based cure for cancer, molecular biologists at MD Anderson have suggested molecular and genetic profiling of the tumor to determine genetic aberrations on the basis of which matched-therapy could be recommended to patients. When phase I trial was conducted, the results were obtained were encouraging and the survival rate was better in matched-therapy patients compared to unmatched patients. Therefore, everytime there is more to consider when treating a cancer patient and who knows a combination of views of oncologists, pathologists, molecular biologists, geneticists, surgeons would device improvised protocols for diagnosis and treatment. It is always going to be complicated and generalizations would never give an answer. Smart interpretations of therapies – imaging-based or others would always be required!

    Ritu

    • Dr. Nir,
      One of your earlier comments, mentioned the non invasiveness of ultrasound, thus, it’s prevalence in use for diagnosis.

      This may be true for other or all areas with the exception of Mammography screening. In this field, an ultrasound is performed only if a suspected area of calcification or a lump has been detected in the routine or patient-initiated request for ad hoc mammography secondery to patient complain of pain or patient report of suspected lump.

      Ultrasound in this field repserents ascalation and two radiologists review.

      It in routine use for Breast biopsy.

    • Thanks Ritu for this supporting comment. The worst enemy of finding solutions is doing nothing while using the excuse of looking for the “ultimate solution” . Personally, I believe in combining methods and improving clinical assessment based on information fusion. Being able to predict, and then timely track the response to treatment is a major issue that affects survival and costs!

  12. […] Dror Nir authored a post on October 16th titled “Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu…” The article attracted a lot of comments from readers including researchers and oncologists and […]

  13. […] ted in this area; New clinical results supports Imaging-guidance for targeted prostate biopsy and Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imagin… Today I report on recent publication presenting the advantage of using targeted trans-perineal […]

  14. […] ted in this area; New clinical results supports Imaging-guidance for targeted prostate biopsy and Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu… Today I report on recent publication presenting the advantage of using targeted trans-perineal […]

  15. […] Knowing the tumor’s size and location, could we target treatment to THE ROI by applying imaging-gu… […]

Case Study #3:

  • Personalized Medicine: Cancer Cell Biology and Minimally Invasive Surgery (MIS)

Curator: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2012/12/01/personalized-medicine-cancer-cell-biology-and-minimally-invasive-surgery-mis

 

This article generated a Scientific Exchange of 24 Comments, some scholarly comments are quite lengthy

24 Responses

  1. GREAT work.

    I’ll read and comment later on

  2. Highlights of The 2012 Johns Hopkins Prostate Disorders White Paper include:

    A promising new treatment for men with frequent nighttime urination.
    Answers to 8 common questions about sacral nerve stimulation for lower urinary tract symptoms.
    Surprising research on the link between smoking and prostate cancer recurrence.
    How men who drink 6 cups of coffee a day or more may reduce their risk of aggressive prostate cancer.
    Should you have a PSA screening test? Answers to important questions on the controversial USPSTF recommendation.
    Watchful waiting or radical prostatectomy for men with early-stage prostate cancer? What the research suggests.
    A look at state-of-the-art surveillance strategies for men on active surveillance for prostate cancer.
    Locally advanced prostate cancer: Will you benefit from radiation and hormones?
    New drug offers hope for men with metastatic castrate-resistant prostate cancer.
    Behavioral therapy for incontinence: Why it might be worth a try.

    You’ll also get the latest news on benign prostatic enlargement (BPE), also known as benign prostatic hyperplasia (BPH) and prostatitis:
    What’s your Prostate Symptom Score? Here’s a quick quiz you can take right now to determine if you should seek treatment for your enlarged prostate.
    Your surgical choices: a close look at simple prostatectomy, transurethral prostatectomy and open prostatectomy.
    New warnings about 5-alpha-reductase inhibitors and aggressive prostate cancer.

  3. Promising technique.

    INCORE pointed out in detail about the general problem judging response and the stil missing quality in standardization:

    http://www.futuremedicine.com/doi/abs/10.2217/fon.12.78?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub%3dwww.ncbi.nlm.nih.gov

    I did research in response evaluation and prediction for about 15y now and being honest: neither the clinical, nor the molecular biological data proved significant benefit in changing a strategy in patient diagnosis and / or treatment. I would state: this brings us back on the ground and not upon the sky. Additionally it means: we have to ´work harder on that and the WHO has to take responsibility: clinicians use a reponse classification without knowing, that this is just related to “ONE” experiment from the 70′s and that this experiment never had been rescrutinized (please read the Editorial I provided – we use a clinical response classification since more than 30 years worldwide (Miller et al. Cancer 1981) but it is useless !

  4. Dr. BB

    Thank you for your comment.
    Dr. Nir will reply to your comment.
    Regarding the Response Classification in use, it seems that the College of Oncology should champion a task force to revisit the Best Practice in use in this domain and issue a revised version or a new effort for a a new classification system for Clinical Response to treatment in Cancer.

  5. I’m sorry that I was looking for this paper again earlier and didn’t find it. I answered my view on your article earlier.

    This is a method demonstration, but not a proof of concept by any means. It adds to the cacophany of approaches, and in a much larger study would prove to be beneficial in treatment, but not a cure for serious prostate cancer because it is unlikely that it can get beyond the margin, and also because there is overtreatment at the cutoff of PSA at 4.0. There is now a proved prediction model that went to press some 4 months ago. I think that the pathologist has to see the tissue, and the standard in pathology now is for any result that is cancer, two pathologist or a group sitting together should see it. It’s not an easy diagnosis.

    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital, & Alexander Stojadinovic. Tumor response criteria: are they appropriate? Future Oncol. (2012) 8(8), 903–906. 10.2217/FON.12.78. ISSN 1479-6694.

    ..Tumor heterogeneity is a ubiquitous phemomenon. In particular, there are important differences among the various types of gastrointestinal (GI) cancers in terms of tumor biology, treatment response and prognosis.

    ..This forms the principal basis for targeted therapy directed by tumor-specific testing at either the gene or protein level. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    ..Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?

    ..In 2000 theNCI with the European Association for Research and Treatment of Cancer, proposed a replacement of 2D measurement with a decrease in the largest tumor diameter by 30% in one dimension. Tumor response as defined would translate into a 50% decrease for a spherical lesion

    ..We must rethink how we may better determine treatment response in a reliable, reproducible way that is aimed at individualizing the therapy of cancer patients.

    ..we must change the tools we use to assess tumor response. The new modality should be based on empirical evidence that translates into relevant and meaningful clinical outcome data.

    ..This becomes a conundrum of sorts in an era of ‘minimally invasive treatment’.

    ..integrated multidisciplinary panel of international experts – not sure that that will do it

    Several years ago i heard Stamey present the totality of his work at Stanford, with great disappointment over hsPSA that they pioneered in. The outcomes were disappointing.

    I had published a review of all of our cases reviewed for 1 year with Marguerite Pinto.
    There’s a reason that the physicians line up outside of her office for her opinion.
    The review showed that a PSA over 24 ng/ml is predictive of bone metastasis. Any result over 10 was as likely to be prostatitis, BPH or cancer.

    I did an ordinal regression in the next study with Gustave Davis using a bivariate ordinal regression to predict lymph node metastasis using the PSA and the Gleason score. It was better than any univariate model, but there was no followup.

    I reviewed a paper for Clin Biochemistry (Elsevier) on a new method for PSA, very different than what we are familiar with. It was the most elegant paper I have seen in the treatment of the data. The model could predict post procedural time to recurrence to 8 years.

    • I hope we are in agreement on the fact that imaging guided interventions are needed for better treatment outcome. The point I’m trying to make in this post is that people are investing in developing imaging guided intervention and it is making progress.

      Over diagnosis and over treatment is another issue altogether. I think that many of my other posts are dealing with that.

  6. Tumor response criteria: are they appropriate?
    Future Oncology 2012; 8(8): 903-906 , DOI 10.2217/fon.12.78 (doi:10.2217/fon.12.78)
    Björn LDM Brücher, Anton Bilchik, Aviram Nissan, Itzhak Avital & Alexander Stojadinovic
    Tumor heterogeneity is a problematic because of differences among the metabolic variety among types of gastrointestinal (GI) cancers, confounding treatment response and prognosis.
    This is in response to … a group of investigators from Sunnybrook Health Sciences Centre, University of Toronto, Ontario, Canada who evaluate the feasibility and safety of magnetic resonance (MR) imaging–controlled transurethral ultrasound therapy for prostate cancer in humans. Their study’s objective was to prove that using real-time MRI guidance of HIFU treatment is possible and it guarantees that the location of ablated tissue indeed corresponds to the locations planned for treatment.
    1. There is a difference between expected response to esophageal or gastric neoplasms both biologically and in expected response, even given variability within a class. The expected time to recurrence is usually longer in the latter case, but the confounders are – age at time of discovery, biological time of detection, presence of lymph node and/or distant metastasis, microscopic vascular invasion.
    2. There is a long latent period in abdominal cancers before discovery, unless a lesion is found incidentally in surgery for another reason.
    3. The undeniable reality is that it is not difficult to identify the main lesion, but it is difficult to identify adjacent epithelium that is at risk (transitional or pretransitional). Pathologists have a very good idea about precancerous cervical neoplasia.

    The heterogeneity rests within each tumor and between the primary and metastatic sites, which is expected to be improved by targeted therapy directed by tumor-specific testing. Despite rapid advances in our understanding of targeted therapy for GI cancers, the impact on cancer survival has been marginal.

    The heterogeneity is a problem that will take at least another decade to unravel because of the number of signaling pathways and the crosstalk that is specifically at issue.

    I must refer back to the work of Frank Dixon, Herschel Sidransky, and others, who did much to develop a concept of neoplasia occurring in several stages – minimal deviation and fast growing. These have differences in growth rates, anaplasia, and biochemical. This resembles the multiple “hit” theory that is described in “systemic inflammatory” disease leading to a final stage, as in sepsis and septic shock.
    In 1920, Otto Warburg received the Nobel Prize for his work on respiration. He postulated that cancer cells become anaerobic compared with their normal counterpart that uses aerobic respiration to meet most energy needs. He attributed this to “mitochondrial dysfunction. In fact, we now think that in response to oxidative stress, the mitochondrion relies on the Lynen Cycle to make more cells and the major source of energy becomes glycolytic, which is at the expense of the lean body mass (muscle), which produces gluconeogenic precursors from muscle proteolysis (cancer cachexia). There is a loss of about 26 ATP ~Ps in the transition.
    The mitochondrial gene expression system includes the mitochondrial genome, mitochondrial ribosomes, and the transcription and translation machinery needed to regulate and conduct gene expression as well as mtDNA replication and repair. Machinery involved in energetics includes the enzymes of the Kreb’s citric acid or TCA (tricarboxylic acid) cycle, some of the enzymes involved in fatty acid catabolism (β-oxidation), and the proteins needed to help regulate these systems. The inner membrane is central to mitochondrial physiology and, as such, contains multiple protein systems of interest. These include the protein complexes involved in the electron transport component of oxidative phosphorylation and proteins involved in substrate and ion transport.
    Mitochondrial roles in, and effects on, cellular homeostasis extend far beyond the production of ATP, but the transformation of energy is central to most mitochondrial functions. Reducing equivalents are also used for anabolic reactions. The energy produced by mitochondria is most commonly thought of to come from the pyruvate that results from glycolysis, but it is important to keep in mind that the chemical energy contained in both fats and amino acids can also be converted into NADH and FADH2 through mitochondrial pathways. The major mechanism for harvesting energy from fats is β-oxidation; the major mechanism for harvesting energy from amino acids and pyruvate is the TCA cycle. Once the chemical energy has been transformed into NADH and FADH2 (also discovered by Warburg and the basis for a second Nobel nomination in 1934), these compounds are fed into the mitochondrial respiratory chain.
    The hydroxyl free radical is extremely reactive. It will react with most, if not all, compounds found in the living cell (including DNA, proteins, lipids and a host of small molecules). The hydroxyl free radical is so aggressive that it will react within 5 (or so) molecular diameters from its site of production. The damage caused by it, therefore, is very site specific. The reactions of the hydroxyl free radical can be classified as hydrogen abstraction, electron transfer, and addition.
    The formation of the hydroxyl free radical can be disastrous for living organisms. Unlike superoxide and hydrogen peroxide, which are mainly controlled enzymatically, the hydroxyl free radical is far too reactive to be restricted in such a way – it will even attack antioxidant enzymes. Instead, biological defenses have evolved that reduce the chance that the hydroxyl free radical will be produced and, as nothing is perfect, to repair damage.
    Currently, some endogenous markers are being proposed as useful measures of total “oxidative stress” e.g., 8-hydroxy-2’deoxyguanosine in urine. The ideal scavenger must be non-toxic, have limited or no biological activity, readily reach the site of hydroxyl free radical production (i.e., pass through barriers such as the blood-brain barrier), react rapidly with the free radical, be specific for this radical, and neither the scavenger nor its product(s) should undergo further metabolism.
    Nitric oxide has a single unpaired electron in its π*2p antibonding orbital and is therefore paramagnetic. This unpaired electron also weakens the overall bonding seen in diatomic nitrogen molecules so that the nitrogen and oxygen atoms are joined by only 2.5 bonds. The structure of nitric oxide is a resonance hybrid of two forms.
    In living organisms nitric oxide is produced enzymatically. Microbes can generate nitric oxide by the reduction of nitrite or oxidation of ammonia. In mammals nitric oxide is produced by stepwise oxidation of L-arginine catalyzed by nitric oxide synthase (NOS). Nitric oxide is formed from the guanidino nitrogen of the L-arginine in a reaction that consumes five electrons and requires flavin adenine dinucleotide (FAD), flavin mononucleotide (FMN) tetrahydrobiopterin (BH4), and iron protoporphyrin IX as cofactors. The primary product of NOS activity may be the nitroxyl anion that is then converted to nitric oxide by electron acceptors.
    The thiol-disulfide redox couple is very important to oxidative metabolism. GSH is a reducing cofactor for glutathione peroxidase, an antioxidant enzyme responsible for the destruction of hydrogen peroxide. Thiols and disulfides can readily undergo exchange reactions, forming mixed disulfides. Thiol-disulfide exchange is biologically very important. For example, GSH can react with protein cystine groups and influence the correct folding of proteins, and it GSH may play a direct role in cellular signaling through thiol-disulfide exchange reactions with membrane bound receptor proteins (e.g., the insulin receptor complex), transcription factors (e.g., nuclear factor κB), and regulatory proteins in cells. Conditions that alter the redox status of the cell can have important consequences on cellular function.
    So the complexity of life is not yet unraveled.

    Can tumor response to therapy be predicted, thereby improving the selection of patients for cancer treatment?
    The goal is not just complete response. Histopathological response seems to be related post-treatment histopathological assessment but it is not free from the challenge of accurately determining treatment response, as this method cannot delineate whether or not there are residual cancer cells. Functional imaging to assess metabolic response by 18-fluorodeoxyglucose PET also has its limits, as the results are impacted significantly by several variables:

    • tumor type
    • sizing
    • doubling time
    • anaplasia?
    • extent of tumor necrosis
    • type of antitumor therapy and the time when response was determined.
    The new modality should be based on individualized histopathology as well as tumor molecular, genetic and functional characteristics, and individual patients’ characteristics, a greater challenge in an era of ‘minimally invasive treatment’.
    This listing suggests that for every cancer the following data has to be collected (except doubling time). If there are five variables, the classification based on these alone would calculate to be very sizable based on Eugene Rypka’s feature extraction and classification. But looking forward, time to remission and disease free survival are additionally important. Treatment for cure is not the endpoint, but the best that can be done is to extend the time of survival to a realistic long term goal and retain a quality of life.

    Brücher BLDM, Piso P, Verwaal V et al. Peritoneal carcinomatosis: overview and basics. Cancer Invest.30(3),209–224 (2012).
    Brücher BLDM, Swisher S, Königsrainer A et al. Response to preoperative therapy in upper gastrointestinal cancers. Ann. Surg. Oncol.16(4),878–886 (2009).
    Miller AB, Hoogstraten B, Staquet M, Winkler A. Reporting results of cancer treatment. Cancer47(1),207–214 (1981).
    Therasse P, Arbuck SG, Eisenhauer EA et al. New guidelines to evaluate the response to treatment in solid tumors. European Organization for Research and Treatment of Cancer, National Cancer Institute of the United States, National Cancer Institute of Canada. J. Natl Cancer Inst.92(3),205–216 (2000).
    Brücher BLDM, Becker K, Lordick F et al. The clinical impact of histopathological response assessment by residual tumor cell quantification in esophageal squamous cell carcinomas. Cancer106(10),2119–2127 (2006).

    • Dr. Larry,

      Thank you for this comment.

      Please carry it as a stand alone post, Dr. Ritu will refer to it and reference it in her FORTHCOMING pst on Tumor Response which will integrate multiple sources.

      Please execute my instruction

      Thank you

    • Thank you Larry for this educating comment. It explains very well why the Canadian investigators did not try to measure therapy response!

      What they have demonstrated is the technological feasibility of coupling a treatment device to an imaging device and use that in order to guide the treatment to the right place.

      the issue of “choice of treatment” to which you are referring is not in the scope of this publication.
      The point is: if one treatment modality can be guided, other can as well! This should encourage others, to try and develop imaging-based treatment guidance systems.

  7. The crux of the matter in terms of capability is that the cancer tissue, adjacent tissue, and the fibrous matrix are all in transition to the cancerous state. It is taught to resect leaving “free margin”, which is better aesthetically, and has had success in breast surgery. The dilemma is that the patient may return, but how soon?

    • Correct. The philosophy behind lumpectomy is preserving quality of life. It was Prof. Veronesi (IEO) who introduced this method 30 years ago noticing that in the majority of cases, the patient will die from something else before presenting recurrence of breast cancer..

      It is well established that when the resection margins are declared by a pathologist (as good as he/she could be) as “free of cancer”, the probability of recurrence is much lower than otherwise.

  8. Dr. Larry,

    To assist Dr. Ritu, PLEASE carry ALL your comments above into a stand alone post and ADD to it your comment on my post on MIS

    Thank you

  9. Great post! Dr. Nir, can the ultrasound be used in conjunction with PET scanning as well to determine a spatial and functional map of the tumor. With a disease like serous ovarian cancer we typically see an intraperitoneal carcimatosis and it appears that clinicians are wanting to use fluorogenic probes and fiberoptics to visualize the numerous nodules located within the cavity Also is the technique being used mainy for surgery or image guided radiotherapy or can you use this for detecting response to various chemotherapeutics including immunotherapy.

    • Ultrasound can and is actually used in conjunction with PET scanning in many cases. The choice of using ultrasound is always left to the practitioner! Being a non-invasive, low cost procedure makes the use of ultrasound a non-issue. The down-side is that because it is so easy to access and operate, nobody bothers to develop rigorous guidelines about using it and the benefits remains the property of individuals.

      In regards to the possibility of screening for ovarian cancer and characterising pelvic masses using ultrasound I can refer you to scientific work in which I was involved:

      1. VAES (E.), MANCHANDA (R), AUTIER, NIR (R), NIR (D.), BLEIBERG (H.), ROBERT (A.), MENON (U.). Differential diagnosis of adnexal masses: Sequential use of the Risk of Malignancy Index and a novel computer aided diagnostic tool. Published in Ultrasound in Obstetrics & Gynecology. Issue 1 (January). Vol. 39. Page(s): 91-98.

      2. VAES (E.), MANCHANDA (R), NIR (R), NIR (D.), BLEIBERG (H.), AUTIER (P.), MENON (U.), ROBERT (A.). Mathematical models to discriminate between benign and malignant adnexal masses: potential diagnostic improvement using Ovarian HistoScanning. Published in International Journal of Gynecologic Cancer (IJGC). Issue 1. Vol. 21. Page(s): 35-43.

      3. LUCIDARME (0.), AKAKPO (J.-P.), GRANBERG (S.), SIDERI (M.), LEVAVI (H.), SCHNEIDER (A.), AUTIER (P.), NIR (D.), BLEIBERG (H.). A new computer aided diagnostic tool for non-invasive characterisation of malignant ovarian masses: Results of a multicentre validation study. Published in European Radiology. Issue 8. Vol. 20. Page(s): 1822-1830.

      Dror Nir, PhD
      Managing partner

      BE: +32 (0) 473 981896
      UK: +44 (0) 2032392424

      web: http://www.radbee.com/
      blogs: http://radbee.wordpress.com/ ; http://www.MedDevOnIce.com

  10. totally true and i am very thankfull for these briliant comments.

    Remember: 10years ago: every cancer researcher stated: “look at the tumor cells only – forget the stroma”. The era of laser-captured tumor-cell dissection started. Now , everyone knows: it is a system we are looking at and viewing and analyzing tumor cells only is really not enough.

    So if we would be honest, we would have to declare, that all data, which had been produced 13-8years ago, dealing with laser capture microdissection, that al these data would need a re-scrutinization, cause the influence of the stroma was “forgotten”. I ‘d better not try thinking about the waisted millions of dollars.

    If we keep on being honest: the surgeon looks at the “free margin” in a kind of reductionable model, the pathologist is more the control instance. I personally see the pathologist as “the control instance” of surgical quality. Therefore, not the wish of the surgeon is important, the objective way of looking into problems or challenges. Can a pathologist always state, if a R0-resection had been performed ?

    The use of the Resectability Classification:
    There had been many many surrogate marker analysis – nothing new. BUT never a real substantial well tought through structured analysis had been done: mm by mm by mm by mm and afterwards analyzing that by a ROC analysis. BUt against which goldstandard ? If you perform statistically a ROC analysis – you need a golstandard to compare to. Therefore what is the real R0-resectiòn? It had been not proven. It just had been stated in this or that tumor entity that this or that margin with this margin free mm distance or that mm distance is enough and it had been declared as “the real R0-classification”. In some organs it is very very difficult and we all (surgeons, pathologists, clinicians) that we always get to the limit, if we try interpretating the R-classification within the 3rd dimension. Often it is just declared and stated.

    Otherwise: if lymph nodes are negative it does not mean, lymph nodes are really negative, cause up to 38% for example in upper GI cancers have histological negative lymph nodes, but immunohistochemical positive lymph nodes. And this had been also shown by Stojadinovic at el analyzing the ultrastaging in colorectal cancer. So the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.

    AS we see: cancer has multifactorial reasons and it is necessary taking the challenge performing high sophisticated research by a multifactorial and multidisciplinary manner.

    Again my deep and heartly thanks for that productive and excellent discussion !

    • Dr. BB,

      Thank you for your comment.

      Multidisciplinary perspectives have illuminated the discussion on the pages of this Journal.

      Eager to review Dr. Ritu’s forthcoming paper – the topic has a life of its own and is embodied in your statement:

      “the 4th dimension of cancer – the lymph nodes / the lymphatic vessel invasion are much more important than just a TNM classification, which unfortunately does often not reflect real tumor biology.”

    • Thank you BB for your comment. You have touched the core limitation of healthcare professionals: how do we know that we know!

      Do we have a reference to each of the test we perform?

      Do we have objective and standardise quality measures?

      Do we see what is out-there or are we imagining?

      The good news: Everyday we can “think” that we learned something new. We should be happy with that, even if it is means that we learned that yesterday’s truth is not true any-more and even if we are likely to be wrong again…:)

      But still, in the last decades, lots of progress was made….

  11. Dr. Nir,
    I thoroughly enjoyed reading your post as well as the comments that your post has attracted. There were different points of view and each one has been supported with relevant examples in the literature. Here are my two cents on the discussion:
    The paper that you have discussed had the objective of finding out whether real-time MRI guidance of treatment was even possible and if yes, and also if the treatment could be performed in accurate location of the ROI? The data reveals they were pretty successful in accomplishing their objective and of course that gives hope to the imaging-based targeted therapies.
    Whether the ROI is defined properly and if it accounts for the real tumor cure, is a different question. Role of pathologists and the histological analysis they bring about to the table cannot be ruled out, and the absence of a defined line between the tumor and the stromal region in the vicinity is well documented. However, that cannot rule out the value and scope of imaging-based detection and targeted therapy. After all, it is seminal in guiding minimally invasive surgery. As another arm of personalized medicine-based cure for cancer, molecular biologists at MD Anderson have suggested molecular and genetic profiling of the tumor to determine genetic aberrations on the basis of which matched-therapy could be recommended to patients. When phase I trial was conducted, the results were obtained were encouraging and the survival rate was better in matched-therapy patients compared to unmatched patients. Therefore, everytime there is more to consider when treating a cancer patient and who knows a combination of views of oncologists, pathologists, molecular biologists, geneticists, surgeons would device improvised protocols for diagnosis and treatment. It is always going to be complicated and generalizations would never give an answer. Smart interpretations of therapies – imaging-based or others would always be required!

    Ritu

    • Dr. Nir,
      One of your earlier comments, mentioned the non invasiveness of ultrasound, thus, it’s prevalence in use for diagnosis.

      This may be true for other or all areas with the exception of Mammography screening. In this field, an ultrasound is performed only if a suspected area of calcification or a lump has been detected in the routine or patient-initiated request for ad hoc mammography secondery to patient complain of pain or patient report of suspected lump.

      Ultrasound in this field repserents ascalation and two radiologists review.

      It in routine use for Breast biopsy.

    • Thanks Ritu for this supporting comment. The worst enemy of finding solutions is doing nothing while using the excuse of looking for the “ultimate solution” . Personally, I believe in combining methods and improving clinical assessment based on information fusion. Being able to predict, and then timely track the response to treatment is a major issue that affects survival and costs!

Case Study #4:

  • Judging the ‘Tumor response’-there is more food for thought

https://pharmaceuticalintelligence.com/2012/12/04/judging-the-tumor-response-there-is-more-food-for-thought/

13 Responses

  1. Dr. Sanexa
    you have brought up an interesting and very clinically relevant point: what is the best measurement of response and 2) how perspectives among oncologists and other professionals differ on this issues given their expertise in their respective subspecialties (immunologist versus oncologist. The advent of functional measurements of tumors (PET etc.) seems extremely important in the therapeutic use AND in the development of these types of compounds since usually a response presents (in cases of solid tumors) as either a lack of growth of the tumor or tumor shrinkage. Did the authors include an in-depth discussion of the rapidity of onset of resistance with these types of compounds?
    Thanks for the posting.

  2. Dr. Williams,
    Thanks for your comment on the post. The editorial brings to attention a view that although PET and other imaging methods provide vital information on tumor growth, shrinkage in response to a therapy, however, there are more aspects to consider including genetic and molecular characteristics of tumor.
    It was an editorial review and the authors did not include any in-depth discussion on the rapidity of onset of resistance with these types of compounds as the focus was primarily on interpreting tumor response.
    I am glad you found the contents of the write-up informative.
    Thanks again!
    Ritu

  3. Thank you for your wonderful comment and interpretation. Dr.Sanexa made a brilliant comment.

    May I allow myself putting my finger deeper into this wound ? Cancer patients deserve it.

    It had been already pointed out by international experts from Munich, Tokyo, Hong-Kong and Houston, dealing with upper GI cancer, that the actual response criteria are not appropriate and moreover: the clinical response criteria in use seem rather to function as an alibi, than helping to differentiate and / or discriminate tumor biology (Ann Surg Oncol 2009):

    http://www.ncbi.nlm.nih.gov/pubmed/19194759

    The response data in a phase-II-trial (one tumor entity, one histology, one treatment, one group) revealed: clinical response evaluation according to the WHO-criteria is not appropriate to determine response:

    http://www.ncbi.nlm.nih.gov/pubmed/15498642

    Of course, there was a time, when it seemed to be useful and this also has to be respected.

    There is another challenge: using statistically a ROC and resulting in thresholds. This was, is and always be “a clinical decision only” and not the decision of the statistician. The clinician tells the statistician, what decision, he wants to make – the responsibility is enormous. Getting back to the roots:
    After the main results of the Munich-group had been published 2001 (Ann Surg) and 2004 (J Clin Oncol):

    http://www.ncbi.nlm.nih.gov/pubmed/11224616

    http://www.ncbi.nlm.nih.gov/pubmed/14990646

    the first reaction in the community was: to difficult, can’t be, not re-evaluated, etc.. However, all evaluated cut-offs / thresholds had been later proven to be the real and best ones by the MD Anderson Cancer Center in Houston, Texas. Jaffer Ajani – a great and critical oncologist – pushed that together with Steve Swisher and they found the same results. Than the upper GI stakeholders went an uncommon way in science: they re-scrutinized their findings. Meanwhile the Goldstandard using histopathology as the basis-criterion had been published in Cancer 2006.

    http://www.ncbi.nlm.nih.gov/pubmed/16607651

    Not every author, who was at the authorlist in 2001 and 2004 wanted to be a part of this analysis and publication ! Why ? Everyone should judge that by himself.

    The data of this analysis had been submitted to the New England Journal of Medicine. In the 2nd review stage process, the manuscript was rejected. The Ann Surg Oncol accepted the publication: the re-scrutinized data resulted in another interesting finding: in the future maybe “one PET-scan” might be appropriate predicting the patient’s response.

    Where are we now ?

    The level of evidence using the response criteria is very low: Miller’s (Cancer 1981) publication belonged to ”one single” experiment from Moertel (Cancer 1976). During that time, there was no definition of “experiences” rather than “oncologists”. These terms had not been in use during that time.

    Additionally they resulted in a (scientifically weak) change of the classification, published by Therasse (J Natl Cancer Inst 2000). Targeted therapy did not result in a change so far. In 2009, the international upper GI experts sent their publication of the Ann Surg Oncol 2009 to the WHO but without any kind of reaction.

    Using molecular biological predictive markers within the last 10years all seem to have potential.

    http://www.ncbi.nlm.nih.gov/pubmed/20012971

    http://www.ncbi.nlm.nih.gov/pubmed/18704459

    http://www.ncbi.nlm.nih.gov/pubmed/17940507

    http://www.ncbi.nlm.nih.gov/pubmed/17354029

    But, experts are aware: the real step breaking barriers had not been performed so far. Additionally, it is very important in trying to evaluate and / predict response, that not different tumor entities with different survival and tumor biology are mixed together. Those data are from my perspective not helpful, but maybe that is my own Bias (!) of my view.

    INCORE, the International Consortium of Research Excellence of the Theodor-Billroth-Academy, was invited publishing the Editorial in Future Oncology 2012. The consortium pointed out, that living within an area of ‘prove of principle’ and also trying to work out level of evidence in medicine, it is “the duty and responsibility” of every clinician, but also of the societies and institutions, also of the WHO.

    Complete remission is not the only goal, as experts dealing with ‘response-research’ are aware. It is so frustrating for patients and clinicians: there is a rate of those patients with complete remission, who develop early recurrence ! This reflects, that complete remission cannot function as the only criterion describing response !

    Again, my heartly thanks, that Dr.Sanexa discussed this issue in detail.
    I hope, I found the way explaining the way of development and evaluating response criteria properly and in a differentiated way of view. From the perspective of INCORE:

    “an interdisciplinary initiative with all key stake¬holders and disciplines represented is imperative to make predictive and prognostic individualized tumor response assessment a modern-day reality. The integrated multidisciplinary panel of international experts need to define how to leverage existing data, tissue and testing platforms in order to predict individual patient treatment response and prognosis.”

  4. Dr. Brucher,

    First of all thanks for expressing your views on the ‘tumor response’ in a comprehensive way. You are the first author of the editorial review one of the prominent people who has taken part in the process of defining tumor response and I am glad that you decided to write a comment on the writeup.
    The topic has been explained well in an immaculate manner and that it further clarifies the need for the perfect markers that would be able to evaluate and predict tumor response. There are, as you mentioned, some molecular markers available including VEGF, cyclins, that have been brought to focus in the context of squamous cell carcinoma.

    It would be great if you could be the guest author for our blog and we could publish your opinion (comment on this blog post) as a separate post. Please let us know if it is OK with you.

    Thanks again for your comment
    Ritu

  5. Thank you all to the compelling discussions, above.

    Please review the two sources on the topic I placed at the bottom of the post, above as post on this Scientific Journal,

    All comments made to both entries are part of thisvdiscussion, I am referring to Dr. Nir’s post on size of tumor, to BB comment to Nir’s post, to Larry’ Pathologist view on Tumors and my post on remission and minimally invasive surgery (MIS).

    Great comments by Dr. Williams, BB and wonderful topic exposition by Dr. Ritu.

  6. Aviva,
    Thats a great idea. I will combine all sources referred by you, the post on tumor imaging by Dr. Nir and the comments made on the these posts including Dr. Brucher’s comments in a new posts.
    Thanks
    Ritu

    • Great idea, ask Larry, he has written two very long important comments on this topic, one on Nir’s post and another one, ask him where, if it is not on MIS post. GREAT work, Ritu, integration is very important. Dr, Williams is one of our Gems.

    • Assessing tumour response it is not an easy task!Because tumours don’t change,but happilly our knowlege(about them) does really change,is everchanging(thans god!).In the past we had the Recist Criteria,then the Modified Recist Criteria,becausa of Gist and other tumors.At this very moment,these are clearly insuficient.We do need more new validated facing the reality of nowadays. A great, enormoust post Dr. Ritu! Congratulations!

 

Conclusions

The Voice of Aviva Lev-Ari, PhD, RN:

The relevance of the Scientific Agora to Medical Education is vast. The Open Access Journal allows EVERY Scientist on the internet the GLOBAL reach and access to Open Access published scientific contents NOT only to the subscription payer base of Journals. If you don’t have a HIGH FEE subscription you get NO access to content in the Journal, you can’t participate in Multiple Comment Exchanges. In the Medical Education context – COMMENTS are the medium to debate with peers. 

Multiple Comment Exchanges on Four articles in the Journal, above, demonstrate the vibrancy of the scientific discussion, the multiplicity of perspectives, the subjectivity of the contribution to the debate and the unique expertise and clinical experience expressed by each Scientist.

 .

Read Full Post »

George A. Miller, a Pioneer in Cognitive Psychology, Is Dead at 92

Larry H. Bernstein, MD, FCAP, Curator

Leaders in Pharmaceutical Intelligence

Series E. 2; 5.10

5.10 George A. Miller, a Pioneer in Cognitive Psychology, Is Dead at 92

By PAUL VITELLOAUG. 1, 2012

http://www.nytimes.com/2012/08/02/us/george-a-miller-cognitive-psychology-pioneer-dies-at-92.html?_r=0

Miller started his education focusing on speech and language and published papers on these topics, focusing on mathematicalcomputational and psychological aspects of the field. He started his career at a time when the reigning theory in psychology was behaviorism, which eschewed any attempt to study mental processes and focused only on observable behavior. Working mostly at Harvard UniversityMIT and Princeton University, Miller introduced experimental techniques to study the psychology of mental processes, by linking the new field of cognitive psychology to the broader area of cognitive science, including computation theory and linguistics. He collaborated and co-authored work with other figures in cognitive science and psycholinguistics, such as Noam Chomsky. For moving psychology into the realm of mental processes and for aligning that move with information theory, computation theory, and linguistics, Miller is considered one of the great twentieth-century psychologists. A Review of General Psychology survey, published in 2002, ranked Miller as the 20th most cited psychologist of that era.[2]

Remembering George A. Miller

The human mind works a lot like a computer: It collects, saves, modifies, and retrieves information. George A. Miller, one of the founders of cognitive psychology, was a pioneer who recognized that the human mind can be understood using an information-processing model. His insights helped move psychological research beyond behaviorist methods that dominated the field through the 1950s. In 1991, he was awarded the National Medal of Science for his significant contributions to our understanding of the human mind.

http://www.psychologicalscience.org/index.php/publications/observer/2012/october-12/remembering-george-a-miller.html

Working memory

From the days of William James, psychologists had the idea memory consisted of short-term and long-term memory. While short-term memory was expected to be limited, its exact limits were not known. In 1956, Miller would quantify its capacity limit in the paper “The magical number seven, plus or minus two”. He tested immediate memory via tasks such as asking a person to repeat a set of digits presented; absolute judgment by presenting a stimulus and a label, and asking them to recall the label later; and span of attention by asking them to count things in a group of more than a few items quickly. For all three cases, Miller found the average limit to be seven items. He had mixed feelings about the focus on his work on the exact number seven for quantifying short-term memory, and felt it had been misquoted often. He stated, introducing the paper on the research for the first time, that he was being persecuted by an integer.[1] Miller also found humans remembered chunks of information, interrelating bits using some scheme, and the limit applied to chunks. Miller himself saw no relationship among the disparate tasks of immediate memory and absolute judgment, but lumped them to fill a one-hour presentation. The results influenced the budding field of cognitive psychology.[15]

WordNet

For many years starting from 1986, Miller directed the development of WordNet, a large computer-readable electronic reference usable in applications such as search engines.[12] Wordnet is a dictionary of words showing their linkages by meaning. Its fundamental building block is a synset, which is a collection of synonyms representing a concept or idea. Words can be in multiple synsets. The entire class of synsets is grouped into nouns, verbs, adjectives and adverbs separately, with links existing only within these four major groups but not between them. Going beyond a thesaurus, WordNet also included inter-word relationships such as part/whole relationships and hierarchies of inclusion.[16] Miller and colleagues had planned the tool to test psycholinguistic theories on how humans use and understand words.[17] Miller also later worked closely with the developers at Simpli.com Inc., on a meaning-based keyword search engine based on WordNet.[18]

Language psychology and computation

Miller is considered one of the founders of psycholinguistics, which links language and cognition in psychology, to analyze how people use and create language.[1] His 1951 book Language and Communication is considered seminal in the field.[5] His later book, The Science of Words (1991) also focused on language psychology.[19] He published papers along with Noam Chomsky on the mathematics and computational aspects of language and its syntax, two new areas of study.[20][21][22] Miller also researched how people understood words and sentences, the same problem faced by artificial speech-recognition technology. The book Plans and the Structure of Behavior (1960), written with Eugene Galanter and Karl H. Pribram, explored how humans plan and act, trying to extrapolate this to how a robot could be programmed to plan and do things.[1] Miller is also known for coining Miller’s Law: “In order to understand what another person is saying, you must assume it is true and try to imagine what it could be true of”.[23]

Language and Communication, 1951[edit]

Miller’s Language and Communication was one of the first significant texts in the study of language behavior. The book was a scientific study of language, emphasizing quantitative data, and was based on the mathematical model of Claude Shannon‘s information theory.[24] It used a probabilistic model imposed on a learning-by-association scheme borrowed from behaviorism, with Miller not yet attached to a pure cognitive perspective.[25] The first part of the book reviewed information theory, the physiology and acoustics of phonetics, speech recognition and comprehension, and statistical techniques to analyze language.[24]The focus was more on speech generation than recognition.[25] The second part had the psychology: idiosyncratic differences across people in language use; developmental linguistics; the structure of word associations in people; use of symbolism in language; and social aspects of language use.[24]

Reviewing the book, Charles E. Osgood classified the book as a graduate-level text based more on objective facts than on theoretical constructs. He thought the book was verbose on some topics and too brief on others not directly related to the author’s expertise area. He was also critical of Miller’s use of simple, Skinnerian single-stage stimulus-response learning to explain human language acquisition and use. This approach, per Osgood, made it impossible to analyze the concept of meaning, and the idea of language consisting of representational signs. He did find the book objective in its emphasis on facts over theory, and depicting clearly application of information theory to psychology.[24]

Plans and the Structure of Behavior, 1960[edit]

In Plans and the Structure of Behavior, Miller and his co-authors tried to explain through an artificial-intelligence computational perspective how animals plan and act.[26] This was a radical break from behaviorism which explained behavior as a set or sequence of stimulus-response actions. The authors introduced a planning element controlling such actions.[27] They saw all plans as being executed based on input using a stored or inherited information of the environment (called the image), and using a strategy called test-operate-test-exit (TOTE). The image was essentially a stored memory of all past context, akin to Tolman‘scognitive map. The TOTE strategy, in its initial test phase, compared the input against the image; if there was incongruity the operate function attempted to reduce it. This cycle would be repeated till the incongruity vanished, and then the exit function would be invoked, passing control to another TOTE unit in a hierarchically arranged scheme.[26]

Peter Milner, in a review in the Canadian Journal of Psychology, noted the book was short on concrete details on implementing the TOTE strategy. He also critically viewed the book as not being able to tie its model to details from neurophysiology at a molecular level. Per him, the book covered only the brain at the gross level of lesion studies, showing that some of its regions could possibly implement some TOTE strategies, without giving a reader an indication as to how the region could implement the strategy.[26]

The Psychology of Communication, 1967[edit]

Miller’s 1967 work, The Psychology of Communication, was a collection of seven previously published articles. The first “Information and Memory” dealt with chunking, presenting the idea of separating physical length (the number of items presented to be learned) and psychological length (the number of ideas the recipient manages to categorize and summarize the items with). Capacity of short-term memory was measured in units of psychological length, arguing against a pure behaviorist interpretation since meaning of items, beyond reinforcement and punishment, was central to psychological length.[28]

The second essay was the paper on magical number seven. The third, ‘The human link in communication systems,’ used information theory and its idea of channel capacity to analyze human perception bandwidth. The essay concluded how much of what impinges on us we can absorb as knowledge was limited, for each property of the stimulus, to a handful of items.[28] The paper on “Psycholinguists” described how effort in both speaking or understanding a sentence was related to how much of self-reference to similar-structures-present-inside was there when the sentence was broken down into clauses and phrases.[29] The book, in general, used the Chomskian view of seeing language rules of grammar as having a biological basis—disproving the simple behaviorist idea that language performance improved with reinforcement—and using the tools of information and computation to place hypotheses on a sound theoretical framework and to analyze data practically and efficiently. Miller specifically addressed experimental data refuting the behaviorist framework at concept level in the field of language and cognition. He noted this only qualified behaviorism at the level of cognition, and did not overthrow it in other spheres of psychology.[28]

https://en.wikipedia.org/wiki/George_Armitage_Miller

Read Full Post »

Cancer Biology and Genomics for Disease Diagnosis (Vol. I) Now Available for Amazon Kindle

Cancer Biology and Genomics for Disease Diagnosis (Vol. I) Now Available for Amazon Kindle

Reporter: Stephen J Williams, PhD

Leaders in Pharmaceutical Business Intelligence would like to announce the First volume of their BioMedical E-Book Series C: e-Books on Cancer & Oncology

Volume One: Cancer Biology and Genomics for Disease Diagnosis

CancerandOncologyseriesCcoverwhich is now available on Amazon Kindle at                          http://www.amazon.com/dp/B013RVYR2K.

This e-Book is a comprehensive review of recent Original Research on Cancer & Genomics including related opportunities for Targeted Therapy written by Experts, Authors, Writers. This ebook highlights some of the recent trends and discoveries in cancer research and cancer treatment, with particular attention how new technological and informatics advancements have ushered in paradigm shifts in how we think about, diagnose, and treat cancer. The results of Original Research are gaining value added for the e-Reader by the Methodology of Curation. The e-Book’s articles have been published on the Open Access Online Scientific Journal, since April 2012.  All new articles on this subject, will continue to be incorporated, as published with periodical updates.

We invite e-Readers to write an Article Reviews on Amazon for this e-Book on Amazon. All forthcoming BioMed e-Book Titles can be viewed at:

http://pharmaceuticalintelligence.com/biomed-e-books/

Leaders in Pharmaceutical Business Intelligence, launched in April 2012 an Open Access Online Scientific Journal is a scientific, medical and business multi expert authoring environment in several domains of  life sciences, pharmaceutical, healthcare & medicine industries. The venture operates as an online scientific intellectual exchange at their website http://pharmaceuticalintelligence.com and for curation and reporting on frontiers in biomedical, biological sciences, healthcare economics, pharmacology, pharmaceuticals & medicine. In addition the venture publishes a Medical E-book Series available on Amazon’s Kindle platform.

Analyzing and sharing the vast and rapidly expanding volume of scientific knowledge has never been so crucial to innovation in the medical field. WE are addressing need of overcoming this scientific information overload by:

  • delivering curation and summary interpretations of latest findings and innovations
  • on an open-access, Web 2.0 platform with future goals of providing primarily concept-driven search in the near future
  • providing a social platform for scientists and clinicians to enter into discussion using social media
  • compiling recent discoveries and issues in yearly-updated Medical E-book Series on Amazon’s mobile Kindle platform

This curation offers better organization and visibility to the critical information useful for the next innovations in academic, clinical, and industrial research by providing these hybrid networks.

Table of Contents for Cancer Biology and Genomics for Disease Diagnosis

Preface

Introduction  The evolution of cancer therapy and cancer research: How we got here?

Part I. Historical Perspective of Cancer Demographics, Etiology, and Progress in Research

Chapter 1:  The Occurrence of Cancer in World Populations

Chapter 2.  Rapid Scientific Advances Changes Our View on How Cancer Forms

Chapter 3:  A Genetic Basis and Genetic Complexity of Cancer Emerge

Chapter 4: How Epigenetic and Metabolic Factors Affect Tumor Growth

Chapter 5: Advances in Breast and Gastrointestinal Cancer Research Supports Hope for Cure

Part II. Advent of Translational Medicine, “omics”, and Personalized Medicine Ushers in New Paradigms in Cancer Treatment and Advances in Drug Development

Chapter 6:  Treatment Strategies

Chapter 7:  Personalized Medicine and Targeted Therapy

Part III.Translational Medicine, Genomics, and New Technologies Converge to Improve Early Detection

Chapter 8:  Diagnosis                                     

Chapter 9:  Detection

Chapter 10:  Biomarkers

Chapter 11:  Imaging In Cancer

Chapter 12: Nanotechnology Imparts New Advances in Cancer Treatment, Detection, &  Imaging                                 

Epilogue by Larry H. Bernstein, MD, FACP: Envisioning New Insights in Cancer Translational Biology

 

Read Full Post »

Artificial Intelligence Versus the Scientist: Who Will Win?

Will DARPA Replace the Human Scientist: Not So Fast, My Friend!

Writer, Curator: Stephen J. Williams, Ph.D.

scientistboxingwithcomputer

Last month’s issue of Science article by Jia You “DARPA Sets Out to Automate Research”[1] gave a glimpse of how science could be conducted in the future: without scientists. The article focused on the U.S. Defense Advanced Research Projects Agency (DARPA) program called ‘Big Mechanism”, a $45 million effort to develop computer algorithms which read scientific journal papers with ultimate goal of extracting enough information to design hypotheses and the next set of experiments,

all without human input.

The head of the project, artificial intelligence expert Paul Cohen, says the overall goal is to help scientists cope with the complexity with massive amounts of information. As Paul Cohen stated for the article:

“‘

Just when we need to understand highly connected systems as systems,

our research methods force us to focus on little parts.

                                                                                                                                                                                                               ”

The Big Mechanisms project aims to design computer algorithms to critically read journal articles, much as scientists will, to determine what and how the information contributes to the knowledge base.

As a proof of concept DARPA is attempting to model Ras-mutation driven cancers using previously published literature in three main steps:

  1. Natural Language Processing: Machines read literature on cancer pathways and convert information to computational semantics and meaning

One team is focused on extracting details on experimental procedures, using the mining of certain phraseology to determine the paper’s worth (for example using phrases like ‘we suggest’ or ‘suggests a role in’ might be considered weak versus ‘we prove’ or ‘provide evidence’ might be identified by the program as worthwhile articles to curate). Another team led by a computational linguistics expert will design systems to map the meanings of sentences.

  1. Integrate each piece of knowledge into a computational model to represent the Ras pathway on oncogenesis.
  2. Produce hypotheses and propose experiments based on knowledge base which can be experimentally verified in the laboratory.

The Human no Longer Needed?: Not So Fast, my Friend!

The problems the DARPA research teams are encountering namely:

  • Need for data verification
  • Text mining and curation strategies
  • Incomplete knowledge base (past, current and future)
  • Molecular biology not necessarily “requires casual inference” as other fields do

Verification

Notice this verification step (step 3) requires physical lab work as does all other ‘omics strategies and other computational biology projects. As with high-throughput microarray screens, a verification is needed usually in the form of conducting qPCR or interesting genes are validated in a phenotypical (expression) system. In addition, there has been an ongoing issue surrounding the validity and reproducibility of some research studies and data.

See Importance of Funding Replication Studies: NIH on Credibility of Basic Biomedical Studies

Therefore as DARPA attempts to recreate the Ras pathway from published literature and suggest new pathways/interactions, it will be necessary to experimentally validate certain points (protein interactions or modification events, signaling events) in order to validate their computer model.

Text-Mining and Curation Strategies

The Big Mechanism Project is starting very small; this reflects some of the challenges in scale of this project. Researchers were only given six paragraph long passages and a rudimentary model of the Ras pathway in cancer and then asked to automate a text mining strategy to extract as much useful information. Unfortunately this strategy could be fraught with issues frequently occurred in the biocuration community namely:

Manual or automated curation of scientific literature?

Biocurators, the scientists who painstakingly sort through the voluminous scientific journal to extract and then organize relevant data into accessible databases, have debated whether manual, automated, or a combination of both curation methods [2] achieves the highest accuracy for extracting the information needed to enter in a database. Abigail Cabunoc, a lead developer for Ontario Institute for Cancer Research’s WormBase (a database of nematode genetics and biology) and Lead Developer at Mozilla Science Lab, noted, on her blog, on the lively debate on biocuration methodology at the Seventh International Biocuration Conference (#ISB2014) that the massive amounts of information will require a Herculaneum effort regardless of the methodology.

Although I will have a future post on the advantages/disadvantages and tools/methodologies of manual vs. automated curation, there is a great article on researchinformation.infoExtracting More Information from Scientific Literature” and also see “The Methodology of Curation for Scientific Research Findings” and “Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison” for manual curation methodologies and A MOD(ern) perspective on literature curation for a nice workflow paper on the International Society for Biocuration site.

The Big Mechanism team decided on a full automated approach to text-mine their limited literature set for relevant information however was able to extract only 40% of relevant information from these six paragraphs to the given model. Although the investigators were happy with this percentage most biocurators, whether using a manual or automated method to extract information, would consider 40% a low success rate. Biocurators, regardless of method, have reported ability to extract 70-90% of relevant information from the whole literature (for example for Comparative Toxicogenomics Database)[3-5].

Incomplete Knowledge Base

In an earlier posting (actually was a press release for our first e-book) I had discussed the problem with the “data deluge” we are experiencing in scientific literature as well as the plethora of ‘omics experimental data which needs to be curated.

Tackling the problem of scientific and medical information overload

pubmedpapersoveryears

Figure. The number of papers listed in PubMed (disregarding reviews) during ten year periods have steadily increased from 1970.

Analyzing and sharing the vast amounts of scientific knowledge has never been so crucial to innovation in the medical field. The publication rate has steadily increased from the 70’s, with a 50% increase in the number of original research articles published from the 1990’s to the previous decade. This massive amount of biomedical and scientific information has presented the unique problem of an information overload, and the critical need for methodology and expertise to organize, curate, and disseminate this diverse information for scientists and clinicians. Dr. Larry Bernstein, President of Triplex Consulting and previously chief of pathology at New York’s Methodist Hospital, concurs that “the academic pressures to publish, and the breakdown of knowledge into “silos”, has contributed to this knowledge explosion and although the literature is now online and edited, much of this information is out of reach to the very brightest clinicians.”

Traditionally, organization of biomedical information has been the realm of the literature review, but most reviews are performed years after discoveries are made and, given the rapid pace of new discoveries, this is appearing to be an outdated model. In addition, most medical searches are dependent on keywords, hence adding more complexity to the investigator in finding the material they require. Third, medical researchers and professionals are recognizing the need to converse with each other, in real-time, on the impact new discoveries may have on their research and clinical practice.

These issues require a people-based strategy, having expertise in a diverse and cross-integrative number of medical topics to provide the in-depth understanding of the current research and challenges in each field as well as providing a more conceptual-based search platform. To address this need, human intermediaries, known as scientific curators, are needed to narrow down the information and provide critical context and analysis of medical and scientific information in an interactive manner powered by web 2.0 with curators referred to as the “researcher 2.0”. This curation offers better organization and visibility to the critical information useful for the next innovations in academic, clinical, and industrial research by providing these hybrid networks.

Yaneer Bar-Yam of the New England Complex Systems Institute was not confident that using details from past knowledge could produce adequate roadmaps for future experimentation and noted for the article, “ “The expectation that the accumulation of details will tell us what we want to know is not well justified.”

In a recent post I had curated findings from four lung cancer omics studies and presented some graphic on bioinformatic analysis of the novel genetic mutations resulting from these studies (see link below)

Multiple Lung Cancer Genomic Projects Suggest New Targets, Research Directions for

Non-Small Cell Lung Cancer

which showed, that while multiple genetic mutations and related pathway ontologies were well documented in the lung cancer literature there existed many significant genetic mutations and pathways identified in the genomic studies but little literature attributed to these lung cancer-relevant mutations.

KEGGinliteroanalysislungcancer

  This ‘literomics’ analysis reveals a large gap between our knowledge base and the data resulting from large translational ‘omic’ studies.

Different Literature Analyses Approach Yeilding

A ‘literomics’ approach focuses on what we don NOT know about genes, proteins, and their associated pathways while a text-mining machine learning algorithm focuses on building a knowledge base to determine the next line of research or what needs to be measured. Using each approach can give us different perspectives on ‘omics data.

Deriving Casual Inference

Ras is one of the best studied and characterized oncogenes and the mechanisms behind Ras-driven oncogenenis is highly understood.   This, according to computational biologist Larry Hunt of Smart Information Flow Technologies makes Ras a great starting point for the Big Mechanism project. As he states,” Molecular biology is a good place to try (developing a machine learning algorithm) because it’s an area in which common sense plays a minor role”.

Even though some may think the project wouldn’t be able to tackle on other mechanisms which involve epigenetic factors UCLA’s expert in causality Judea Pearl, Ph.D. (head of UCLA Cognitive Systems Lab) feels it is possible for machine learning to bridge this gap. As summarized from his lecture at Microsoft:

“The development of graphical models and the logic of counterfactuals have had a marked effect on the way scientists treat problems involving cause-effect relationships. Practical problems requiring causal information, which long were regarded as either metaphysical or unmanageable can now be solved using elementary mathematics. Moreover, problems that were thought to be purely statistical, are beginning to benefit from analyzing their causal roots.”

According to him first

1) articulate assumptions

2) define research question in counter-inference terms

Then it is possible to design an inference system using calculus that tells the investigator what they need to measure.

To watch a video of Dr. Judea Pearl’s April 2013 lecture at Microsoft Research Machine Learning Summit 2013 (“The Mathematics of Causal Inference: with Reflections on Machine Learning”), click here.

The key for the Big Mechansism Project may me be in correcting for the variables among studies, in essence building a models system which may not rely on fully controlled conditions. Dr. Peter Spirtes from Carnegie Mellon University in Pittsburgh, PA is developing a project called the TETRAD project with two goals: 1) to specify and prove under what conditions it is possible to reliably infer causal relationships from background knowledge and statistical data not obtained under fully controlled conditions 2) develop, analyze, implement, test and apply practical, provably correct computer programs for inferring causal structure under conditions where this is possible.

In summary such projects and algorithms will provide investigators the what, and possibly the how should be measured.

So for now it seems we are still needed.

References

  1. You J: Artificial intelligence. DARPA sets out to automate research. Science 2015, 347(6221):465.
  2. Biocuration 2014: Battle of the New Curation Methods [http://blog.abigailcabunoc.com/biocuration-2014-battle-of-the-new-curation-methods]
  3. Davis AP, Johnson RJ, Lennon-Hopkins K, Sciaky D, Rosenstein MC, Wiegers TC, Mattingly CJ: Targeted journal curation as a method to improve data currency at the Comparative Toxicogenomics Database. Database : the journal of biological databases and curation 2012, 2012:bas051.
  4. Wu CH, Arighi CN, Cohen KB, Hirschman L, Krallinger M, Lu Z, Mattingly C, Valencia A, Wiegers TC, John Wilbur W: BioCreative-2012 virtual issue. Database : the journal of biological databases and curation 2012, 2012:bas049.
  5. Wiegers TC, Davis AP, Mattingly CJ: Collaborative biocuration–text-mining development task for document prioritization for curation. Database : the journal of biological databases and curation 2012, 2012:bas037.

Other posts on this site on include: Artificial Intelligence, Curation Methodology, Philosophy of Science

Inevitability of Curation: Scientific Publishing moves to embrace Open Data, Libraries and Researchers are trying to keep up

A Brief Curation of Proteomics, Metabolomics, and Metabolism

The Methodology of Curation for Scientific Research Findings

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

The growing importance of content curation

Data Curation is for Big Data what Data Integration is for Small Data

Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation The Art of Scientific & Medical Curation

Exploring the Impact of Content Curation on Business Goals in 2013

Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison

conceived: NEW Definition for Co-Curation in Medical Research

Reconstructed Science Communication for Open Access Online Scientific Curation

Search Results for ‘artificial intelligence’

 The Simple Pictures Artificial Intelligence Still Can’t Recognize

Data Scientist on a Quest to Turn Computers Into Doctors

Vinod Khosla: “20% doctor included”: speculations & musings of a technology optimist or “Technology will replace 80% of what doctors do”

Where has reason gone?

Read Full Post »

The Reconstruction of Life Processes requires both Genomics and Metabolomics to explain Phenotypes and Phylogenetics

Writer and Curator: Larry H. Bernstein, MD, FCAP 

 

phylogenetics

phylogenetics

http://upload.wikimedia.org/wikipedia/commons/thumb/1/12/CollapsedtreeLabels-simplified.svg/200px-CollapsedtreeLabels-simplified.svg.png

 

This discussion that completes and is an epicrisis (summary and critical evaluation) of the series of discussions that preceded it.

  1. Innervation of Heart and Heart Rate
  2. Action of hormones on the circulation
  3. Allogeneic Transfusion Reactions
  4. Graft-versus Host reaction
  5. Unique problems of perinatal period
  6. High altitude sickness
  7. Deep water adaptation
  8. Heart-Lung-and Kidney
  9. Acute Lung Injury

The concept inherent in this series is that the genetic code is an imprint that is translated into a message.  It is much the same as a blueprint, or a darkroom photographic image that has to be converted to a print. It is biologically an innovation of evolutionary nature because it establishes a simple and reproducible standard for the transcription of the message through the transcription of the message using strings of nucleotides (oligonucleotides) that systematically transfer the message through ribonucleotides that communicate in the cytoplasm with the cytoskeleton based endoplasmic reticulum (ER), composing a primary amino acid sequence.  This process is a quite simple and convenient method of biological activity.  However, the simplicity ends at this step.  The metabolic components of the cell are organelles consisting of lipoprotein membranes and a cytosol which have particularly aligned active proteins, as in the inner membrane of the mitochondrion, or as in the liposome or phagosome, or the structure of the  ER, each of which is critical for energy transduction and respiration, in particular, for the mitochondria, cellular remodeling or cell death, with respect to the phagosome, and construction of proteins with respect to the ER, and anaerobic glycolysis and the hexose monophosphate shunt in the cytoplasmic domain.  All of this refers to structure and function, not to leave out the membrane assigned transport of inorganic, and organic ions (electrolytes and metabolites).

I have identified a specific role of the ER, the organelles, and cellular transactions within and between cells that is orchestrated.  But what I have outlined is a somewhat limited and rigid model that does not reach into the dynamics of cellular transactions.  The DNA has expression that may be old, no longer used messages, and this is perhaps only part of a significant portion of “dark matter”.  There is also nuclear DNA that is enmeshed with protein, mRNA that is a copy of DNA, and mDNA  is copied to ribosomal RNA (rRNA).  There is also rDNA. The classic model is DNA to RNA to protein.  However, there is also noncoding RNA, which plays an important role in regulation of transcription.

This has been discussed in other articles.  But the important point is that proteins have secondary structure through disulfide bonds, which is determined by position of sulfur amino acids, and by van der Waal forces, attraction and repulsion. They have tertiary structure, which is critical for 3-D structure.  When like subunits associate, or dissimilar oligomers, then you have heterodimers and oligomers.  These constructs that have emerged over time interact with metabolites within the cell, and also have an important interaction with the extracellular environment.

When you take this into consideration then a more complete picture emerges. The primitive cell or the multicellular organism lives in an environment that has the following characteristics – air composition, water and salinity, natural habitat, temperature, exposure to radiation, availability of nutrients, and exposure to chemical toxins or to predators.  In addition, there is a time dimension that proceeds from embryonic stage to birth in mammals, a rapid growth phase, a tapering, and a decline.  The time span is determined by body size, fluidity of adaptation, and environmental factors.  This is covered in great detail in this work.  The last two pieces are in the writing stage that completes the series. Much content has already be presented in previous articles.

The function of the heart, kidneys and metabolism of stressful conditions have already been extensively covered in http://pharmaceuticalintelligence.com  in the following and more:

The Amazing Structure and Adaptive Functioning of the Kidneys: Nitric Oxide – Part I

http://pharmaceuticalintelligence.com/2012/11/26/the-amazing-structure-and-adaptive-functioning-of-the-kidneys/

Nitric Oxide and iNOS have Key Roles in Kidney Diseases – Part II

http://pharmaceuticalintelligence.com/2012/11/26/nitric-oxide-and-inos-have-key-roles-in-kidney-diseases/

The pathological role of IL-18Rα in renal ischemia/reperfusion injury – Nature.com

http://pharmaceuticalintelligence.com/2014/10/24/the-pathological-role-of-il-18r%CE%B1-in-renal-ischemiareperfusion-injury-nature-com/

Summary, Metabolic Pathways

http://pharmaceuticalintelligence.com/2014/10/23/summary-metabolic-pathways/

 

Read Full Post »

The Life and Work of Allan Wilson

Curator: Larry H. Bernstein, MD, FCAP

 

Allan Charles Wilson (18 October 1934 – 21 July 1991) was a Professor of Biochemistry at the University of California, Berkeley, a pioneer in the use of molecular approaches to understand evolutionary change and reconstruct phylogenies, and a revolutionary contributor to the study of human evolution. He was one of the most controversial figures in post-war biology; his work attracted a great deal of attention both from within and outside the academic world. He is the only New Zealander to have won the MacArthur Fellowship.

He is best known for experimental demonstration of the concept of the molecular clock (with his doctoral student Vincent Sarich), which was theoretically postulated by Linus Pauling and Emile Zuckerkandl, revolutionary insights into the nature of the molecular anthropology of higher primates and human evolution, called Mitochondrial Eve hypothesis (with his doctoral students Rebecca L. Cann and Mark Stoneking).

Allan Wilson was born in Ngaruawahia, New Zealand, and raised on his family’s rural dairy farm at Helvetia, Pukekohe, about twenty miles south of Auckland. At his local Sunday School, the vicar’s wife was impressed by young Allan’s interest in evolution and encouraged Allan’s mother to enroll him at the elite King’s College secondary school in Auckland. There he excelled in mathematics, chemistry, and sports.

Wilson already had an interest in evolution and biochemistry, but intended to be the first in his family to attend university by pursuing studies in agriculture and animal science. Wilson met Professor Campbell Percy McMeekan, a New Zealand pioneer in animal science, who suggested that Wilson attend the University of Otago in southern New Zealand to further his study in biochemistry rather than veterinary science. Wilson gained a BSc from the University of Otago in 1955, majoring in both zoology and biochemistry.

The bird physiologist Donald S. Farner met Wilson as an undergraduate at Otago and invited him to Washington State University at Pullman as his graduate student. Wilson obliged and completed a master’s degree in zoology at WSU under Farner in 1957, where he worked on the effects of photoperiod on the physiology of birds.

Wilson then moved to the University of California, Berkeley, to pursue his doctoral research. At the time the family thought Allan would only be gone two years. Instead, Wilson remained in the United States, gaining his PhD at Berkeley in 1961 under the direction of biochemist Arthur Pardee for work on the regulation of flavin biosynthesis in bacteria. From 1961 to 1964, Wilson studied as a post-doc under biochemist Nathan O. Kaplan at Brandeis University in Waltham, Massachusetts. In Kaplan’s lab, working with lactate and malate dehydrogenases, Wilson was first introduced to the nascent field of molecular evolution. Nate Kaplan was one of the very earliest pioneers to address phylogenetic problems with evidence from protein molecules, an approach that Wilson later famously applied to human evolution and primate relationships. After Brandeis, Wilson returned to Berkeley where he set up his own lab in the Biochemistry department, remaining there for the rest of his life.

Wilson joined the UC Berkeley faculty of biochemistry in 1964, and was promoted to full professor in 1972. His first major scientific contribution was published as Immunological Time-Scale For Hominid Evolution in the journal Science in December 1967. With his student Vincent Sarich, he showed that evolutionary relationships of the human species with other primates, in particular the Great Apes (chimpanzees, gorillas, and orangutans), could be inferred from molecular evidence obtained from living species, rather than solely from fossils of extinct creatures.

Their microcomplement fixation method (see complement system) measured the strength of the immune reaction between an antigen (serum albumin) from one species and an antibody raised against the same antigen in another species. The strength of the antibody-antigen reaction was known to be stronger between more closely related species: their innovation was to measure it quantitatively among many species pairs as an “immunological distance”. When these distances were plotted against the divergence times of species pair with well-established evolutionary histories, the data showed that the molecular difference increased linearly with time, in what was termed a “molecular clock”. Given this calibration curve, the time of divergence between species pairs with unknown or uncertain fossil histories could be inferred. Most controversially, their data suggested that divergence times between humans, chimpanzees, and gorillas were on the order of 3~5 million years, far less than the estimates of 9~30 million years accepted by conventional paleoanthropologists from fossil hominids such as Ramapithecus. This ‘recent origin’ theory of human/ape divergence remained controversial until the discovery of the “Lucy” fossils in 1974.

Wilson and another PhD student Mary-Claire King subsequently compared several lines of genetic evidence (immunology, amino acid differences, and protein electrophoresis) on the divergence of humans and chimpanzees, and showed that all methods agreed that the two species were >99% similar.[4][19] Given the large organismal differences between the two species in the absence of large genetic differences, King and Wilson argued that it was not structural gene differences that were responsible for species differences, but gene regulation of those differences, that is, the timing and manner in which near-identical gene products are assembled during embryology and development. In combination with the “molecular clock” hypothesis, this contrasted sharply with the accepted view that larger or smaller organismal differences were due to large or smaller rates of genetic divergence.

In the early 1980s, Wilson further refined traditional anthropological thinking with his work with PhD students Rebecca Cann and Mark Stoneking on the so-called “Mitochondrial Eve” hypothesis.[20] In his efforts to identify informative genetic markers for tracking human evolutionary history, he focused on mitochondrial DNA (mtDNA) — genes that are found in mitochondria in the cytoplasm of the cell outside the nucleus. Because of its location in the cytoplasm, mtDNA is passed exclusively from mother to child, the father making no contribution, and in the absence of genetic recombination defines female lineages over evolutionary timescales. Because it also mutates rapidly, it is possible to measure the small genetic differences between individual within species by restriction endonuclease gene mapping. Wilson, Cann, and Stoneking measured differences among many individuals from different human continental groups, and found that humans from Africa showed the greatest inter-individual differences, consistent with an African origin of the human species (the so-called “Out of Africa” hypothesis). The data further indicated that all living humans shared a common maternal ancestor, who lived in Africa only a few hundreds of thousands of years ago.

This common ancestor became widely known in the media and popular culture as the Mitochondrial Eve. This had the unfortunate and erroneous implication that only a single female lived at that time, when in fact the occurrence of a coalescent ancestor is a necessary consequence of population genetic theory, and the Mitochondrial Eve would have been only one of many humans (male and female) alive at that time.[2][3] This finding was, like his earlier results, not readily accepted by anthropologists. Conventional hypothesis was that various human continental groups had evolved from diverse ancestors, over several million of years since divergence from chimpanzees. The mtDNA data, however, strongly suggested that all humans descended from a common, quite recent, African mother.

Wilson became ill with leukemia, and after a bone marrow transplant, died on Sunday, 21 July 1991, at the Fred Hutchinson Memorial Cancer Research Center in Seattle. He had been scheduled to give the keynote address at an international conference the same day. He was 56, at the height of his scientific recognition and powers.

Wilson’s success can be attributed to his strong interest and depth of knowledge in biochemistry and evolutionary biology, his insistence of quantification of evolutionary phenomena, and has early recognition of new molecular techniques that could shed light on questions of evolutionary biology. After development of quantitative immunological methods, his lab was the first to recognize restriction endonuclease mapping analysis as a quantitative evolutionary genetic method, which led to his early use of DNA sequencing, and the then-nascent technique of PCR to obtain large DNA sets for genetic analysis of populations. He trained scores of undergraduate, graduate (34 people, 17 each of men and women, received their doctoral degrees in his lab), and post-doctoral students in molecular evolutionary biology, including sabbatical visitors from six continents. His lab published more than 300 technical papers, and was recognized as a mecca for those wishing to enter the field of molecular evolution in the 1970s and 1980s.

The Allan Wilson Centre for Molecular Ecology and Evolution was established in 2002 in his honour to advance knowledge of the evolution and ecology of New Zealand and Pacific plant and animal life, and human history in the Pacific. The Centre is under the Massey University, at Palmerston North, New Zealand, and is a national collaboration involving the University of Auckland, Victoria University of Wellington, the University of Otago, University of Canterbury and the New Zealand Institute for Plant and Food Research.

A 41-minutes documentary film of his life entitled Allan Wilson, Evolutionary: Biochemist, Biologist, Giant of Molecular Biology was released by Films Media Group in 2008.

 

Allan Charles Wilson. 18 October 1934 — 21 July 1991

Rebecca L. Cann

Department of Cell and Molecular Biology, University of Hawaii at Manoa, Biomedical Sciences Building T514, 1960 East–West Rd, Honolulu, HI 96822, USA

Abstract

Allan Charles Wilson was born on 18 October 1934 at Ngaruawahia, New Zealand. He died in Seattle, Washington, on 21 July 1991 while undergoing treatment for leukemia.  Allan was known as a pioneering and highly innovative biochemist, helping to define the field of molecular evolution and establish the use of a molecular clock to measure evolutionary change between living species. The molecular clock, a method of measuring the timescale of evolutionary change between two organisms on the basis of the number of mutations that they have accumulated since last sharing a common genetic ancestor, was an idea initially championed by Émile Zuckerkandl and Linus Pauling (Zuckerkandl & Pauling 1962), on the basis of their observations that the number of changes in an amino acid sequence was roughly linear with time in the aligned hemoglobin proteins of animals. Although it is now not unusual to see the words ‘molecular evolution’ and ‘molecular phylogeny’ together, when Allan formed his own biochemistry laboratory in 1964 at the University of California, Berkeley, many scientists in the field of evolutionary biology considered these ideas complete heresy. Allan’s death at the relatively young age of 56 years left behind his wife, Leona (deceased in 2009), a daughter, Ruth (b. 1961), and a son, David (b. 1964), as well his as mother, Eunice (deceased in 2002), a younger brother, Gary Wilson, and a sister, Colleen Macmillan, along with numerous nieces, nephews and cousins in New Zealand, Australia and the USA. In this short span of time, he trained more than 55 doctoral students and helped launch the careers of numerous postdoctoral fellows.

Allan Charles Wilson, Biochemistry; Molecular Biology: Berkeley

1934-1991

Professor

The sudden death of Allan Wilson, of leukemia, on 21 July 1991, at the age of 56, and at the height of his powers, robbed the Berkeley campus and the international scientific community of one of its most active and respected leaders.

Read Full Post »

Leaders in Pharmaceutical Business Intelligence Announced New Cardiovascular Series of e-Books at SACHS Associates 14th Annual Biotech In Europe Forum

Reporter: Aviva Lev-Ari, PhD, RN

 

 

Please see Further Titles at

http://pharmaceuticalintelligence.com/biomed-e-books/

Please see Further Information on the Sachs Associates 14th Annual Biotech in Europe Forum for Global Investing & Partnering at:

http://pharmaceuticalintelligence.com/2014/03/25/14th-annual-biotech-in-europe-forum-for-global-partnering-investment-930-1012014-%E2%80%A2-congress-center-basel-sachs-associates-london/

AND

http://www.sachsforum.com/basel14/index.html

why-is-twitter-s-logo-named-after-larry-bird--b8d70319daON TWITTER Follow at

@SachsAssociates

#Sachs14thBEF

@pharma_BI

@AVIVA1950 

Read Full Post »

Extracellular evaluation of intracellular flux in yeast cells

Larry H. Bernstein, MD, FCAP, Reviewer and Curator

Leaders in Pharmaceutical Intelligence

This is the fourth article in a series on metabolomics, which is a major development in -omics, integrating transcriptomics, proteomics,  genomics, metabolic pathways analysis, metabolic and genomic regulatory control using computational mapping.  In the previous two part presentation, flux analysis was not a topic for evaluation, but here it is the major focus.  It is a study of yeast cells, and bears some relationship to the comparison of glycemia, oxidative phosphorylation, TCA cycle, and ETC in leukemia cell lines.  In the previous study – system flux was beyond the scope of analysis, and explicitly stated.  The inferences made in comparing the two lymphocytic leukemia cells was of intracellular metabolism from extracellular measurements.  The study of yeast cells is aimed at looking at cellular effluxes, which is also an important method for studying pharmacological effects and drug resistance.

Metabolomic series

1.  Metabolomics, Metabonomics and Functional Nutrition: the next step in nutritional metabolism and biotherapeutics

http://pharmaceuticalintelligence.com/2014/08/22/metabolomics-metabonomics-and-functional-nutrition-the-next-step-in-nutritional-metabolism-and-biotherapeutics/

2.  Metabolomic analysis of two leukemia cell lines. I

http://pharmaceuticalintelligence.com/2014/08/23/metabolomic-analysis-of-two-leukemia-cell-lines-_i/

3.  Metabolomic analysis of two leukemia cell lines. II.

 http://pharmaceuticalintelligence.com/2014/08/24/metabolomic-analysis-of-two-leukemia-cell-lines-ii/

4.  Extracellular evaluation of intracellular flux in yeast cells

Q1. What is efflux?

Q2. What measurements were excluded from the previous study that would not allow inference about fluxes?

Q3. Would this study bear any relationship to the Pasteur effect?

Q4 What is a genome scale network reconstruction?

Q5 What type of information is required for a network prediction model?

Q6. Is there a difference between the metabolites profiles for yeast grown under aerobic and anaerobuc conditions – under the constrainsts?

Q7.  If there is a difference in the S metabolism, would there be an effect on ATP production?

 

 

Connecting extracellular metabolomic measurements to intracellular flux
states in yeast

Monica L Mo1Bernhard Ø Palsson1 and Markus J Herrgård12*

Author Affiliations

1 Department of Bioengineering, University of California, San Diego, La Jolla, CA 92093, USA

2 Current address: Synthetic Genomics, Inc, 11149 N Torrey Pines Rd, La Jolla, CA 92037, USA

For all author emails, please log on.

BMC Systems Biology 2009, 3:37  doi:10.1186/1752-0509-3-37

 

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1752-0509/3/37

 

Received: 15 December 2008
Accepted: 25 March 2009
Published: 25 March 2009

© 2009 Mo et al; licensee BioMed Central Ltd.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background

Metabolomics has emerged as a powerful tool in the

  • quantitative identification of physiological and disease-induced biological states.

Extracellular metabolome or metabolic profiling data, in particular,

  • can provide an insightful view of intracellular physiological states in a noninvasive manner.

Results

We used an updated genome-scale

  • metabolic network model of Saccharomyces cerevisiae, iMM904, to investigate
  1. how changes in the extracellular metabolome can be used
  2. to study systemic changes in intracellular metabolic states.

The iMM904 metabolic network was reconstructed based on

  • an existing genome-scale network, iND750,
  • and includes 904 genes and 1,412 reactions.

The network model was first validated by

  • comparing 2,888 in silico single-gene deletion strain growth phenotype predictions
  • to published experimental data.

Extracellular metabolome data measured

  • of ammonium assimilation pathways 
  • in response to environmental and genetic perturbations

was then integrated with the iMM904 network

  • in the form of relative overflow secretion constraints and
  • a flux sampling approach was used to characterize candidate flux distributions allowed by these constraints.

Predicted intracellular flux changes were

  • consistent with published measurements
  • on intracellular metabolite levels and fluxes.

Patterns of predicted intracellular flux changes

  • could also be used to correctly identify the regions of
  • the metabolic network that were perturbed.

Conclusion

Our results indicate that

  • integrating quantitative extracellular metabolomic profiles
  • in a constraint-based framework
  • enables inferring changes in intracellular metabolic flux states.

Similar methods could potentially be applied

  • towards analyzing biofluid metabolome variations
  • related to human physiological and disease states.

Background

“Omics” technologies are rapidly generating high amounts of data

  • at varying levels of biological detail.

In addition, there is a rapidly growing literature and

  • accompanying databases that compile this information.

This has provided the basis for the assembly of

  • genome-scale metabolic networks for various microbial and eukaryotic organisms [111].

These network reconstructions serve

  • as manually curated knowledge bases of
  • biological information as well as
  • mathematical representations of biochemical components and
  • interactions specific to each organism.

genome-scale network reconstruction is

  • structured collection of genes, proteins, biochemical reactions, and metabolites
  • determined to exist and operate within a particular organism.

This network can be converted into a predictive model

  • that enables in silico simulations of allowable network states based on
  • governing physico-chemical and genetic constraints [12,13].

A wide range of constraint-based methods have been developed and applied

  • to analyze network metabolic capabilities under
  • different environmental and genetic conditions [13].

These methods have been extensively used to

  • study genome-scale metabolic networks and have successfully predicted, for example,
  1. optimal metabolic states,
  2. gene deletion lethality, and
  3. adaptive evolutionary endpoints [1416].

Most of these applications utilize

  • optimization-based methods such as flux balance analysis (FBA)
  • to explore the metabolic flux space.

However, the behavior of genome-scale metabolic networks can also be studied

  • using unbiased approaches such as
  • uniform random sampling of steady-state flux distributions [17].

Instead of identifying a single optimal flux distribution based on

  • a given optimization criterion (e.g. biomass production),

these methods allow statistical analysis of

  • a large range of possible alternative flux solutions determined by
  • constraints imposed on the network.

Sampling methods have been previously used to study

  1. global organization of E. coli metabolism [18] as well as
  2. to identify candidate disease states in the cardiomyocyte mitochondria [19].

Network reconstructions provide a structured framework

  • to systematically integrate and analyze disparate datasets
  • including transcriptomic, proteomic, metabolomic, and fluxomic data.

Metabolomic data is one of the more relevant data types for this type of analysis as

  1. network reconstructions define the biochemical links between metabolites, and
  2. recent advancements in analytical technologies have allowed increasingly comprehensive
  • intracellular and extracellular metabolite level measurements [20,21].

The metabolome is

  1. the set of metabolites present under a given physiological condition
  2. at a particular time and is the culminating phenotype resulting from
  • various “upstream” control mechanisms of metabolic processes.

Of particular interest to this present study are

  • the quantitative profiles of metabolites that are secreted into the extracellular environment
  • by cells under different conditions.

Recent advances in profiling the extracellular metabolome (EM) have allowed

  • obtaining insightful biological information on cellular metabolism
  • without disrupting the cell itself.

This information can be obtained through various

  • analytical detection,
  • identification, and
  • quantization techniques

for a variety of systems ranging from

  • unicellular model organisms to human biofluids [2023].

Metabolite secretion by a cell reflects its internal metabolic state, and

  • its composition varies in response to
  • genetic or experimental perturbations
  • due to changes in intracellular pathway activities
  • involved in the production and utilization of extracellular metabolites [21].

Variations in metabolic fluxes can be reflected in EM changes which can

  • provide insight into the intracellular pathway activities related to metabolite secretion.

The extracellular metabolomic approach has already shown promise

  • in a variety of applications, including
  1. capturing detailed metabolite biomarker variations related to disease and
  2. drug-induced states and
  3. characterizing gene functions in yeast [2427].

However, interpreting changes in the extracellular metabolome can be challenging

  • due to the indirect relationship between the proximal cause of the change
    (e.g. a mutation)
  • and metabolite secretion.

Since metabolic networks describe

  • mechanistic,
  • biochemical links between metabolites,

integrating such data can allow a systematic approach

  • to identifying altered pathways linked to
  • quantitative changes in secretion profiles.

Measured secretion rates of major byproduct metabolites

  • can be applied as additional exchange flux constraints
  • that define observed metabolic behavior.

For example, a recent study integrating small-scale EM data

  • with a genome-scale yeast model
  • correctly predicted oxygen consumption and ethanol production capacities
  • in mutant strains with respiratory deficiencies [28].

The respiratory deficient mutant study

  • used high accuracy measurements for a small number of
  • major byproduct secretion rates
  • together with an optimization-based method well suited for such data.

Here, we expand the application range of the model-based method used in [28]

  • to extracellular metabolome profiles,
  • which represent a temporal snapshot of the relative abundance
  • for a larger number of secreted metabolites.

Our approach is complementary to

  • statistical (i.e. “top-down”) approaches to metabolome analysis [29]
  • and can potentially be used in applications such as biofluid-based diagnostics or
  • large-scale characterization of mutants strains using metabolite profiles.

This study implements a constraint-based sampling approach on

  • an updated genome-scale network of yeast metabolism
  • to systematically determine how EM level variations

are linked to global changes in intracellular metabolic flux states.

By using a sampling-based network approach and statistical methods (Figure 1),

  • EM changes were linked to systemic intracellular flux perturbations
    in an unbiased manner
  • without relying on defining single optimal flux distributions
  • used in the previously mentioned study [28].

The inferred perturbations in intracellular reaction fluxes were further analyzed

  • using reporter metabolite and subsystem (i.e., metabolic pathway) approaches [30]
  • in order to identify dominant metabolic features that are collectively perturbed (Figure 2).

The sampling-based approach also has the additional benefit of

  • being less sensitive to inaccuracies in metabolite secretion profiles than
  • optimization-based methods and can effectively be used – in biofluid metabolome analysis.

integration of exometabolomic (EM) data

integration of exometabolomic (EM) data

Figure 1. Schematic illustrating the integration of exometabolomic (EM) data with the constraint-based framework.

(A) Cells are subjected to genetic and/or environmental perturbations to secrete metabolite patterns unique to that condition.
(B) EM is detected, identified, and quantified.
(C) EM data is integrated as required secretion flux constraints to define allowable solution space.
(D) Random sampling of solution space yields the range of feasible flux distributions for intracellular reactions.
(E) Sampled fluxes were compared to sampled fluxes of another condition to determine

  • which metabolic regions were altered between the two conditions (see Figure 2).

(F) Significantly altered metabolic regions were identified.

http://www.biomedcentral.com/content/figures/1752-0509-3-37-1.jpg

 

sampling and scoring analysis to determine intracellular flux changes

sampling and scoring analysis to determine intracellular flux changes

Figure 2. Schematic of sampling and scoring analysis to determine intracellular flux changes.

(A) Reaction fluxes are sampled for two conditions.
(B & C) Sample of flux differences is calculated by selecting random flux values from each condition

  • to obtain a distribution of flux differences for each reaction.

(D) Standardized reaction Z-scores are determined, which represent

  • how far the sampled flux differences deviates from a zero flux change.

Reaction scores can be used in

  1. visualizing perturbation subnetworks and
  2. analyzing reporter metabolites and subsystems.

http://www.biomedcentral.com/content/figures/1752-0509-3-37-2.jpg

This study was divided into two parts and describes:

(i) the reconstruction and validation of an expanded S. cerevisiae metabolic network, iMM904; and
(ii) the systematic inference of intracellular metabolic states from

  • two yeast EM data sets using a constraint-based sampling approach.

The first EM data set compares wild type yeast to the gdh1/GDH2 (glutamate dehydrogenase) strain [31],

  • which indicated good agreement between predicted metabolic changes
  • of intracellular metabolite levels and fluxes [31,32].

The second EM data set focused on secreted amino acid measurements

  • from a separate study of yeast cultured in different
    ammonium and potassium concentrations [33].

We analyzed the EM data to gain further insight into

  • perturbed ammonium assimilation processes as well as
  1. metabolic states relating potassium limitation and
  2. ammonium excess conditions to one another.

The model-based analysis of both

  • separately published extracellular metabolome datasets
  • suggests a relationship between
  1. glutamate,
  2. threonine and
  3. folate metabolism,
  • which are collectively perturbed when
    ammonium assimilation processes are broadly disrupted
  1. either by environmental (excess ammonia) or
  2. genetic (gene deletion/overexpression) perturbations.

The methods herein present an approach to

  • interpreting extracellular metabolome data and
  • associating these measured secreted metabolite variations
  • to changes in intracellular metabolic network states.

Additional file 1. iMM904 network content.

The data provided represent the content description of the iMM904 metabolic network and
detailed information on the expanded content.

Format: XLS Size: 2.7MB Download file

This file can be viewed with: Microsoft Excel Viewer

Additional file 2. iMM904 model files.

The data provided are the model text files of the iMM904 metabolic network
that is compatible with the available COBRA Toolbox [13]. The model structure
can be loaded into Matlab using the ‘SimPhenyPlus’ format with GPR and compound information.

Format: ZIP Size: 163KB Download file

Conversion of the network to a predictive model

The network reconstruction was converted to a constraint-based model using established procedures [13].

Network reactions and metabolites were assembled into a stoichiometric matrix 

  • containing the stoichiometric coefficients of the reactions in the network.

The steady-state solution space containing possible flux distributions

  • is determined by calculating the null space of S= 0,

where is the reaction flux vector.

Minimal media conditions were set through constraints on exchange fluxes

  • corresponding to the experimental measured substrate uptake rates.

All the model-based calculations were done using the Matlab COBRA Toolbox [13]

  • utilizing the glpk or Tomlab/CPLEX (Tomopt, Inc.) optimization solvers.

Chemostat growth simulations

The iMM904 model was initially validated by

  1. simulating wild type yeast growth in aerobic and anaerobic
    carbon-limited chemostat conditions
  2. and comparing the simulation results to published experimental data

on substrate uptake and byproduct secretion in these conditions [34].

The study was performed following the approach taken to validate the iFF708 model in a previous study [35].

The predicted glucose uptake rates were determined

  1. by setting the in silico growth rate to the measured dilution rate,
    – equivalent under continuous culture growth,
  2. and minimizing the glucose uptake rate.

The accuracy of in silico predictions of

  • substrate uptake and byproduct secretion by the iMM904 model
  • was similar to the accuracy obtained using the iFF708 model
  • and results are shown in Figure S1 [see Additional file 3].

Additional file 3. Supplemental figures. 

The file provides the supplemental figures and descriptions of S1, S2, S3, and S4.

Format: PDF Size: 513KB Download file

This file can be viewed with: Adobe Acrobat Reader

Genome-scale gene deletion phenotype predictions

The iMM904 network was further validated by

  • performing genome-scale gene lethality computations
  • following established procedures to determine growth phenotypes
  1. under minimal medium conditions and
  2. compared to published data.

A modified version of the biomass function used in previous iND750 studies

  1. was set as the objective to be maximized and
  2. gene deletions were simulated by

setting the flux through the corresponding reaction(s) to zero.

The biomass function was based on the experimentally measured

  1. composition of major cellular constituents
  2. during exponential growth of yeast cells and
  3. was reformulated to include trace amounts of
  4. additional cofactors and metabolites
  5. with the assumed fractional contribution of 10-.

These additional biomass compounds were included

according to the biomass formulation used in the iLL672 study

  • to improve lethality predictions through
  • the inclusion of additional essential biomass components [3].

The model was constrained by limiting

  1. the carbon source uptake to 10 mmol/h/gDW
  2. and oxygen uptake to 2 mmol/h/gDW.

Ammonia, phosphate, and sulfate were assumed to be non-limiting.

The experimental phenotyping data was obtained

  • using strains that were auxotrophic for
  1. methionine,
  2. leucine,
  3. histidine, and
  4. uracil.

These auxotrophies were simulated

  1. by deleting the appropriate genes from the model and
  2. supplementing the in silico strain with the appropriate supplements
  3. at non-limiting, but low levels.

Furthermore, trace amounts of essential nutrients that are present

  • in the experimental minimal media formulation
  1. 4-aminobenzoate,
  2. biotin,
  3. inositol,
  4. nicotinate,
  5. panthothenate,
  6. thiamin)
  • were supplied in the simulations [3].

Three distinct methods to simulate the outcome of gene deletions were utilized:

  1. Flux-balance analysis (FBA) [36-38],
  2. Minimization of Metabolic Adjustment (MoMA) [39], and
  3. a linear version of MoMA (linearMoMA).

In the linearMoMA method, minimization of the quadratic objective function
of the original MoMA algorithm

  • was replaced by minimization of the corresponding 1-norm objective function
    (i.e. sum of the absolute values of the differences of wild type FBA solution
    and the knockout strain flux solution).

The computed results were then compared to growth phenotype data
(viable/lethal) from a previously published experimental gene deletion study [3].

The comparison between experimental and in silico deletion phenotypes involved

  • choosing a threshold for the predicted relative growth rate of
  • a deletion strain that is considered to be viable.

We used standard ROC curve analysis

  • to assess the accuracy of different prediction methods and models
  • across the full range of the viability threshold parameter,
    results shown in Figure S2 [see Additional file 3].

The ROC curve plots the true viable rate against the false viable rate

  • allowing comparison of different models and methods
  • without requiring arbitrarily choosing this parameter a priori [40].

The optimal prediction performance corresponds to

  • the point closest to the top left corner of the ROC plot
    (i.e. 100% true viable rate, 0% false viable rate).

Table 1

Table 1 Comparison of iMM904 and iLL672 gene deletion predictions and experimental data under minimal media conditions
Media Model Method True viable False viable False lethal True lethal True viable % False viable % MCC
Glucose iMM904 full FBA 647 10 32 33 95.29 23.26 0.6
iMM904 full linMOMA 644 10 35 33 94.85 23.26 0.58
iMM904 full MOMA 644 10 35 33 94.85 23.26 0.58
iMM904 red FBA 440 9 28 33 94.02 21.43 0.61
iMM904 red linMOMA 437 9 31 33 93.38 21.43 0.6
iMM904 red MOMA 437 9 31 33 93.38 21.43 0.6
iLL672 full MOMA 433 9 35 33 92.52 21.43 0.57
Galactose iMM904 full FBA 595 32 36 59 94.29 35.16 0.58
iMM904 full linMOMA 595 32 36 59 94.29 35.16 0.58
iMM904 full MOMA 595 32 36 59 94.29 35.16 0.58
iMM904 red FBA 409 12 33 56 92.53 17.65 0.67
iMM904 red linMOMA 409 12 33 56 92.53 17.65 0.67
iMM904 red MOMA 409 12 33 56 92.53 17.65 0.67
iLL672 full MOMA 411 19 31 49 92.99 27.94 0.61
Glycerol iMM904 full FBA 596 43 36 47 94.3 47.78 0.48
iMM904 full linMOMA 595 44 37 46 94.15 48.89 0.47
iMM904 full MOMA 598 44 34 46 94.62 48.89 0.48
iMM904 red FBA 410 20 34 46 92.34 30.3 0.57
iMM904 red linMOMA 409 21 35 45 92.12 31.82 0.56
iMM904 red MOMA 412 21 32 45 92.79 31.82 0.57
iLL672 full MOMA 406 20 38 46 91.44 30.3 0.55
Ethanol iMM904 full FBA 593 45 29 55 95.34 45 0.54
iMM904 full linMOMA 592 45 30 55 95.18 45 0.54
iMM904 full MOMA 592 44 30 56 95.18 44 0.55
iMM904 red FBA 408 21 27 54 93.79 28 0.64
iMM904 red linMOMA 407 21 28 54 93.56 28 0.63
iMM904 red MOMA 407 20 28 55 93.56 26.67 0.64
iLL672 full MOMA 401 13 34 62 92.18 17.33 0.68
MCC, Matthews correlation coefficient (see Methods). Note that the iLL672 predictions were obtained directly from [3] and thus the viability threshold was not optimized using the maximum MCC approach.
Mo et al. BMC Systems Biology 2009 3:37  http://dx.doi.org:/10.1186/1752-0509-3-37

 

The values reported in Table 1 correspond to selecting

  • the optimal viability threshold based on this criterion.

We summarized the overall prediction accuracy of a model and method

  • using the Matthews Correlation Coefficient (MCC) [40].

The MCC ranges from -1 (all predictions incorrect) to +1 (all predictions correct) and

  • is suitable for summarizing overall prediction performance

in our case where there are substantially more viable than lethal gene deletions.

ROC plots were produced in Matlab (Mathworks, Inc.).

 

Table 1. Comparison of iMM904 and iLL672

  • gene deletion predictions and
  • experimental data

Inferring perturbed metabolic regions based on EM profiles

The method implemented in this study is shown schematically in Figures 1 and 2

Constraining the iMM904 network 

Relative levels of quantitative EM data were incorporated into the constraint-based framework

  • as overflow secretion exchange fluxes to simulate the required low-level production of
  • experimentally observed excreted metabolites.

The primary objective of this study is to associate

  • relative metabolite levels that are generally measured for metabonomic or biofluid analyses
  • to the quantitative ranges of intracellular reaction fluxes required to produce them.

However, without detailed kinetic information or dynamic metabolite measurements available,

  • we approximated EM datasets of relative quantitative metabolite levels
  • to be proportional to the rate in which they are secreted and detected
  • (at a steady state) – into the extracellular media.

This approach is analogous to approximating uptake rates based

  • on metabolite concentrations from a previous study performing sampling analysis
  • on a cardiomyocyte mitochondrial network
  • to identify differential flux distribution ranges

for various environmental (i.e. substrate uptake) conditions [19].

The raw data was normalized by the raw maximum value of the dataset
(thus the maximum secretion flux was 1 mmol/hr/gDW) with

  • an assumed error of 10%
  • to set the lower and upper bounds and thus
  • inherently accounting for sampling calculation sensitivity.

The gdh1/GDH2 strains were flask cultured under minimal glucose media conditions; thus,

  • glucose and oxygen uptake rates were set at 15 and 2 mmol/hr/gDW, respectively,
  • for the gdh1/GDH2 strain study.

In the anaerobic case the oxygen uptake rate was set to zero, and

  • sterols and fatty acids were provided as in silico supplements as described in [35].

For the potassium limitation/ammonium toxicity study

  • the growth rate was set at 0.17 1/h, and
  • the glucose uptake rate was minimized
  • to mimic experimental chemostat cultivation conditions.

These input constraints were constant for each perturbation and comparative wild-type condition

  • such that the calculated solution spaces between the conditions
  • differed based only on variations in the output secretion constraints.

FBA optimization of EM-constrained networks

A modified FBA method with minimization of the 1-norm objective function

  • between two optimal flux distributions was used
  • to determine optimal intracellular fluxes
  • based on the EM-constrained metabolic models.

This method determines two optimal flux distributions simultaneously

  • for two differently constrained models (e.g. wild type vs. mutant) –
  • these flux distributions maximize biomass production in each case and
  • the 1-norm distance between the distributions is as small as possible
  • given the two sets of constraints.

This approach avoids problems with

  • alternative optimal solutions when comparing two FBA-computed flux distributions
  • by assuming minimal rerouting of flux distibution between a perturbed network and its reference network.

Reaction flux changes from the FBA optimization results were determined

  • by computing the relative percentage fold change for each reaction
  • between the mutant and wild-type flux distributions.

Random sampling of the steady-state solution space

We utilized artificial centering hit-and-run (ACHR) Monte Carlo sampling [19,41]

  • to uniformly sample the metabolic flux solution space
  • defined by the constraints described above.

Reactions, and their participating metabolites, found to participate in intracellular loops [42]

  • were discarded from further analysis as these reactions can have arbitrary flux values.

The following sections describe the approaches used for the analysis of the different datasets.

Sampling approach used in the gdh1/GDH2 study

Due to the overall shape of the metabolic flux solution space,

  • most of the sampled flux distributions resided close to the minimally allowed growth rate
    (i.e. biomass production) and
  • corresponded to various futile cycles that utilized substrates but
  • did not produce significant biomass.

In order to study more physiologically relevant portions of the flux space

  • we restricted the sampling to the part of the solution space
  • where the growth rate was at least 50% of the maximum growth rate
  • for the condition as determined by FBA.

This assumes that cellular growth remains an important overall objective by the yeast cells

  • even in batch cultivation conditions, but
  • that the intracellular flux distributions
  • may not correspond to maximum biomass production [43].

To test the sensitivity of the results to the minimum growth rate threshold,

  • separate Monte Carlo samples were created for each minimum threshold
  • ranging from 50% to 100% at 5% increments.

We also tested the sensitivity of the results

  • to the relative magnitude of the extracellular metabolite secretion rates
  • by performing the sampling at three different relative levels

(0 corresponding to no extracellular metabolite secretion, maximum rate of 0.5 mmol/hr/gDW,
and maximum rate of 1.0 mmol/hr/gDW).

For each minimum growth rate threshold and extracellular metabolite secretion rate,

  • the ACHR sampler was run for 5 million steps and
  • a flux distribution was stored every 5000 steps.

The sensitivity analysis results are presented in Figures S3 and S4 [see Additional File 3], and

  • the results indicate that the reaction Z-scores (see below) are not significantly affected by
  1. either the portion of the solution space sampled or
  2. the exact scaling of secretion rates.

The final overall sample used was created by combining the samples for all minimum growth rate thresholds

  • for the highest extracellular metabolite secretion rate (maximum 1 mmol/hr/gDW).

This approach allowed biasing the sampling towards

  • physiologically relevant parts of the solution space
  • without imposing the requirement of strictly maximizing a predetermined objective function.

The samples obtained with no EM data were used as control samples

  • to filter reporter metabolites/subsystems whose scores were significantly high
  • due to only random differences between sampling runs.

Sampling approach used in the potassium limitation/ammonium toxicity study

Since the experimental data used in this study was generated in chemostat conditions, and

  • previous studies have indicated that chemostat flux patterns predicted by FBA are
  • close to the experimentally measured ones [43],
  • we assumed that sampling of the optimal solution space was appropriate for this study.

In order to sample a physiologically reasonable range of flux distributions,

  • samples for four different oxygen uptake rates
    (1, 2, 3, and 4 mmol/hr/gDW with 5 million steps each)
  • were combined in the final analysis.

Standardized scoring of flux differences between perturbation and control conditions

Z-score based approach was implemented to quantify differences in flux samples between two conditions (Figure 2).
First, two flux vectors were chosen randomly,

  • one from each of the two samples to be compared and
  • the difference between the flux vectors was computed.

This approach was repeated to create a sample of 10,000 (n) flux difference vectors

  • for each pair of conditions considered (e.g. mutant or perturbed environment vs. wild type).

Based on this flux difference sample, the sample mean (μdiff,i) and standard deviation (σdiff,i)

  • between the two conditions was calculated for each reaction i. The reaction Z-score was calculated as:

 

reaction Z-score

reaction Z-score

which describes the sampled mean difference deviation

  • from a population mean change of zero (i.e. no flux difference
    between perturbation and wild type).

Note that this approach allows accounting for uncertainty in the

  • flux distributions inferred based on the extracellular metabolite secretion constraints.

This is in contrast to approaches such as FBA or MoMA that would predict

  • a single flux distribution for each condition and thus potentially
  • overestimate differences between conditions.

The reaction Z-scores can then be further used in analysis

  • to identify significantly perturbed regions of the metabolic network
  • based on reporter metabolite [44] or subsystem [30] Z-scores.

These reporter regions indicate, or “report”, dominant perturbation features

  • at the metabolite and pathway levels for a particular condition.

The reporter metabolite Z-score for any metabolite can be derived from the reaction Z-scores

  • of the reactions consuming or producing j (set of reactions denoted as Rj) as:

 

reporter z-score for any metabolite j

reporter z-score for any metabolite j

where Nis the number of reactions in Rand mmet,is calculated as

 

distributional correction for m_met,j SQRT

distributional correction for m_met,j SQRT

To account and correct for background distribution, the metabolite Z-score was normalized

  • by computing μmet,Nj and σmet,,Nj corresponding to the mean mmet and
  • its standard deviation for 1,000 randomly generated reaction sets of size Nj.

Z-scores for subsystems were calculated similarly by considering the set of reactions R

  • that belongs to each subsystem k.

Hence, positive metabolite and subsystem scores indicate a significantly perturbed metabolic region

  • relative to other regions, whereas
  • a negative score indicate regions that are not perturbed
  • more significantly than what is expected by random chance.

Perturbation subnetworks of reactions and connecting metabolites were visualized using Cytoscape [45].

Results and discussion

  1. Reconstruction and validation of iMM904 network iMM904 network content 

A previously reconstructed S. cerevisiae network, iND750,

  • was used as the basis for the construction of the expanded iMM904 network.
  • Prior to its presentation here, the
    iMM904 network content was the basis for a consensus jamboree network that was recently published
  • but has not yet been adapted for FBA calculations [46].

The majority of iND750 content was carried over and

  • further expanded on to construct iMM904, which accounts for
  1. 904 genes,
  2. 1,228 individual metabolites, and
  3. 1,412 reactions of which
  •                       395 are transport reactions.

Both the number of gene-associated reactions and the number of metabolites

  • increased in iMM904 compared with the iND750 network.

Additional genes and reactions included in the network primarily expanded the

  • lipid,
  • transport, and
  • carbohydrate subsystems.

The lipid subsystem includes

  • new genes and
  • reactions involving the degradation of sphingolipids and glycerolipids.

Sterol metabolism was also expanded to include

  • the formation and degradation of steryl esters, the
  •                      storage form of sterols.

The majority of the new transport reactions were added

  • to connect network gaps between intracellular compartments
  • to enable the completion of known physiological functions.

We also added a number of new secretion pathways

  • based on experimentally observed secreted metabolites [31].

A number of gene-protein-reaction (GPR) relationships were modified

  • to include additional gene products that are required to catalyze a reaction.

For example, the protein compounds

  • thioredoxin and
  • ferricytochrome C

were explicitly represented as compounds in iND750 reactions, but

  • the genes encoding these proteins were not associated with their corresponding GPRs.

Other examples include glycogenin and NADPH cytochrome p450 reductases (CPRs),

  1. which are required in the assembly of glycogen and
  2. to sustain catalytic activity in cytochromes p450, respectively.

These additional proteins were included in iMM904 as

  • part of protein complexes to provide a more complete
  • representation of the genes and
  • their corresponding products necessary for a catalytic activity to occur.

Major modifications to existing reactions were in cofactor biosynthesis, namely in

  • quinone,
  • beta-alanine, and
  • riboflavin biosynthetic pathways.

Reactions from previous S. cerevisiae networks associated with

  • quinone,
  • beta-alanine, and
  • riboflavin biosynthetic pathways

were essentially inferred from known reaction mechanisms based on

  • reactions in previous network reconstructions of E. coli [2,47].

These pathways were manually reviewed

  • based on current literature and subsequently replaced by
  • reactions and metabolites specific to yeast.

Additional changes in other subsystems were also made, such as

  1. changes to the compartmental location of a gene and
  2. its corresponding reaction(s),
  3. changes in reaction reversibility and cofactor specificity, and
  4. the elucidation of particular transport mechanisms.

A comprehensive listing of iMM904 network contents as well as

  • a detailed list of changes between iND750 and iMM904 is included
    [see Additional file 1].

Predicting deletion growth phenotypes

The updated genome-scale iMM904 metabolic network was validated

  • by comparing in silico single-gene deletion predictions to
  • in vivo results from a previous study used
  • to analyze another S. cerevisiae metabolic model, iLL672 [3].

This network was constructed based on the iFF708 network [22],

  • which was also the starting point for
  • reconstructing the iND750 network [2].

The experimental data used to validate the iLL672 model consisted of

3,360 single-gene knockout strain phenotypes evaluated

  • under minimal media growth conditions with
  1. glucose,
  2. galactose,
  3. glycerol, and
  4. ethanol

as sole carbon sources. Growth phenotypes for the iMM904 network were predictedusing

  1. FBA [3234],
  2. MoMA [35], and
  3. linear MoMA methods

as described in Methods and subsequently compared to the experimental data (Table 1).

Each deleted gene growth prediction comparison was classified as

  1. true lethal,
  2. true viable,
  3. false lethal, or
  4. false viable.

The growth rate threshold for considering a prediction viable was chosen

  • for each condition and method separately
  • to optimize the tradeoff between true viable and false viable predictions
    (maximum Matthews correlation coefficient, see Methods).

Since iMM904 has 212 more genes than iLL672 with experimental data, we also present results

  • for the subset of iMM904 predictions with genes included in iLL672 (reduced iMM904 set).

When the same gene sets are compared, iMM904 improves gene lethality predictions under

  • glucose,
  • galactose, and
  • glycerol conditions

over iLL672 somewhat, but is less accurate

  • at predicting growth phenotypes under the ethanol condition.

It should be noted that the iLL672 predictions were obtained directly from [3]

  • thus the growth rate threshold was not optimized similarly to iMM904 predictions.

Overall, when viability cutoff is chosen

  • as indicated above for each method separately,
  • the three prediction methods perform similarly
  1. FBA,
  2. MOMA, and
  3. linear MOMA) .

While the full gene complement in iMM904 greatly increased

  • the number of true viable predictions,
  • the full model also made significantly more false viable predictions
  • compared with reduced iMM904 and iLL672 predictions.

However, it is important to note that 143 reactions involved in dead-end biosynthetic pathways were actually

  • removed from iFF708 to build the iLL672 reconstruction [3].

These dead-ends are considered “knowledge gaps” in pathways

  • that have not been fully characterized and, as a result,
  • lead to false viable predictions when determining gene essentiality
  • if the pathway is in fact required for growth under a certain condition [2,26].

As more of these pathways are elucidated and

  • included in the model to
  • fill in existing network gaps,
  • we can expect false viable prediction rates to consequently decrease.

Thus, while a larger network has a temporarily reduced capacity to accurately predict gene deletion phenotypes,

  • it captures a more complete picture of currently known metabolic functions and
  • provides a framework for network expansion as new pathways are elucidated [48].

 

Inferring intracellular perturbation states from metabolic profiles – Aerobic and anaerobic gdh1/GDH2 mutant behavior

The gdh1/GDH2 mutant strain was previously developed [49,50]

  • to lower NADPH consumption in ammonia assimilation, which would
  • favor the NADPH-dependent fermentation of xylose.

In this strain, the NADPH-dependent glutamate dehydrogenase, Gdh1, was

  • deleted and the NADH-dependent form of the enzyme, Gdh2,
  •                     was overexpressed.

The net effect is to allow efficient assimilation of ammonia

  • into glutamate using NADH instead of NADPH as a cofactor.

While growth characteristics remained unaffected,

  • relative quantities of secreted metabolites differed between the wild-type and mutant strain
  • under aerobic and anaerobic conditions.

We analyzed EM data for the gdh1/GDH2 and wild-type strains reported

  • in [31] under aerobic and anaerobic conditions separately using
  • both FBA optimization and
  • sampling-based approaches as described in Methods.

43 measured extracellular and intracellular metabolites from the original dataset [31],

  • primarily of central carbon and amino acid metabolism,
  • were explicitly represented in the iMM904 network [see Additional file 4].

Extracellular metabolite levels were used

  • to formulate secretion constraints and
  • differential intracellular metabolites were used
  • to compare and validate the intracellular flux predictions.

Perturbed reactions from the FBA results were

  • determined by calculating relative flux changes, and
  • reaction Z-scores were calculated from the sampling analysis
  • to quantify flux changes between the mutant and wild-type strains,
  • with Z reaction > 1.96 corresponding to a two-tailed p-value < 0.05 and
  • considered to be significantly perturbed [see Additional file 4].

Additional file 4. Gdh mutant aerobic and anaerobic analysis results. 

The data provided are the full results for the exometabolomic analysis of aerobic and anerobic gdh1/GDH2 mutant.

Format: XLS Size: 669KB Download file

This file can be viewed with: Microsoft Excel Viewer

To validate the predicted results, reaction flux changes from both FBA and sampling methods were compared to differential intracellular metabolite level data measured from the same study. Intracellular metabolites involved in highly perturbed reactions (i.e. reactants and products) predicted from FBA and sampling analyses were identified and
compared to metabolites that were experimentally identified as significantly changed (< 0.05) between mutant and wild-type. Statistical measures of recall, accuracy, and
precision were calculated and represent the predictive sensitivity, exactness, and reproducibility respectively. From the sampling analysis, a considerably larger number of
significantly perturbed reactions are predicted in the anaerobic case (505 reactions, or 70.7% of active reactions) than in aerobic (394 reactions, or 49.8% of active reactions). The top percentile of FBA flux changes equivalent to the percentage of significantly perturbed sampling reactions were compared to the intracellular data. Results from both analyses are summarized in Table 2. Sampling predictions were considerably higher in recall than FBA predictions for both conditions, with respective ranges of 0.83–1
compared to 0.48–0.96. Accuracy was also higher in sampling predictions; however, precision was slightly better in the FBA predictions as expected due to the smaller
number of predicted changes. Overall, the sampling predictions of perturbed intracellular metabolites are strongly consistent with the experimental data and significantly
outperforms that of FBA optimization predictions in accurately predicting differential metabolites involved in perturbed intracellular fluxes.

Table 2. Statistical comparison of the differential intracellular metabolite data set (< 0.05) with metabolites involved in perturbed reactions predicted by FBA optimization and sampling analyses for aerobic and anaerobic gdh1/GDH2 mutant.

 

Table 2 Statistical comparison of the differential intracellular metabolite data set (p < 0.05)
with metabolites involved in perturbed reactions predicted by FBA optimization and
sampling analyses for aerobic and anaerobic gdh1/GDH2 mutant.
                           Aerobic                         Anaerobic                             Overall
FBA Sampling FBA Sampling FBA
Recall 0.48 0.83 0.96 1 0.71 0.91
Accuracy 0.55 0.62 0.64 0.64 0.6 0.63
Precision 0.78 0.69 0.64 0.63 0.68 0.66
Overall statistics indicate combined results of both conditions.
Mo et al. BMC Systems Biology 2009 3:37   http://dx.doi.org:/10.1186/1752-0509-3-37


Figure 3.
 Perturbation reaction subnetwork of gdh1/GDH2 mutant under aerobic conditions.

The network illustrates a simplified subset of highly perturbedPerturbation subnetworks can be drawn to visualize predicted significantly perturbed intracellular reactions and illustrate their connection to the observed secreted metabolites in the aerobic and anaerobic gdh1/GDH2 mutants.

Perturbation reaction subnetwork of gdh1.GDH2 mutant under aerobic conditions.

Perturbation reaction subnetwork of gdh1.GDH2 mutant under aerobic conditions.

Figure 3 shows an example of a simplified aerobic perturbation subnetwork consisting primarily of proximal pathways connected directly to a subset of major secreted
metabolites

  • glutamate,
  • proline,
  • D-lactate, and
  • 2-hydroxybuturate.

Figure 4 displays anaerobic reactions with Z-scores of similar magnitude to the perturbed reactions in Figure 3. The same subset of metabolites is also present in the
larger anaerobic perturbation network and indicates that the NADPH/NADH balance perturbation induced by the gdh1/GDH2 manipulation has widespread effects
beyond just altering glutamate metabolism anaerobically.

Interestingly, it is clear that the majority of the secreted metabolite pathways involve connected perturbed reactions that broadly converge on glutamate.

Note that Figures 3 and 4 only show the subnetworks that consisted of two or more connected reactions  for a number of secreted metabolites no contiguous perturbed pathway could be identified by the sampling approach. This indicates that the secreted metabolite pattern alone is not sufficient to determine which specific
production and secretion pathways are used by the cell for these metabolites.

Reactions connected to aerobically-secreted metabolites predicted from the sampling analysis of the gdh1/GDH2 mutant strain.
The major secreted metabolites

  • glutamate,
  • proline,
  • D-lactate, and
  • 2-hydroxybuturate

were also detected in the anaerobic condition. Metabolite abbreviations are found in Additional file 1.

Figure 4.

Perturbation reaction subnetwork of gdh1/GDH2 mutant under anaerobic conditions.

Perturbation reaction subnetwork of gdh1.GDH2 mutant under anaerobic conditions

Perturbation reaction subnetwork of gdh1.GDH2 mutant under anaerobic conditions

Subnetwork illustrates the highly perturbed anaerobic reactions of similar Z-reaction magnitude to the reactions in Figure 3.

A significantly larger number of reactions indicates mutant metabolic effects are more widespread in the anaerobic environment.
The network shows that perturbed pathways converge on glutamate, the main site in which the gdh1/GDH2 modification was introduced, which
suggests that the direct genetic perturbation effects are amplified under this environment. Metabolite abbreviations are found in Additional file 1.

To further highlight metabolic regions that have been systemically affected by the gdh1/GDH2 modification, reporter metabolite and subsystem methods [30] were used to
summarize reaction scores around specific metabolites and in specific metabolic subsystems. The top ten significant scores for metabolites/subsystems associated with more
than three reactions are summarized in Tables 3 (aerobic) and 4 (anaerobic), with Z > 1.64 corresponding to < 0.05 for a one-tailed distribution. Full data for all reactions,
reporter metabolites, and reporter subsystems is included [see Additional file 4].

Table 3. List of the top ten significant reporter metabolite and subsystem scores for the gdh1/GDH2 vs. wild type comparison in aerobic conditions.

Table 3
List of the top ten significant reporter metabolite and subsystem scores for the gdh1/GDH2 vs. wild type comparison in aerobic conditions.
Reporter metabolite Z-score No of reactions*
L-proline [c] 2.71 4
Carbon dioxide [m] 2.51 15
Proton [m] 2.19 51
Glyceraldehyde 3-phosphate [c] 1.93 7
Ubiquinone-6 [m] 1.82 5
Ubiquinol-6 [m] 1.82 5
Ribulose-5-phosphate [c] 1.8 4
Uracil [c] 1.74 4
L-homoserine [c] 1.72 4
Alpha-ketoglutarate [m] 1.71 8
Reporter subsystem Z-score No of reactions
Citric Acid Cycle 4.58 7
Pentose Phosphate Pathway 3.29 12
Glycine and Serine Metabolism 2.69 17
Alanine and Aspartate Metabolism 2.65 6
Oxidative Phosphorylation 1.79 8
Thiamine Metabolism 1.54 8
Arginine and Proline Metabolism 1.44 20
Other Amino Acid Metabolism 1.28 5
Glycolysis/Gluconeogenesis 0.58 14
Anaplerotic reactions 0.19 9
*Number of reactions categorized in a subsystem or found to be neighboring each metabolite
Mo et al. BMC Systems Biology 2009 3:37   http://dx.doi.org:/10.1186/1752-0509-3-37

Table 4. List of top ten significant reporter metabolite and subsystem scores for the gdh1/GDH2 vs. wild type comparison in anaerobic conditions.

 

Table 4
List of top ten significant reporter metabolite and subsystem scores for the gdh1/GDH2 vs. wild type comparison in anaerobic conditions.
Reporter metabolite Z-score No of reactions
Glutamate [c] 4.52 35
Aspartate [c] 3.21 11
Alpha-ketoglutarate [c] 2.66 17
Glycine [c] 2.65 7
Pyruvate [m] 2.56 7
Ribulose-5-phosphate [c] 2.43 4
Threonine [c] 2.28 6
10-formyltetrahydrofolate [c] 2.27 5
Fumarate [c] 2.27 5
L-proline [c] 2.04 4
Reporter subsystem Z-score No of reactions
Valine, Leucine, and Isoleucine Metabolism 3.97 15
Tyrosine, Tryptophan, and Phenylalanine Metabolism 3.39 23
Pentose Phosphate Pathway 3.29 11
Purine and Pyrimidine Biosynthesis 3.08 40
Arginine and Proline Metabolism 2.96 19
Threonine and Lysine Metabolism 2.74 14
NAD Biosynthesis 2.66 7
Alanine and Aspartate Metabolism 2.65 6
Histidine Metabolism 2.24 10
Cysteine Metabolism 1.85 10
Mo et al. BMC Systems Biology 2009 3:37   http://dx.doi.org:/10.1186/1752-0509-3-37
Open Data

Perturbations under aerobic conditions largely consisted of pathways involved in mediating the NADH and NADPH balance. Among the highest scoring aerobic subsystems
are TCA cycle and pentose phosphate pathway – key pathways directly involved in the generation of NADH and NADPH. Reporter metabolites involved in these
subsystems –

  • glyceraldehyde-3-phosphate,
  • ribulose-5-phosphate, and
  • alpha-ketoglutarate – were also identified.

These results are consistent with flux and enzyme activity measurements

  • of the gdh1/GDH2 strain under aerobic conditions [32],
  1. which reported significant reduction in the pentose phosphate pathway flux
  2. with concomitant changes in other central metabolic pathways.

Levels of several TCA cycle intermediates (e.g. fumarate, succinate, malate) were also elevated

  • in the gdh1/GDH2 mutant according to the differential intracellular metabolite data.

Altered energy metabolism, as indicated by

  • reporter metabolites (i.e. ubiquinone- , ubiquinol, mitochondrial proton)
  • and subsystem (oxidative phosphorylation),

is certainly feasible as NADH is a primary reducing agent for ATP production.

Pentose phosphate pathway and NAD biosynthesis also appears

  • among the most perturbed anaerobic subsystems, further suggesting
  • perturbed cofactor balance as a common, dominant effect under both conditions.

Glutamate dehydrogenase is a critical enzyme of amino acid biosynthesis as it acts as

  • the entry point for ammonium assimilation via glutamate.

Consequently, metabolic subsystems involved in amino acid biosynthesis were broadly perturbed

  • as a result of the gdh1/GDH2 modification in both aerobic and anaerobic conditions.

For example, the proline biosynthesis pathway that uses glutamate as a precursor

  • was significantly perturbed in both conditions,
  • with significantly changed intracellular and extracellular levels.

There were differences, however, in that more amino acid related subsystems were

  • significantly affected in the anaerobic case (Table 4),
  • further highlighting that altered ammonium assimilation in the mutant
  • has a more widespread effect under anaerobic conditions.

This effect is especially pronounced for

  • threonine and nucleotide metabolism,
  • which were predicted to be significantly perturbed only in anaerobic conditions.

Intracellular threonine levels were amongst the most significantly reduced

  • relative to other differential intracellular metabolites in the anaerobically grown gdh1/GDH2 strain
    (see [31] and Additional file 4), and
  • the relationship between threonine and nucleotide biosynthesis is further supported

by threonine’s recently discovered role as a key precursor in yeast nucleotide biosynthesis [51].

Other key anaerobic reporter metabolites are

  • glycine and 10-formyltetrahydrofolate,
  • both of which are involved in the cytosolic folate cycle (one-carbon metabolism).

Folate is intimately linked to biosynthetic pathways of

  • glycine (with threonine as its precursor) and purines
  • by mediating one-carbon reaction transfers necessary in their metabolism and
  • is a key cofactor in cellular growth [52].

Thus, the anaerobic perturbations identified in the analysis emphasize the close relationship

  • between threonine, folate, and nucleotide metabolic pathways as well as
  • their potential connection to perturbed ammonium assimilation processes.

Interestingly, this association has been previously demonstrated at the transcriptional level

  • as yeast ammonium assimilation (via glutamine synthesis) was found to be
  • co-regulated with genes involved in glycine, folate, and purine synthesis [53].

In summary, the overall differences in predicted gdh1/GDH2 mutant behavior

  • under aerobic and anaerobic conditions show that changes in flux states
  • directly related to modified ammonium assimilation pathway
  1. are amplified anaerobically whereas the
  2. indirect effects through NADH/NADPH balance are more significant aerobically.

Perturbed metabolic regions under aerobic conditions were predominantly

  • in central metabolic pathways involved in responding to the changed NADH/NADPH demand
  • and did not necessarily emphasize that glutamate dehydrogenase was the site of the genetic modification.

The majority of affected anaerobic pathways were involved directly

  • in modified ammonium assimilation as evidenced by

1) significantly perturbed amino acid subsystems,

2) a broad perturbation subnetwork converging on glutamate (Figure 4), and

3) glutamate as the most significant reporter metabolite (Table 4).

Potassium-limited and excess ammonium environments

A recent study reported that potassium limitation resulted in significant

  • growth retardation effect in yeast due to excess ammonium uptake
  • when ammonium was provided as the sole nitrogen source [33].

The proposed mechanism for this effect was that ammonium

  • could to be freely transported through potassium channels
  • when potassium concentrations were low in the media environment, thereby
  • resulting in excess ammonium uptake [33].

As a result, yeast incurred a significant metabolic cost

  • in assimilating ammonia to glutamate and
  • secreting significant amounts of glutamate and other amino acids
  • in potassium-limited conditions as a means to detoxify the excess ammonium.

A similar effect was observed when yeast was grown

  • with no potassium limitation,
  • but with excess ammonia in the environment.

While the observed effect of both environments (low potassium or excess ammonia) was similar,

  • quantitatively unique amino acid secretion profiles suggested that
  • internal metabolic states in these conditions are potentially different.

In order to elucidate the differences in internal metabolic states, we utilized

  • the iMM904 model and the EM profile analysis method to analyze amino acid secretion profiles
  • for a range of low potassium and high ammonia conditions reported in [33].

As before, we utilized amino acid secretion patterns as constraints to the iMM904 model,

  1. sampled the allowable solution space,
  2. computed reaction Z-scores for changes from a reference condition (normal potassium and ammonia), and
  3. finally summarized the resulting changes using reporter metabolites.

Figure 5 shows a clustering of the most significant reporter metabolites (Z ≥ 1.96 in any of the four conditions studied)

  • obtained from this analysis across the four conditions studied.

Interestingly, the potassium-limited environment perturbed only a subset of

  • the significant reporter metabolites identified in the high ammonia environments.

Both low potassium environments shared a consistent pattern of

  • highly perturbed amino acids and related precursor biosynthesis metabolites
    (e.g. pyruvate, PRPP, alpha-ketoglutarate)
  • with high ammonium environments.

The amino acid perturbation pattern (indicated by red labels in Figure 5) was present in

  • the ammonium-toxic environments, although the pattern was
  • slightly weaker for the lower ammonium concentration.

Nevertheless, the results clearly indicate that a similar

  • ammonium detoxifying mechanism that primarily perturbs pathways
  • directly related to amino acid metabolism
  • exists under both types of media conditions.

Figure 5.

Clustergram of top reporter metabolites - y in ammonium-toxic and potassium-limited conditions

Clustergram of top reporter metabolites – y in ammonium-toxic and potassium-limited conditions

Clustergram of top reporter metabolites (i.e. in yellow) in ammonium-toxic and potassium-limited conditions.

Amino acid perturbation patterns (shown in red) were shown to be consistently scored across conditions, indicating that potassium-limited environments K1 (lowest
concentration) and K2 (low concentration) elicited a similar ammonium detoxification response as ammonium-toxic environments N1 (high concentration) and N2
(highest concentration). Metabolites associated with folate metabolism (highlighted in green) are also highly perturbed in ammonium-toxic conditions. Metabolite
abbreviations are found in Additional file 1.

In addition to perturbed amino acids, a secondary effect notably appears at high ammonia levels in which metabolic regions related to folate metabolism are significantly affected. As highlighted in green in Figure 3, we predicted significantly perturbed key metabolites involved in the cytosolic folate cycle. These include tetrahydrofolate derivatives and other metabolites connected to the folate pathway, namely glycine and the methionine-derived methylation cofactors S-adenosylmethionine and S-adenosyl-homocysteine. Additionally, threonine was identified to be a key perturbed metabolite in excess ammonium conditions. These results further illustrate the close
connection between threonine biosynthesis, folate metabolism involving glycine derived from its threonine precursor, and nucleotide biosynthesis [51] that was discussed in
conjunction with the gdh1/GDH2 strain data. Taken together with the anaerobic gdh1/GDH2 data, the results consistently suggest highly perturbed threonine and folate
metabolism when amino acid-related pathways are broadly affected.

In both ammonium-toxic and potassium-limited environments, impaired cellular growth was observed, which can be attributed to high energetic costs of increased
ammonium assimilation to synthesize and excrete amino acids. However, under high ammonium environments, reporter metabolites related to threonine and folate
metabolism indicated that their perturbation, and thus purine supply, may be an additional factor in decreasing cellular viability as there is a direct relationship between
intracellular folate levels and growth rate [54]. Based on these results, we concluded that while potassiumlimited growth in yeast indeed shares physiological features with
growth in ammonium excess, its effects are not as detrimental as actual ammonium excess. The effects on proximal amino acid metabolic pathways are similar in both
environments as indicated by the secretion of the majority of amino acids. However, when our method was applied to analyze the physiological basis behind differences in
secretion profiles between low potassium and high ammonium conditions, ammonium excess was predicted to likely disrupt physiological ammonium assimilation processes,
which in turn potentially impacts folate metabolism and associated cellular growth.

Conclusion

The method presented in this study presents an approach to connecting intracellular flux states to metabolites that are excreted under various physiological conditions. We
showed that well-curated genome-scale metabolic networks can be used to integrate and analyze quantitative EM data by systematically identifying altered intracellular
pathways related to measured changes in the extracellular metabolome. We were able to identify statistically significant metabolic regions that were altered as a result of
genetic (gdh1/GD2 mutant) and environmental (excess ammonium and limited potassium) perturbations, and the predicted intracellular metabolic changes were consistent
with previously published experimental data including measurements of intracellular metabolite levels and metabolic fluxes. Our reanalysis of previously published EM data
on ammonium assimilation-related genetic and environmental perturbations also resulted in testable hypotheses about the role of threonine and folate pathways in mediating
broad responses to changes in ammonium utilization. These studies also demonstrated that the samplingbased method can be readily applied when only partial secreted
metabolite profiles (e.g. only amino acids) are available.

With the emergence of metabolite biofluid biomarkers as a diagnostic tool in human disease [55,56] and the availability of genome-scale human metabolic networks [1],
extensions of the present method would allow identifying potential pathway changes linked to these biomarkers. Employing such a method for studying yeast metabolism was possible as the metabolomic data was measured under controllable environmental conditions where the inputs and outputs of the system were defined. Measured metabolite biomarkers in a clinical setting, however, is far from a controlled environment with significant variations in genetic, nutritional, and environmental factors between different
patients. While there are certainly limitations for clinical applications, the method introduced here is a progressive step towards applying genome-scale metabolic networks
towards analyzing biofluid metabolome data as it 1) avoids the need to only study optimal metabolic states based on a predetermined objective function, 2) allows dealing with noisy experimental data through the sampling approach, and 3) enables analysis even with limited identification of metabolites in the data. The ability to establish potential
connections between extracellular markers and intracellular pathways would be valuable in delineating the genetic and environmental factors associated with a particular
disease.

Authors’ contributions

Conceived and designed the experiments: MLM MJH BOP. Performed experiments: MLM MJH. Analyzed the data: MLM MJH. Wrote the paper: MLM MJH BOP. All authors have read and approved the final manuscript.

Acknowledgements

We thank Jens Nielsen for providing the raw metabolome data for the mutant strain, and Jan Schellenberger and Ines Thiele for valuable discussions. This work was supported by NIH grant R01 GM071808. BOP serves on the scientific advisory board of Genomatica Inc.

 

References

  1. Duarte NC, Becker SA, Jamshidi N, Thiele I, Mo ML, Vo TD, Srivas R, Palsson BO: Global reconstruction of the human metabolic network based on genomic and bibliomic data. 

Proc Natl Acad Sci USA 2007, 104(6):1777-1782. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Duarte NC, Herrgard MJ, Palsson B: Reconstruction and Validation of Saccharomyces cerevisiae iND750, a Fully Compartmentalized Genome-Scale Metabolic Model. 

Genome Res 2004, 14(7):1298-1309. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Kuepfer L, Sauer U, Blank LM: Metabolic functions of duplicate genes in Saccharomyces cerevisiae. 

Genome Res 2005, 15(10):1421-1430. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Nookaew I, Jewett MC, Meechai A, Thammarongtham C, Laoteng K, Cheevadhanarak S, Nielsen J, Bhumiratana S: The genome-scale metabolic model iIN800 of Saccharomyces cerevisiae and its validation: a scaffold to query lipid metabolism. 

BMC Syst Biol 2008, 2:71. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text

  1. Edwards JS, Palsson BO: Systems properties of the Haemophilus influenzae Rd metabolic genotype. 

J biol chem 1999, 274(25):17410-17416. PubMed Abstract | Publisher Full Text

  1. Edwards JS, Palsson BO: The Escherichia coli MG1655 in silico metabolic genotype: Its definition, characteristics, and capabilities. 

Proc Natl Acad Sci USA 2000, 97(10):5528-5533. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Thiele I, Vo TD, Price ND, Palsson B: An Expanded Metabolic Reconstruction of Helicobacter pylori (IT341 GSM/GPR): An in silico genome-scale characterization of single and double deletion mutants. 

J Bacteriol 2005, 187(16):5818-5830. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Vo TD, Greenberg HJ, Palsson BO: Reconstruction and functional characterization of the human mitochondrial metabolic network based on proteomic and biochemical data. 

J Biol Chem 2004, 279(38):39532-39540. PubMed Abstract | Publisher Full Text

  1. Reed JL, Vo TD, Schilling CH, Palsson BO: An expanded genome-scale model of Escherichia coli K-12 (iJR904 GSM/GPR). 

Genome Biology 2003, 4(9):R54.51-R54.12. BioMed Central Full Text

  1. Feist AM, Henry CS, Reed JL, Krummenacker M, Joyce AR, Karp PD, V H, Palsson BO: A genome-scale metabolic reconstruction for Escherichia coli K-12 MG1655 that accounts for 1261 ORFs and thermodynamic information. 

Molecular Systems Biology 2007, 3:121. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Suthers PF, Dasika MS, Kumar VS, Denisov G, Glass JI, Maranas CD: A genome-scale metabolic reconstruction of Mycoplasma genitalium, iPS189. 

PLos Comp Biol 2009, 5(2):e1000285. Publisher Full Text

  1. Price ND, Reed JL, Palsson BO: Genome-scale models of microbial cells: evaluating the consequences of constraints. 

Nat Rev Microbiol 2004, 2(11):886-897. PubMed Abstract | Publisher Full Text

  1. Becker SA, Feist AM, Mo ML, Hannum G, Palsson BO, Herrgard MJ: Quantitative Prediction of Cellular Metabolism with Constraint-based Models: The COBRA Toolbox. 

Nature protocols 2007, 2(3):727-738. PubMed Abstract | Publisher Full Text

  1. Reed JL, Palsson BO: Genome-Scale In Silico Models of E. coli Have Multiple Equivalent Phenotypic States: Assessment of Correlated Reaction Subsets That Comprise Network States. 

Genome Res 2004, 14(9):1797-1805. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Fong SS, Palsson BO: Metabolic gene deletion strains of Escherichia coli evolve to computationally predicted growth phenotypes. 

Nature Genetics 2004, 36(10):1056-1058. PubMed Abstract | Publisher Full Text

  1. Ibarra RU, Edwards JS, Palsson BO: Escherichia coli K-12 undergoes adaptive evolution to achieve in silico predicted optimal growth. 

Nature 2002, 420(6912):186-189. PubMed Abstract | Publisher Full Text

  1. Schellenberger J, Palsson BØ: Use of randomized sampling for analysis of metabolic networks. 

J Biol Chem 2009, 284(9):5457-5461. PubMed Abstract | Publisher Full Text

  1. Almaas E, Kovács B, Vicsek T, Oltvai ZN, Barabási AL: Global organization of metabolic fluxes in the bacterium Escherichia coli. 

Nature 2004, 427(6977):839-843. PubMed Abstract | Publisher Full Text

  1. Thiele I, Price ND, Vo TD, Palsson BO: Candidate metabolic network states in human mitochondria: Impact of diabetes, ischemia, and diet. 

J Biol Chem 2005, 280(12):11683-11695. PubMed Abstract | Publisher Full Text

  1. Kell DB: Metabolomics and systems biology: making sense of the soup. 

Curr Opin Microbiol 2004, 7(3):296-307. PubMed Abstract | Publisher Full Text

  1. Kell DB, Brown M, Davey HM, Dunn WB, Spasic I, Oliver SG: Metabolic footprinting and systems biology: the medium is the message. 

Nat Rev Microbiol 2005, 3(7):557-565. PubMed Abstract | Publisher Full Text

  1. Goodacre R, Vaidyanathan S, Dunn WB, Harrigan GG, Kell DB: Metabolomics by numbers: acquiring and understanding global metabolite data. 

Trends Biotechnol 2004, 22(5):245-252. PubMed Abstract | Publisher Full Text

  1. Lenz EM, Bright J, Wilson ID, Morgan SR, Nash AF: A 1H NMR-based metabonomic study of urine and plasma samples obtained from healthy human subjects. 

J Pharm Biomed Anal 2003, 33(5):1103-1115. PubMed Abstract | Publisher Full Text

  1. Allen J, Davey HM, Broadhurst D, Heald JK, Rowland JJ, Oliver SG, Kell DB: High-throughput classification of yeast mutants for functional genomics using metabolic footprinting. 

Nat Biotech 2003, 21(6):692-696. Publisher Full Text

  1. Nicholson JK, Connelly J, Lindon JC, Holmes E: Metabonomics: a platform for studying drug toxicity and gene function. 

Nat Rev Drug Discov 2002, 1(2):153-161. PubMed Abstract | Publisher Full Text

  1. Mortishire-Smith RJ, Skiles GL, Lawrence JW, Spence S, Nicholls AW, Johnson BA, Nicholson JK: Use of metabonomics to identify impaired fatty acid metabolism as the mechanism of a drug-induced toxicity. 

Chem Res Toxicol 2004, 17(2):165-173. PubMed Abstract | Publisher Full Text

  1. Sabatine MS, Liu E, Morrow DA, Heller E, McCarroll R, Wiegand R, Berriz GF, Roth FP, Gerszten RE: Metabolomic identification of novel biomarkers of myocardial ischemia. 

Circulation 2005, 112(25):3868-3875. PubMed Abstract | Publisher Full Text

  1. Cakir T, Efe C, Dikicioglu D, Hortaçsu AKB, Oliver SG: Flux balance analysis of a genome-scale yeast model constrained by exometabolomic data allows metabolic system identification of genetically different strains. 

Biotechnol Prog 2007, 23(2):320-326. PubMed Abstract | Publisher Full Text

  1. Bang JW, Crockford DJ, Holmes E, Pazos F, Sternberg MJ, Muggleton SH, Nicholson JK: Integrative top-down system metabolic modeling in experimental disease states via data-driven Bayesian methods. 

J Proteome Res 2008, 7(2):497-503. PubMed Abstract | Publisher Full Text

  1. Oliveira AP, Patil KR, Nielsen J: Architecture of transcriptional regulatory circuits is knitted over the topology of bio-molecular interaction networks. 

BMC Syst Biol 2008, 2:17. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text

  1. Villas-Boas SG, Moxley JF, Akesson M, Stephanopoulos G, Nielsen J: High-throughput metabolic state analysis: the missing link in integrated functional genomics of yeasts. 

Biochem J 2005, 388(Pt 2):669-677. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Moreira dos Santos M, Thygesen G, Kötter P, Olsson L, Nielsen J: Aerobic physiology of redox-engineered Saccharomyces cerevisiae strains modified in the ammonium assimilation for increased NADPH availability. 

FEMS Yeast Res 2003, 4(1):59-68. PubMed Abstract | Publisher Full Text

  1. Hess DC, Lu W, Rabinowitz JD, Botstein D: Ammonium toxicity and potassium limitation in yeast. 

PLoS Biol 2006, 4(11):e351. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Nissen TL, Schulze U, Nielsen J, Villadsen J: Flux distributions in anaerobic, glucose-limited continuous cultures of Saccharomyces cerevisiae. 

Microbiology 1997, 143(Pt 1):203-218. PubMed Abstract | Publisher Full Text

  1. Famili I, Forster J, Nielsen J, Palsson BO: Saccharomyces cerevisiae phenotypes can be predicted by using constraint-based analysis of a genome-scale reconstructed metabolic network. 

Proc Natl Acad Sci USA 2003, 100(23):13134-13139. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Bonarius HPJ, Schmid G, Tramper J: Flux analysis of underdetermined metabolic networks: The quest for the missing constraints. 

Trends in Biotechnology 1997, 15(8):308-314. Publisher Full Text

  1. Edwards JS, Palsson BO: Metabolic flux balance analysis and the in silico analysis of Escherichia coli K-12 gene deletions. 

BMC Bioinformatics 2000, 1:1. PubMed Abstract | BioMed Central Full Text | PubMed Central Full Text

  1. Varma A, Palsson BO: Metabolic Flux Balancing: Basic concepts, Scientific and Practical Use. 

Nat Biotechnol 1994, 12:994-998. Publisher Full Text

  1. Segre D, Vitkup D, Church GM: Analysis of optimality in natural and perturbed metabolic networks. 

Proc Natl Acad Sci USA 2002, 99(23):15112-15117. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Baldi P, Brunak S, Chauvin Y, Andersen CA, Nielsen H: Assessing the accuracy of prediction algorithms for classification: an overview. 

Bioinformatics 2000, 16(5):412-424. PubMed Abstract | Publisher Full Text

  1. Price ND, Schellenberger J, Palsson BO: Uniform Sampling of Steady State Flux Spaces: Means to Design Experiments and to Interpret Enzymopathies. 

Biophysical Journal 2004, 87(4):2172-2186. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Price ND, Thiele I, Palsson BO: Candidate states of Helicobacter pylori’s genome-scale metabolic network upon application of “loop law” thermodynamic constraints. 

Biophysical J 2006, 90(11):3919-3928. Publisher Full Text

  1. Schuetz R, Kuepfer L, Sauer U: Systematic evaluation of objective functions for predicting intracellular fluxes in Escherichia coli. 

Mol Syst Biol 2007, 3:119. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Patil KR, Nielsen J: Uncovering transcriptional regulation of metabolism by using metabolic network topology. 

Proc Natl Acad Sci USA 2005, 102(8):2685-2689. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Shannon P, Markiel A, Ozier O, Baliga NS, Wang JT, Ramage D, Amin N, Schwikowski B, Ideker T: Cytoscape: a software environment for integrated models of biomolecular interaction networks. 

Genome Res 2003, 13(11):2498-2504. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Herrgard MJ, Swainston N, Dobson P, Dunn WB, Arga KY, Arvas M, Bluthgen N, Borger S, Costenoble R, Heinemann M, et al.: A consensus yeast metabolic network reconstruction obtained from a community approach to systems biology. 

Nat Biotech 2008, 26:1155-1160. Publisher Full Text

  1. Forster J, Famili I, Fu PC, Palsson BO, Nielsen J: Genome-Scale Reconstruction of the Saccharomyces cerevisiae Metabolic Network. 

Genome Research 2003, 13(2):244-253. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Reed JL, Patel TR, Chen KH, Joyce AR, Applebee MK, Herring CD, Bui OT, Knight EM, Fong SS, Palsson BO: Systems Approach to Genome Annotation: Prediction and Validation of Metabolic Functions. 

Proc Natl Acad Sci USA 2006, 103(46):17480-17484. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Nissen TL, Kielland-Brandt MC, Nielsen J, Villadsen J: Optimization of ethanol production in Saccharomyces cerevisiae by metabolic engineering of the ammonium assimilation. 

Metab Eng 2000, 2(1):69-77. PubMed Abstract | Publisher Full Text

  1. Roca C, Nielsen J, Olsson L: Metabolic engineering of ammonium assimilation in xylose-fermenting Saccharomyces cerevisiae improves ethanol production. 

Appl Environ Microbiol 2003, 69(8):4732-4736. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Hartman JL IV: Buffering of deoxyribonucleotide pool homeostasis by threonine metabolism. 

Proc Natl Acad Sci USA 2007, 104(28):11700-11705. PubMed Abstract | Publisher Full Text | PubMed Central Full Text

  1. Gelling CL, Piper MD, Hong SP, Kornfeld GD, Dawes IW: Identification of a novel one-carbon metabolism regulon in Saccharomyces cerevisiae. 

J Biol Chem 2004, 279(8):7072-7081. PubMed Abstract | Publisher Full Text

  1. Denis V, Daignan-Fornier B: Synthesis of glutamine, glycine and 10-formyl tetrahydrofolate is coregulated with purine biosynthesis in Saccharomyces cerevisiae. 

Mol Gen Genet 1998, 259(3):246-255. PubMed Abstract | Publisher Full Text

  1. Hjortmo S, Patring J, Andlid T: Growth rate and medium composition strongly affect folate content in Saccharomyces cerevisiae. 

Int J Food Microbiol 2008, 123(1–2):93-100. PubMed Abstract | Publisher Full Text

  1. Kussmann MRF, Affolter M: OMICS-driven biomarker discovery in nutrition and health. 

J Biotechnol 2006, 124(4):758-787. PubMed Abstract | Publisher Full Text

  1. Serkova NJ, Niemann CU: Pattern recognition and biomarker validation using quantitative 1H-NMR-based metabolomics. 

Expert Rev Mol Diagn 2006, 6(5):717-731. PubMed Abstract | Publisher Full Text

 

Read Full Post »

Older Posts »