Feeds:
Posts
Comments

Posts Tagged ‘Kurt Goedel’

Irreconciliable Dissonance in Physical Space and Cellular Metabolic Conception

Irreconciliable Dissonance in Physical Space and Cellular Metabolic Conception

Curator: Larry H. Bernstein, MD, FCAP

Pasteur Effect – Warburg Effect – What its history can teach us today. 

José Eduardo de Salles Roselino

The Warburg effect, in reality the “Pasteur-effect” was the first example of metabolic regulation described. A decrease in the carbon flux originated at the sugar molecule towards the end of the catabolic pathway, with ethanol and carbon dioxide observed when yeast cells were transferred from an anaerobic environmental condition to an aerobic one. In Pasteur´s studies, sugar metabolism was measured mainly by the decrease of sugar concentration in the yeast growth media observed after a measured period of time. The decrease of the sugar concentration in the media occurs at great speed in yeast grown in anaerobiosis (oxygen deficient) and its speed was greatly reduced by the transfer of the yeast culture to an aerobic condition. This finding was very important for the wine industry of France in Pasteur’s time, since most of the undesirable outcomes in the industrial use of yeast were perceived when yeasts cells took a very long time to create, a rather selective anaerobic condition. This selective culture media was characterized by the higher carbon dioxide levels produced by fast growing yeast cells and by a higher alcohol content in the yeast culture media.

However, in biochemical terms, this finding was required to understand Lavoisier’s results indicating that chemical and biological oxidation of sugars produced the same calorimetric (heat generation) results. This observation requires a control mechanism (metabolic regulation) to avoid burning living cells by fast heat released by the sugar biological oxidative processes (metabolism). In addition, Lavoisier´s results were the first indications that both processes happened inside similar thermodynamics limits. In much resumed form, these observations indicate the major reasons that led Warburg to test failure in control mechanisms in cancer cells in comparison with the ones observed in normal cells.

[It might be added that the availability of O2 and CO2 and climatic conditions over 750 million years that included volcanic activity, tectonic movements of the earth crust, and glaciation, and more recently the use of carbon fuels and the extensive deforestation of our land masses have had a large role in determining the biological speciation over time, in sea and on land. O2 is generated by plants utilizing energy from the sun and conversion of CO2. Remove the plants and we tip the balance. A large source of CO2 is from beneath the earth’s surface.]

Biology inside classical thermodynamics places some challenges to scientists. For instance, all classical thermodynamics must be measured in reversible thermodynamic conditions. In an isolated system, increase in P (pressure) leads to increase in V (volume), all this occurring in a condition in which infinitesimal changes in one affects in the same way the other, a continuum response. Not even a quantic amount of energy will stand beyond those parameters.

In a reversible system, a decrease in V, under same condition, will led to an increase in P. In biochemistry, reversible usually indicates a reaction that easily goes either from A to B or B to A. For instance, when it was required to search for an anti-ischemic effect of Chlorpromazine in an extra hepatic obstructed liver, it was necessary to use an adequate system of increased biliary system pressure in a reversible manner to exclude a direct effect of this drug over the biological system pressure inducer (bile secretion) in Braz. J. Med. Biol. Res 1989; 22: 889-893. Frequently, these details are jumped over by those who read biology in ATGC letters.

Very important observations can be made in this regard, when neutral mutations are taken into consideration since, after several mutations (not affecting previous activity and function), a last mutant may provide a new transcript RNA for a protein and elicit a new function. For an example, consider a Prion C from lamb getting similar to bovine Prion C while preserving  its normal role in the lamb when its ability to change Human Prion C is considered (Stanley Prusiner).

This observation is good enough, to confirm one of the most important contributions of Erwin Schrodinger in his What is Life:

“This little book arose from a course of public lectures, delivered by a theoretical physicist to an audience of about four hundred which did not substantially dwindle, though warned at the outset that the subject matter was a difficult one and that the lectures could not be termed popular, even though the physicist’s most dreaded weapon, mathematical deduction, would hardly be utilized. The reason for this was not that the subject was simple enough to be explained without mathematics, but rather that it was much too involved to be fully accessible to mathematics.”

After Hans Krebs, description of the cyclic nature of the citrate metabolism and after its followers described its requirement for aerobic catabolism two major lines of research started the search for the understanding of the mechanism of energy transfer that explains how ADP is converted into ATP. One followed the organic chemistry line of reasoning and therefore, searched for a mechanism that could explain how the breakdown of carbon-carbon link could have its energy transferred to ATP synthesis. One of the major leaders of this research line was Britton Chance. He took into account that relatively earlier in the series of Krebs cycle reactions, two carbon atoms of acetyl were released as carbon dioxide ( In fact, not the real acetyl carbons but those on the opposite side of citrate molecule). In stoichiometric terms, it was not important whether the released carbons were or were not exactly those originated from glucose carbons. His research aimed at to find out an intermediate proteinaceous intermediary that could act as an energy reservoir. The intermediary could store in a phosphorylated amino acid the energy of carbon-carbon bond breakdown. This activated amino acid could transfer its phosphate group to ADP producing ATP. A key intermediate involved in the transfer was identified by Kaplan and Lipmann at John Hopkins as acetyl coenzyme A, for which Fritz Lipmann received a Nobel Prize.

Alternatively, under possible influence of the excellent results of Hodgkin and Huxley a second line of research appears. The work of Hodgkin & Huxley indicated that the storage of electrical potential energy in transmembrane ionic asymmetries and presented the explanation for the change from resting to action potential in excitable cells. This second line of research, under the leadership of Peter Mitchell postulated a mechanism for the transfer of oxide/reductive power of organic molecules oxidation through electron transfer as the key for the energetic transfer mechanism required for ATP synthesis.
This diverted the attention from high energy (~P) phosphate bond to the transfer of electrons. During most of the time the harsh period of the two confronting points of view, Paul Boyer and followers attempted to act as a conciliatory third party, without getting good results, according to personal accounts (in L. A. or Latin America) heard from those few of our scientists who were able to follow the major scientific events held in USA, and who could present to us later. Paul  Boyer could present how the energy was transduced by a molecular machine that changes in conformation in a series of 3 steps while rotating in one direction in order to produce ATP and in opposite direction in order to produce ADP plus Pi from ATP (reversibility).

However, earlier, a victorious Peter Mitchell obtained the result in the conceptual dispute, over the Britton Chance point of view, after he used E. Coli mutants to show H+ gradients in the cell membrane and its use as energy source, for which he received a Nobel Prize. Somehow, this outcome represents such a blow to Chance’s previous work that somehow it seems to have cast a shadow over very important findings obtained during his earlier career that should not be affected by one or another form of energy transfer mechanism.  For instance, Britton Chance got the simple and rapid polarographic assay method of oxidative phosphorylation and the idea of control of energy metabolism that brings us back to Pasteur.

This metabolic alternative result seems to have been neglected in the recent years of obesity epidemics, which led to a search for a single molecular mechanism required for the understanding of the accumulation of chemical (adipose tissue) reserve in our body. It does not mean that here the role of central nervous system is neglected. In short, in respiring mitochondria the rate of electron transport linked to the rate of ATP production is determined primarily by the relative concentrations of ADP, ATP and phosphate in the external media (cytosol) and not by the concentration of respiratory substrate as pyruvate. Therefore, when the yield of ATP is high as it is in aerobiosis and the cellular use of ATP is not changed, the oxidation of pyruvate and therefore of glycolysis is quickly (without change in gene expression), throttled down to the resting state. The dependence of respiratory rate on ADP concentration is also seen in intact cells. A muscle at rest and using no ATP has a very low respiratory rate.   [When skeletal muscle is stressed by high exertion, lactic acid produced is released into the circulation and is metabolized aerobically by the heart at the end of the activity].

This respiratory control of metabolism will lead to preservation of body carbon reserves and in case of high caloric intake in a diet, also shows increase in fat reserves essential for our biological ancestors survival (Today for our obesity epidemics). No matter how important this observation is, it is only one focal point of metabolic control. We cannot reduce the problem of obesity to the existence of metabolic control. There are numerous other factors but on the other hand, we cannot neglect or remove this vital process in order to correct obesity. However, we cannot explain obesity ignoring this metabolic control. This topic is so neglected in modern times that we cannot follow major research lines of the past that were interrupted by the emerging molecular biology techniques and the vain belief that a dogmatic vision of biology could replace all previous knowledge by a new one based upon ATGC readings. For instance, in order to display bad consequences derived from the ignorance of these old scientific facts, we can take into account, for instance, how ion movements across membranes affects membrane protein conformation and therefore contradicts the wrong central dogma of molecular biology. This change in protein conformation (with unchanged amino acid sequence) and/or the lack of change in protein conformation is linked to the factors that affect vital processes as the heart beats. This modern ignorance could also explain some major pitfalls seen in new drugs clinical trials and in a small scale on bad medical practices.

The work of Britton Chance and of Peter Mitchell have deep and sound scientific roots that were made with excellent scientific techniques, supported by excellent scientific reasoning and that were produced in a large series of very important intermediary scientific results. Their sole difference was to aim at very different scientific explanations as their goals (They have different Teleology in their minds made by their previous experiences). When, with the use of mutants obtained in microorganisms P Mitchell´s goal was found to survive and B Chance to succumb to the experimental evidence, all those excellent findings of B Chance and followers were directed to the dustbin of scientific history as an example of lack of scientific consideration.  [On the one hand, the Mitchell model used a unicellular organism; on the other, Chance’s work was with eukaryotic cells, quite relevant to the discussion.]

We can resume the challenge faced by these two great scientists in the following form: The first conceptual unification in bioenergetics, achieved in the 1940s, is inextricably bound up with the name of Fritz Lipmann. Its central feature was the recognition that adenosine triphosphate, ATP, serves as a universal energy  “currency” much as money serves as economic currency. In a nutshell, the purpose of metabolism is to support the synthesis of ATP. In microorganisms, this is perfect! In humans or mammals, or vertebrates, by the same reason that we cannot consider that gene expression is equivalent to protein function (an acceptable error in the case of microorganisms) this oversimplifies the metabolic requirement with a huge error. However, in case our concern is ATP chemistry only, the metabolism produces ATP and the hydrolysis of ATP pays for the performance of almost, all kinds of works. It is possible to presume that to find out how the flow of metabolism (carbon flow) led to ATP production must be considered a major focal point of research of the two contenders. Consequently, what could be a minor fall of one of the contenders, in case we take into account all that was found during their entire life of research, the real failure in B Chance’s final goal was amplified far beyond what may be considered by reason!

Another aspect that must be taken into account: Both contenders have in the scientific past a very sound root. Metabolism may produce two forms of energy currency (I personally don´t like this expression*) and I use it here because it was used by both groups in order to express their findings. Together with simplistic thermodynamics, this expression conveys wrong ideas): The second kind of energy currency is the current of ions passing from one side of a membrane to the other. The P. Mitchell scientific root undoubtedly have the work of Hodgkin & Huxley, Huxley &  Huxley, Huxley & Simmons

*ATP is produced under the guidance of cell needs and not by its yield. When glucose yields only 2 ATPs per molecule it is oxidized at very high speed (anaerobiosis) as is required to match cellular needs. On the other hand, when it may yield (thermodynamic terms) 38 ATP the same molecule is oxidized at low speed. It would be similar to an investor choice its least money yield form for its investment (1940s to 1972) as a solid support. B. Chance had the enzymologists involved in clarifying how ATP could be produced directly from NADH + H+ oxidative reductive metabolic reactions or from the hydrolysis of an enolpyruvate intermediary. Both competitors had their work supported by different but, sound scientific roots and have produced very important scientific results while trying to present their hypothetical point of view.

Before the winning results of P. Mitchell were displayed, one line of defense used by B. Chance followers was to create a conflict between what would be expected by a restrictive role of proteins through its specificity ionic interactions and the general ability of ionic asymmetries that could be associated with mitochondrial ATP production. Chemical catalyzed protein activities do not have perfect specificity but an outstanding degree of selective interaction was presented by the lock and key model of enzyme interaction. A large group of outstanding “mitochondriologists” were able to show ATP synthesis associated with Na+, K+, Ca2+… asymmetries on mitochondrial membranes and any time they did this, P. Mitchell have to display the existence of antiporters that exchange X for hydrogen as the final common source of chemiosmotic energy used by mitochondria for ATP synthesis.

This conceptual battle has generated an enormous knowledge that was laid to rest, somehow discontinued in the form of scientific research, when the final E. Coli mutant studies presented the convincing final evidence in favor of P. Mitchell point of view.

Not surprisingly, a “wise anonymous” later, pointed out: “No matter what you are doing, you will always be better off in case you have a mutant”

(Principles of Medical Genetics T D Gelehrter & F.S. Collins chapter 7, 1990).

However, let’s take the example of a mechanical wristwatch. It clearly indicates when the watch is working in an acceptable way, that its normal functioning condition is not the result of one of its isolated components – or something that can be shown by a reductionist molecular view.  Usually it will be considered that it is working in an acceptable way, in case it is found that its accuracy falls inside a normal functional range, for instance, one or two standard deviations bellow or above the mean value for normal function, what depends upon the rigor wisely adopted. While, only when it has a faulty component (a genetic inborn error) we can indicate a single isolated piece as the cause of its failure (a reductionist molecular view).

We need to teach in medicine, first the major reasons why the watch works fine (not saying it is “automatic”). The functions may cross the reversible to irreversible regulatory limit change, faster than what we can imagine. Latter, when these ideas about normal are held very clear in the mind set of medical doctors (not medical technicians) we may address the inborn errors and what we may have learn from it. A modern medical technician may cause admiration when he uses an “innocent” virus to correct for a faulty gene (a rather impressive technological advance). However, in case the virus, later shows signals that indicate that it was not so innocent, a real medical doctor will be called upon to put things in correct place again.

Among the missing parts of normal evolution in biochemistry a lot about ion fluxes can be found. Even those oscillatory changes in Ca2+ that were shown to affect gene expression (C. De Duve) were laid to rest since, they clearly indicate a source of biological information that despite the fact that it does not change nucleotides order in the DNA, it shows an opposing flux of biological information against the dogma (DNA to RNA to proteins). Another, line has shown a hierarchy, on the use of mitochondrial membrane potential: First the potential is used for Ca2+ uptake and only afterwards, the potential is used for ADP conversion into ATP (A. L. Lehninger). In fact, the real idea of A. L. Lehninger was by far, more complex since according to him, mitochondria works like a buffer for intracellular calcium releasing it to outside in case of a deep decrease in cytosol levels or capturing it from cytosol when facing transient increase in Ca2+ load. As some of Krebs cycle dehydrogenases were activated by Ca2+, this finding was used to propose a new control factor in addition to the one of ADP (B. Chance). All this was discontinued with the wrong use of calculus (today we could indicate bioinformatics in a similar role) in biochemistry that has established less importance to a mitochondrial role after comparative kinetics that today are seen as faulty.

It is important to combat dogmatic reasoning and restore sound scientific foundations in basic medical courses that must urgently reverse the faulty trend that tries to impose a view that goes from the detail towards generalization instead of the correct form that goes from the general finding well understood towards its molecular details. The view that led to curious subjects as bioinformatics in medical courses as training in sequence finding activities can only be explained by its commercial value. The usual form of scientific thinking respects the limits of our ability to grasp new knowledge and relies on reproducibility of scientific results as a form to surpass lack of mathematical equation that defines relationship of variables and the determination of its functional domains. It also uses old scientific roots, as its sound support never replaces existing knowledge by dogmatic and/or wishful thinking. When the sequence of DNA was found as a technical advance to find amino acid sequence in proteins it was just a technical advance. This technical advance by no means could be considered a scientific result presented as an indication that DNA sequences alone have replaced the need to study protein chemistry, its responses to microenvironmental changes in order to understand its multiple conformations, changes in activities and function. As E. Schrodinger correctly describes the chemical structure responsible for the coded form stored of genetic information must have minimal interaction with its microenvironment in order to endure hundreds and hundreds years as seen in Hapsburg’s lips. Only magical reasoning assumes that it is possible to find out in non-reactive chemical structures the properties of the reactive ones.

For instance, knowledge of the reactions of the Krebs cycle clearly indicate a role for solvent that no longer could be considered to be an inert bath for catalytic activity of the enzymes when the transfer of energy include a role for hydrogen transport. The great increase in understanding this change on chemical reaction arrived from conformational energy.

Again, even a rather simplistic view of this atomic property (Conformational energy) is enough to confirm once more, one of the most important contribution of E. Schrodinger in his What is Life:

“This little book arose from a course of public lectures, delivered by a theoretical physicist to an audience of about four hundred which did not substantially dwindle, though warned at the outset that the subject matter was a difficult one and that the lectures could not be termed popular, even though the physicist’s most dreaded weapon, mathematical deduction, would hardly be utilized. The reason for this was not that the subject was simple enough to be explained without mathematics, but rather that it was much too involved to be fully accessible to mathematics.”

In a very simplistic view, while energy manifests itself by the ability to perform work conformational energy as a property derived from our atomic structure can be neutral, positive or negative (no effect, increased or decreased reactivity upon any chemistry reactivity measured as work)

Also:

“I mean the fact that we, whose total being is entirely based on a marvelous interplay of this very kind, yet if all possess the power of acquiring considerable knowledge about it. I think it possible that this knowledge may advance to little just a short of a complete understanding -of the first marvel. The second may well be beyond human understanding.”

In fact, scientific knowledge allows us to understand how biological evolution may have occurred or have not occurred and yet does not present a proof about how it would have being occurred. It will be always be an indication of possible against highly unlike and never a scientific proven fact about the real form of its occurrence.

As was the case of B. Chance in its bioenergetics findings, we may get very important findings that indicates wrong directions in the future as was his case, or directed toward our past.

The Skeleton of Physical Time – Quantum Energies in Relative Space of S-labs

By Radoslav S. Bozov  Independent Researcher

WSEAS, Biology and BioSystems of Biomedicine

Space does not equate to distance, displacement of an object by classically defined forces – electromagnetic, gravity or inertia. In perceiving quantum open systems, a quanta, a package of energy, displaces properties of wave interference and statistical outcomes of sums of paths of particles detected by a design of S-labs.

The notion of S-labs, space labs, deals with inherent problems of operational module, R(i+1), where an imagination number ‘struggles’ to work under roots of a negative sign, a reflection of an observable set of sums reaching out of the limits of the human being organ, an eye or other foundational signal processing system.

While heavenly bodies, planets, star systems, and other exotic forms of light reflecting and/or emitting objects, observable via naked eye have been deduced to operate under numerical systems that calculate a periodic displacement of one relative to another, atomic clocks of nanospace open our eyes to ever expanding energy spaces, where matrices of interactive variables point to the problem of infinity of variations in scalar spaces, however, defining properties of minute universes as a mirror image of an astronomical system. The first and furthermost problem is essentially the same as those mathematical methodologies deduced by Isaac Newton and Albert Einstein for processing a surface. I will introduce you to a surface interference method by describing undetermined objective space in terms of determined subjective time.

Therefore, the moment will be an outcome of statistical sums of a numerical system extending from near zero to near one. Three strings hold down a dual system entangled via interference of two waves, where a single wave is a product of three particles (today named accordingly to either weak or strong interactions) momentum.

The above described system emerges from duality into trinity the objective space value of physical realities. The triangle of physical observables – charge, gravity and electromagnetism, is an outcome of interference of particles, strings and waves, where particles are not particles, or are strings strings, or  are waves waves of an infinite character in an open system which we attempt to define to predict outcomes of tomorrow’s parameters, either dependent or independent as well as both subjective to time simulations.

We now know that aging of a biological organism cannot be defined within singularity. Thereafter, clocks are subjective to apparatuses measuring oscillation of defined parameters which enable us to calculate both amplitude and a period, which we know to be dependent on phase transitions.

The problem of phase was solved by the applicability of carbon relative systems. A piece of diamond does not get wet, yet it holds water’s light entangled property. Water is the dark force of light. To formulate such statement, we have been searching truth by examining cooling objects where the Maxwell demon is translated into information, a data complex system.

Modern perspectives in computing quantum based matrices, 0+1 =1 and/or 0+0=1, and/or 1+1 =0, will be reduced by applying a conceptual frame of Aladdin’s flying anti-gravity carpet, unwrapping both past and future by sending a photon to both, placing present always near zero. Thus, each parallel quantum computation of a natural system approaching the limit of a vibration of a string defining 0 does not equal 0, and 1 does not equal 1. In any case, if our method 1+1 = 1, yet, 1 is not 1 at time i+1. This will set the fundamentals of an operational module, called labris operator or in simplicity S-labs. Note, that 1 as a result is an event predictable to future, while interacting parameters of addition 1+1 may be both, 1 as an observable past, and 1 as an imaginary system, or 1+1 displaced interactive parameters of past observable events. This is the foundation of Future Quantum Relative Systems Interference (QRSI), taking analytical technologies of future as a result of data matrices compressing principle relative to carbon as a reference matter rational to water based properties.

Goedel’s concept of loops exist therefore only upon discrete relative space uniting to parallel absolute continuity of time ‘lags’. ( Goedel, Escher and Bach: An Eternal Golden Braid. A Metaphorical Fugue on Minds and Machines in the Spirit of Lewis Carroll. D Hofstadter.  Chapter XX: Strange Loops, Or Tangled Hierarchies. A grand windup of many of the ideas about hierarchical systems and self-reference. It is concerned with the snarls which arise when systems turn back on themselves-for example, science probing science, government investigating governmental wrongdoing, art violating the rules of art, and finally, humans thinking about their own brains and minds. Does Gödel’s Theorem have anything to say about this last “snarl”? Are free will and the sensation of consciousness connected to Gödel’s Theorem? The Chapter ends by tying Gödel, Escher, and Bach together once again.)  The fight struggle in-between time creates dark spaces within which strings manage to obey light properties – entangled bozons of information carrying future outcomes of a systems processing consciousness. Therefore, Albert Einstein was correct in his quantum time realities by rejecting a resolving cube of sugar within a cup of tea (Henri Bergson 19th century philosopher. Bergson’s concept of multiplicity attempts to unify in a consistent way two contradictory features: heterogeneity and continuity. Many philosophers today think that this concept of multiplicity, despite its difficulty, is revolutionary.) However, the unity of time and space could not be achieved by deducing time to charge, gravity and electromagnetic properties of energy and mass.

Charge is further deduced to interference of particles/strings/waves, contrary to the Hawking idea of irreducibility of chemical energy carrying ‘units’, and gravity is accounted for by intrinsic properties of   anti-gravity carbon systems processing light, an electromagnetic force, that I have deduced towards ever expanding discrete energy space-energies rational to compressing mass/time. The role of loops seems to operate to control formalities where boundaries of space fluctuate as a result of what we called above – dark time-spaces.

Indeed, the concept of horizon is a constant due to ever expanding observables. Thus, it fails to acquire a rational approach towards space-time issues.

Richard Feynman has touched on issues of touching of space, sums of paths of particle traveling through time. In a way he has resolved an important paradigm, storing information and possibly studying it by opening a black box. Schroedinger’s cat is alive again, but incapable of climbing a tree when chased by a dog. Every time a cat climbs a garden tree, a fruit falls on hedgehogs carried away parallel to living wormholes whose purpose of generating information lies upon carbon units resolving light.

In order to deal with such a paradigm, we will introduce i+1 under square root in relativity, therefore taking negative one ( -1 = sqrt (i+1), an operational module R dealing with Wheelers foam squeezed by light, releasing water – dark spaces. Thousand words down!

What is a number? Is that a name or some kind of language or both? Is the issue of number theory possibly accountable to the value of the concept of entropic timing? Light penetrating a pyramid holding bean seeds on a piece of paper and a piece of slice of bread, a triple set, where a church mouse has taken a drop of tear, but a blood drop. What an amazing physics! The magic of biology lies above egoism, above pride, and below Saints.

We will set up the twelve parameters seen through 3+1 in classic realities:

–              discrete absolute energies/forces – no contradiction for now between Newtonian and Albert Einstein mechanics

–              mass absolute continuity – conservational law of physics in accordance to weak and strong forces

–              quantum relative spaces – issuing a paradox of Albert Einstein’s space-time resolved by the uncertainty principle

–              parallel continuity of multiple time/universes – resolving uncertainty of united space and energy through evolving statistical concepts of scalar relative space expansion and vector quantum energies by compressing relative continuity of matter in it, ever compressing flat surfaces – finding the inverse link between deterministic mechanics of displacement and imaginary space, where spheres fit within surface of triangles as time unwraps past by pulling strings from future.

To us, common human beings, with an extra curiosity overloaded by real dreams, value happens to play in the intricate foundation of life – the garden of love, its carbon management in mind, collecting pieces of squeezed cooling time.

The infinite interference of each operational module to another composing ever emerging time constrains unified by the Solar system, objective to humanity, perhaps answers that a drop of blood and a drop of tear is united by a droplet of a substance separating negative entropy to time courses of a physical realities as defined by an open algorithm where chasing power subdue to space becomes an issue of time.

Jose Eduardo de Salles Roselino

Some small errors: For intance an increase i P leads to a decrease in V ( not an increase in V)..

 

Radoslav S. Bozov  Independent Researcher

If we were to use a preventative measures of medical science, instruments of medical science must predict future outcomes based on observable parameters of history….. There are several key issues arising: 1. Despite pinning a difference on genomic scale , say pieces of information, we do not know how to have changed that – that is shift methylome occupying genome surfaces , in a precise manner.. 2. Living systems operational quo DO NOT work as by vector gravity physics of ‘building blocks. That is projecting a delusional concept of a masonry trick, who has not worked by corner stones and ever shifting momenta … Assuming genomic assembling worked, that is dealing with inferences through data mining and annotation, we are not in a position to read future in real time, and we will never be, because of the rtPCR technology self restriction into data -time processing .. We know of existing post translational modalities… 3. We don’t know what we don’t know, and that foundational to future medicine – that is dealing with biological clocks, behavior, and various daily life inputs ranging from radiation to water systems, food quality, drugs…

Read Full Post »