Advertisements
Feeds:
Posts
Comments

Posts Tagged ‘C. difficile’


Clinical Laboratory Challenges

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

CLINICAL LABORATORY NEWS   

The Lab and CJD: Safe Handling of Infectious Prion Proteins

Body fluids from individuals with possible Creutzfeldt-Jakob disease (CJD) present distinctive safety challenges for clinical laboratories. Sporadic, iatrogenic, and familial CJD (known collectively as classic CJD), along with variant CJD, kuru, Gerstmann-Sträussler-Scheinker, and fatal familial insomnia, are prion diseases, also known as transmissible spongiform encephalopathies. Prion diseases affect the central nervous system, and from the onset of symptoms follow a typically rapid progressive neurological decline. While prion diseases are rare, it is not uncommon for the most prevalent form—sporadic CJD—to be included in the differential diagnosis of individuals presenting with rapid cognitive decline. Thus, laboratories may deal with a significant number of possible CJD cases, and should have protocols in place to process specimens, even if a confirmatory diagnosis of CJD is made in only a fraction of these cases.

The Lab’s Role in Diagnosis

Laboratory protocols for handling specimens from individuals with possible, probable, and definitive cases of CJD are important to ensure timely and appropriate patient management. When the differential includes CJD, an attempt should be made to rule-in or out other causes of rapid neurological decline. Laboratories should be prepared to process blood and cerebrospinal fluid (CSF) specimens in such cases for routine analyses.

Definitive diagnosis requires identification of prion aggregates in brain tissue, which can be achieved by immunohistochemistry, a Western blot for proteinase K-resistant prions, and/or by the presence of prion fibrils. Thus, confirmatory diagnosis is typically achieved at autopsy. A probable diagnosis of CJD is supported by elevated concentration of 14-3-3 protein in CSF (a non-specific marker of neurodegeneration), EEG, and MRI findings. Thus, the laboratory may be required to process and send CSF samples to a prion surveillance center for 14-3-3 testing, as well as blood samples for sequencing of the PRNP gene (in inherited cases).

Processing Biofluids

Laboratories should follow standard protective measures when working with biofluids potentially containing abnormally folded prions, such as donning standard personal protective equipment (PPE); avoiding or minimizing the use of sharps; using single-use disposable items; and processing specimens to minimize formation of aerosols and droplets. An additional safety consideration is the use of single-use disposal PPE; otherwise, re-usable items must be either cleaned using prion-specific decontamination methods, or destroyed.

Blood. In experimental models, infectivity has been detected in the blood; however, there have been no cases of secondary transmission of classical CJD via blood product transfusions in humans. As such, blood has been classified, on epidemiological evidence by the World Health Organization (WHO), as containing “no detectible infectivity,” which means it can be processed by routine methods. Similarly, except for CSF, all other body fluids contain no infectivity and can be processed following standard procedures.

In contrast to classic CJD, there have been four cases of suspected secondary transmission of variant CJD via transfused blood products in the United Kingdom. Variant CJD, the prion disease associated with mad cow disease, is unique in its distribution of prion aggregates outside of the central nervous system, including the lymph nodes, spleen, and tonsils. For regions where variant CJD is a concern, laboratories should consult their regulatory agencies for further guidance.

CSF. Relative to highly infectious tissues of the brain, spinal cord, and eye, infectivity has been identified less often in CSF and is considered to have “low infectivity,” along with kidney, liver, and lung tissue. Since CSF can contain infectious material, WHO has recommended that analyses not be performed on automated equipment due to challenges associated with decontamination. Laboratories should perform a risk assessment of their CSF processes, and, if deemed necessary, consider using manual methods as an alternative to automated systems.

Decontamination

The infectious agent in prion disease is unlike any other infectious pathogen encountered in the laboratory; it is formed of misfolded and aggregated prion proteins. This aggregated proteinacious material forms the infectious unit, which is incredibly resilient to degradation. Moreover, in vitro studies have demonstrated that disrupting large aggregates into smaller aggregates increases cytotoxicity. Thus, if the aim is to abolish infectivity, all aggregates must be destroyed. Disinfectant procedures used for viral, bacterial, and fungal pathogens such as alcohol, boiling, formalin, dry heat (<300°C), autoclaving at 121°C for 15 minutes, and ionizing, ultraviolet, or microwave radiation, are either ineffective or variably effective against aggregated prions.

The only means to ensure no risk of residual infectious prions is to use disposable materials. This is not always practical, as, for instance, a biosafety cabinet cannot be discarded if there is a CSF spill in the hood. Fortunately, there are several protocols considered sufficient for decontamination. For surfaces and heat-sensitive instruments, such as a biosafety cabinet, WHO recommends flooding the surface with 2N NaOH or undiluted NaClO, letting stand for 1 hour, mopping up, and rinsing with water. If the surface cannot tolerate NaOH or NaClO, thorough cleaning will remove most infectivity by dilution. Laboratories may derive some additional benefit by using one of the partially effective methods discussed previously. Non-disposable heat-resistant items preferably should be immersed in 1N NaOH, heated in a gravity displacement autoclave at 121°C for 30 min, cleaned and rinsed in water, then sterilized by routine methods. WHO has outlined several alternate decontamination methods. Using disposable cover sheets is one simple solution to avoid contaminating work surfaces and associated lengthy decontamination procedures.

With standard PPE—augmented by a few additional safety measures and prion-specific decontamination procedures—laboratories can safely manage biofluid testing in cases of prion disease.

 

The Microscopic World Inside Us  

Emerging Research Points to Microbiome’s Role in Health and Disease

Thousands of species of microbes—bacteria, viruses, fungi, and protozoa—inhabit every internal and external surface of the human body. Collectively, these microbes, known as the microbiome, outnumber the body’s human cells by about 10 to 1 and include more than 1,000 species of microorganisms and several million genes residing in the skin, respiratory system, urogenital, and gastrointestinal tracts. The microbiome’s complicated relationship with its human host is increasingly considered so crucial to health that researchers sometimes call it “the forgotten organ.”

Disturbances to the microbiome can arise from nutritional deficiencies, antibiotic use, and antiseptic modern life. Imbalances in the microbiome’s diverse microbial communities, which interact constantly with cells in the human body, may contribute to chronic health conditions, including diabetes, asthma and allergies, obesity and the metabolic syndrome, digestive disorders including irritable bowel syndrome (IBS), and autoimmune disorders like multiple sclerosis and rheumatoid arthritis, research shows.

While study of the microbiome is a growing research enterprise that has attracted enthusiastic media attention and venture capital, its findings are largely preliminary. But some laboratorians are already developing a greater appreciation for the microbiome’s contributions to human biochemistry and are considering a future in which they expect to measure changes in the microbiome to monitor disease and inform clinical practice.

Pivot Toward the Microbiome

Following the National Institutes of Health (NIH) Human Genome Project, many scientists noted the considerable genetic signal from microbes in the body and the existence of technology to analyze these microorganisms. That realization led NIH to establish the Human Microbiome Project in 2007, said Lita Proctor, PhD, its program director. In the project’s first phase, researchers studied healthy adults to produce a reference set of microbiomes and a resource of metagenomic sequences of bacteria in the airways, skin, oral cavities, and the gastrointestinal and vaginal tracts, plus a catalog of microbial genome sequences of reference strains. Researchers also evaluated specific diseases associated with disturbances in the microbiome, including gastrointestinal diseases such as Crohn’s disease, ulcerative colitis, IBS, and obesity, as well as urogenital conditions, those that involve the reproductive system, and skin diseases like eczema, psoriasis, and acne.

Phase 1 studies determined the composition of many parts of the microbiome, but did not define how that composition affects health or specific disease. The project’s second phase aims to “answer the question of what microbes actually do,” explained Proctor. Researchers are now examining properties of the microbiome including gene expression, protein, and human and microbial metabolite profiles in studies of pregnant women at risk for preterm birth, the gut hormones of patients at risk for IBS, and nasal microbiomes of patients at risk for type 2 diabetes.

Promising Lines of Research

Cystic fibrosis and microbiology investigator Michael Surette, PhD, sees promising microbiome research not just in terms of evidence of its effects on specific diseases, but also in what drives changes in the microbiome. Surette is Canada research chair in interdisciplinary microbiome research in the Farncombe Family Digestive Health Research Institute at McMaster University
in Hamilton, Ontario.

One type of study on factors driving microbiome change examines how alterations in composition and imbalances in individual patients relate to improving or worsening disease. “IBS, cystic fibrosis, and chronic obstructive pulmonary disease all have periods of instability or exacerbation,” he noted. Surette hopes that one day, tests will provide clinicians the ability to monitor changes in microbial composition over time and even predict when a patient’s condition is about to deteriorate. Monitoring perturbations to the gut microbiome might also help minimize collateral damage to the microbiome during aggressive antibiotic therapy for hospitalized patients, he added.

Monitoring changes to the microbiome also might be helpful for “culture negative” patients, who now may receive multiple, unsuccessful courses of different antibiotics that drive antibiotic resistance. Frustration with standard clinical biology diagnosis of lung infections in cystic fibrosis patients first sparked Surette’s investigations into the microbiome. He hopes that future tests involving the microbiome might also help asthma patients with neutrophilia, community-acquired pneumonia patients who harbor complex microbial lung communities lacking obvious pathogens, and hospitalized patients with pneumonia or sepsis. He envisions microbiome testing that would look for short-term changes indicating whether or not a drug is effective.

Companion Diagnostics

Daniel Peterson, MD, PhD, an assistant professor of pathology at Johns Hopkins University School of Medicine in Baltimore, believes the future of clinical testing involving the microbiome lies in companion diagnostics for novel treatments, and points to companies that are already developing and marketing tests that will require such assays.

Examples of microbiome-focused enterprises abound, including Genetic Analysis, based in Oslo, Norway, with its high-throughput test that uses 54 probes targeted to specific bacteria to measure intestinal gut flora imbalances in inflammatory bowel disease and irritable bowel syndrome patients. Paris, France-based Enterome is developing both novel drugs and companion diagnostics for microbiome-related diseases such as IBS and some metabolic diseases. Second Genome, based in South San Francisco, has developed an experimental drug, SGM-1019, that the company says blocks damaging activity of the microbiome in the intestine. Cambridge, Massachusetts-based Seres Therapeutics has received Food and Drug Administration orphan drug designation for SER-109, an oral therapeutic intended to correct microbial imbalances to prevent recurrent Clostridium difficile infection in adults.

One promising clinical use of the microbiome is fecal transplantation, which both prospective and retrospective studies have shown to be effective in patients with C. difficile infections who do not respond to front-line therapies, said James Versalovic, MD, PhD, director of Texas Children’s Hospital Microbiome Center and professor of pathology at Baylor College of Medicine in Houston. “Fecal transplants and other microbiome replacement strategies can radically change the composition of the microbiome in hours to days,” he explained.

But NIH’s Proctor discourages too much enthusiasm about fecal transplant. “Natural products like stool can have [side] effects,” she pointed out. “The [microbiome research] field needs to mature and we need to verify outcomes before anything becomes routine.”

Hurdles for Lab Testing

While he is hopeful that labs someday will use the microbiome to produce clinically useful information, Surette pointed to several problems that must be solved beforehand. First, molecular methods commonly used right now should be more quantitative and accurate. Additionally, research on the microbiome encompasses a wide variety of protocols, some of which are better at extracting particular types of bacteria and therefore can give biased views of communities living in the body. Also, tests may need to distinguish between dead and live microbes. Another hurdle is that labs using varied bioinfomatic methods may produce different results from the same sample, a problem that Surette sees as ripe for a solution from clinical laboratorians, who have expertise in standardizing robust protocols and in automating tests.

One way laboratorians can prepare for future, routine microbiome testing is to expand their notion of clinical chemistry to include both microbial and human biochemistry. “The line between microbiome science and clinical science is blurring,” said Versalovic. “When developing future assays to detect biochemical changes in disease states, we must consider the contributions of microbial metabolites and proteins and how to tailor tests to detect them.” In the future, clinical labs may test for uniquely microbial metabolites in various disease states, he predicted.

 

Automated Review of Mass Spectrometry Results  

Can We Achieve Autoverification?

Author: Katherine Alexander and Andrea R. Terrell, PhD  // Date: NOV.1.2015  // Source:Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/november/automated-review-of-mass-spectrometry-results-can-we-achieve-autoverification

 

Paralleling the upswing in prescription drug misuse, clinical laboratories are receiving more requests for mass spectrometry (MS) testing as physicians rely on its specificity to monitor patient compliance with prescription regimens. However, as volume has increased, reimbursement has declined, forcing toxicology laboratories both to increase capacity and lower their operational costs—without sacrificing quality or turnaround time. Now, new solutions are available enabling laboratories to bring automation to MS testing and helping them with the growing demand for toxicology and other testing.

What is the typical MS workflow?

A typical workflow includes a long list of manual steps. By the time a sample is loaded onto the mass spectrometer, it has been collected, logged into the lab information management system (LIMS), and prepared for analysis using a variety of wet chemistry techniques.

Most commercial clinical laboratories receive enough samples for MS analysis to batch analyze those samples. A batch consists of a calibrator(s), quality control (QC) samples, and patient/donor samples. Historically, the method would be selected (i.e. “analysis of opiates”), sample identification information would be entered manually into the MS software, and the instrument would begin analyzing each sample. Upon successful completion of the batch, the MS operator would view all of the analytical data, ensure the QC results were acceptable, and review each patient/donor specimen, looking at characteristics such as peak shape, ion ratios, retention time, and calculated concentration.

The operator would then post acceptable results into the LIMS manually or through an interface, and unacceptable results would be rescheduled or dealt with according to lab-specific protocols. In our laboratory we perform a final certification step for quality assurance by reviewing all information about the batch again, prior to releasing results for final reporting through the LIMS.

What problems are associated with this workflow?

The workflow described above results in too many highly trained chemists performing manual data entry and reviewing perfectly acceptable analytical results. Lab managers would prefer that MS operators and certifying scientists focus on troubleshooting problem samples rather than reviewing mounds of good data. Not only is the current process inefficient, it is mundane work prone to user errors. This risks fatigue, disengagement, and complacency by our highly skilled scientists.

Importantly, manual processes also take time. In most clinical lab environments, turnaround time is critical for patient care and industry competitiveness. Lab directors and managers are looking for solutions to automate mundane, error-prone tasks to save time and costs, reduce staff burnout, and maintain high levels of quality.

How can software automate data transfer from MS systems to LIMS?

Automation is not a new concept in the clinical lab. Labs have automated processes in shipping and receiving, sample preparation, liquid handling, and data delivery to the end user. As more labs implement MS, companies have begun to develop special software to automate data analysis and review workflows.

In July 2011, AIT Labs incorporated ASCENT into our workflow, eliminating the initial manual peak review step. ASCENT is an algorithm-based peak picking and data review system designed specifically for chromatographic data. The software employs robust statistical and modeling approaches to the raw instrument data to present the true signal, which often can be obscured by noise or matrix components.

The system also uses an exponentially modified Gaussian (EMG) equation to apply a best-fit model to integrated peaks through what is often a noisy signal. In our experience, applying the EMG results in cleaner data from what might appear to be poor chromatography ultimately allows us to reduce the number of samples we might otherwise rerun.

How do you validate the quality of results?

We’ve developed a robust validation protocol to ensure that results are, at minimum, equivalent to results from our manual review. We begin by building the assay in ASCENT, entering assay-specific information from our internal standard operating procedure (SOP). Once the assay is configured, validation proceeds with parallel batch processing to compare results between software-reviewed data and staff-reviewed data. For new implementations we run eight to nine batches of 30–40 samples each; when we are modifying or upgrading an existing implementation we run a smaller number of batches. The parallel batches should contain multiple positive and negative results for all analytes in the method, preferably spanning the analytical measurement range of the assay.

The next step is to compare the results and calculate the percent difference between the data review methods. We require that two-thirds of the automated results fall within 20% of the manually reviewed result. In addition to validating patient sample correlation, we also test numerous quality assurance rules that should initiate a flag for further review.

What are the biggest challenges during implementation and continual improvement initiatives?

On the technological side, our largest hurdle was loading the sequence files into ASCENT. We had created an in-house mechanism for our chemists to upload the 96-well plate map for their batch into the MS software. We had some difficulty transferring this information to ASCENT, but once we resolved this issue, the technical workflow proceeded fairly smoothly.

The greater challenge was changing our employees’ mindset from one of fear that automation would displace them, to a realization that learning this new technology would actually make them more valuable. Automating a non-mechanical process can be a difficult concept for hands-on scientists, so managers must be patient and help their employees understand that this kind of technology leverages the best attributes of software and people to create a powerful partnership.

We recommend that labs considering automated data analysis engage staff in the validation and implementation to spread the workload and the knowledge. As is true with most technology, it is best not to rely on just one or two super users. We also found it critical to add supervisor level controls on data file manipulation, such as removing a sample that wasn’t run from the sequence table. This can prevent inadvertent deletion of a file, requiring reinjection of the entire batch!

 

Understanding Fibroblast Growth Factor 23

Author: Damien Gruson, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/understanding-fibroblast-growth-factor-23

What is the relationship of FGF-23 to heart failure?

A Heart failure (HF) is an increasingly common syndrome associated with high morbidity, elevated hospital readmission rates, and high mortality. Improving diagnosis, prognosis, and treatment of HF requires a better understanding of its different sub-phenotypes. As researchers gained a comprehensive understanding of neurohormonal activation—one of the hallmarks of HF—they discovered several biomarkers, including natriuretic peptides, which now are playing an important role in sub-phenotyping HF and in driving more personalized management of this chronic condition.

Like the natriuretic peptides, fibroblast growth factor 23 (FGF-23) could become important in risk-stratifying and managing HF patients. Produced by osteocytes, FGF-23 is a key regulator of phosphorus homeostasis. It binds to renal and parathyroid FGF-Klotho receptor heterodimers, resulting in phosphate excretion, decreased 1-α-hydroxylation of 25-hydroxyvitamin D, and decreased parathyroid hormone (PTH) secretion. The relationship to PTH is important because impaired homeostasis of cations and decreased glomerular filtration rate might contribute to the rise of FGF-23. The amino-terminal portion of FGF-23 (amino acids 1-24) serves as a signal peptide allowing secretion into the blood, and the carboxyl-terminal portion (aa 180-251) participates in its biological action.

How might FGF-23 improve HF risk assessment?

Studies have shown that FGF-23 is related to the risk of cardiovascular diseases and mortality. It was first demonstrated that FGF-23 levels were independently associated with left ventricular mass index and hypertrophy as well as mortality in patients with chronic kidney disease (CKD). FGF-23 also has been associated with left ventricular dysfunction and atrial fibrillation in coronary artery disease subjects, even in the absence of impaired renal function.

FGF-23 and FGF receptors are both expressed in the myocardium. It is possible that FGF-23 has direct effects on the heart and participates in the physiopathology of cardiovascular diseases and HF. Experiments have shown that for in vitro cultured rat cardiomyocytes, FGF-23 stimulates pathological hypertrophy by activating the calcineurin-NFAT pathway—and in wild-type mice—the intra-myocardial or intravenous injection of FGF-23 resulted in left ventricular hypertrophy. As such, FGF-23 appears to be a potential stimulus of myocardial hypertrophy, and increased levels may contribute to the worsening of heart failure and long-term cardiovascular death.

Researchers have documented that HF patients have elevated FGF-23 circulating levels. They have also found a significant correlation between plasma levels of FGF-23 and B-type natriuretic peptide, a biomarker related to ventricular stretch and cardiac hypertrophy, in patients with left ventricular hypertrophy. As such, measuring FGF-23 levels might be a useful tool to predict long-term adverse cardiovascular events in HF patients.

Interestingly, researchers have documented a significant relationship between FGF-23 and PTH in both CKD and HF patients. As PTH stimulates FGF-23 expression, it could be that in HF patients, increased PTH levels increase the bone expression of FGF-23, which enhances its effects on the heart.

 

The Past, Present, and Future of Western Blotting in the Clinical Laboratory

Author: Curtis Balmer, PhD  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/the-past-present-and-future-of-western-blotting-in-the-clinical-laboratory

Much of the discussion about Western blotting centers around its performance as a biological research tool. This isn’t surprising. Since its introduction in the late 1970s, the Western blot has been adopted by biology labs of virtually every stripe, and become one of the most widely used techniques in the research armamentarium. However, Western blotting has also been employed in clinical laboratories to aid in the diagnosis of various diseases and disorders—an equally important and valuable application. Yet there has been relatively little discussion of its use in this context, or of how advances in Western blotting might affect its future clinical use.

Highlighting the clinical value of Western blotting, Stanley Naides, MD, medical director of Immunology at Quest Diagnostics observed that, “Western blotting has been a very powerful tool in the laboratory and for clinical diagnosis. It’s one of many various methods that the laboratorian brings to aid the clinician in the diagnosis of disease, and the selection and monitoring of therapy.” Indeed, Western blotting has been used at one time or the other to aid in the diagnosis of infectious diseases including hepatitis C (HCV), HIV, Lyme disease, and syphilis, as well as autoimmune disorders such as paraneoplastic disease and myositis conditions.

However, Naides was quick to point out that the choice of assays to use clinically is based on their demonstrated sensitivity and performance, and that the search for something better is never-ending. “We’re constantly looking for methods that improve detection of our target [protein],” Naides said. “There have been a number of instances where we’ve moved away from Western blotting because another method proves to be more sensitive.” But this search can also lead back to Western blotting. “We’ve gone away from other methods because there’s been a Western blot that’s been developed that’s more sensitive and specific. There’s that constant movement between methods as new tests are developed.”

In recent years, this quest has been leading clinical laboratories away from Western blotting toward more sensitive and specific diagnostic assays, at least for some diseases. Using confirmatory diagnosis of HCV infection as an example, Sai Patibandla, PhD, director of the immunoassay group at Siemens Healthcare Diagnostics, explained that movement away from Western blotting for confirmatory diagnosis of HCV infection began with a technical modification called Recombinant Immunoblotting Assay (RIBA). RIBA streamlines the conventional Western blot protocol by spotting recombinant antigen onto strips which are used to screen patient samples for antibodies against HCV. This approach eliminates the need to separate proteins and transfer them onto a membrane.

The RIBA HCV assay was initially manufactured by Chiron Corporation (acquired by Novartics Vaccines and Diagnostics in 2006). It received Food and Drug Administration (FDA) approval in 1999, and was marketed as Chiron RIBA HCV 3.0 Strip Immunoblot Assay. Patibandla explained that, at the time, the Chiron assay “…was the only FDA-approved confirmatory testing for HCV.” In 2013 the assay was discontinued and withdrawn from the market due to reports that it was producing false-positive results.

Since then, clinical laboratories have continued to move away from Western blot-based assays for confirmation of HCV in favor of the more sensitive technique of nucleic acid testing (NAT). “The migration is toward NAT for confirmation of HCV [diagnosis]. We don’t use immunoblots anymore. We don’t even have a blot now to confirm HCV,” Patibandla said.

Confirming HIV infection has followed a similar path. Indeed, in 2014 the Centers for Disease Control and Prevention issued updated recommendations for HIV testing that, in part, replaced Western blotting with NAT. This change was in response to the recognition that the HIV-1 Western blot assay was producing false-negative or indeterminable results early in the course of HIV infection.

At this juncture it is difficult to predict if this trend away from Western blotting in clinical laboratories will continue. One thing that is certain, however, is that clinicians and laboratorians are infinitely pragmatic, and will eagerly replace current techniques with ones shown to be more sensitive, specific, and effective. This raises the question of whether any of the many efforts currently underway to improve Western blotting will produce an assay that exceeds the sensitivity of currently employed techniques such as NAT.

Some of the most exciting and groundbreaking work in this area is being done by Amy Herr, PhD, a professor of bioengineering at University of California, Berkeley. Herr’s group has taken on some of the most challenging limitations of Western blotting, and is developing techniques that could revolutionize the assay. For example, the Western blot is semi-quantitative at best. This weakness dramatically limits the types of answers it can provide about changes in protein concentrations under various conditions.

To make Western blotting more quantitative, Herr’s group is, among other things, identifying losses of protein sample mass during the assay protocol. About this, Herr explains that the conventional Western blot is an “open system” that involves lots of handling of assay materials, buffers, and reagents that makes it difficult to account for protein losses. Or, as Kevin Lowitz, a senior product manager at Thermo Fisher Scientific, described it, “Western blot is a [simple] technique, but a really laborious one, and there are just so many steps and so many opportunities to mess it up.”

Herr’s approach is to reduce the open aspects of Western blot. “We’ve been developing these more closed systems that allow us at each stage of the assay to account for [protein mass] losses. We can’t do this exactly for every target of interest, but it gives us a really good handle [on protein mass losses],” she said. One of the major mechanisms Herr’s lab is using to accomplish this is to secure proteins to the blot matrix with covalent bonding rather than with the much weaker hydrophobic interactions that typically keep the proteins in place on the membrane.

Herr’s group also has been developing microfluidic platforms that allow Western blotting to be done on single cells, “In our system we’re doing thousands of independent Westerns on single cells in four hours. And, hopefully, we’ll cut that down to one hour over the next couple years.”

Other exciting modifications that stand to dramatically increase the sensitivity, quantitation, and through-put of Western blotting also are being developed and explored. For example, the use of capillary electrophoresis—in which proteins are conveyed through a small electrolyte-filled tube and separated according to size and charge before being dropped onto a blotting membrane—dramatically reduces the amount of protein required for Western blot analysis, and thereby allows Westerns to be run on proteins from rare cells or for which quantities of sample are extremely limited.

Jillian Silva, PhD, an associate specialist at the University of California, San Francisco Helen Diller Family Comprehensive Cancer Center, explained that advances in detection are also extending the capabilities of Western blotting. “With the advent of fluorescence detection we have a way to quantitate Westerns, and it is now more quantitative than it’s ever been,” said Silva.

Whether or not these advances produce an assay that is adopted by clinical laboratories remains to be seen. The emphasis on Western blotting as a research rather than a clinical tool may bias advances in favor of the needs and priorities of researchers rather than clinicians, and as Patibandla pointed out, “In the research world Western blotting has a certain purpose. [Researchers] are always coming up with new things, and are trying to nail down new proteins, so you cannot take Western blotting away.” In contrast, she suggested that for now, clinical uses of Western blotting remain “limited.”

 

Adapting Next Generation Technologies to Clinical Molecular Oncology Service

Author: Ronald Carter, PhD, DVM  // Date: OCT.1.2015  // Source: Clinical Laboratory News

https://www.aacc.org/publications/cln/articles/2015/october/adapting-next-generation-technologies-to-clinical-molecular-oncology-service

Next generation technologies (NGT) deliver huge improvements in cost efficiency, accuracy, robustness, and in the amount of information they provide. Microarrays, high-throughput sequencing platforms, digital droplet PCR, and other technologies all offer unique combinations of desirable performance.

As stronger evidence of genetic testing’s clinical utility influences patterns of patient care, demand for NGT testing is increasing. This presents several challenges to clinical laboratories, including increased urgency, clinical importance, and breadth of application in molecular oncology, as well as more integration of genetic tests into synoptic reporting. Laboratories need to add NGT-based protocols while still providing old tests, and the pace of change is increasing.What follows is one viewpoint on the major challenges in adopting NGTs into diagnostic molecular oncology service.

Choosing a Platform

Instrument selection is a critical decision that has to align with intended test applications, sequencing chemistries, and analytical software. Although multiple platforms are available, a mainstream standard has not emerged. Depending on their goals, laboratories might set up NGTs for improved accuracy of mutation detection, massively higher sequencing capacity per test, massively more targets combined in one test (multiplexing), greater range in sequencing read length, much lower cost per base pair assessed, and economy of specimen volume.

When high-throughput instruments first made their appearance, laboratories paid more attention to the accuracy of base-reading: Less accurate sequencing meant more data cleaning and resequencing (1). Now, new instrument designs have narrowed the differences, and test chemistry can have a comparatively large impact on analytical accuracy (Figure 1). The robustness of technical performance can also vary significantly depending upon specimen type. For example, LifeTechnologies’ sequencing platforms appear to be comparatively more tolerant of low DNA quality and concentration, which is an important consideration for fixed and processed tissues.

https://www.aacc.org/~/media/images/cln/articles/2015/october/carter_fig1_cln_oct15_ed.jpg

Figure 1 Comparison of Sequencing Chemistries

Sequence pile-ups of the same target sequence (2 large genes), all performed on the same analytical instrument. Results from 4 different chemistries, as designed and supplied by reagent manufacturers prior to optimization in the laboratory. Red lines represent limits of exons. Height of blue columns proportional to depth of coverage. In this case, the intent of the test design was to provide high depth of coverage so that reflex Sanger sequencing would not be necessary. Courtesy B. Sadikovic, U. of Western Ontario.

 

In addition, batching, robotics, workload volume patterns, maintenance contracts, software licenses, and platform lifetime affect the cost per analyte and per specimen considerably. Royalties and reagent contracts also factor into the cost of operating NGT: In some applications, fees for intellectual property can represent more than 50% of the bench cost of performing a given test, and increase substantially without warning.

Laboratories must also deal with the problem of obsolescence. Investing in a new platform brings the angst of knowing that better machines and chemistries are just around the corner. Laboratories are buying bigger pieces of equipment with shorter service lives. Before NGTs, major instruments could confidently be expected to remain current for at least 6 to 8 years. Now, a major instrument is obsolete much sooner, often within 2 to 3 years. This means that keeping it in service might cost more than investing in a new platform. Lease-purchase arrangements help mitigate year-to-year fluctuations in capital equipment costs, and maximize the value of old equipment at resale.

One Size Still Does Not Fit All

Laboratories face numerous technical considerations to optimize sequencing protocols, but the test has to be matched to the performance criteria needed for the clinical indication (2). For example, measuring response to treatment depends first upon the diagnostic recognition of mutation(s) in the tumor clone; the marker(s) then have to be quantifiable and indicative of tumor volume throughout the course of disease (Table 1).

As a result, diagnostic tests need to cover many different potential mutations, yet accurately identify any clinically relevant mutations actually present. On the other hand, tests for residual disease need to provide standardized, sensitive, and accurate quantification of a selected marker mutation against the normal background. A diagnostic panel might need 1% to 3% sensitivity across many different mutations. But quantifying early response to induction—and later assessment of minimal residual disease—needs a test that is reliably accurate to the 10-4 or 10-5 range for a specific analyte.

Covering all types of mutations in one diagnostic test is not yet possible. For example, subtyping of acute myeloid leukemia is both old school (karyotype, fluorescent in situ hybridization, and/or PCR-based or array-based testing for fusion rearrangements, deletions, and segmental gains) and new school (NGT-based panel testing for molecular mutations).

Chemistries that cover both structural variants and copy number variants are not yet in general use, but the advantages of NGTs compared to traditional methods are becoming clearer, such as in colorectal cancer (3). Researchers are also using cell-free DNA (cfDNA) to quantify residual disease and detect resistance mutations (4). Once a clinically significant clone is identified, enrichment techniques help enable extremely sensitive quantification of residual disease (5).

Validation and Quality Assurance

Beyond choosing a platform, two distinct challenges arise in bringing NGTs into the lab. The first is assembling the resources for validation and quality assurance. The second is keeping tests up-to-date as new analytes are needed. Even if a given test chemistry has the flexibility to add analytes without revalidating the entire panel, keeping up with clinical advances is a constant priority.

Due to their throughput and multiplexing capacities, NGT platforms typically require considerable upfront investment to adopt, and training staff to perform testing takes even more time. Proper validation is harder to document: Assembling positive controls, documenting test performance criteria, developing quality assurance protocols, and conducting proficiency testing are all demanding. Labs meet these challenges in different ways. Laboratory-developed tests (LDTs) allow self-determined choice in design, innovation, and control of the test protocol, but can be very expensive to set up.

Food and Drug Administration (FDA)-approved methods are attractive but not always an option. More FDA-approved methods will be marketed, but FDA approval itself brings other trade-offs. There is a cost premium compared to LDTs, and the test methodologies are locked down and not modifiable. This is particularly frustrating for NGTs, which have the specific attraction of extensive multiplexing capacity and accommodating new analytes.

IT and the Evolution of Molecular Oncology Reporting Standards

The options for information technology (IT) pipelines for NGTs are improving rapidly. At the same time, recent studies still show significant inconsistencies and lack of reproducibility when it comes to interpreting variants in array comparative genomic hybridization, panel testing, tumor expression profiling, and tumor genome sequencing. It can be difficult to duplicate published performances in clinical studies because of a lack of sufficient information about the protocol (chemistry) and software. Building bioinformatics capacity is a key requirement, yet skilled people are in short supply and the qualifications needed to work as a bioinformatician in a clinical service are not yet clearly defined.

Tumor biology brings another level of complexity. Bioinformatic analysis must distinguish tumor-specific­ variants from genomic variants. Sequencing of paired normal tissue is often performed as a control, but virtual normal controls may have intriguing advantages (6). One of the biggest challenges is to reproducibly interpret the clinical significance of interactions between different mutations, even with commonly known, well-defined mutations (7). For multiple analyte panels, such as predictive testing for breast cancer, only the performance of the whole panel in a population of patients can be compared; individual patients may be scored into different risk categories by different tests, all for the same test indication.

In large scale sequencing of tumor genomes, which types of mutations are most informative in detecting, quantifying, and predicting the behavior of the tumor over time? The amount and complexity of mutation varies considerably across different tumor types, and while some mutations are more common, stable, and clinically informative than others, the utility of a given tumor marker varies in different clinical situations. And, for a given tumor, treatment effect and metastasis leads to retesting for changes in drug sensitivities.

These complexities mean that IT must be designed into the process from the beginning. Like robotics, IT represents a major ancillary decision. One approach many labs choose is licensed technologies with shared databases that are updated in real time. These are attractive, despite their cost and licensing fees. New tests that incorporate proprietary IT with NGT platforms link the genetic signatures of tumors to clinically significant considerations like tumor classification, recommended methodologies for monitoring response, predicted drug sensitivities, eligible clinical trials, and prognostic classifications. In-house development of such solutions will be difficult, so licensing platforms from commercial partners is more likely to be the norm.

The Commercial Value of Health Records and Test Data

The future of cancer management likely rests on large-scale databases that link hereditary and somatic tumor testing with clinical outcomes. Multiple centers have such large studies underway, and data extraction and analysis is providing increasingly refined interpretations of clinical significance.

Extracting health outcomes to correlate with molecular test results is commercially valuable, as the pharmaceutical, insurance, and healthcare sectors focus on companion diagnostics, precision medicine, and evidence-based health technology assessment. Laboratories that can develop tests based on large-scale integration of test results to clinical utility will have an advantage.

NGTs do offer opportunities for net reductions in the cost of healthcare. But the lag between availability of a test and peer-evaluated demon­stration of clinical utility can be considerable. Technical developments arise faster than evidence of clinical utility. For example, immuno­histochemistry, estrogen receptor/progesterone receptor status, HER2/neu, and histology are still the major pathological criteria for prognostic evaluation of breast cancer at diagnosis, even though multiple analyte tumor profiling has been described for more than 15 years. Healthcare systems need a more concerted assessment of clinical utility if they are to take advantage of the promises of NGTs in cancer care.

Disruptive Advances

Without a doubt, “disruptive” is an appropriate buzzword in molecular oncology, and new technical advances are about to change how, where, and for whom testing is performed.

• Predictive Testing

Besides cost per analyte, one of the drivers for taking up new technologies is that they enable multiplexing many more analytes with less biopsy material. Single-analyte sequential testing for epidermal growth factor receptor (EGFR), anaplastic lymphoma kinase, and other targets on small biopsies is not sustainable when many more analytes are needed, and even now, a significant proportion of test requests cannot be completed due to lack of suitable biopsy material. Large panels incorporating all the mutations needed to cover multiple tumor types are replacing individual tests in companion diagnostics.

• Cell-Free Tumor DNA

Challenges of cfDNA include standardizing the collection and processing methodologies, timing sampling to minimize the effect of therapeutic toxicity on analytical accuracy, and identifying the most informative sample (DNA, RNA, or protein). But for more and more tumor types, it will be possible to differentiate benign versus malignant lesions, perform molecular subtyping, predict response, monitor treatment, or screen for early detection—all without a surgical biopsy.

cfDNA technologies can also be integrated into core laboratory instrumentation. For example, blood-based EGFR analysis for lung cancer is being developed on the Roche cobas 4800 platform, which will be a significant change from the current standard of testing based upon single tests of DNA extracted from formalin-fixed, paraffin-embedded sections selected by a pathologist (8).

• Whole Genome and Whole Exome Sequencing

Whole genome and whole exome tumor sequencing approaches provide a wealth of biologically important information, and will replace individual or multiple gene test panels as the technical cost of sequencing declines and interpretive accuracy improves (9). Laboratories can apply informatics selectively or broadly to extract much more information at relatively little increase in cost, and the interpretation of individual analytes will be improved by the context of the whole sequence.

• Minimal Residual Disease Testing

Massive resequencing and enrichment techniques can be used to detect minimal residual disease, and will provide an alternative to flow cytometry as costs decline. The challenge is to develop robust analytical platforms that can reliably produce results in a high proportion of patients with a given tumor type, despite using post-treatment specimens with therapy-induced degradation, and a very low proportion of target (tumor) sequence to benign background sequence.

The tumor markers should remain informative for the burden of disease despite clonal evolution over the course of multiple samples taken during progression of the clinical course and treatment. Quantification needs to be accurate and sensitive down to the 10-5 range, and cost competitive with flow cytometry.

• Point-of-Care Test Methodologies

Small, rapid, cheap, and single use point-of-care (POC) sequencing devices are coming. Some can multiplex with analytical times as short as 20 minutes. Accurate and timely testing will be possible in places like pharmacies, oncology clinics, patient service centers, and outreach programs. Whether physicians will trust and act on POC results alone, or will require confirmation by traditional laboratory-based testing, remains to be seen. However, in the simplest type of application, such as a patient known to have a particular mutation, the advantages of POC-based testing to quantify residual tumor burden are clear.

Conclusion

Molecular oncology is moving rapidly from an esoteric niche of diagnostics to a mainstream, required component of integrated clinical laboratory services. While NGTs are markedly reducing the cost per analyte and per specimen, and will certainly broaden the scope and volume of testing performed, the resources required to choose, install, and validate these new technologies are daunting for smaller labs. More rapid obsolescence and increased regulatory scrutiny for LDTs also present significant challenges. Aligning test capacity with approved clinical indications will require careful and constant attention to ensure competitiveness.

References

1. Liu L, Li Y, Li S, et al. Comparison of next-generation sequencing systems. J Biomed Biotechnol 2012; doi:10.1155/2012/251364.

2. Brownstein CA, Beggs AH, Homer N, et al. An international effort towards developing standards for best practices in analysis, interpretation and reporting of clinical genome sequencing results in the CLARITY Challenge. Genome Biol 2014;15:R53.

3. Haley L, Tseng LH, Zheng G, et al. Performance characteristics of next-generation sequencing in clinical mutation detection of colorectal ­cancers. [Epub ahead of print] Modern Pathol July 31, 2015 as doi:10.1038/modpathol.2015.86.

4. Butler TM, Johnson-Camacho K, Peto M, et al. Exome sequencing of cell-free DNA from metastatic cancer patients identifies clinically actionable mutations distinct from primary ­disease. PLoS One 2015;10:e0136407.

5. Castellanos-Rizaldos E, Milbury CA, Guha M, et al. COLD-PCR enriches low-level variant DNA sequences and increases the sensitivity of genetic testing. Methods Mol Biol 2014;1102:623–39.

6. Hiltemann S, Jenster G, Trapman J, et al. Discriminating somatic and germline mutations in tumor DNA samples without matching normals. Genome Res 2015;25:1382–90.

7. Lammers PE, Lovly CM, Horn L. A patient with metastatic lung adenocarcinoma harboring concurrent EGFR L858R, EGFR germline T790M, and PIK3CA mutations: The challenge of interpreting results of comprehensive mutational testing in lung cancer. J Natl Compr Canc Netw 2015;12:6–11.

8. Weber B, Meldgaard P, Hager H, et al. Detection of EGFR mutations in plasma and biopsies from non-small cell lung cancer patients by allele-specific PCR assays. BMC Cancer 2014;14:294.

9. Vogelstein B, Papadopoulos N, Velculescu VE, et al. Cancer genome landscapes. Science 2013;339:1546–58.

10. Heitzer E, Auer M, Gasch C, et al. Complex tumor genomes inferred from single circulating tumor cells by array-CGH and next-generation sequencing. Cancer Res 2013;73:2965–75.

11. Healy B. BRCA genes — Bookmaking, fortunetelling, and medical care. N Engl J Med 1997;336:1448–9.

 

 

 

Advertisements

Read Full Post »


Diarrheas – Bacterial and Nonbacterial

Writer and Curator: Larry H. Bernstein, MD, FCAP 

 

Introduction

Diarrheas are one of the common problems of societies worldwide. However, the prevalence of cause of the diarrhea may be different depending on location, water quality, food source, age, and psychological stress factors.

Microbial and Parasitic Diseases

A Systematic Review on Neglected Important Protozoan Zoonoses

Yibeltal Muhie Mekonen and Simenew Keskes Melaku
Int. J. Adv. Res. Biol.Sci. 2(1): (2015): 53–65

Infectious protozoan parasites are transmitted to humans through several routes, including contaminated food and water, inadequately treated sewage/sewage products, and livestock and domestic pet handling. Several enteric protozoa cause severe morbidity and mortality in both humans and animals worldwide. In developed settings, enteric protozoa are often ignored as a cause of diarrheal illness due to better hygiene conditions, and as such, very little effort is used toward laboratory diagnosis. Although these protozoa contribute to the high burden of infectious diseases, estimates of their true prevalence are sometimes affected by the lack of sensitive diagnostic techniques to detect them in clinical and environmental specimens. Despite recent advances in the epidemiology, molecular biology, and treatment of protozoan illnesses, gaps in knowledge still exist, requiring further research. There is evidence that climate-related changes will contribute to their burden due to displacement of ecosystems and human and animal populations, increases in atmospheric temperature, flooding and other environmental conditions suitable for transmission, and the need for the reuse of alternative water sources to meet growing population needs. This review discusses the common enteric protozoa from a public health perspective, highlighting their epidemiology, modes of transmission, prevention and control and epidemiological pictures in Ethiopia. It also discusses the potential impact of climate changes on their epidemiology and the issues surrounding waterborne transmission and suggests a multidisciplinary approach to their prevention and control.

Approximately 60 percent of all human pathogens are zoonoses of microbes that are naturally transmitted between animals and humans. Neglect of their control persists because of a lack of information and awareness about their distribution, a lack of suitable tools and managerial capacity for their diagnosis, and a lack of appropriate and sustainable strategies for their prevention and control. Furthermore, many of the most affected countries have poor or non-existent veterinary public health infrastructures. This situation has marginalized control of zoonoses to the gap between veterinary responsibilities and medical needs, generating a false perception that their burden and impact on society are low. As a result, neither the human and animal health resources nor the research needed for their control are available spawning a category of non zoonotic diseases (Choffnes and Relman, 2011).

The neglected tropical diseases (NTDs) are the most common conditions affecting the poorest 500 million people living in sub-Saharan Africa (SSA), and together produce a burden of disease that may be equivalent to up to one-half of SSA’s malaria disease burden and more than double that caused by tuberculosis (Hotez and Kamath, 2009).Starting with an initial set of 13-15 diseases, there are now over 40 helminth, protozoal, bacterial, viral, fungal and ectoparasitic infections covered under the brand-name of the neglected tropical diseases. Gaps in our understanding of the epidemiology and control of many of the neglected tropical diseases remain, which calls for additional funding for innovative research (Jürg et al, 2012).

The health and socioeconomic impacts of zoonotic parasitic and related food-borne diseases are growing continuously and increasingly being felt most particularly by developing countries. Apart from causing human morbidity and mortality, they hamper agricultural production, decrease availability of food, and create barriers to international trade (Solaymani-Mohammadi and Petri, 2006). The problem of zoonoses has spread from predominantly restricted rural areas into regional and, in some cases, worldwide epidemics. This is due to the great changes of the previous decades, especially the increasing urbanization, most of which is inadequate planned. In addition, large movements of populations, opening up of badly needed new areas for food production, the increasing trade in meat, milk and other products of animal origin, the increasing number and speed of vehicles, and even tourism have contributed to expanding the impact of zoonotic diseases. The challenges of food-borne, waterborne, and zoonotic protozoan diseases associated with climate change are expected to increase, with a need for active surveillance systems, some of which have already been initiated by several developed countries. However, very little effects are attempting in the developing world which actually are the main victims.

The prevalence rates are generally higher in immunodeficient compared to immune-competent patients. However, most studies on prevalence have been carried out in developed countries where the laboratory and clinical infrastructure are more easily available. Protozoan pathogens and HIV interact in their host, modifying the immunopathology of disease and complicating therapeutic intervention. Disease prevalence and distribution and population movements impact greatly on HIV/protozoan parasite co-infections (Andeani et al, 2012).

In Ethiopia there are little reports regarding protozoan zoonoses. However, there are still reports from clinics and hospitals where these diseases are becoming major issues of concern. This review will examine published data on the neglected protozoan pathogens in Ethiopia and analyses their current importance to public health.

Important but Neglected Protozoan Zoonoses Dealt in this Critical Review

  • Amebiasis

This disease is caused by a single cell protozoan parasite called Entamoeba  spp. (E. histolytica, E. polecki). Invasive amebiasis is one of the world most prevalent and fatal infectious diseases. Around 500 million people are infected worldwide while 75,000 die of the disease annually. Behind malaria and schistosomiasis, amebiasis ranks third on the list of parasitic causes of death worldwide. The infection is common in developing countries and predominantly affects individuals with poor socioeconomic conditions, non hygienic practices, and malnutrition (Stanley, 2003).

A number of survey and routine diagnosis in Ethiopia indicate that amebiasis is one of the most widely distributed diseases. In a countrywide survey of amebiasis in 97 communities, the overall prevalence of Entamoeba histolytica infections, as measured by rate of cyst-passers, in schoolchildren and non-school communities were 15.0% and 3.5%, respectively (Erko et al, 1995). A study conducted on the prevalence of Entamoeba histolytica/dispar among children in Legedini, Adada and Legebira, Dire-Dawa administrative region was 33.7% (Dawit, 2006 Unpublished MSc Thesis).

  • Giardiasis

Giardiasis is caused by Giardia lamblia (also known as Giardia duodenalis or G. intestinalis) is a unicellular, flagellated intestinal protozoan parasite of humans isolated worldwide and is ranked among the top 10 parasites of man (Farthing and Kelly, 2005). Its occurrence is worldwide (Figure 1) and prevalence very high in areas with poor sanitation and in institutions. Human infections usually originate from other humans but may result from contact with dogs, cats, rodents, beavers, or nonhuman primates. The prevalence of the disease varies from 2% to 5% in developed to 20% to 30% in developing countries. The variation in prevalence might be attributed to factors such as the geographical area, the urban or rural setting of the society, the age group composition and the socio-economical conditions of the study subject.

Risk of disease caused by Giarda species

Risk of disease caused by Giarda species

Risk of disease caused by Giarda species with different degrees Source: Esch and Petersen (2013)

According to Birrie and Erko (1995) based on a countrywide survey of giardiasis, the overall prevalence among school children and residents were 8.9% and 3.1%, respectively and that of the non-school children were 4.4%. Recent report indicates that the prevalence of Giardia lamblia among diarrhea patients referred to EHNRI (Ethiopian Health and Nutrition Research Institute) was 8.6%. In a study conducted in South Western Ethiopia, the prevalence of Giardiasis was 13.7%. A study conducted for the determination of Prevalence of Giardiasis and Cryptosporidiosis among children in relation to water sources in selected Village of Pawi Special District in Benishangul-Gumuz Region, Northwestern Ethiopia showed that out of the 384 children examined, 102 for giardiasis.

  • Leishmaniasis

Leishmaniasis is an ancient disease caused by protozoans from the Leishmania genus and transmitted by the bite of a sand fly. It has four subtypes of varying severity, which include cutaneous and visceral infections. Cutaneous infection results in the formation of disfiguring lesions which frequently occur on the face, arms and legs. Lesions may last anywhere from a few weeks to over a year; secondary lesions may also occur years after the initial lesion has healed. Visceral cases can result in anemia, fever, debility and death if left untreated.

About 20 species of Leishmania infect mammals and many of them can cause human leishmaniasis. Motile infective forms of the parasite (metacyclic promastigotes with a long free flagellum) develop in the guts of competent sand fly vectors, which inoculate them into mammalian skin. Infections can spread, often via the lymphatic system, to cause secondary dermal lesions with forms and tissue tropisms in humans that show some parasite species specificity. Leishmaniasis can visceralize (for example Leishmania (Leishmania) tropica, which normally causes Oriental sore), but only two species of the subgenus Leishmania routinely do so, and these are the causative agents of most human visceral leishmaniasis (VL) worldwide.

Global burden of Leishmania

Global burden of Leishmania

Global burden of Leishmania as adapted from the “Leshimaniases and Leishmania HIV co-infection” WHO fact sheet No. 116, May 2000

The New World visceral leishmaniasis is in Latin America and southern United States. Of course the visceral form also is common in Asia, Africa, Europe and Latin America. Both VL and CL are important endemic vector‐borne diseases in Ethiopia. The Federal Ministry of Health (FMoH) estimates the annual burden of VL to be between 4,500 and 5,000 cases (FMoH Ethiopia, 2006 unpublished). Known VL endemic foci are in the arid southwest, and the Humera and Metema lowlands in the north‐west. About 2-12% of all visceral leishmaniasis cases involve HIV coinfections underlines the synergic aspect of both diseases; such proportions may reach 40%, as in Humera, northwest Ethiopia (WHO, 2007), where coinfections have increased two-fold in the last decade (Andreani et al, 2012).

  • Cryptosporidiosis

The causes of this disease are Cryptosporidium spp. (C. parvum, possibly others). In humans, abdominal pain, nausea, watery diarrhea lasting 3-4 days. In immune-deficient or immune-suppressed people, the disease is severe, with persistent diarrhea (6-25 evacuations per day) and malabsorption of nutrients. In normal persons the disease is self-limiting. In immune-compromised individuals, disease is severe and case fatality rate may be high. In animals normally a clinical disease can be seen only among young neonates. In ruminants, gastroenteritis and diarrhea are common.

  • Toxoplasmosis

Toxoplasmosis is among the global major zoonotic diseases and the third leading cause of food-related deaths in the USA. It is caused by Toxoplasma gondii, an Apicomplexa protozoan parasite, with cats as the definitive host. Cats are considered the key in the transmission of Toxoplasma gondii to humans because they are the only hosts that can excrete the environmentally resistant oocysts in their feces.

Human seroprevalece of Toxoplasma gondii

Human seroprevalece of Toxoplasma gondii

Human seroprevalece of Toxoplasma gondii.  Esch and Petersen (2013)

The clinical impact of zoonotic enteric protozoan infections is greatest in the developing world where inadequate sanitation, poor hygiene and proximity to zoonotic reservoirs, particularly companion animals and livestock are greatest. In such circumstances, it is not surprising that infections with more than one species of enteric protozoan are common, and in fact single infections are rare.

Impact of animal disease on human health

Impact of animal disease on human health

Impact of animal disease on human health

The protozoan zoonoses circulating in Ethiopia are major burden on public health and wellbeing. The magnitude and scope of this burden varies for each of the protozoan parasites discussed in this manuscript. Apart from causing human morbidity and mortality, they hamper agricultural production, decrease availability of food, and create barriers to international trade. It is generally believed that although these parasitic infections are distributed worldwide, their prevalence is higher in developing compared to developed countries. However, the relative importance of zoonotic infections especially in developing countries has not been studied in detail including. These protozoan zoonoses are the most neglected but very important in terms of human health and veterinary concerns. The main share belongs to cryptosporidiosis; giardiasis, toxoplasmosis, leishmaniasis and amebiasis are some of the major protozoan zoonoses.

Clinical Microbiology: Past, Present, and Future

Henry D. Isenberg
J Clin Microbiol, Mar. 2003; 41(3):917–918
http://dx.doi.org:/10.1128/JCM.41.3.917–918.2003

During the last two decades of the 19th century, a plethora of bacteria were isolated and designated etiological agents of human infectious diseases. As with many instances at the interface between cause and effective therapy, the further characterization of these alleged pathogens remained in the hands of a few devoted investigators until drugs with therapeutic potential became available. This vague period before the advent of proper cures for infections explains the shadowy origin of clinical or diagnostic microbiology. But, as R. Porter has stated, “history should be rooted in detail and as messy as life itself”; this is an undeniable description of the history of clinical microbiology, long the stepchild, frequently denied legitimacy, among the many siblings that constitute the science of microbiology. Yet the practice of clinical microbiology is the application of knowledge gained to the betterment of the human condition, the goal of clinical microbiologists.

The advances in the grouping and typing of streptococci, salmonellae, and shigellae, the separation of Staphylococcus aureus on the basis of the coagulase reaction, and the growing awareness of the need for safe water and uncontaminated food items established the need for laboratories to assume these responsibilities. It was only logical that microbiology should join endeavors such as chemistry, hematology, and serology under the rubric of clinical pathology. Differential media especially designed to sequester species increased dramatically during Word War II; military hospitals developed clinical microbiology sections devoted not only to recognizing agents endangering the health of troops in camps, in battle, and in foreign environments but also to assessing the responses of certain of the microorganisms isolated to several sulfonamides and that hitherto unknown agent, penicillin. The subsequent explosion of antimicrobial agents—streptomycin, chloramphenicol, tetracyclines, and erythromycin—suggested to the reigning powers of medical facilities that clinical microbiologists could be phased out, since infectious disease would disappear before the onslaught of agents discovered through human ingenuity.

In the interim, cotton plugs gave way to Bakelite, polypropylene, glass, metal, and plastic closures; in-house medium preparation was relieved in part by the beginnings of commercially manufactured ready-to-use media especially for mycobacteria and antimicrobial susceptibility testing. Alcohol, Bunsen, and Tyril burners were replaced by microincinerators, eventually followed by disposable loops and transfer needles. The prescient wisdom of hospital boards soon was shattered by the genetic versatility of the microbial world, dramatically demonstrated by the pandemic of S. aureus 80/81 in the late 1950s and early 1960s and the emergence of gram-negative rods that demonstrated the superiority of the bacterial physiology over the commercially prepared secondary microbial metabolites that initially appeared so promising. To be sure, the tug of war between antimicrobial agents—natural and synthetic—and the microorganisms continues unabated, with signs that the evolutionary potential of the microbial world will succeed in the long run.

Since the 1960s, numerous ingenious innovations have been introduced. Molecular biology techniques promise to revolutionize the diagnosis of infectious disease—to date a promise still in its infancy.  Systems approaches began to replace the single test tube with but one substrate. Perhaps the first was double sugar iron agar for the recognition of so-called enteric pathogens, followed by triple sugar iron agar and the next tentative shortcut, the r/b tube. Rollender and Beckford, the inventors of the r/b tube, must be credited with initiating manufacturers’ efforts to teach laboratory staffs the vagaries and problems of new system approaches. Shortly thereafter, the API system was introduced in the United States, bringing a novel numerical approach first to the identification of Enterobacteriaceae (enteric – gut bacteria) and then to that of several other categories of microorganisms. Similarly, the Roche Enterotube used fewer reaction substrates to decrease the time needed to identify isolates to the species level; initially it was used for members of the Enterobacteriaceae and eventually for other microbial representatives. All systems eventually addressed yeasts and nutritionally demanding bacteria, obviating the multiple-tube approaches in use.

Clinical microbiologists are acutely aware of the constantly emerging intruders into the intimate human biosphere. These agents appear as the traditional scourges of humanity are brought under control. But the application of antimicrobial agents to the food chain, cosmetics, and over-the-counter medications, and the advances in medical science, sparing individuals afflicted with a variety of diseases but accompanied by impaired immunity—all these factors have combined to increase nosocomial infections, placing the medical facility at the very apex of the selective-pressure pyramid. The selection results in colonization by microbiota with a minority of antimicrobial-tolerant or -resistant constituents; administration of antimicrobial therapy converts these organisms to a majority. These selected prokaryotes and eukaryotes,
along with the emerging viruses, coccidia, yeasts, and molds, pose a dynamic challenge to the clinical microbiologist and promise a continued need for her or his services. But these challenges must be met by the expansion of technical skills brought to bear on the changing nature of the challenging microbiota and the willingness of clinical microbiologists
to adopt and practice evolving technologies, to gain knowledge in addition to information, and to remain in the forefront of innovation and invention.

Gut microbiota: next frontier in understanding human health and development of biotherapeutics

Satya Prakash, L Rodes, M Coussa-Charley, C Tomaro-Duchesneau
Biologics: Targets and Therapy 2011:5 71–86
http://dx.doi.org/10.2147/BTT.S19099

The human gastrointestinal tract houses a huge microbial ecosystem, the gut microbiota. This intestinal ecosystem is partially responsible for maintaining human health. However, particular changes in the ecosystem might contribute to the development of certain diseases. With this in mind, there is a need for an exhaustive review on the functions of the gut microbiota, occurrence of gut dysbiosis (alteration of the microbiota), mechanisms by which intestinal bacteria can trigger development of disease, how this ecosystem can be exploited for understanding human health, development of biotherapeutics, expert opinion on current biotherapeutics, and future perspectives. This review presents a descriptive and comprehensive analysis on “the good, the bad, and the ugly” of the gut microbiota, and methods to study these and their modulation of human health.

The gut microbiota is a remarkable asset for human health. As a key element in the development and prevention of specific diseases, its study has yielded a new field of promising biotherapeutics. This review provides comprehensive and updated knowledge of the human gut microbiota, its implications in health and disease, and the potentials and limitations of its modification by currently available biotherapeutics to treat, prevent and/ or restore human health, and future directions. Homeostasis of the gut microbiota maintains various functions which are vital to the maintenance of human health. Disruption of the intestinal ecosystem equilibrium (gut dysbiosis) is associated with a plethora of human diseases, including autoimmune and allergic diseases, colorectal cancer, metabolic diseases,

and bacterial infections. Relevant underlying mechanisms by which specific intestinal bacteria populations might trigger the development of disease in susceptible hosts are being explored across the globe. Beneficial modulation of the gut microbiota using biotherapeutics, such as prebiotics, probiotics, and antibiotics, may favor health-promoting populations of bacteria and can be exploited in development of biotherapeutics. Other technologies, such as development of human gut models, bacterial screening, and delivery formulations e.g., microencapsulated probiotics, may contribute significantly in the near future. Therefore, the human gut microbiota is a legitimate therapeutic target to treat and/or prevent various diseases. Development of a clear understanding of the technologies needed to exploit the gut microbiota is urgently required.

Seven bacterial divisions constitute the gut microbiota, i.e., Firmicutes, Bacteroides, Proteobacteria, Fusobacteria, Verrucomicrobia, Cyanobacteria, and Actinobacteria, with Firmicutes and Bacteroides being the most abundant species. Bacterial communities exhibit quantitative and qualitative variations along the length of the gastrointestinal tract due to host factors (e.g., pH, transit time, bile acids, digestive enzymes, and mucus), nonhost factors (eg, nutrients, medication, and environmental factors), and bacterial factors (e.g., adhesion capacity, enzymes, and metabolic capacity).

Until recently, the analysis of bacterial ecosystems was performed by growth on defined media, which has some limitations because this method is labor-intensive and, more importantly, only 80% of stool bacteria can be cultivated. As a consequence, new molecular techniques have been developed. In terms of qualitative measurements of the microbiota, techniques such as fingerprinting (denaturing gradient gel electrophoresis), terminal restriction fragment length polymorphism, ribosomal intergenic spacer analysis, and 16S ribosomal RNA sequencing are widely used. Specifically, genome sequencing has provided tremendous information in the microbial world, spearheading technologies such as microarrays. New automated parallel sequencing technologies, based on the 16S ribosomal RNA gene present in all prokaryotes, can offer a cost-effective solution for rapid sequencing and identification of bacterial species of the gut.

Essential metabolic functions

Metabolic functions of the gut microbiota include production of vitamin, amino acid synthesis, and bile acid biotransformation. Bile acid biotransformations, performed by microbial enzymes, have implications for cholesterol and glucose metabolism. Importantly, the microbiome provides biochemical pathways required for the fermentation of nondigestible substrates and endogenous mucus. Through fermentation, bacterial growth is stimulated, producing short-chain fatty acids and gases. The major short-chain fatty acids produced are acetate, butyrate, and propionate. Other bacterial end products include lactate, ethanol, succinate, formate, valerate, caproate, isobutyrate, 2-methyl-butyrate,
and isovalerate. Bacterial fermentation is present in the cecum and colon, where the short-chain fatty acids are absorbed, stimulating the absorption of salts and water.

Ensures protection

Pathogen displacement or “colonization resistance” is an accepted function of the gut microbiota. Commensal organisms prevent pathogenic colonization by competing for attachment sites and nutrients, and also through the production and secretion of antimicrobials. Those mechanisms are relevant for reducing the level of lipopolysaccharides, peptidoglycans, bacterial CpG-DNA motifs, and superantigens, which can all be detrimental to the host. The indigenous microbiota is also essential for development of the immune system. Short-chain fatty acids, such as butyrate, may exert potent immunomodulatory effects by suppressing nuclear factor-kB activation and/or by acting on G-coupled receptors, as demonstrated with acetate. These concepts illustrate a dynamic relationship between the immune system and the microbiota. The intestinal mucosa averts threats by signaling to the innate immune system through pattern recognition receptors, such as toll-like receptors. Pattern recognition receptors recognize and bind to specific microbial macromolecules, referred to as microbial-associated molecular patterns. These include lipopolysaccharide, flagellin, peptidoglycan, and N-formylated peptides.

Structural and histological function

The microbiota ensures intestinal structure and function. Firstly, the mucus layer, which reflects the balance between mucus secretion and bacterial degradation, constitutes an obstacle to the uptake of antigens and proinflammatory molecules. Secondly, some bacterial communities may strengthen the barrier at the level of the tight junctions, ie, protein clusters that form a barrier between the lumen and the lamina propria. Moreover, the gut microbiota is involved in cell and tissue development. Butyrate regulates cell growth and differentiation, inhibiting transformed cell growth while encouraging reversion of cells from a neoplastic to a non-neoplastic phenotype. Most of the structural and morphological development of the gut contributes to and manages the gut bacterial system.

Dysbiosis is a state in which the microbiota becomes altered as a consequence of an alteration in the composition of the microbiota, a change in bacterial metabolic activity, and/or a shift in local distribution of communities. Many factors can alter the gastrointestinal ecosystem, including antibiotics, psychological and physical stresses, radiation, altered peristalsis, and dietary changes. At present, the focus is on the description of dysbiosis in a plethora of human disorders.

  • Autoimmune disease

Autoimmune diseases occur when the body’s immune system attacks and destroys healthy cells and tissues, as is the case in type 1 diabetes mellitus, celiac disease, inflammatory bowel diseases, and allergic asthma. Most often, the immune response is initiated by unknown factors. Alteration of the gut microbiota as a result of modern lifestyles is an attractive hypothesis to explain the rise in prevalence of celiac disease, type 1 diabetes mellitus, and inflammatory bowel diseases.

Celiac disease is an inflammatory disease of the small intestine that is triggered and maintained by the storage proteins of wheat, barley, and rye. Studies have investigated the composition of the microbiota in patients with celiac disease. Fecal samples from patients with celiac disease had reduced the proportions of Bifidobacterium, Clostridium histolyticum, Clostridium lituseburense, Faecalibacterium prausnitzii, and increased proportions of Bacteroides/Prevotella.

Type 1 diabetes mellitus, characterized by insulin deficiency resulting from immune-mediated destruction of pancreatic β cells, is thought to be triggered by environmental factors in genetically susceptible individuals. Given that antibiotics prevented type 1 diabetes mellitus in biobreeding diabetes-prone rats and in nonobese diabetic mice, alteration of the microbiota has been associated with progression of type 1 diabetes mellitus. Evidence shows that bacterial communities from biobreeding diabetes-prone and diabetes-resistant rats differ, marked by a higher abundance of Lactobacillus and Bifidobacterium in diabetes-resistant rats.

Inflammatory bowel diseases include ulcerative colitis and Crohn’s disease. Crohn’s disease is characterized by patchy and transmural inflammation that may affect any part of the gastrointestinal tract, while ulcerative colitis is a chronic episodic inflammatory condition that involves only the large bowel. There is evidence that species belonging to the normal gut microbiota are involved in the etiology and/or maintenance of inflammatory processes. Reduced microbial diversity, increased Bacteroidetes and Enterobacteriaceae, and decreased Firmicutes were all observed in patients with inflammatory bowel diseases.

  • Irritable bowel syndrome

Irritable bowel syndrome is characterized by abdominal pain, bloating, and changes in bowel habit, in the absence of any overt mucosal abnormality. Observations have directed attention towards the gut microbiota, identifying a postinfectious variant of the syndrome, ie, evidence that antibiotics induced a reduction in the microbiota which may be a risk factor, and the proposal that some patients may have bacterial overgrowth in the small bowel.49 Studies have demonstrated that patients with irritable bowel syndrome have fewer intestinal Bifidobacteria, Collinsella aerofaciens, Coprococcus eutactus, and Clostridium cocleatum, and an increase in Veillonella and Enterobacteriaeae.

  • Bacterial infection

It is well established that a disruption in the commensal microbiota increases susceptibility to enteric infections. Antibiotic-treated mice are particularly useful for studying colitis induced by Salmonella spp, Shigella spp, and E. coli infections. In addition, in murine Citrobacter rodentium infections, pathogen colonization is associated with a reduced total density and a relative increase in γ-Proteobacteria. Furthermore, elderly patients with C. difficile-associated diarrhea demonstrate reduced numbers of Bacteroides, Prevotella, and Bifidobacteria, and a greater diversity of facultative species, ie, Lactobacilli and Clostridia. The evidence suggests an association between disruption of the gut microbiota and bacterial infections, further accentuating the dysbiosis.

Altered composition of the human gastrointestinal ecosystem can lead to physiological changes in the intestinal environment, disrupting the functions of the microbiota and having serious consequences for human health.

  1. Altered gut microbiota may trigger serious immune deregulation
  2. Specific gut dysbiosis can engender metabolic endotoxemia
  3. Bacterial infection might be promoted by gut dysbiosis
  4. Abnormal bacterial metabolite levels may trigger cancer

An altered microbial balance in the gut can lead to A) an increase in immune mediated disorders and B) chronic low-grade inflammation.

A mechanism based on the triggering of the host’s immune defenses was elucidated using models of C. rodentium (mimicking diarrheal pathogen associated inflammation), Campylobacter jejuni infection, and chemically and genetically induced models of intestinal inflammation are used for altered microbiota investigations. An overgrowth of Enterobacteriaceae was observed in all models, indicating that inflammation induced microbiota changes support colonization by aerotolerant bacteria.

Many etiological bacterial mechanisms have been hypothesized to promote carcinogenesis. Amongst those, hydrogen sulfide, a product of bacterial sulfate reduction, appears to be linked to the incidence of chronic disorders, such as ulcerative colitis and colorectal cancer. Because DNA strand breaks are associated with mutation and promotion of carcinogenesis, bacterial hydrogen sulfide may be responsible for the induction of mutations in the development of sporadic colorectal cancer.

  • Gut microbiota alters energy and lipid metabolism

Reared mice have more body and gonadal fat than germ-free mice, despite reduced chow consumption. The increase in fat was accompanied with increased fasting glucose and insulin levels and an insulin-resistant state.

– Prebiotics

Prebiotics are “nondigestible food ingredients that beneficially affect the host by selectively stimulating the growth and/or the activity of one or a limited number of bacteria in the colon, and thus improves host health”. A prebiotic should not be hydrolyzed by human intestinal enzymes, but selectively fermented by bacteria, benefiting the host.

The relationship between health and the gastrointestinal system is established. Due to the inherent plasticity of microbiota, one can consider exploiting it to develop biotherapeutics.

Gut Microbiota Regulates Bile Acid Metabolism by Reducing the Levels of Tauro-beta-muricholic Acid, a Naturally Occurring FXR Antagonist

Sama I. Sayin, Annika Wahlstrom, Jenny Felin, Sirkku Jantti, et al.
Cell Metabolism  Feb 5, 2013; 17, 225–235
http://dx.doi.org/10.1016/j.cmet.2013.01.003

Bile acids are synthesized from cholesterol in the liver and further metabolized by the gut microbiota into secondary bile acids. Bile acid synthesis is under negative feedback control through activation of the nuclear receptor farnesoid X receptor (FXR) in the ileum and liver. Here we profiled the bile acid composition throughout the enterohepatic system in germfree (GF) and conventionally raised (CONV-R) mice.

We confirmed a dramatic reduction in muricholic acid, but not cholic acid, levels in CONV-R mice. Rederivation of Fxr-deficient mice as GF demonstrated that the gut microbiota regulated expression of fibroblast growth factor 15 in the ileum and cholesterol 7α-hydroxylase (CYP7A1) in the liver by FXR-dependent mechanisms. Importantly, we identified tauroconjugated β- and α-muricholic acids as FXR antagonists. These studies suggest that the gut microbiota not only regulates secondary bile acid metabolism but also inhibits bile acid synthesis in the liver by alleviating FXR inhibition in the ileum.

 

Diet rapidly and reproducibly alters the human gut microbiome.

Lawrence A David, Corinne F Maurice, Rachel N Carmody, et al.
Nature 12/2013; http://dx.doi.org:/10.1038/nature12820

Long-term dietary intake influences the structure and activity of the trillions of microorganisms residing in the human gut, but it remains unclear how rapidly and reproducibly the human gut microbiome responds to short-term macronutrient change. Here we show that the short-term consumption of diets composed entirely of animal or plant products alters microbial community structure and overwhelms inter-individual differences in microbial gene expression. The animal-based diet increased the abundance of bile-tolerant microorganisms (Alistipes, Bilophila and Bacteroides) and decreased the levels of Firmicutes that metabolize dietary plant polysaccharides (Roseburia, Eubacterium rectale and Ruminococcus bromii). Microbial activity mirrored differences between herbivorous and carnivorous mammals, reflecting trade-offs between carbohydrate and protein fermentation. Foodborne microbes from both diets transiently colonized the gut, including bacteria, fungi and even viruses. Finally, increases in the abundance and activity of Bilophila wadsworthia on the animal-based diet support a link between dietary fat, bile acids and the outgrowth of microorganisms capable of triggering inflammatory bowel disease. In concert, these results demonstrate that the gut microbiome can rapidly respond to altered diet, potentially facilitating the diversity of human dietary lifestyles

Honing in on enteric fever

Lyle R Mckinnon And Quarraisha Abdool Karim
eLife 2014;3:e03545. http://dx.doi.org:/10.7554/eLife.03545

Enteric fever, also known as typhoid, is a disease that affects about 22 million people and causes about 200,000 deaths every year, according to conservative estimates. Enteric fever is spread by bacteria belonging to the Salmonella genus, with two sub-species—Salmonella Typhi and Salmonella Paratyphi A—being responsible for most cases of the disease. And although the number of cases of enteric fever has fallen significantly over recent decades, there is a clear need for a diagnostic test for Salmonella that is rapid, affordable and accurate. Moreover, it is important to be able to distinguish between enteric fever caused by Salmonella Typhi and enteric fever caused by Salmonella Paratyphi A in order to ensure that the correct drugs are prescribed and to combat the development of antibiotic resistance.

The application of metabolomics is relatively new in infectious diseases research compared to the application of genomics and proteomics. Despite this, screening the metabolome in blood plasma has identified useful prognostic profiles of several diseases, including sepsis.  Using a combination of gas chromatography and mass spectrometry, Näsström et al. identified 695 distinct peaks that were associated with different metabolites: from these they selected six peaks that had significantly different heights in the three groups of patients. This meant that they were able to tell if the patient had S. Typhi, S. Paratyphi A, or neither. That this mass spectrometric analysis was able to distinguish two Salmonella groups that share many similarities is remarkable. Moreover, in addition to its diagnostic potential, this new approach might also provide insights into the antigenic and physiological differences between the two strains.

http://emedicine.medscape.com/article/186458-overview

Clostridium difficile colitis results from a disturbance of the normal bacterial flora of the colon, colonization by C difficile, and the release of toxins that cause mucosal inflammation and damage. Antibiotic therapy is the key factor that alters the colonic flora. C difficile infection (CDI) occurs primarily in hospitalized patients.

Essential update: CDC promotes improving inpatient antibiotic prescribing to reduce drug resistance and increase patient safety In a CDC analysis of data regarding antibiotic prescribing in hospitalized patients, Fridkin and colleagues estimated that a 30% reduction in use of broad-spectrum antibiotics would result in a 26% reduction in C difficile infections (CDIs).[1, 2] In addition, improvement in physician antibiotic prescribing habits from overuse and incorrect use would also help to reduce antibiotic resistance.

The authors recommend the following[2] :

Promptly initiate antibiotics for a presumed infection, but first obtain any recommended cultures.

Document and specify the drug’s indication, dose, and expected duration of use in the patient’s medical chart.

Reassess the patient within 48 hours based on test results and patient examination; adjust the antibiotic regimen (dose, duration) and/or the agent, or end the antibiotic treatment, as needed.

Signs and symptoms

Symptoms of C difficile colitis often include the following:

Mild to moderate watery diarrhea that is rarely bloody

Cramping abdominal pain

Anorexia

Malaise

Physical examination may reveal the following in patients with the disorder:

Fever: Especially in more severe cases

Dehydration

Lower abdominal tenderness

Rebound tenderness: Raises the possibility of colonic perforation and peritonitis

Regulatory T-cells in autoimmune diseases: Challenges, controversies and—yet—unanswered questions

Charlotte R. Grant, Rodrigo Liberal, Giorgina Mieli-Vergani, et al.
Autoimmunity Reviews 14 (2015) 105–116
http://dx.doi.org/10.1016/j.autrev.2014.10.012

Regulatory T cells (Tregs) are central to the maintenance of self-tolerance and tissue homeostasis. Markers commonly used to define human Tregs in the research setting include high expression of CD25, FOXP3 positivity and low expression/negativity for CD127. Many other markers have been proposed, but none unequivocally identifies bona fide Tregs. Tregs are equipped with an array of mechanisms of suppression, including the modulation of antigen presenting cell maturation and function, the killing of target cells, the disruption of metabolic pathways and the production of anti-inflammatory cytokines. Treg impairment has been reported in a number of human autoimmune conditions and includes Treg numerical and functional defects and conversion into effector cells in response to inflammation. In addition to intrinsic Treg impairment, resistance of effector T cells to Treg control has been described. Discrepancies in the literature are common, reflecting differences in the choice of study participants and the technical challenges associated with investigating this cell population. Studies differ in terms of the methodology used to define and isolate putative regulatory cells and to assess their suppressive function. In this review we outline studies describing Treg frequency and suppressive function in systemic and organ specific autoimmune diseases, with a specific focus on the challenges faced when investigating Tregs in these conditions.
Role of dendritic cells in the initiation, progress and modulation of systemic autoimmune diseases

Juan Pablo Mackern-Oberti, Carolina Llanos, Fabián Vega, et al.
Autoimmunity Reviews 14 (2015) 127–139
http://dx.doi.org/10.1016/j.autrev.2014.10.010

Dendritic cells (DCs) play a key role in the activation of the immune response against pathogens, as well as in the modulation of peripheral tolerance to self-antigens (Ags). Furthermore, an imbalance in the activating/inhibitory receptors expressed on the surface of DCs has been linked to increased susceptibility to develop autoimmune diseases underscoring their immunogenicity potential. It has been described that modulation of activating or inhibitory molecules expressed by DCs, such as CD86, TLRs, PDL-1 and FcγRs, can define the immunogenic phenotype. On the other hand, T cell tolerance can be achieved by tolerogenic DCs, which have the capacity of blocking undesired autoimmune responses in several experimental models, mainly by inducing T cell anergy, expansion of regulatory T cells and limiting B cell responses. Due to the lack of specific therapies to treat autoimmune disorders and the tolerogenic capacity of DCs shown in experimental autoimmune disease models, autologous tol-DCs are a potential therapeutic strategy for fine-tuning the immune system and reestablishing tolerance in human autoimmune diseases. New advances in the role of DCs in systemic lupus erythematosus (SLE) pathogenesis and the identification of pathogenic self-Ags may favor the development of novel tol-DC based therapies with amajor clinical impact. In this review, we discuss recent data relative to the role of DCs in systemic autoimmune pathogenesis and their use as a therapy to restore tolerance.

T cell subsets and their signature cytokines in autoimmune and inflammatory diseases

Itay Raphael, Saisha Nalawade, Todd N. Eagar, Thomas G. Forsthuber
Cytokine xxx (2014) xxx–xxx
http://dx.doi.org/10.1016/j.cyto.2014.09.011

CD4+ T helper (Th) cells are critical for proper immune cell homeostasis and host defense, but are also major contributors to pathology of autoimmune and inflammatory diseases. Since the discovery of the Th1/Th2 dichotomy, many additional Th subsets were discovered, each with a unique cytokine profile, functional properties, and presumed role in autoimmune tissue pathology. This includes Th1, Th2, Th17, Th22, Th9, and Treg cells which are characterized by specific cytokine profiles. Cytokines produced by these Th subsets play a critical role in immune cell differentiation, effector subset commitment, and in directing the effector response. Cytokines are often categorized into proinflammatory and anti-inflammatory cytokines and linked to Th subsets expressing them. This article reviews the different Th subsets in terms of cytokine profiles, how these cytokines influence and shape the immune response, and their relative roles in promoting pathology in autoimmune and inflammatory diseases. Furthermore, we will discuss whether Th cell pathogenicity can be defined solely based on their cytokine profiles and whether rigid definition of a Th cell subset by its cytokine profile is helpful.

Irritable Bowel Syndrome and Gluten Sensitivity Without Celiac Disease: Separating the Wheat from the Chaff

Biesiekierski JR, Newnham ED, Irving PM, et al. Gluten causes gastrointestinal symptoms in subjects without celiac disease: a double-blind randomized placebo controlled trial. Am J Gastroenterol 2011;106:508–514.

Courtney C. Ferch, William D. Chey
Gastroenterology 2012; 142:664–673

Over the past several years, there has been increasing discussion concerning the topic of gluten sensitivity as a cause of irritable bowel syndrome (IBS) symptoms in patients for whom celiac disease has been excluded. Biesiekierski et al performed a double-blind, placebo-controlled, dietary rechallenge trial to better understand the role of gluten ingestion in the development of gastrointestinal (GI) and non-GI symptoms in patients diagnosed with IBS. This study included a sample of 34 patients diagnosed with IBS by the Rome III criteria who had experienced symptom improvement with a gluten-free diet for 6 weeks before study enrollment. Celiac disease had been excluded in all study participants by either a negative HLADQ2/HLA-DQ8 haplotype or a normal duodenal biopsy. Patients with potentially important confounders such as cirrhosis, inflammatory bowel disease, nonsteroidal anti-inflammatory drug ingestion, or excessive alcohol use were excluded from the study.

Upon completion of the study period, it was found that a significantly greater proportion of patients in the gluten group compared with the gluten-free group answered “no” to the primary outcome question (68% vs 40%; P < .001). Compared with the gluten group, those who remained gluten free also reported significant improvements in pain (P < .016), bloating (P < .031), satisfaction with stool consistency (P <.024), and tiredness (P < .001), but showed no significant differences in wind (P < .053) or nausea (P < .69). The results of celiac antibodies at baseline and after the dietary intervention were similar. Intestinal permeability as measured by urine lactulose-to-rhamnose ratio was also unchanged by the dietary intervention. Fecal lactoferrin levels were persistently undetectable in all but 1 patient during the treatment period. High-sensitivity C-reactive protein levels remained normal before and after the dietary intervention. There were no differences in the likelihood of symptomatic response in those with and without HLA-DQ2 and HLA-DQ8 alleles, arguing against undiagnosed celiac disease as a cause for symptom response to a gluten-free diet.

The authors felt that these data support the existence of non–celiac-associated gluten sensitivity. They concluded that gluten is indeed associated with overall IBS symptoms, bloating, dissatisfaction with stool consistency, abdominal pain, and fatigue in a subset of patients.

A recent meta-analyses of studies from around the world found that patients with IBS symptoms were significantly more likely to have celiac disease than controls. (Arch Intern Med 2009;169:65165– 65168). As such, the American College of Gastroenterology Task Force has recommended that routine serologic screening for celiac sprue be pursued in patients with diarrhea-predominant IBS and IBS with a mixed bowel pattern (grade 1B recommendation; Am J Gastroenterol 2009;104[Suppl 1]:S1–S35). Although much of the recent discussion around the potential role of food in IBS symptoms has focused on celiac disease, it is important to note that data from the available US studies have not shown a significantly greater risk for celiac disease among patients with IBS symptoms and no warning signs (Am J Gastroenterol 2008;103[Suppl 1]:S472; Gastroenterology 2011;141:1187–1193). A recent prospective study from the United States reported a 0.4% prevalence of biopsy-proven celiac disease in 492 patients with IBS symptoms and 458 asymptomatic persons undergoing colonoscopy for colorectal cancer screening or surveillance (Gastroenterology 2011;141:1187–1193). Although not significantly different, it is interesting that 7.3% of the IBS group and 4.8% of controls had 1 abnormal celiac serology test result (adjusted odds ratio, 1.49; 95% confidence interval, 0.76 – 0.90; P =.25). Thus, this study suggests that the likelihood of an abnormal immunologic response to gluten is orders of magnitude more common than biopsy-proven celiac disease in IBS patients and controls from the United States. It has been suggested that ~20% of the general population reports symptoms in association with the ingestion of gluten. Such patients have been said to suffer from “gluten sensitivity.”

It is also interesting to consider the potential effects of food on gut immune function beyond celiac disease. There is emerging evidence to suggest that immune activation and/or low-grade inflammation may play a role in the pathogenesis of IBS (GI Clin North Am 2011;40:65–85). The data are currently conflicting, but alternations in the number of mast cells in close proximity to afferent neurons, mucosal lymphocytes, and certain pro-inflammatory or anti-inflammatory cytokines have been identified in a subset of patients with IBS. It is not difficult to envision that alterations in the gut immune system could occur as a consequence of an acute GI infection in a genetically susceptible individual. However, it is interesting to speculate that other environmental factors, such as an altered gut microbiota, physical or emotional abuse, stress, or food, might result in abnormal gut immune function translating clinically into IBS symptoms.

A better understanding of how differences in gut immune function, the microbiome, and fermentation might influence the development of IBS symptoms in association with the ingestion of gluten are all deserving of further investigation. The study by Biesiekierski et al is the first randomized, controlled trial to suggest that nonceliac IBS patients might benefit from a gluten-free diet. Although these results are certainly intriguing and hypothesis generating, they require validation in larger, randomized, controlled trials in other parts of the world. What is clear and important for providers to understand is that gluten sensitivity is here to stay and significantly more likely for them to encounter in day-to-day practice than celiac disease.

No Effects of Gluten in Patients With Self-Reported Non-Celiac Gluten Sensitivity After Dietary Reduction of Fermentable, Poorly Absorbed, Short-Chain Carbohydrates

Jessica R. Biesiekierski, Simone L. Peters, Evan D. Newnham, et al.
Gastroenterology 2013;145:320–328
http://dx.doi.org/10.1053/j.gastro.2013.04.051

Background & Aims: Patients with non-celiac gluten sensitivity (NCGS) do not have celiac disease but their symptoms improve when they are placed on gluten-free diets. We investigated the specific effects of gluten after dietary reduction of fermentable, poorly absorbed, short-chain carbohydrates (fermentable, oligo-, di-, monosaccharides, and polyols [FODMAPs]) in subjects believed to have NCGS. Methods: We performed a double-blind crossover trial of 37 subjects (aged 2461 y, 6 men) with NCGS and irritable bowel syndrome (based on Rome III criteria), but not celiac disease. Participants were randomly assigned to groups given a 2-week diet of reduced FODMAPs, and were then placed on high-gluten (16 g gluten/d), low-gluten (2 g gluten/d and 14 g whey protein/d), or control (16 g whey protein/d) diets for 1 week, followed by a washout period of at least 2 weeks. We assessed serum and fecal markers of intestinal inflammation/injury and immune activation, and indices of fatigue. Twenty-two participants then crossed over to groups given gluten (16 g/d), whey (16 g/d), or control (no additional protein) diets for 3 days. Symptoms were evaluated by visual analogue scales. Results: In all participants, gastrointestinal symptoms consistently and significantly improved during reduced FODMAP intake, but significantly worsened to a similar degree when their diets included gluten or whey protein. Gluten-specific effects were observed in only 8% of participants. There were no diet-specific changes in any biomarker. During the 3-day rechallenge, participants’ symptoms increased by similar levels among groups. Gluten-specific gastrointestinal effects were not reproduced. An order effect was observed. Conclusions: In a placebo controlled, cross-over rechallenge study, we found no evidence of specific or dose-dependent effects of gluten in patients with NCGS placed diets low in FODMAPs. www.anzctr.org.au.ACTRN12610000524099

Infection, inflammation, and the irritable bowel syndrome

Spiller, K. Garsed
Digestive and Liver Disease 41 (2009) 844–849
http://dx.doi.org:/10.1016/j.dld.2009.07.007

Infectious diarrhea is one of the commonest afflictions of mankind. Worldwide most of the burden, about 1 billion cases a year, is seen in children <5 years old, the vast majority in the developing world in communities where access to clean water and adequate sanitation is restricted. Here a child can expect to have 6–7 episodes per year compared to 1–2 in the developed world. Following recovery from an episode of gastroenteritis (GE) the vast majority of healthy adults and children develop some degree of immunity to the organism responsible and return to normal functioning. However 7–31% develop post-infectious irritable bowel syndrome (PI-IBS). The proportion of unselected IBS that is post-infectious varies from 6 to 17% in the USA and Europe but whether this differs in the developing world is unknown, though previous enteric infection is a known risk factor for IBS in Southern China.

This review will compare the epidemiology of infectious diarrhea in the developing and developed world and the link between mucosal inflammation and the development of IBS symptoms. The available evidence suggests that the acquisition of immunity in early childhood reduces the severity of subsequent gastroenteritis in adulthood. Since these are known risk factor for developing PI-IBS we hypothesize that this may underlie some of the regional differences in the incidence of both infection and IBS.

Gastrointestinal infection is ubiquitous worldwide though the pattern of infection varies widely. Poor hygiene and lack of piped water is associated with a high incidence of childhood infection, both viral and bacterial. However in developed countries bacterial infection is commoner in young adults. Studies of bacterial infections in developed countries suggest 75% of adults fully recover, however around 25% have long lasting changes in bowel habit and a smaller number develop the irritable bowel syndrome (IBS). Whether the incidence is similar in developing countries is unknown. Post-infective IBS (PI-IBS) shares many features with unselected IBS but by having a defined onset allows better definition of risk factors. These are in order of importance: severity of initial illness, smoking, female gender and adverse psychological factors. Symptoms may last many years for reasons which are unclear. They are likely to include genetic factors controlling the immune response, alterations in serotonin signaling, low grade mucosal inflammation maintained by psychological stressors and alterations in gut microbiota. As yet there are no proven specific treatments, though 5HT3 receptor antagonists, anti-inflammatory agents and probiotics are all logical treatments which should be examined in large well-designed randomized placebo controlled trials.

There are three key questions. Firstly is the incidence of IBS less in the developing world, secondly is the incidence increasing with the adoption of a western urban life style and finally is the disease itself different? The answer to all three is probably yes though interpretation of cross-cultural surveys is fraught with problems relating to the imprecise translation of questions into different cultures. Initial reports from small uncontrolled studies suggested that IBS was very uncommon and predominantly affected a subpopulation who pursued a “western life style”. More recent and robust work gives a range of values for prevalence from very low in Iran and India with just 5.8 and 4.2% respectively, to values in developed Asian countries that are generally lower but not dissimilar to those seen in the west. The key factors associated with rapid westernization that underlie this increase in numbers is unclear but could include the effect of improved hygiene, increased overcrowding, stress and changes in diet. The best evidence comes from studies in which the same populations have been studied over a number of years as has been done in Singapore where after a decade of steady industrial growth the prevalence of IBS has risen from 2.3% to 8.6%.

This raises is a most important question – why should these differences occur? It is clear that major differences in the epidemiology of gut infection exist between the west and the developing world. This is illustrated by Campylobacter jejuni enteritis, which causes a shorter, less severe illness in childhood than in adulthood, which is when most Europeans and North Americans are infected. The greater degree of inflammation which adults experience may increase the risk of developing subsequent PI-IBS which might partly account for the higher prevalence of IBS in the westernized nations.

Worldwide the average number of episodes of infection annually per person is 3. A poorly nourished child living in cramped conditions without access to sewerage and running water will have 8 or more infections in the first year of life, most frequently with enteric bacteria and parasites whereas a child in better sanitary conditions would have less infections and these would be more likely to be viral in origin. Even in England an estimated 1 in 5 people per year have an episode of diarrhea in the community adding up to 9.4 million cases in total a year, largely unreported since only 1 in 30 present to their doctor. It seems here that viral infections predominate in the very young, with bacterial infection particularly Campylobacter spp. being most common in adolescence and early adulthood. PCR analysis of stool in the same study showed that Norovirus and Rotavirus were the commonest pathogens detected across all age groups. Campylobacter spp. were most commonly found in age group 30–39 (16% compared to 6.7% of those aged 1–4).

Infectious diarrhea results from either an increase in fluid and electrolyte secretion, predominantly in the small intestine, or a decrease in absorption which can involve both the small and large bowel. During a diarrheal illness these two mechanisms frequently co-exist. Enterotoxins from Vibrio cholerae or enterotoxigenic E. coli induce profuse secretion while decreased intestinal absorption can be induced by mucosal injury caused by enteroinvasive organisms (e.g., Salmonella, Shigella, and Yersinia spp.). These invasive infections injure cells and excite an immune response and activate enteric nerves and mast cells resulting in an acute inflammatory infiltrate with the release of pro-inflammatory mediators and stimulation of secretion. Clinically the patient will have an acutely inflamed mucosa with ulceration and bleeding.

Campylobacter jejuni produces a range of toxins including cytolethal distending toxin, that first produces a secretory diarrhea in the small intestine in the early part of the illness after which there is invasion of the distal ileum and colon to produce an inflammatory ileocolitis, which can extend all the way to the rectum. The disease is less severe in developing countries than in developed countries, with watery stool, fever, abdominal pain, vomiting and dehydration predominating as opposed to the severe abdominal pain, weight loss, fever and bloody stool that is seen more frequently in infections in the west. Infants usually have milder disease with less fever and pain, which in some cases is due to immunity acquired during previous infection. The reasons for these differences between the developed and developing world are unclear.

The composition of the resident intestinal microbiota is highly variable between individuals but relatively stable for each individual, though IBS patients showamore unstable microbiota. This instability may be due to antibiotic therapy or alterations in diet, both of which are commoner in IBS. Patients given antibiotics are 4 times more likely than untreated controls to report bowel symptoms 4 months later, and antibiotic use is a risk factor for developing IBS with an adjusted OR of 3.70 (1.80–7.60). Antibiotic use increases the incidence of post-infective functional diseases following both Salmonella enteritidis and travellers’ diarrhea, in whom antibiotic treatment gave a relative risk of developing PI-IBS of 4.1 (1.1–15.3) compared with those not receiving treatment.

During acute infectious diarrhea there is a decrease in anaerobes. Mice infected with Citrobacter rodentium or C. jejuni or subjected to a chemically induced colitis show significant reduction in the total numbers of microbiota, which is mainly due to activation of the host immune response and only to a much lesser degree by bacterial factors. This loss of anaerobes is associated with a depletion in short chain fatty acids and an increase in the pH of the stool allowing overgrowth of other organisms which may contribute to disturbed bowel function.

The study of patients with PI-IBS has yielded many new insights for several reasons. Firstly the patients are a more homogenous group than unselected IBS, most having diarrhea with fewer psychological problems than unselected IBS. Secondly the direction of causation is easier to ascertain as they represent a “natural experiment”, with subjects “randomized” to receive an infection, thus producing an unbiased study group. Finally onset of symptoms on a clearly defined date in a previously well patient provides an opportunity to examine the prior host and bacterial factors that predispose to developing IBS.

The severity of injury is mediated not only by factors related to the infecting organism but also by the host’s own immune response which develops in early life and declines in old age. However little is known about the incidence of PI-IBS in the pediatric population and whether it is different to the condition seen in adults. Functional bowel disorders are common in children, with IBS affecting 14% of high school and 6% of middle school patients in a US community study and are classified according to the main complaints made by parents or children rather than in an organ-specific way. This makes comparisons with the adult population difficult however a single recent study reports a very high incidence of postinfectious symptoms in 88 children with positive bacterial stool culture results presenting to a single institution. These had a 36% prevalence of functional gastrointestinal disorders compared to 11% in age- and sex-matched healthy controls. This is much higher than most adult studies with the exception of the Walkerton outbreak. Unlike adults, female gender is not a risk factor for PI-IBS in children suggesting the gender effect depends on hormonal and/or psychosocial factors rather than being genetic.

Despite uncertainty about PI-IBS in childhood we do know that age in adulthood does have an effect on the likelihood of developing PI-IBS. A meta-analysis indicates that patients who develop PI-IBS are slightly younger and one study showed increasing age was protective with age >60 years giving a relative risk of PI-IBS of 0.36 (0.1–0.09) though not all studies have shown this.

Why should this inflammation persist in some and not others? As we have already discussed adverse life events, anxiety and epression may play a part however less psychological morbidity is seen in PI-IBS than IBS indicating the presence of other factors which predispose to an exaggerated or prolonged inflammatory response.  These factors might be genetic since a larger proportion of IBS patients have the high producing heterozygous TNF-α G/A polymorphism at position-308 than controls. Some PI-IBS patients were contained in this study but too few to examine as a subgroup. This study did not confirm an earlier finding of a decrease in the presumed immunoregulatory high IL-10 producing phenotype in IBS.

Although it is likely from animal work that infection does alter the gut microbiota there is no data on this in PI-IBS. There is some indirect evidence that altered microbiota may be important in IBS since fecal serine protease activity, which may be of bacterial origin, is increased in D-IBS. This is of great interest because these proteases can increase visceral sensitivity in rats, acting via the protease activated receptor-2 (PAR-2) group of receptors found in the mucosa and enteric nerves.

A recent small randomized placebo controlled trial of Mesalazine suggested this could reduce mast cell numbers and improve symptoms, a finding which needs repeating with larger numbers. Given the increase in 5HT availability and the effectiveness of 5HT3 receptor antagonists in animal studies and in unselected IBS-D patients a trial of a 5HT3 receptor antagonist would also be logical.

Gut motility and enteroendocrine secretion

Tongzhi Wu, Christopher K Rayner, Richard L Young and Michael Horowitz
Current Opinion in Pharmacology 2013, 13:928–934
http://dx.doi.org/10.1016/j.coph.2013.09.002
The motility of the gastrointestinal (GI) tract is modulated by complex neural and hormonal networks; the latter include gut peptides released from enteroendocrine cells during both the interdigestive and postprandial periods. Conversely, it is increasingly recognised that GI motility is an important determinant of gut hormone secretion, in that the transit of luminal contents influences the degree of nutrient stimulation of enteroendocrine cells in different gut regions, as well as the overall length of gut exposed to nutrient. Of particular interest is the relationship between gallbladder emptying and enteroendocrine secretion. The inter-relationships between GI motility and enteroendocrine secretion are central to blood glucose homeostasis, where an understanding is fundamental to the development of novel strategies for the management of diabetes mellitus.

Enteroendocrine cells account for release of more than 30 known peptides, including motilin and ghrelin during the interdigestive period, and cholecystokinin (CCK), glucose-dependent insulinotropic polypeptide (GIP), glucagon-like peptide-1 (GLP-1) and peptide YY (PYY) after meals. The latter are key mediators of the shift from an interdigestive to a postprandial GI motor pattern. Conversely, the delivery of luminal contents to be sensed by enteroendocrine cells in various gut regions is dependent on GI motor activity.

During the interdigestive period, both the stomach and small intestine undergo a cyclical motor pattern — the ‘migrating motor complex (MMC)’ — consisting of a quiescent phase (~40 min, phase I), a phase of irregular contraction (~50 min, phase II), and a period of maximum contraction (5–10 min, phase III). The MMC migrates from the stomach (or proximal small intestine) to the terminal ileum, and acts to sweep small intestinal contents (including bile, digestive juice and indigestible debris) towards the large intestine. Phase III of the MMC is also associated with spontaneous gallbladder emptying.

The cyclical occurrence of MMC activity during the interdigestive state closely parallels the secretion of motilin, and to a lesser degree, ghrelin. Increases in plasma motilin concentrations follow immediately each episode of spontaneous gallbladder emptying, while after phase III there is a decrease in motilin. The latter might be associated with the relative absence of luminal content due to the ‘house-keeping’ effect of phase III. Patients with gallstones have defective gallbladder emptying and lack the cyclical profile of motilin concentrations and exhibit a reduced frequency of phase III activity.

GI motility has a major impact on enteroendocrine secretion; conversely, enteroendocrine hormones play a pivotal role in the regulation of interdigestive and postprandial GI motility. The significance of these interrelationships is increasingly recognized as being central to the regulation of postprandial glycemia. Slowing gastric emptying and intestinal transit, accelerating gallbladder emptying and intestinal exposure to bile acids, and stimulating postprandial enteroendocrine hormones, all represent novel therapeutic approaches for the management of type 2 diabetes.

Enteroendocrine cell types revisited

Maja S Engelstoft, Kristoffer L Egerod, Mari L Lund and Thue W Schwartz
Current Opinion in Pharmacology 2013, 13:912–921
http://dx.doi.org/10.1016/j.coph.2013.09.018

The GI-tract is profoundly involved in the control of metabolism through peptide hormones secreted from enteroendocrine cells scattered throughout the gut mucosa. A large number of recently generated transgenic reporter mice have allowed for direct characterization of biochemical and cell biological properties of these previously highly elusive enteroendocrine cells. In particular the surprisingly broad co-expression of six functionally related hormones in the intestinal enteroendocrine cells indicates that it should be possible to control not only the hormone secretion but also the type and number of enteroendocrine cells. However, this will require a more deep understanding of the factors controlling differentiation, gene expression and specification of the enteroendocrine cells during their weekly renewal from progenitor cells in the crypts of the mucosa.

Go with the flow — membrane transport in the gut

Editorial overview, David T Thwaites
Current Opinion in Pharmacology 2013, 13:843–846
http://dx.doi.org/10.1016/j.coph.2013.09.019

The primary function of the gastrointestinal tract is the assimilation of nutrients from diet. The final stages of digestion and almost all absorption take place in the small intestine and, to a lesser extent, the large intestine. The intestinal epithelium is the single layer of polarized, differentiated cells that lines the wall of the intestine. It sits at the interface between the outside world and the internal environment of the human body. It is across this epithelial barrier that all essential nutrients, vitamins, electrolytes and fluid are absorbed. Many toxins and waste products can be secreted directly across the intestinal epithelium or excreted through the biliary route. The gastrointestinal tract is of great interest to the pharmacologist, and the pharmaceutical industry beyond, because most patients, if given the opportunity, would choose to take medication orally rather than have it delivered by any other route. In addition, many drugs and metabolites are lost from the body by active secretion from the intestine and liver. Thus, the intestinal epithelium is a major target for clinical intervention to improve bioavailability and modulate gut function.

To allow net transport in either the absorptive or secretory direction, the polarised cells in the small intestine (enterocytes), large intestine (colono-cytes) and liver (hepatocytes) express a distinct set of membrane transport proteins in their apical and basolateral membrane domains. Each epithelial cell type mediates net solute and ion movement through the coordinated activity of an array of membrane transport proteins (primary active transporters or pumps, secondary active cotransporters or antiporters, and channels).

Chloride channel-targeted therapy for secretory diarrheas

Jay R Thiagarajah and AS Verkman
Current Opinion in Pharmacology 2013, 13:888–894
http://dx.doi.org/10.1016/j.coph.2013.08.005

Secretory diarrheas caused by bacterial and viral enterotoxins remain a significant cause of morbidity and mortality. Enterocyte Cl channels represent an attractive class of targets for diarrhea therapy, as they are the final, rate-limiting step in enterotoxin-induced fluid secretion in the intestine. Activation of cyclic nucleotide and/or Ca2+ signaling pathways in secretory diarrheas increases the conductance of Cl channels at the enterocyte luminal membrane, which include the cystic fibrosis transmembrane conductance regulator (CFTR) and Ca2+-activated Cl channels (CaCCs). High-throughput screens have yielded several chemical classes of small molecule CFTR and CaCC inhibitors that show efficacy in animal models of diarrheas. Natural-product diarrhea remedies with Cl channel inhibition activity have also been identified, with one product recently receiving FDA approval for HIV-associated diarrhea.

The intestinal epithelium consists of villi and crypts, with absorption occurring mainly in villi and secretion in crypts. Fluid absorption in the small intestine is driven by the luminal Na+/H+ exchanger (NHE3), Na+-glucose cotransporter (SGLT1), and Cl/HCO3 exchanger (DRA)(Figure 1, not shown). As in all epithelia the electrochemical driving force is established by a basolateral Na+K+-ATPase pump. The pro-absorptive solute transporters are constitutively active, though they can be modulated by second-messengers including cAMP and Ca2+. NHE3, SGLT1 and DRA are thus potential membrane transporter targets to increase intestinal fluid absorption. In the colon, fluid absorption is also facilitated by the epithelial Na+ channel (ENaC) and short-chain fatty acid (scfa) transporters (SMCT1).

Intestinal signal pathways controlling fluid secretion. Not shown. (a) Signaling pathways in CFTR activation by bacterial enterotoxins. Cholera toxin and heat stable enterotoxin (STa) bind to membrane receptors (GM1 —ganglioside receptor, guanylin receptor) causing increases in cyclic nucleotides (cAMP, cGMP) and neurotransmitters, resulting in CFTR activation. EC — enterochromaffin cells, 5-HT — 5-hydroxytryptamine, VIP — vasoactive intestinal peptide, ENS — enteric nervous system. (b) Signaling pathways in CaCC activation by rotavirus. Rotavirus releases NSP4 (non-structural protein 4), which causes elevation of cytoplasmic Ca2+ either: directly via binding to a membrane receptor (integrin α1β2); via neuropeptide galanin; or through activation of enteric nerves. Gal1-R — galanin 1 receptor. (c) Cross-talk between Ca2+ and cAMP pathways in intestinal epithelial cells. Epac — exchange protein directly activated by cAMP, PDE — phosphodiesterase, AC —adenylate cyclase, CaSR — calcium sensing receptor.

Natural-product ClS channel inhibitors Natural products have been identified with antidiarrheal efficacy in humans and a putative mechanism of action involving Cl channel inhibition. Crofelemer, a heterogeneous proanthocyanidin oligomer extracted from the bark latex of South American tree Croton lechleri, was approved recently for HIV-associated diarrhea following clinical trials showing efficacy in reducing the number and severity of diarrhea episodes. Whether CaCC inhibition by crofelemer can explain its efficacy in HIV-associated diarrhea is unclear.

Following a natural product screen that identified tannic acid as a general CaCC inhibitor, we found that red wines containing polyphenolic gallotannins fully inhibited intestinal CaCC without effect on CFTR. In recent follow-up work, we generated an alcohol-free red wine extract with potent CaCC inhibition activity, and showed its efficacy in a neonatal mouse model of rotaviral diarrhea (unpublished data). The wine extract inhibited intestinal Ca2+-activated Cl current and fluid secretion without affecting rotaviral infection of intestinal epithelial cells. CaCC inhibition may account for anecdotal reports of antidiarrheal action of red wines. Motivated by the possibility that known herbal antidiarrheal remedies might act by Cl channel inhibition, we recently screened a selection of diarrhea remedies from sources worldwide and identified a commonly used Thai herbal remedy that fully inhibited both CFTR and CaCC (unpublished observations). The herbal remedy showed efficacy in mouse models of cholera and rotaviral diarrhea.

Clinical relevance of drug efflux pumps in the gut

Shingen Misaka, Fabian Muller and Martin F Fromm
Current Opinion in Pharmacology 2013, 13:847–852
http://dx.doi.org/10.1016/j.coph.2013.08.010

Important export pumps expressed in the apical membrane of enterocytes are P-glycoprotein (P-gp), breast cancer resistance protein (BCRP) and multidrug resistance protein 2 (MRP2). They are believed to be a crucial part of the bodies’ defense mechanisms against potentially toxic, orally administered xenobiotics. In particular P-gp and BCRP also limit the bioavailability of drugs. Inhibition of these intestinal export pumps by concomitantly administered drugs leads to increased plasma concentrations, whereas induction can reduce absorption of the substrate drugs and decrease plasma concentrations. The role of polymorphisms in genes encoding for these transporters will also be discussed. Taken together this review will focus on the role of intestinal export pumps using selected examples from clinical studies in humans.

P-gp (gene: ABCB1) is a protein consisting of two homologous halves, each containing six transmembrane helices and one nuclear-binding domain. The protein expression of P-gp has been shown to increase from proximal to distal parts of the intestine. P-gp generally tends to transport hydrophobic, amphipathic or cationic compounds. Clinically important P-gp substrates include anticancer agents, cardiovascular drugs and immunosuppressants. It is of note that most of the listed drugs are also substrates of CYP3A4, and thus intestinal P-gp and intestinal CYP3A4 efficiently collaborate to enhance the removal of their substrates. ABCB1 mRNA expression is regulated by several nuclear receptors such as pregnane X receptor (PXR), constitutive androstane receptor (CAR), thyroid hormone receptor and vitamin D receptor (VDR).

Human intestinal P-gp limits bioavailability of drugs and induction and inhibition of intestinal P-gp are important mechanisms underlying drug–drug interactions in humans. Direct evidence for these processes in humans was largely generated using studies in healthy volunteers, who received P-gp drug substrates with negligible drug metabolism such as digoxin and talinolol. Further work is required regarding the importance of intestinal P-gp for drug disposition and drug–drug interactions for the majority of P-gp substrates, which are also metabolized, for example, by intestinal and hepatic CYP3A4, since inducers or inhibitors of P-gp frequently also affect CYP3A4 expression or function. For intestinal BCRP and intestinal MRP2, so far only a limited number of examples with specific drugs exist, which indicate their clinical importance in humans.

Gastrointestinal HCO3 S transport and epithelial protection in the gut: new techniques, transport pathways and regulatory pathways

Ursula E Seidler
Current Opinion in Pharmacology 2013, 13:900–908
http://dx.doi.org/10.1016/j.coph.2013.10.001

The concept of a protective alkaline gastric and duodenal mucus layer is a century old, yet it is amazing how much new information on HCO3 transport pathways has emerged recently, made possible by the extensive utilization of gene deleted and transgenic mice and novel techniques to study HCO3  transport. This review highlights recent findings regarding the importance of HCO3  for mucosal protection of duodenum and other gastrointestinal epithelia against luminal acid and other damaging factors. Recently, methods have been developed to visualize HCO3  transport in vivo by assessing the surface pH in the mucus layer, as well as the epithelial pH. New information about HCO3  transport pathways, and emerging concepts about the intricate regulatory network that governs duodenal HCO3 secretion are described, and new perspectives for drug therapy discussed.

The lack of HCO3 ions in the pancreatic secretions of children with cystic fibrosis was recognized in the 1960s and the significance for impaired mucus release discussed. It is now evident that CFTR expression is essential for HCO3  secretion in most gastrointestinal epithelia, such as the esophagus, the small intestine, the biliary tract, and the pancreatic ducts, as well as the reproductive tract and the airways. The low pH in the acinar-ductal unit after release of the zymogen granules needs to be quickly neutralized to prevent acinar damage. Similarly, the bile ducts need a ‘biliary HCO3 umbrella’ to keep toxic bile acids ionized and thereby membrane-impermeable, and the esophagus needs HCO3  secretion to protect the epithelial surface from acid reflux, and this is possibly mediated also by CFTR-dependent mechanisms. HCO3 is essential for the release and proper expansion of mucin molecules. CF patients and CFTR-deficient mice have impaired lipid absorption, which in mice has been experimentally linked with the duodenal HCO3 deficit. Thus the HCO3 secretory defect of cystic fibrosis patients is closely linked to many of the pathophysiological GI manifestations of CF.

Fluid and electrolyte secretion in the inflamed gut: novel targets for treatment of inflammation-induced diarrhea

Melanie G Gareau and Kim E Barrett
Current Opinion in Pharmacology 2013, 13:895–899
http://dx.doi.org/10.1016/j.coph.2013.08.014

Diarrheal disease can occur in the context of both inflammatory and infectious challenges. Inflammation can result in changes in ion transporter expression or simply mislocalization of the protein. In addition to development of diarrhea, an altered secretory state can lead to changes in mucus secretion and luminal pH. Bacterial infection can lead to subversion of host cell signaling, leading to transporter mislocalization and hyposecretion, promoting bacterial colonization. Novel therapeutic strategies are currently being developed to ameliorate transporter defects in the setting of inflammation or bacterial infection including, for example, administration of probiotics and fecal microbiota transplantation. This review will highlight recent findings in the literature detailing these aspects of ion transport in the inflamed gut.

Inflammatory diarrhea can occur in many different pathological conditions including IBD, comprising Crohn’s disease (CD) and ulcerative colitis (UC). The resulting inflammation triggers production of cytokines, including TNFα and IFNϒ, that can modulate ion transporters directly, including Na+K+ATPase and Na+H+ exchanger (NHE)-1 (SLC9A1), and decrease barrier function. Inflammation can activate several potential mechanisms that can underlie diarrheal symptoms via distinct pathways.

The presence of immune cells, such as T cells, results in the production of cytokines that can inhibit Na+ absorption, activate Cl secretion, and cause mucosal barrier dysfunction, resulting in diarrhea. In the IL-10 deficient mouse model of colitis, inflammation is characterized by T cells and macrophages, and high levels of pro-inflammatory cytokines, including TNFα. This was accompanied by dysfunctional NHE3 (SLC9A3) transport activity in the absence of overall changes in gene expression and protein localization. A decrease in expression of PDZ adaptor proteins (NHERF2 and PDZK1 scaffolding proteins), which modulate NHE3 activity by regulating transporter interactions and signal transduction, was also observed.

Ion transporters and their regulatory mechanisms represent potential therapeutic targets for the treatment of inflammatory diarrhea. Probiotics, live microorganisms provided in adequate amounts to confer a benefit on the host beyond their inherent nutrition, have been demonstrated to provide a beneficial effect in various GI diseases, including diarrhea. Acute administration of Lactobacillus acidophilus to Caco-2 cells in vitro and to mice in vivo increased DRA expression. Administration of Bifidobacterium breve, but not Lactobacillus rhamnosus or Eubacterium rectale, to HT29 cells down-regulated both Ca2+ (carbachol [CCh]) and cAMP (FSK) mediated Cl secretion. This effect by B. breve was not seen at the expense of monolayer integrity or tight junction dysfunction, occurred downstream of Ca2+ mobilization and was hypothesized to occur via CFTR based on the observation that a CFTR inhibitor could block the effects of CCh. In contrast, administration of the probiotic strain Enterococcus faecium was able to improve intestinal barrier function in piglets, as measured by mannitol flux rates, whereas prostaglandin E2-induced short circuit current was increased, suggesting an increased secretory state.

Differing degrees of susceptibility to infection with C. rodentium within different strains of mice have been well established and characterized; however the precise mechanisms involved are not well defined. A decrease in DRA was found in C3H and FVB mice, which succumb to C. rodentium infection, compared to resistant C57BL/6 mice. It was recently demonstrated that gavaging C3H mice with the colonic microbiota of C57BL/6 mice, following antibiotic administration, could transfer the protection against death following infection with C. rodentium in C3H mice. Survival was accompanied with restoration of DRA gene expression and other transporters that are known to be involved in protection from diarrhea. While this is extremely preliminary, fecal microbiota transplant may serve as an alternative in a subset of cases of infectious diarrhea, separate from the well-established data on C. difficile.

Phospholipids are increasingly being recognized for their signaling roles in addition to their traditional roles in cell structure. Lysophosphatidic acid (LPA) is a naturally occurring glycerophospholipid that can serve as a signaling molecule via binding to its G-protein coupled receptors LPA1, LPA2, and LPA3. In colonic Caco-2 cells, administration of LPA for 24 hours induced DRA expression via LPA2, increasing its Cl/HCO3exchange activity via a PI3 kinase pathway. The ability of LPA to increase ion transporter activity in the setting of inflammation or infection needs to be tested directly, but the findings at least potentially suggest that LPA may serve as a useful anti-diarrheal agent. Studies in bronchial epithelial cells suggest that LPA can also ameliorate lipopoly-saccharide-induced barrier dysfunction, suggesting a similar effect may be present in the intestinal tract. The ability of LPA to increase migration and proliferation of intestinal epithelial cells, however, would warrant some concerns with long-term administration and would need to be carefully assessed.

Intestinal ion transporters represent a valid physiological target for limiting inflammatory and infectious diarrhea. Their ability to regulate both water secretion and absorption allows bidirectional mechanisms to be exploited, creating a wide range of possible therapeutic targets.

Discovery and Development of Antisecretory Drugs for Treating Diarrheal Diseases

Jay R. Thiagarajah, Eun–A Ko, L Tradtrantip, M Donowitz, and A. S. Verkman
Clinical Gastroenterology and Hepatology 2014;12:204–209
http://dx.doi.org/10.1016/j.cgh.2013.12.001

Diarrheal diseases constitute a significant global health burden and are a major cause of childhood mortality and morbidity. Treatment of diarrheal disease has centered on the replacement of fluid and electrolyte losses using oral rehydration solutions. Although oral rehydration solutions have been highly successful, significant mortality and morbidity due to diarrheal disease remains. Secretory diarrheas, such as those caused by bacterial and viral enterotoxins, result from activation of cyclic nucleotide and/or Ca2+ signaling pathways in intestinal epithelial cells, enterocytes, which increase the permeability of Cl channels at the lumen-facing membrane. Additionally, there is often a parallel reduction in intestinal Na+ absorption. Inhibition of enterocyte Cl channels, including the cystic fibrosis transmembrane conductance regulator and Ca2-activated Cl channels, represents an attractive strategy for antisecretory drug therapy. High-throughput screening of synthetic small-molecule collections has identified several classes of Cl channel inhibitors that show efficacy in animal models of diarrhea but remain to be tested clinically. In addition, several natural product extracts with Cl channel inhibition activity have shown efficacy in diarrhea models. However, a number of challenges remain to translate the promising bench science into clinically useful therapeutics, including efficiently targeting orally administered drugs to enterocytes during diarrhea, funding development costs, and carrying out informative clinical trials. Nonetheless, Cl channel inhibitors may prove to be effective adjunctive therapy in a broad spectrum of clinical diarrheas, including acute infectious and drug-related diarrheas, short bowel syndrome, and congenital enteropathies.

Cl- channels as targets for therapy of secretory diarrheas

Cl- channels as targets for therapy of secretory diarrheas

Cl channels as targets for therapy of secretory diarrheas. This diagram of fluid secretory mechanism in enterocytes lining intestinal crypts and villi illustrates active Cl transport from the blood or submucosa to the intestinal lumen facilitated by luminal membrane CFTR and CaCC channels.

Natural products represent a potentially attractive source of antidiarrheal therapeutics, because they are generally inexpensive and have the potential for rapid translation to the clinic. In addition, there is a long history of anecdotal evidence of efficacy of various antidiarrheal remedies in many parts of the world.

A number of hurdles remain in the translation of antidiarrheal drug candidates to widely used therapy. Although a number of compounds have been advanced through preclinical testing in murine models, new high throughput model systems of enterocyte fluid secretion, such as human intestinal enteroids, or genetically tractable systems, such as zebrafish, warrant development to identify novel compounds and antidiarrheal drug targets. A major translational roadblock, however, is the difficulty in designing and funding informative clinical trials.

Barriers to diarrheal drug development in developing countries include the need for very low manufacture cost, high stability in hot and humid environments, and obtaining funding to support commercial development of new chemical entities with relatively low profit potential.

For drugs targeting the enterocyte extracellular surface, an additional challenge is convective washout in which secreted fluid in intestinal crypts washes away inhibitor drugs. A mathematical model of intestinal convection-diffusion concluded that in severe secretory diarrheas, such as cholera, the antisecretory efficacy of an orally administered, surface-targeted inhibitor requires high inhibitor affinity to its target (low nanomolar Kd) to obtain sufficiently high luminal inhibitor concentration (>100-fold Kd), and sustained high luminal inhibitor concentration or slow inhibitor dissociation. Washout is a significant concern for small-molecule CFTR glycine hydrazides, such as iOWH032, and potentially for several of the natural product agents.

Current and emerging therapies in irritable bowel syndrome: from pathophysiology to treatment

Joseph Y. Chang and Nicholas J. Talley
Trends in Pharmacological Sciences 31 (2010) 326–334
http://dx.doi.org:/10.1016/j.tips.2010.04.008

Irritable bowel syndrome is a common functional gastrointestinal disorder with characteristic symptoms of abdominal pain/discomfort with a concurrent disturbance in defecation. It accounts for a significant healthcare burden, and symptoms may be debilitating for some patients. Traditional symptom-based therapies have been found to be ineffective in the treatment of the entire syndrome complex, and do not modify the natural history of the disorder. Although the exact etiopathogenesis of IBS is incompletely understood, recent advances in the elucidation of the pathophysiology and molecular mechanisms of IBS have resulted in the development of novel therapies, as well as potential future therapeutic targets. This article reviews current and emerging therapies in IBS based upon: IBS as a serotonergic disorder; stimulating intestinal chloride channels; modulation of visceral hypersensitivity; altering low-grade intestinal inflammation; and modulation of the gut microbiota.

Irritable bowel syndrome (IBS) is a functional gastrointestinal (GI) disorder characterized by abdominal pain or discomfort that is associated with disturbances in defecation; bloating is common, and the symptoms are not estimates for North America being 10–15%. Only a minority seek care for their symptoms, but IBS has a dramatic impact on patients and utilization of healthcare resources. It is estimated that IBS accounts for 3.5 million physician visits annually in the USA, and is associated with annual direct costs of $1.6 billion and indirect costs of $19.2 billion; patients with IBS consistently report lower health-related quality of life (HRQOL).

Serotonin, or 5-hydroxytryptamine (5-HT), is a neurotransmitter which is largely stored in the enterochromaffin cells of the gut and plays a critical part in the motility, sensation, and secretion of the GI tract. There is growing evidence that a serotonergic mechanism may be involved in the pathophysiology of IBS. Some of the notable findings include: increased postprandial levels of circulating 5-HT in subjects with diarrhea-predominant IBS (D-IBS); D-IBS subjects were observed to have elevated platelet-depleted plasma 5-HT levels in fasting and fed states; the mucosal 5-Hydroxyindoleacetic acid (5-HIAA)/5-HT ratio was decreased in subjects with constipation-predominant IBS (C-IBS); and a lack of increase in plasma 5-HT levels after meal ingestion in those with C-IBS. These findings suggested that a subset of IBS may be a disorder centered on the serotonin disequilibrium, with 5-HT excess responsible for symptoms of D-IBS and insufficient release of 5-HT in the circulation being responsible for the features of C-IBS. However, not all studies support this disease model.

Given the possible role of serotonin in IBS, several 5-HT receptor-modulating agents have been developed as disease-specific therapeutic agents. The 5-HT3 antagonist alosetron has been shown in multiple randomized clinical trials as well as meta-analyses to be an effective agent in the treatment of D-IBS with improvements in global IBS symptoms, relief of abdominal pain, improvement of the consistency and frequency of bowel movements, and reduced fecal urgency. Furthermore, alosetron has been reported to inhibit intestinal secretion, delay colonic transit time, increase colonic compliance in response to distention, and have central effects that result in its beneficial effects on sensation in IBS.

Current and emerging therapies in irritable bowel syndrome

Serotonergic mechanisms·       Alosetron

·       Tegaserod

·       Prucalopride

Chloride channelsActivators

·       Lubiprostone

·       Linaclotide

Inhibitors

·       Crofelemer

Visceral hypersensitivityTricyclic antidepressants (TCAs)

·       Selective serotonin reuptake inhibitors (SSRIs)

·       ϒ-Aminobutyric acid analog (pregabalin)

K-opioid receptor agonists

·       Asimadoline

Corticotropin-releasing factor (CRF) receptor antag

Modulation of immune activation and inflammation·       5-aminosalicylic acid

·       Corticosteroids ?

Modulating intestinal floraProbiotics

·       Bifidobacteria

Prebiotics

Antibiotics

·       Rifaximin

Fiber supplementation·       Psyllium
Antispasmodics·       Hyoscine

·       Cimetropium

·       Pinaverium

·       Peppermint oil

Alternative therapiesDietary factors and modification

·       Food elimination diet (based on IgG antibodies)

·       Low fermentable oligosaccharides, disaccharides, monosaccharides, and polyols (FODMAPs) diet

·       Gluten-free diet

Agonists to 5-HT4 receptors have been found to be effective in the treatment of C-IBS. 5-HT4 receptor agonists accelerate intestinal transit in the small intestine and colon. Tegaserod is an aminoguanidine indole and selective partial agonist of the 5-HT4 receptor that has been shown to provide improvements in global IBS symptoms and improve constipation in female C-IBS patients. Reports have supported the efficacy of tegaserod in CIBS in terms of global symptom improvement as well as improvement of constipation.

The GI tract contains numerous chloride channels that have an integral role in the transport and secretion of fluids. Type-2 chloride channels (ClC-2) have been investigated with respect to their role in C-IBS and constipation. The ClC-2 channel is an a-helical transmembrane protein located on the apical cell membrane of the intestines, is highly selective for chloride ions, and is involved in the transport and secretion of fluids as well as maintaining cellular membrane potential.  Activation of ClC-2 channels through second messenger induced phosphorylation causes an efflux of chloride ions into the lumen of the GI tract, which results in a subsequent efflux of sodium ions due to isoelectric balance. It is the efflux of sodium that results in the efflux of water into the lumen due to the maintenance of isotonic neutrality through the paracellular pathway. This resulting increase in intestinal secretion and fluid volume has been of interest in the development of chloride channel-directed therapies for C-IBS and constipation.

TCAs and SSRIs have been of interest in the treatment of IBS for their modulation of hyperalgesia and not for their psychotropic effects. TCAs have been demonstrated to be effective in the treatment of neuropathic pain, whereas SSRIs have been suggested to enhance the effectiveness of endogenous pain inhibition systems, and both have been effective the in treatment of various chronic pain disorders. Despite the analgesic effects of these agents, some authors have cited the lack of evidence based on well designed large clinical trials of these agents in IBS as reason for caution.

Gammaa Aminobutyric acid (GABA) analog: pregabalin

Pregabalin is a novel second-generation α2δ ligand that is structurally related to ϒ-aminobutyric acid (GABA). It has been shown to be effective in the treatment of inflammatory and neuropathic pain. Its precise mechanism of action is incompletely understood because it does not appear to have GABA-related functional activity or metabolites; it is believed to decrease depolarization-induced calcium influx at nerve terminals, and thereby inhibit release of excitatory neurotransmitters by acting on the α2δ auxiliary proteins associated with voltage-gated calcium channels. Its potential role in IBS is based upon a recent study demonstrating normalization of rectal distension sensory thresholds in IBS patients with rectal hypersensitivity. Placebo-controlled trials of pregabalin for IBS are currently ongoing.

Potential advances in the visceral modulation of IBS have been seen through studies of the role of opiate receptors in visceral pain. Specifically, peripheral K-opioid receptor agonists are of great interest because they are involved in the inhibition of noxious stimuli from the gut and are devoid of many of the adverse side effects (e.g. constipation, opioid dependence) seen in other opioid agonists that bind to µ receptors; K receptors are found most abundantly in the stomach and colon and in the brain. Asimadoline, a novel selective K-opioid receptor agonist, may be promising in the treatment of IBS. Its low blood–brain barrier permeability and low distribution in the central nervous system (CNS) suggest that its analgesic effects are mediated by reduction of excitability of nociceptors on peripheral nerve endings. Human pharmacodynamics studies of asimadoline demonstrated attenuation of visceral sensation without affecting gut motor function, a decrease in satiation and postprandial fullness independent of effects on gastric volume, and attenuation of pain intensity to colonic distension in IBS subjects. These findings led to the investigation of the possible role of asimadoline in IBS.

Read Full Post »