Advertisements
Feeds:
Posts
Comments

Archive for the ‘Intelligent Information Systems’ Category


The BioPharma Industry’s Unrealized Wealth of Data, by Ben Szekely, Vice President, Cambridge Semantics

Reporter: Aviva Lev-Ari, PhD, RN

 

 

The BioPharma Industry’s Unrealized Wealth of Data

by Ben Szekely, Vice President of Solutions and Pre-sales, Cambridge Semantics

 

Solving the great medical challenges of our time reside within patient data. Clinical trial data, real-world evidence, patient feedback, genetic data, wearables data and adverse event reports contain signals to target medicines at the right patient populations, improve overall safety, and uncover the next blockbuster therapy for unmet medical needs.

However, data sources are large, diverse, multi-structured, messy and highly regulated presenting numerous challenges. As result, extracting value from data are slow to come and require manual work or long-poll dependencies on IT and Data Science teams.

Fortunately, there are new ways being adopted to take better advantage of the ever-growing volumes of patient data.  Called ‘Smart’ Patient Data Lakes (SPDL), these tools create an Enterprise Knowledge Graph built upon foundational and open Semantic Web technology standards, providing rich descriptions of data and flexibility end-to-end.  With the SPDL, biopharma researchers can:

  • Quickly on-board new data without requiring up-front modeling or mapping, ingesting data from any source versus months or weeks of preparation
  • Dynamically map and prepare data at analytics time
  • Horizontally scale in cloud or on-prem infrastructure to 100’s of nodes – allowing billions of facts to be analyzed, queried and explored in real-time   

The world’s BioPharma and research institutions are sitting on a wealth of highly differentiating and life-saving data and should begin to realize its value via Smart Patient Data Lakes (SPDL).

 

 

CONTACT: Nadia Haidar

Global Results Communications ∙ 949-278-7328 ∙ nhaidar@globalresultspr.com

 

Advertisements

Read Full Post »


Dr. Doudna: RNA synthesis capabilities of Synthego’s team represent a significant leap forward for Synthetic Biology

Reporter: Aviva Lev-Ari, PhD, RN

 

Synthego Raises $41 Million From Investors, Including a Top Biochemist

Synthego also drew in Dr. Doudna, who had crossed paths with the company’s head of synthetic biology at various industry conferences. According to Mr. Dabrowski, the money from her trust represents the single-biggest check from a non-institutional investor that the start-up has raised.

Synthego’s new funds will help the company take its products to a more global customer base, as well as broaden its offerings. The longer-term goal, Mr. Dabrowski said, is to help fully automate biotech research and take care of much of the laboratory work that scientists currently handle themselves.

The model is cloud technology, where companies rent out powerful remote server farms to handle their computing needs rather than rely on their own hardware.

“We’ll be able to do their full research workflow,” he said. “If you look at how cloud computing developed, it used to be that every company handled their server farm. Now it’s all handled in the cloud.”

SOURCE

Other related articles published in this Open Access Online Scientific Journal include the following:

UPDATED – Status “Interference — Initial memorandum” – CRISPR/Cas9 – The Biotech Patent Fight of the Century: UC, Berkeley and Broad Institute @MIT

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2016/01/06/status-interference-initial-memorandum-crisprcas9-the-biotech-patent-fight-of-the-century/

 

Read Full Post »


Imaging of Cancer Cells

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Microscope uses nanosecond-speed laser and deep learning to detect cancer cells more efficiently

April 13, 2016

Scientists at the California NanoSystems Institute at UCLA have developed a new technique for identifying cancer cells in blood samples faster and more accurately than the current standard methods.

In one common approach to testing for cancer, doctors add biochemicals to blood samples. Those biochemicals attach biological “labels” to the cancer cells, and those labels enable instruments to detect and identify them. However, the biochemicals can damage the cells and render the samples unusable for future analyses. There are other current techniques that don’t use labeling but can be inaccurate because they identify cancer cells based only on one physical characteristic.

Time-stretch quantitative phase imaging (TS-QPI) and analytics system

The new technique images cells without destroying them and can identify 16 physical characteristics — including size, granularity and biomass — instead of just one.

The new technique combines two components that were invented at UCLA:

A “photonic time stretch” microscope, which is capable of quickly imaging cells in blood samples. Invented by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering, it works by taking pictures of flowing blood cells using laser bursts (similar to how a camera uses a flash). Each flash only lasts nanoseconds (billionths of a second) to avoid damage to cells, but that normally means the images are both too weak to be detected and too fast to be digitized by normal instrumentation. The new microscope overcomes those challenges by using specially designed optics that amplify and boost the clarity of the images, and simultaneously slow them down enough to be detected and digitized at a rate of 36 million images per second.

A deep learning computer program, which identifies cancer cells with more than 95 percent accuracy. Deep learning is a form of artificial intelligence that uses complex algorithms to extract patterns and knowledge from rich multidimenstional datasets, with the goal of achieving accurate decision making.

The study was published in the open-access journal Nature Scientific Reports. The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.

The research was supported by NantWorks, LLC.

 

Abstract of Deep Learning in Label-free Cell Classification

Label-free cell analysis is essential to personalized genomics, cancer diagnostics, and drug development as it avoids adverse effects of staining reagents on cellular viability and cell signaling. However, currently available label-free cell assays mostly rely only on a single feature and lack sufficient differentiation. Also, the sample size analyzed by these assays is limited due to their low throughput. Here, we integrate feature extraction and deep learning with high-throughput quantitative imaging enabled by photonic time stretch, achieving record high accuracy in label-free cell classification. Our system captures quantitative optical phase and intensity images and extracts multiple biophysical features of individual cells. These biophysical measurements form a hyperdimensional feature space in which supervised learning is performed for cell classification. We compare various learning algorithms including artificial neural network, support vector machine, logistic regression, and a novel deep learning pipeline, which adopts global optimization of receiver operating characteristics. As a validation of the enhanced sensitivity and specificity of our system, we show classification of white blood T-cells against colon cancer cells, as well as lipid accumulating algal strains for biofuel production. This system opens up a new path to data-driven phenotypic diagnosis and better understanding of the heterogeneous gene expressions in cells.

references:

Claire Lifan Chen, Ata Mahjoubfar, Li-Chia Tai, Ian K. Blaby, Allen Huang, Kayvan Reza Niazi & Bahram Jalali. Deep Learning in Label-free Cell Classification. Scientific Reports 6, Article number: 21471 (2016); doi:10.1038/srep21471 (open access)

Supplementary Information

 

Deep Learning in Label-free Cell Classification

Claire Lifan Chen, Ata Mahjoubfar, Li-Chia Tai, Ian K. Blaby, Allen Huang,Kayvan Reza Niazi & Bahram Jalali

Scientific Reports 6, Article number: 21471 (2016)    http://dx.doi.org:/10.1038/srep21471

Deep learning extracts patterns and knowledge from rich multidimenstional datasets. While it is extensively used for image recognition and speech processing, its application to label-free classification of cells has not been exploited. Flow cytometry is a powerful tool for large-scale cell analysis due to its ability to measure anisotropic elastic light scattering of millions of individual cells as well as emission of fluorescent labels conjugated to cells1,2. However, each cell is represented with single values per detection channels (forward scatter, side scatter, and emission bands) and often requires labeling with specific biomarkers for acceptable classification accuracy1,3. Imaging flow cytometry4,5 on the other hand captures images of cells, revealing significantly more information about the cells. For example, it can distinguish clusters and debris that would otherwise result in false positive identification in a conventional flow cytometer based on light scattering6.

In addition to classification accuracy, the throughput is another critical specification of a flow cytometer. Indeed high throughput, typically 100,000 cells per second, is needed to screen a large enough cell population to find rare abnormal cells that are indicative of early stage diseases. However there is a fundamental trade-off between throughput and accuracy in any measurement system7,8. For example, imaging flow cytometers face a throughput limit imposed by the speed of the CCD or the CMOS cameras, a number that is approximately 2000 cells/s for present systems9. Higher flow rates lead to blurred cell images due to the finite camera shutter speed. Many applications of flow analyzers such as cancer diagnostics, drug discovery, biofuel development, and emulsion characterization require classification of large sample sizes with a high-degree of statistical accuracy10. This has fueled research into alternative optical diagnostic techniques for characterization of cells and particles in flow.

Recently, our group has developed a label-free imaging flow-cytometry technique based on coherent optical implementation of the photonic time stretch concept11. This instrument overcomes the trade-off between sensitivity and speed by using Amplified Time-stretch Dispersive Fourier Transform12,13,14,15. In time stretched imaging16, the object’s spatial information is encoded in the spectrum of laser pulses within a pulse duration of sub-nanoseconds (Fig. 1). Each pulse representing one frame of the camera is then stretched in time so that it can be digitized in real-time by an electronic analog-to-digital converter (ADC). The ultra-fast pulse illumination freezes the motion of high-speed cells or particles in flow to achieve blur-free imaging. Detection sensitivity is challenged by the low number of photons collected during the ultra-short shutter time (optical pulse width) and the drop in the peak optical power resulting from the time stretch. These issues are solved in time stretch imaging by implementing a low noise-figure Raman amplifier within the dispersive device that performs time stretching8,11,16. Moreover, warped stretch transform17,18can be used in time stretch imaging to achieve optical image compression and nonuniform spatial resolution over the field-of-view19. In the coherent version of the instrument, the time stretch imaging is combined with spectral interferometry to measure quantitative phase and intensity images in real-time and at high throughput20. Integrated with a microfluidic channel, coherent time stretch imaging system in this work measures both quantitative optical phase shift and loss of individual cells as a high-speed imaging flow cytometer, capturing 36 million images per second in flow rates as high as 10 meters per second, reaching up to 100,000 cells per second throughput.

Figure 1: Time stretch quantitative phase imaging (TS-QPI) and analytics system; A mode-locked laser followed by a nonlinear fiber, an erbium doped fiber amplifier (EDFA), and a wavelength-division multiplexing (WDM) filter generate and shape a train of broadband optical pulses. http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f1.jpg

 

Box 1: The pulse train is spatially dispersed into a train of rainbow flashes illuminating the target as line scans. The spatial features of the target are encoded into the spectrum of the broadband optical pulses, each representing a one-dimensional frame. The ultra-short optical pulse illumination freezes the motion of cells during high speed flow to achieve blur-free imaging with a throughput of 100,000 cells/s. The phase shift and intensity loss at each location within the field of view are embedded into the spectral interference patterns using a Michelson interferometer. Box 2: The interferogram pulses were then stretched in time so that spatial information could be mapped into time through time-stretch dispersive Fourier transform (TS-DFT), and then captured by a single pixel photodetector and an analog-to-digital converter (ADC). The loss of sensitivity at high shutter speed is compensated by stimulated Raman amplification during time stretch. Box 3: (a) Pulse synchronization; the time-domain signal carrying serially captured rainbow pulses is transformed into a series of one-dimensional spatial maps, which are used for forming line images. (b) The biomass density of a cell leads to a spatially varying optical phase shift. When a rainbow flash passes through the cells, the changes in refractive index at different locations will cause phase walk-off at interrogation wavelengths. Hilbert transformation and phase unwrapping are used to extract the spatial phase shift. (c) Decoding the phase shift in each pulse at each wavelength and remapping it into a pixel reveals the protein concentration distribution within cells. The optical loss induced by the cells, embedded in the pulse intensity variations, is obtained from the amplitude of the slowly varying envelope of the spectral interferograms. Thus, quantitative optical phase shift and intensity loss images are captured simultaneously. Both images are calibrated based on the regions where the cells are absent. Cell features describing morphology, granularity, biomass, etc are extracted from the images. (d) These biophysical features are used in a machine learning algorithm for high-accuracy label-free classification of the cells.

On another note, surface markers used to label cells, such as EpCAM21, are unavailable in some applications; for example, melanoma or pancreatic circulating tumor cells (CTCs) as well as some cancer stem cells are EpCAM-negative and will escape EpCAM-based detection platforms22. Furthermore, large-population cell sorting opens the doors to downstream operations, where the negative impacts of labels on cellular behavior and viability are often unacceptable23. Cell labels may cause activating/inhibitory signal transduction, altering the behavior of the desired cellular subtypes, potentially leading to errors in downstream analysis, such as DNA sequencing and subpopulation regrowth. In this way, quantitative phase imaging (QPI) methods24,25,26,27 that categorize unlabeled living cells with high accuracy are needed. Coherent time stretch imaging is a method that enables quantitative phase imaging at ultrahigh throughput for non-invasive label-free screening of large number of cells.

In this work, the information of quantitative optical loss and phase images are fused into expert designed features, leading to a record label-free classification accuracy when combined with deep learning. Image mining techniques are applied, for the first time, to time stretch quantitative phase imaging to measure biophysical attributes including protein concentration, optical loss, and morphological features of single cells at an ultrahigh flow rate and in a label-free fashion. These attributes differ widely28,29,30,31 among cells and their variations reflect important information of genotypes and physiological stimuli32. The multiplexed biophysical features thus lead to information-rich hyper-dimensional representation of the cells for label-free classification with high statistical precision.

We further improved the accuracy, repeatability, and the balance between sensitivity and specificity of our label-free cell classification by a novel machine learning pipeline, which harnesses the advantages of multivariate supervised learning, as well as unique training by evolutionary global optimization of receiver operating characteristics (ROC). To demonstrate sensitivity, specificity, and accuracy of multi-feature label-free flow cytometry using our technique, we classified (1) OT-IIhybridoma T-lymphocytes and SW-480 colon cancer epithelial cells, and (2) Chlamydomonas reinhardtii algal cells (herein referred to as Chlamydomonas) based on their lipid content, which is related to the yield in biofuel production. Our preliminary results show that compared to classification by individual biophysical parameters, our label-free hyperdimensional technique improves the detection accuracy from 77.8% to 95.5%, or in other words, reduces the classification inaccuracy by about five times.     ……..

 

Feature Extraction

The decomposed components of sequential line scans form pairs of spatial maps, namely, optical phase and loss images as shown in Fig. 2 (see Section Methods: Image Reconstruction). These images are used to obtain biophysical fingerprints of the cells8,36. With domain expertise, raw images are fused and transformed into a suitable set of biophysical features, listed in Table 1, which the deep learning model further converts into learned features for improved classification.

The new technique combines two components that were invented at UCLA:

A “photonic time stretch” microscope, which is capable of quickly imaging cells in blood samples. Invented by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering, it works by taking pictures of flowing blood cells using laser bursts (similar to how a camera uses a flash). Each flash only lasts nanoseconds (billionths of a second) to avoid damage to cells, but that normally means the images are both too weak to be detected and too fast to be digitized by normal instrumentation. The new microscope overcomes those challenges by using specially designed optics that amplify and boost the clarity of the images, and simultaneously slow them down enough to be detected and digitized at a rate of 36 million images per second.

A deep learning computer program, which identifies cancer cells with more than 95 percent accuracy. Deep learning is a form of artificial intelligence that uses complex algorithms to extract patterns and knowledge from rich multidimenstional datasets, with the goal of achieving accurate decision making.

The study was published in the open-access journal Nature Scientific Reports. The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.

The research was supported by NantWorks, LLC.

 

http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f2.jpg

The optical loss images of the cells are affected by the attenuation of multiplexed wavelength components passing through the cells. The attenuation itself is governed by the absorption of the light in cells as well as the scattering from the surface of the cells and from the internal cell organelles. The optical loss image is derived from the low frequency component of the pulse interferograms. The optical phase image is extracted from the analytic form of the high frequency component of the pulse interferograms using Hilbert Transformation, followed by a phase unwrapping algorithm. Details of these derivations can be found in Section Methods. Also, supplementary Videos 1 and 2 show measurements of cell-induced optical path length difference by TS-QPI at four different points along the rainbow for OT-II and SW-480, respectively.

Table 1: List of extracted features.

Feature Name    Description         Category

 

Figure 3: Biophysical features formed by image fusion.

(a) Pairwise correlation matrix visualized as a heat map. The map depicts the correlation between all major 16 features extracted from the quantitative images. Diagonal elements of the matrix represent correlation of each parameter with itself, i.e. the autocorrelation. The subsets in box 1, box 2, and box 3 show high correlation because they are mainly related to morphological, optical phase, and optical loss feature categories, respectively. (b) Ranking of biophysical features based on their AUCs in single-feature classification. Blue bars show performance of the morphological parameters, which includes diameter along the interrogation rainbow, diameter along the flow direction, tight cell area, loose cell area, perimeter, circularity, major axis length, orientation, and median radius. As expected, morphology contains most information, but other biophysical features can contribute to improved performance of label-free cell classification. Orange bars show optical phase shift features i.e. optical path length differences and refractive index difference. Green bars show optical loss features representing scattering and absorption by the cell. The best performed feature in these three categories are marked in red.

Figure 4: Machine learning pipeline. Information of quantitative optical phase and loss images are fused to extract multivariate biophysical features of each cell, which are fed into a fully-connected neural network.

The neural network maps input features by a chain of weighted sum and nonlinear activation functions into learned feature space, convenient for classification. This deep neural network is globally trained via area under the curve (AUC) of the receiver operating characteristics (ROC). Each ROC curve corresponds to a set of weights for connections to an output node, generated by scanning the weight of the bias node. The training process maximizes AUC, pushing the ROC curve toward the upper left corner, which means improved sensitivity and specificity in classification.

….   How to cite this article: Chen, C. L. et al. Deep Learning in Label-free Cell Classification.

Sci. Rep. 6, 21471; http://dx.doi.org:/10.1038/srep21471

 

Computer Algorithm Helps Characterize Cancerous Genomic Variations

http://www.genengnews.com/gen-news-highlights/computer-algorithm-helps-characterize-cancerous-genomic-variations/81252626/

To better characterize the functional context of genomic variations in cancer, researchers developed a new computer algorithm called REVEALER. [UC San Diego Health]

Scientists at the University of California San Diego School of Medicine and the Broad Institute say they have developed a new computer algorithm—REVEALER—to better characterize the functional context of genomic variations in cancer. The tool, described in a paper (“Characterizing Genomic Alterations in Cancer by Complementary Functional Associations”) published in Nature Biotechnology, is designed to help researchers identify groups of genetic variations that together associate with a particular way cancer cells get activated, or how they respond to certain treatments.

REVEALER is available for free to the global scientific community via the bioinformatics software portal GenePattern.org.

“This computational analysis method effectively uncovers the functional context of genomic alterations, such as gene mutations, amplifications, or deletions, that drive tumor formation,” said senior author Pablo Tamayo, Ph.D., professor and co-director of the UC San Diego Moores Cancer Center Genomics and Computational Biology Shared Resource.

Dr. Tamayo and team tested REVEALER using The Cancer Genome Atlas (TCGA), the NIH’s database of genomic information from more than 500 human tumors representing many cancer types. REVEALER revealed gene alterations associated with the activation of several cellular processes known to play a role in tumor development and response to certain drugs. Some of these gene mutations were already known, but others were new.

For example, the researchers discovered new activating genomic abnormalities for beta-catenin, a cancer-promoting protein, and for the oxidative stress response that some cancers hijack to increase their viability.

REVEALER requires as input high-quality genomic data and a significant number of cancer samples, which can be a challenge, according to Dr. Tamayo. But REVEALER is more sensitive at detecting similarities between different types of genomic features and less dependent on simplifying statistical assumptions, compared to other methods, he adds.

“This study demonstrates the potential of combining functional profiling of cells with the characterizations of cancer genomes via next-generation sequencing,” said co-senior author Jill P. Mesirov, Ph.D., professor and associate vice chancellor for computational health sciences at UC San Diego School of Medicine.

 

Characterizing genomic alterations in cancer by complementary functional associations

Jong Wook Kim, Olga B Botvinnik, Omar Abudayyeh, Chet Birger, et al.

Nature Biotechnology (2016)              http://dx.doi.org:/10.1038/nbt.3527

Systematic efforts to sequence the cancer genome have identified large numbers of mutations and copy number alterations in human cancers. However, elucidating the functional consequences of these variants, and their interactions to drive or maintain oncogenic states, remains a challenge in cancer research. We developed REVEALER, a computational method that identifies combinations of mutually exclusive genomic alterations correlated with functional phenotypes, such as the activation or gene dependency of oncogenic pathways or sensitivity to a drug treatment. We used REVEALER to uncover complementary genomic alterations associated with the transcriptional activation of β-catenin and NRF2, MEK-inhibitor sensitivity, and KRAS dependency. REVEALER successfully identified both known and new associations, demonstrating the power of combining functional profiles with extensive characterization of genomic alterations in cancer genomes

 

Figure 2: REVEALER results for transcriptional activation of β-catenin in cancer.close

(a) This heatmap illustrates the use of the REVEALER approach to find complementary genomic alterations that match the transcriptional activation of β-catenin in cancer. The target profile is a TCF4 reporter that provides an estimate of…

 

An imaging-based platform for high-content, quantitative evaluation of therapeutic response in 3D tumour models

Jonathan P. Celli, Imran Rizvi, Adam R. Blanden, Iqbal Massodi, Michael D. Glidden, Brian W. Pogue & Tayyaba Hasan

Scientific Reports 4; 3751  (2014)    http://dx.doi.org:/10.1038/srep03751

While it is increasingly recognized that three-dimensional (3D) cell culture models recapitulate drug responses of human cancers with more fidelity than monolayer cultures, a lack of quantitative analysis methods limit their implementation for reliable and routine assessment of emerging therapies. Here, we introduce an approach based on computational analysis of fluorescence image data to provide high-content readouts of dose-dependent cytotoxicity, growth inhibition, treatment-induced architectural changes and size-dependent response in 3D tumour models. We demonstrate this approach in adherent 3D ovarian and pancreatic multiwell extracellular matrix tumour overlays subjected to a panel of clinically relevant cytotoxic modalities and appropriately designed controls for reliable quantification of fluorescence signal. This streamlined methodology reads out the high density of information embedded in 3D culture systems, while maintaining a level of speed and efficiency traditionally achieved with global colorimetric reporters in order to facilitate broader implementation of 3D tumour models in therapeutic screening.

The attrition rates for preclinical development of oncology therapeutics are particularly dismal due to a complex set of factors which includes 1) the failure of pre-clinical models to recapitulate determinants of in vivo treatment response, and 2) the limited ability of available assays to extract treatment-specific data integral to the complexities of therapeutic responses1,2,3. Three-dimensional (3D) tumour models have been shown to restore crucial stromal interactions which are missing in the more commonly used 2D cell culture and that influence tumour organization and architecture4,5,6,7,8, as well as therapeutic response9,10, multicellular resistance (MCR)11,12, drug penetration13,14, hypoxia15,16, and anti-apoptotic signaling17. However, such sophisticated models can only have an impact on therapeutic guidance if they are accompanied by robust quantitative assays, not only for cell viability but also for providing mechanistic insights related to the outcomes. While numerous assays for drug discovery exist18, they are generally not developed for use in 3D systems and are often inherently unsuitable. For example, colorimetric conversion products have been noted to bind to extracellular matrix (ECM)19 and traditional colorimetric cytotoxicity assays reduce treatment response to a single number reflecting a biochemical event that has been equated to cell viability (e.g. tetrazolium salt conversion20). Such approaches fail to provide insight into the spatial patterns of response within colonies, morphological or structural effects of drug response, or how overall culture viability may be obscuring the status of sub-populations that are resistant or partially responsive. Hence, the full benefit of implementing 3D tumour models in therapeutic development has yet to be realized for lack of analytical methods that describe the very aspects of treatment outcome that these systems restore.

Motivated by these factors, we introduce a new platform for quantitative in situ treatment assessment (qVISTA) in 3D tumour models based on computational analysis of information-dense biological image datasets (bioimage-informatics)21,22. This methodology provides software end-users with multiple levels of complexity in output content, from rapidly-interpreted dose response relationships to higher content quantitative insights into treatment-dependent architectural changes, spatial patterns of cytotoxicity within fields of multicellular structures, and statistical analysis of nodule-by-nodule size-dependent viability. The approach introduced here is cognizant of tradeoffs between optical resolution, data sampling (statistics), depth of field, and widespread usability (instrumentation requirement). Specifically, it is optimized for interpretation of fluorescent signals for disease-specific 3D tumour micronodules that are sufficiently small that thousands can be imaged simultaneously with little or no optical bias from widefield integration of signal along the optical axis of each object. At the core of our methodology is the premise that the copious numerical readouts gleaned from segmentation and interpretation of fluorescence signals in these image datasets can be converted into usable information to classify treatment effects comprehensively, without sacrificing the throughput of traditional screening approaches. It is hoped that this comprehensive treatment-assessment methodology will have significant impact in facilitating more sophisticated implementation of 3D cell culture models in preclinical screening by providing a level of content and biological relevance impossible with existing assays in monolayer cell culture in order to focus therapeutic targets and strategies before costly and tedious testing in animal models.

Using two different cell lines and as depicted in Figure 1, we adopt an ECM overlay method pioneered originally for 3D breast cancer models23, and developed in previous studies by us to model micrometastatic ovarian cancer19,24. This system leads to the formation of adherent multicellular 3D acini in approximately the same focal plane atop a laminin-rich ECM bed, implemented here in glass-bottom multiwell imaging plates for automated microscopy. The 3D nodules resultant from restoration of ECM signaling5,8, are heterogeneous in size24, in contrast to other 3D spheroid methods, such as rotary or hanging drop cultures10, in which cells are driven to aggregate into uniformly sized spheroids due to lack of an appropriate substrate to adhere to. Although the latter processes are also biologically relevant, it is the adherent tumour populations characteristic of advanced metastatic disease that are more likely to be managed with medical oncology, which are the focus of therapeutic evaluation herein. The heterogeneity in 3D structures formed via ECM overlay is validated here by endoscopic imaging ofin vivo tumours in orthotopic xenografts derived from the same cells (OVCAR-5).

 

Figure 1: A simplified schematic flow chart of imaging-based quantitative in situ treatment assessment (qVISTA) in 3D cell culture.

(This figure was prepared in Adobe Illustrator® software by MD Glidden, JP Celli and I Rizvi). A detailed breakdown of the image processing (Step 4) is provided in Supplemental Figure 1.

A critical component of the imaging-based strategy introduced here is the rational tradeoff of image-acquisition parameters for field of view, depth of field and optical resolution, and the development of image processing routines for appropriate removal of background, scaling of fluorescence signals from more than one channel and reliable segmentation of nodules. In order to obtain depth-resolved 3D structures for each nodule at sub-micron lateral resolution using a laser-scanning confocal system, it would require ~ 40 hours (at approximately 100 fields for each well with a 20× objective, times 1 minute/field for a coarse z-stack, times 24 wells) to image a single plate with the same coverage achieved in this study. Even if the resources were available to devote to such time-intensive image acquisition, not to mention the processing, the optical properties of the fluorophores would change during the required time frame for image acquisition, even with environmental controls to maintain culture viability during such extended imaging. The approach developed here, with a mind toward adaptation into high throughput screening, provides a rational balance of speed, requiring less than 30 minutes/plate, and statistical rigour, providing images of thousands of nodules in this time, as required for the high-content analysis developed in this study. These parameters can be further optimized for specific scenarios. For example, we obtain the same number of images in a 96 well plate as for a 24 well plate by acquiring only a single field from each well, rather than 4 stitched fields. This quadruples the number conditions assayed in a single run, at the expense of the number of nodules per condition, and therefore the ability to obtain statistical data sets for size-dependent response, Dfrac and other segmentation-dependent numerical readouts.

 

We envision that the system for high-content interrogation of therapeutic response in 3D cell culture could have widespread impact in multiple arenas from basic research to large scale drug development campaigns. As such, the treatment assessment methodology presented here does not require extraordinary optical instrumentation or computational resources, making it widely accessible to any research laboratory with an inverted fluorescence microscope and modestly equipped personal computer. And although we have focused here on cancer models, the methodology is broadly applicable to quantitative evaluation of other tissue models in regenerative medicine and tissue engineering. While this analysis toolbox could have impact in facilitating the implementation of in vitro 3D models in preclinical treatment evaluation in smaller academic laboratories, it could also be adopted as part of the screening pipeline in large pharma settings. With the implementation of appropriate temperature controls to handle basement membranes in current robotic liquid handling systems, our analyses could be used in ultra high-throughput screening. In addition to removing non-efficacious potential candidate drugs earlier in the pipeline, this approach could also yield the additional economic advantage of minimizing the use of costly time-intensive animal models through better estimates of dose range, sequence and schedule for combination regimens.

 

Microscope Uses AI to Find Cancer Cells More Efficiently

Thu, 04/14/2016 – by Shaun Mason

http://www.mdtmag.com/news/2016/04/microscope-uses-ai-find-cancer-cells-more-efficiently

Scientists at the California NanoSystems Institute at UCLA have developed a new technique for identifying cancer cells in blood samples faster and more accurately than the current standard methods.

In one common approach to testing for cancer, doctors add biochemicals to blood samples. Those biochemicals attach biological “labels” to the cancer cells, and those labels enable instruments to detect and identify them. However, the biochemicals can damage the cells and render the samples unusable for future analyses.

There are other current techniques that don’t use labeling but can be inaccurate because they identify cancer cells based only on one physical characteristic.

The new technique images cells without destroying them and can identify 16 physical characteristics — including size, granularity and biomass — instead of just one. It combines two components that were invented at UCLA: a photonic time stretch microscope, which is capable of quickly imaging cells in blood samples, and a deep learning computer program that identifies cancer cells with over 95 percent accuracy.

Deep learning is a form of artificial intelligence that uses complex algorithms to extract meaning from data with the goal of achieving accurate decision making.

The study, which was published in the journal Nature Scientific Reports, was led by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering; Claire Lifan Chen, a UCLA doctoral student; and Ata Mahjoubfar, a UCLA postdoctoral fellow.

Photonic time stretch was invented by Jalali, and he holds a patent for the technology. The new microscope is just one of many possible applications; it works by taking pictures of flowing blood cells using laser bursts in the way that a camera uses a flash. This process happens so quickly — in nanoseconds, or billionths of a second — that the images would be too weak to be detected and too fast to be digitized by normal instrumentation.

The new microscope overcomes those challenges using specially designed optics that boost the clarity of the images and simultaneously slow them enough to be detected and digitized at a rate of 36 million images per second. It then uses deep learning to distinguish cancer cells from healthy white blood cells.

“Each frame is slowed down in time and optically amplified so it can be digitized,” Mahjoubfar said. “This lets us perform fast cell imaging that the artificial intelligence component can distinguish.”

Normally, taking pictures in such minuscule periods of time would require intense illumination, which could destroy live cells. The UCLA approach also eliminates that problem.

“The photonic time stretch technique allows us to identify rogue cells in a short time with low-level illumination,” Chen said.

The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.   …..  see also http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f1.jpg

Chen, C. L. et al. Deep Learning in Label-free Cell Classification.    Sci. Rep. 6, 21471;   http://dx.doi.org:/10.1038/srep21471

 

 

Read Full Post »


How do we address medical diagnostic errors?

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

It is my observation over the course of 50 years in medicine that there is no simple resolution to a frequently identified problem, which has grown more serious with the implementation of hospital-wide and system-wide health information technology.  There has been reduction of hospital staff, nurses and physicians, with an increased patient load even while the technology has advanced significantly.  Over this investment in technology and manpower the hospital staff time has become dependent on the medical record and a large portion of their time is drawn away from patient care.  This is not a substantial improvement in the last decade.  The problem has been a lack of focus on the correct design of these systems.  They should be interactive and they should have incorporated anticipatory logic.  Instead, they are not that much more than large electronic filing cabinets. There are large sources of information in laboratory, radiology and pharmacy and are siloed.  When one drives a car, one does not need to look under the hood, a sentiment that has been expressed by a real expert.

 

Diagnostic Errors Get the Attention of the Institute of Medicine, Reinforcing Efforts by Nation’s Clinical Pathology Laboratory Scientists to Improve Patient Safety

March 25, 2016         DARK DAILY info@darkreport.com via cmail2.com

 

Along with its assessment of the rate of errors in diagnosis, the IOM has a plan to improve, but will doctors accept the IOM’s advice, or continue business as usual?

Diagnostic errors in the American healthcare system is a problem that is now on the radar screen of policymakers at the Institute of Medicine (IOM). Pathologists andclinical laboratory professionals will welcome this development, because recommendations from the IOM carry weight with Congress.

Thus, should the IOM develop specific actions items intended to reduce medical errors, not only are these suggestions likely to involve more effective use of medical laboratory tests by physicians, but there is a strong probability that Congress might eventually write these recommendations into future healthcare legislation.

The Institute of Medicine is a division of the National Academies of Sciences, Engineering, and Medicine. The IOM recently convened a committee that released a list of recommendations to address the problem of diagnostic errors in medicine. Those recommendations, however, are running up against ingrained mindsets and overconfidence on the part of physicians who are reluctant to include decision-support technology in the diagnostic process.

Misdiagnoses in healthcare has led to tragic consequences. Over the years, the IOM has worked to increase the public’s awareness of the problem of misdiagnosis. It has also encouraged health information technology (HIT) developers to provide clinicians with the decision-support systems they need to prevent misdiagnosis from happening in the first place. But getting physicians to use the new tools has become a challenge.

IOM Dedicated to Improving Diagnostic Accuracy 

Preventing diagnostic errors has been a primary target in medical reform for many years. A 1999 report by the U.S Institute of Medicine (IOM) titled: “To Err Is Human: Building a Safer Health System” first drew attention to the rate of preventable medical errors in the U.S.

Since then, much has been written about the problem of diagnostic inaccuracy. Although many provider organizations are working to improve it, progress has been slow.

Click here to see image

Edward Hoffer, MD (above), FACP, FACC, FACMI, is a faculty member at the Massachusetts General Hospital Laboratory of Computer Science. He told Modern Healthcare that getting doctors to use diagnostic software is a hard sell. “The main problem we face is trying to convince physicians that they actually need to use it,” he concluded. (Image copyright: Massachusetts General Hospital, Laboratory of Computer Science.)

Solutions for Preventing Errors Face Many Challenges

From the first evaluation of the patient, to the clinical laboratory, to physician follow-up; errors can occur throughout the chain of care. And a culture that discourages reporting of medical errors can make it very difficult for healthcare organizations to resolve these issues.

For example, an investigation by the Milwaukee Journal Sentinel uncovered what they say is a “secretive system [that] hides [medical] lab errors from the public and puts patients at risk.”

To make matters worse, according to the Journal Sentinel article, federal agencies that are tasked with oversight of the more than 35,000 clinical laboratories in the U.S. (such as the Joint Commission) are overloaded and thus may be missing many violations that would lead to sanctions or outright loss of accreditation.

The Milwaukee Journal Sentinel stated that, “even when serious violations are identified, offending labs are rarely sanctioned except in the most extreme cases.” And that “in 2013, just 90 sanctions were issued—accounting for not even 1% of the 35,000 labs that do high-level lab testing in the United States.”

Additionally, there are different types of misdiagnoses, which also complicates matters. A study by three professors at the Dartmouth Center for Healthcare Deliver Science investigated the cost of what they termed “silent misdiagnoses.”

According to a Dartmouth Now article outlining the study, the disparity between “the treatments patients want and what doctors think they want” accounts for a great number of misdiagnoses. The study’s authors call these misdiagnoses “silent” because they are rarely reported or recorded.

Click here to see image

Albert G. Mulley, Jr., MD, MPP (above), is Managing Director, Global Health Care Delivery Science, The Dartmouth Institute for Health Policy and Clinical Practice; Professor of Medicine, Geisel School of Medicine at Dartmouth; and the lead author of the Dartmouth study. Mulley says that listening to the patient is more important than ever. “Today, the rise in treatment options makes this even more critical, not only to reach a correct medical diagnosis but also to understand fully the patients’ preferences, and reduce the huge waste in time and money that comes from delivery of services that patients often neither want nor need.” (Photo copyright: Dartmouth Now.)

IOM Committee on Diagnostic Error in Healthcare Releases Recommendations

In September, 2015, the IOM released its latest report in the series of reports about medical errors and diagnostic problems that goes back 17 years. The series started in 1999 with “To Err is Human: Building A Safer Health System” in 1999. This was followed in 2001 by “Crossing the Quality Chasm: A New Health System for the 21st Century.”

This newest report is titled, “Improving Diagnosis in HealthCare.” It comes from the IOM’s National Academies of Medicine’s Committee on Diagnostic Error in Healthcare. The study lists recommendations that the IOM says will help improve the rate of diagnostic errors. Included are:

• Facilitate more effective teamwork in the diagnostic process;

• Enhance education in the healthcare community on the topic of improving diagnostics;

• Incorporate diagnostic support into electronic health records (EHRs), and make EHRs fully interoperable;

• Set up systems to identify, learn from, and reduce diagnostic errors;

• Establish a culture that encourages investigation of mistakes;

• Develop a collaborative reporting system that involves clinicians, government agencies, and liability insurers that allows everyone to learn from diagnostic errors and near misses;

• Design a payment system that supports correct diagnosis; and

• Provide funding for research into how to improve the diagnostic process.

Click here to see video

This video (above) is of the IOM’s public release in September, 2015, of the Committee on Diagnostic Error in Healthcare’s report, “Improving Diagnosis in Health Care.” A full text of the report can be downloaded by clicking here, or by visiting http://iom.nationalacademies.org/reports/2015/improving-diagnosis-in-healthcare.

Technology Exists to Support Accurate Diagnoses

One area of development that offers real hope for lowering the rate of misdiagnosis is medical decision-support software—specifically in clinical informatics, where the integration of health data with health information technology (HIT) supports physician decision making at the point of care.

Whether that software is part of an organization’s EHR system, or is a standalone program dedicated to reducing diagnostic errors, can make a difference. For example, a research study published in the Journal of Clinical Bioinformatics) suggests that health information technology has an important role to play in improving diagnostic accuracy.

The challenge is that, although the technology currently exists, diagnostic error rates remain high. A research study published in the Journal of the American Medical Association Internal Medicine (JAMA) in 2013 may explain why. It states that overconfidence could prevent some physicians from re-evaluating questionable diagnoses.

In a Modern Healthcare article, Edward Hoffer, MD, FACP, FACC, FACMI, a faculty member at the Massachusetts General Hospital Laboratory of Computer Science, said that getting doctors to use diagnostic software is a hard sell. “The main problem we face is trying to convince physicians that they actually need to use it,” he concluded.

All of the building blocks for improving the diagnostic process are either in place, or as in the case of EHR interoperability, in the works. Clinical laboratory personnel are uniquely positioned to assist physicians in improving through communication and the careful use of technology.

—Dava Stewart

Related Information:

To Err Is Human

Weak Oversight Allows Lab Failures to Put Patients at Risk  

Dartmouth Study: ‘Silent’ Misdiagnoses by Doctors Are Common, and Come at Great Cost 

Press Release: Doctors ‘Silent’ Misdiagnoses Cost Patients Dearly . . . And US Health Care Billions

Improving Diagnosis in Health Care

Clinical Decision Support Systems for Improving Diagnostic Accuracy and Achieving Precision Medicine 

Using Software To Avoid Misdiagnoses 

In Conversation with…Mark L. Graber, MD 

 

All of the building blocks for improving the diagnostic process are either in place, or as in the case of EHR interoperability, in the works.   Nothing could be further from reality than what has been stated.   There is no minimization of keystrokes.  There is no anticipatory logic.  There is no automated comparison of medication and diagnoses electronically.  There is no check to determine the consistency of the laboratory results with a probability of diagnosis, or automated identification of tests that would clarify the situation.

Read Full Post »


Middleware

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

Broadening the productivity spectrum with middleware

Anne Paxton

March 2016—As James Beck, MT(ASCP), remembers it, middleware was introduced at his institution about the same time that the nursing department decided connectivity should be the province of the laboratory.

When the concept of docking and interfacing glucose testing devices came on the scene around the turn of the millennium, that was a turning point, says Beck, who is point-of-care testing coordinator for the University of Pittsburgh Medical Center–St. Margaret, which uses the Telcor middleware solution QML. “What had previously been a nursing-run program turned into a lab-run program just because of their unfamiliarity with the electronics and connectivity issues. Our nursing department, at least, felt, okay, this is the end of line for us—you take it on.”

At that point, middleware was considered revolutionary in its ability just to handle getting point-of-care data to the laboratory information system, never mind to the hospital information system. But between its beginnings a couple of decades ago and today, middleware has charted a more multidimensional role. It is increasingly being called upon as an agile and resourceful productivity manager, software vendors and users say.

It was about 2006–2007 that middleware’s role started to evolve from connectivity to more “productivity opportunities,” Maureen Marentette recalls. Middleware vendor Data Innovations (DI), where Marentette is director of North American sales, started in 1989 with the interfacing of instruments to the LIS as its focus. Now, products like DI’s Instrument Manager take the information that instruments provide plus what the LIS provides, and “they combine those two pieces to provide powerful management of results at a completely different level.”

With drivers available from more than 1,000 different instruments in anatomic pathology, molecular, microbiology, immunoassay, and more, DI has vastly multiplied the number of parameters it can manage, Marentette says. “We’ve got over 473 data fields that we can use to create very specific rules that are patient- and physician-specific as well as specimen-specific.”

For example, if an instrument is giving a bilirubin result, depending on the hospital, it might be a result on a newborn, a 10-year-old child, or an adult. “Whereas most LISs only handle reference ranges in years, using a middleware product you’re able to drill down to a reference range based on number of hours old, because that parameter can be significant for a newborn from a diagnostic perspective.”

Some LIS companies still use DI for their interface engine for point-to-point interfaces from each new instrument a hospital purchases into their LIS. But when a laboratory purchases Instrument Manager, Marentette notes, “if you switch from, say, a Beckman to a Sysmex hematology instrument, you can simply repurpose the connection you had for the first instrument and move it over to the new one, then just do the editing needed to recognize the codes that are unique to each instrument. It’s really just a tweaking rather than having to recreate the entire interface. So we’ve evolved from connectivity to being able to help labs with productivity and optimizing their workflow.”

These capabilities become particularly important for laboratories dealing with staffing constraints. Some Instrument Manager clients, for example, have 95 percent autoverification of their core chemistry lab results, she says, so those tests go straight into the LIS once released, and then into the HIS for interpretation by the physician.
Many labs are still using LIS autoverification protocols only about 40 percent of the time but could be doing much more, Marentette believes. “People kind of look at it over the short term. There is change management that goes into moving into a middleware product initially. It’s a cost thing, and there’s a knowledge piece, and it takes time to implement and get people to understand how much better off they would be with middleware.”

Focused specialization is what Sysmex’s middleware product WAM (Work Area Management) offers for hematology labs, says Anne Tate, MT(ASCP), IT/automation group manager at Sysmex America. “We concentrate on managing everything around the lavender top.” As a best-of-breed solution, WAM provides hematology-specific rules honed over the past 10 years and now used by more than 300 hematology labs serving 1,100 hospital sites, Tate says.
Sixty percent of Sysmex WAM clients are large integrated health networks, many of which use Sysmex instruments such as the DI-60 and its slidemaker/stainers and sorters, as well as Bio-Rad and Cellavision instruments. “We have instrument lines with one or two instruments on them, up to large systems with two automation lines. So we support both complex and the high-end markets.”

Multisite capability is one of WAM’s key features. “It’s a server-based system, so whatever results you have on site A, you’ll be able to see them on site B. So you can get delta checking and previous results and be able to correlate data from one site to another, but also apply the same standard rules. Doctors and other professionals are assured predictability in how the results are looked at and managed between sites,” Tate says.

She considers middleware as the “production floor” of the lab. “We manage all the results coming out of the instruments, applying rules and logic to either autovalidate to the LIS, or hold up results to do something further such as rerun, reflex, review, or maybe move to another Work Area Manager.”

More and more customers are demanding middleware when they purchase automation, Tate points out, and middleware is increasingly becoming an essential part of the lab. “We’ve always been important. But the LIS really depends on us to manage these combinations and multiple sites because that’s where our expertise is.”

Tate believes LISs have to concentrate on other issues. “LIS vendors really can’t know everything about vendors’ automation and all the changes and parts that go with it. LISs are concentrating their dollars on interfacing more with the EMR and medical necessity and other modules, but they depend on us for decision logic to process results for small to large automation.”

Keeping track of operator competency is another WAM capability. “With our management reports, you can search by user and track almost everything users are doing on the system, with competency based on whatever your criteria are.”

What Sysmex customers most like about the WAM middleware is “they don’t have to look at every result. It automates the process so they are assured that all the logic they would have done manually is being done consistently without a verification.”

At Dartmouth-Hitchcock Medical Center in Lebanon, NH, which includes a main hospital, several clinics, and a network of regional lab partners, Sysmex’s WAM is what allowed the laboratory to bring on the clinics and increase its volume in 2009, says Dorothy Martin, MT(ASCP), hematology supervisor.


The facility started with the Sysmex WAM 3.0, which communicates to Dartmouth-Hitchcock’s Cerner LIS and its Epic HIS. After upgrading to WAM 5.0 in November 2015, “we’ve increased our autovalidation to 87.5 percent, so that’s another 2.5 percent that we can autovalidate. Not only that, but we decreased our manual differential rate. Before, it was a little over four percent, then we brought it down to two percent. Because those manual diffs take longer, that’s less tech time we’re spending at the scope.”
“Originally we were sending many tests out to a reference lab because we could not handle the volume, but Sysmex allowed us to expand and then bring on the regional partners. This is helping us to drive cost out of the system for our patients.” Her lab also has middleware for urinalysis and is exploring use of middleware in coagulation testing as a 2017 goal.

The 5.0 version also offers manager reports. “Before, in order to know my autoverification rate, I had to use my Cerner system and pull a number of different reports and export that data into Excel spreadsheets. Now, with a click of a button or two, I can mine all of that data out of my WAM system. We can create reports based on rules, based on our turnaround time, based on technologist. So it’s great. It’s quite amazing.”

Martin works with database coordinator Kari Agan who does the “rules building” and integration work with Cerner. “We actually work with Sysmex to create our rules so they do exactly what we want. We want them customized to be based on critical values in our facility.” For example, “I have different platforms in the main lab than we have in the smaller facilities. I have rules for our hematology-oncology population that are different than rules we would have for a patient who comes in for their primary care visit. WAM allows us that flexibility.”

The WAM has been a key contributor to Dartmouth-Hitchcock’s reputation as an innovator, Martin believes. “We’ve been on this Lean Six Sigma journey for a number of years, and I think that middleware and automation have helped us get to those Lean processes.” The distance among the Lebanon, Nashua, and Manchester laboratories has been a non-issue. “All the labs, even though they are as much as 70 miles apart, are tied into the same middleware on the same server and perform the same way. There’s no issue with networking at long distances and it’s done through the Dartmouth-Hitchcock secure network, so all the patient data is protected.”

With the standardization the middleware has allowed among the technologists, and the growth of the network and technical staff that it has facilitated, “we are really proud of how much Sysmex has helped us improve the processes in our lab,” Martin says.

Kerstin Halverson, point-of-care coordinator for Children’s Hospitals and Clinics of Minnesota, uses Telcor’s QML Data Management and Connectivity Solution to interface five instruments and seven manual device types for the roughly 120,000 POC tests per year, encompassing glucose, hemoglobin, rapid strep, pregnancy, blood gases, and urine dipsticks. “We have Sunquest as our LIS and Cerner as our HIS, and everything point of care flows through Telcor from either a device or via Telcor’s offshoot product called WebMRE for manual result entry.”

She handles most every issue involving the connection between point of care, middleware, and the LIS. “I’m the point person when we have to upgrade servers or add devices. I work pretty specifically with the vendors to make sure everything gets established and connected properly.” Over the years, her lab has gone from having only glucoses and blood gases interfaced to having five device types interfaced, and manual testing is electronically entered now too.

Halverson

Halverson

Children’s Hospitals and Clinics, with its 2,200 credentialed operators, was the first of Telcor’s customers to go live with an e-learning interface between Telcor and the hospital’s e-learning software, PeopleSoft. “As a piece of this interface, tracking attendance at the annual competency fair that all staff must attend is now done electronically. This allows us to monitor point-of-care testing competency efficiently in one spot,” Halverson says.

This project required considerable lead time. “It was not something we switched on one day. It was probably a three-and-a-half-year project with Telcor to get all my POC courses in a table and then on a daily basis, using PeopleSoft, have that information be updated with all the additions, deletions, department changes, course passing and failing, and so on.” The whole process is now automatic, Halverson says, which is useful for dealing with so many operators over 11 different device types.

In her previous job, more than 13 years ago, “we did not have any middleware,” she notes. “At that time, we had glucometers connected through fax machines, not on the hospital’s network. But results weren’t electronically transferred to charts yet; the nurses would just see results on the device and chart them. There was still a lot of discussion about whether to bill for point of care back then, so some institutions probably did not worry about interfacing and getting things electronically recorded the way things have evolved now.” These days, “IT is one of my major focuses. I can’t do point of care without it.”

In her work as POC coordinator, competency takes a huge chunk of her time—for example, keeping records of diplomas for anyone who is doing moderately complex testing. She says Telcor is working on a way to store those records centrally in its database. “After competency, I would say I spend the most time making sure we’re meeting all the regulations on a daily basis, and by that I mean monitoring QC, making sure the instrumentation is up and functioning properly, and troubleshooting issues.”

A change she has greatly appreciated was moving the Telcor software from a single PC at her desk to a server. “I had two different campuses to travel between, and as much as I try to clone myself, I can’t. So having it moved to a server and having access to the server if I’m somewhere else has been a huge jump forward. I can do troubleshooting pretty much anywhere I need to go now.”

Halverson likens POC to a spiderweb with middleware at the center. “For me, the Telcor middleware ends up being the center of the web to help pull everything into one place where I can manage it. It passes everything along to the LIS, then eventually to the HIS, but it helps cut down on a lot of potential problems by locking people out who aren’t trained and keeping the interfaces up. It’s a one-stop shop.”

Today, some POC devices have their own data-management systems and others don’t, but Telcor is able to interface with them either way, says UPMC’s James Beck. “I can’t say that Telcor makes those systems obsolete. But from my perspective, the fewer systems I need to get into, the easier it is to do what I need to do. Which is basically oversee what’s going on in the system, make sure quality checks are being done, and make sure data is moving through the system and getting to the end users. The fewer systems there are for that data, the more likely the data is going to end up on the chart so that clinical decisions can be made on it.”

Most devices, he has found, are interfaceable through Telcor, which is installed in about 1,900 U.S. and Canadian hospitals. UPMC uses Abbott’s Precision Xceed Pro for glucose, and has seen unprecedented growth in the number of tests it conducts using Abbott i-Stat in the past nine months. The connectivity issues have been for the most part already resolved—“we have just been adding a lot more tests to the menu.”

The ABL blood gas analyzers, Beck says, are considered by many to be laboratory instruments, not point-of-care devices, because they are tabletops. But “this is another example of Telcor flexibility, because before, many of the instruments were being run by respiratory therapists and they had a hard time internalizing how to use that lab interface. They didn’t get enough repeat experience with it, the way a lab person would, and the standard lab interface was really failing them. They could not seem to get the hang of it as far as the flow of data.”Siemens’ Clinitek automatically uploads standard urinalysis dipstick and urine hCG testing results to Telcor and in turn to the EMR, Beck says. “So a person who is multitasking does not have to stay there. They can load up a device and walk away and know that the result is going to automatically upload.” Similarly, Accriva’s Hemochron Signature Elite is used for activated clotting time, and Avoximeter 1000E, another Accriva device, is used for co-oximetry testing and cardiac catheterization labs. Both also connect to Telcor.

Beck approached Telcor about taking over the interface, and the company appointed a dedicated analyst to work it out. “Then our next hurdle was actually to convince the LIS folks to allow us to do it. I said, ‘I can make life so much easier for these respiratory therapists if we can send the information through a middleware system like Telcor as opposed to a standard lab interface, which is full of rules and things to remember. I can cut 20-plus steps and mouse clicks and verifications out of their process.’ So it took a lot of convincing but finally they said, okay, go ahead and try it. And that actually got us a patient safety award for our facility improvement process. It was copied at other hospitals within our health system because they had the same dilemmas and we had a proven model that worked.”

There has been downtime, he says, but not due to the middleware. “Recently, our centralized information support actually disconnected our Telcor server so it stopped transmission unexpectedly. It ended up getting reported by our ED where a nurse was doing a urine dip and knew it should have crossed by a certain amount of time.” So temporarily, no data were moving.

But, he says, “The beauty of the computer age is that everything is basically held in a buffer so there’s no data loss. Everything performed during a downtime does get captured ultimately.” However, with one system processing information from 19 hospitals, catching up can take quite a while. “In fact, one of the things we requested was that they expand memory so the catch-up is quicker when there is some backlog of processing that needs to be done.”

Some operators are actually physicians—typically anesthesiologists—and Beck has found there is better appreciation among physicians generally about middleware. “Definitely five or 10 years ago, they could afford to be more distant from middleware awareness, but now they are aware in making sure the data gets to the chart.”

It’s a good thing, too, because point of care is growing. In fact, “we’re seeing explosions in point of care that are often exceeding the ability of one person to oversee.” For this reason, Beck is pleased that his middleware vendor is able to keep pace with the state of technology and even stay one step ahead. Middleware capability like that has become necessary, he says.

The potential customer base for middleware is wide-ranging, says Data Innovations’ Marentette. DI has diverse customers that include the Department of Defense and Department of Veterans Affairs, large academic health science centers, and reference labs. But the company is finding a market as well in smaller and midrange hospitals, which are increasingly looking at DI’s productivity and quality suite modules and its “management through metrics” data mining.

On the quality assurance side, Marentette points to Instrument Manager’s “moving averages” program which helps monitor how instruments are performing between quality control challenges. “If there is a problem when the moving averages go out, you can first of all stop testing and prevent those results from being released to the LIS. Therefore you have fewer edited reports. That really helps on the quality side. And using our specimen management workspace for autoverification, you can put in instructions on what to do when a particular value comes up.”The data mining involves pulling data out of the specimen management database to conduct population or research studies. “We’ve now added our laboratory intelligence, which enables real-time metrics monitoring, so you can look at what’s going on in your lab five minutes ago, identify issues, and drill down to what the challenge is.”

Middleware’s return on investment is twofold, Marentette believes. “It’s not just within the lab itself, but also what it means to the rest of the hospital organization and the ability to enhance the patient experience. For example, quicker turnaround time on tests for the emergency department lets physicians make faster patient diagnostic decisions, bringing better patient experience and contributing to the overall revenue of the organization. Then, within the lab itself, if you are using the autoverification, you can redeploy your staff and take on more work by bringing on new tests or doing some of the centralized testing you maybe sent to another lab before.”

Through its JResultNet software, which DI purchased from Dawning Technologies more than two years ago, the same middleware benefits are available to physician office labs, and Laboratory Production Manager is DI’s parallel product in the European marketplace. DI itself was acquired in 2015 by Roper Technologies, which also owns LIS giant Sunquest Information Systems. But DI is still a standalone company, Marentette emphasizes, noting that a range of LISs (including Epic’s Beaker) as well as IVD manufacturers use Instrument Manager as their connectivity and middleware solution.

Instrument Manager’s usefulness also becomes apparent when hospitals or integrated delivery networks do acquisitions. “We have the ability to interface multiple LISs into the same IM database, so if a hospital acquires another hospital and wants to put things together on the same system, you can do that seamlessly through one of our interfaces. The middleware is able to keep orders and results reporting management straight between systems. So it does make consolidation much simpler and streamlined.”

With the changing configuration of IT in health care, Sysmex’s Anne Tate sees a changing role for middleware ahead. “WAM provides everything the LIS needs to report immediately to the EMR. In some instances, we’re seeing where the WAM can go into clinics or small labs that don’t have an LIS, so eventually we may have to connect the WAM directly to the EMR. We’ve never done it yet, but we’re seeing movement in that respect.” She doesn’t expect it in the hospital environment. “But in standalone clinics, in nontraditional or non-hospital–supported environments, I do see it happening. We’ve already had requests for that.”

Middleware does come with a cost, but customers understand middleware’s value, Tate says. “We don’t get much pushback anymore on the cost of middleware because they see the value and return on investment: They’ll get vastly improved turnaround time, they can reallocate FTEs, and they get standardized results and less error.”

Read Full Post »


BioMEMS The Market aspects of Oligonucleotide-Chips, Products and Applications, Competition, January 21, 2016

Curator: Gérard LOISEAU, ESQ

 

BioMEMS

The Market aspects of Oligonucleotide-Chips, Products, Applications, Competition 

January 21, 2016

2015-2020

The oligonucleotide synthesis market is expected to reach USD 1.918.6Billion at a CAGR of 10.1% by 2020 from USD 1.078.1Billion in 2015.

SOURCE

MARKETSANDMARKETS marketsandmarkets.com/

 

PLAYERS

  • Agilent Technologies Inc.
  • BioAutomation Corp.
  • Biosearch Technologies
  • Gen9 Inc.
  • GenScript Inc.
  • Illumina Inc.
  • Integrated DNA Technologies
  • New England Biolabs Inc.
  • Nitto Denko Avecia Inc.
  • OriGene Technologies Inc.
  • Sigma-Aldrich Corporation
  • Thermo Fisher Scientific Inc.
  • TriLink Biotechnologies

 

Agilent Technologies
 CA NYSE :A


http://www.agilent.com/

  • Agilent was created as a spin off from Hewlett-Packard Company in 1999.
  • Agilent Technologies Inc. is engaged in the life sciences, diagnostics and applied chemical markets. The Company provides application focused solutions that include instruments, software, services and consumables for the entire laboratory workflow. The Company has three business segments:

the life sciences and applied markets business,

the diagnostics and genomics business, and

the Agilent Cross Lab business

  • The Company’s life sciences and applied markets business segment brings together the Company’s analytical laboratory instrumentation and informatics.
  • The Company’s diagnostics and genomics business segment consists of three businesses: the Dako business, the genomics business and the nucleic acid solutions business.
  • The Company’s Agilent Cross Lab business segment combines its analytical laboratory services and consumables business

SOURCE

http://reuters.com/

PRODUCTS AND SERVICES

https://www.agilent.com/en-us/default#collapse-0

  • October 09, 2015 03:21 PM Eastern Daylight Time
  • CARPINTERIA, Calif.–(BUSINESS WIRE)–Dako, an Agilent Technologies company and a worldwide provider of cancer diagnostics, today announced the U.S. Food and Drug Administration has approved a new test that can identify PD-L1 expression levels on the surface of non-small cell lung cancer tumor cells and provide information on the survival benefit with OPDIVO® (nivolumab) for patients with non-squamous NSCLC.

SOURCE

BUSINESS WIRE busibesswire.com/

 

BioAutomation Corp.

 TX


 

http://bioautomation.com/

          PRODUCTS AND SERVICES

  • DNA and RNA synthesis reagents for the MerMades

 

Note: The MerMade 192E Oligonucleotide synthesizer is designed to synthesize DNA, RNA & LNA oligonucleotides in a column format

          PARTNERSHIPS

  • HONGENE BIOTECH : BIOAUTOMATION is the exclusive distributor for the Americas
  • EMD MILLIPORE
  • BIOSEARCH TECHNOLOGIES

 

DISTRIBUTORS

  • LINK TECHNOLOGIES : UK
  • AME BIOSCIENCE : UK
  • BOSUNG SCIENCE : KOREA
  • DNA CHEM : CHINA
  • WAKO : JAPAN
  • ACE PROBE : INDIA

SOURCE

bioautomation.com/

 

Biosearch Technologies
 CA


http://biosearchtech.com/

          PRODUCTS

  • qPCR & SNP Genotyping
  • Custom Oligonucleotides
  • – highly sophisticated oligonucleotides
  • – simple PCR primers
  • Oligos in Plates
  • RNA FISH
  • Synthesis Reagents
  • Immunochemicals
  • Primers
  • Probes
  • Large-Scale Synthesis Oligos
  • Intermediate-Scale Synthesis Oligos

          SERVICES

  • GMP & Commercial Services
  • OEM & Kit Manufacturing
  • qPCR Design Collaborations

          DISTRIBUTORS

Argentina | Australia | Austria | Brazil | Canada |Chile | China | Colombia | Czech Republic | Denmark | Ecuador | Finland | Germany |Hong Kong | Israel | Italy | Japan | Korea | Malaysia | Mexico | New Zealand | Norway | Paraguay | Peru| Philippines | Poland | Romania | Singapore | South Africa | Spain | Sweden |Switzerland | Taiwan ROC | Thailand | Turkey | United Kingdom | Uruguay | Vietnam

SOURCE

biosearchtech.com/

 

Gen9 Inc.
 MA 


http://www.gen9bio.com/

          PRODUCTS

Gen9 is building on advances in synthetic biology to power a scalable fabrication capability that will significantly increase the world’s capacity to produce DNA content. The privately held company’s next-generation gene synthesis technology allows for the high-throughput, automated production of DNA constructs at lower cost and higher accuracy than previous methods on the market. Founded by world leaders in synthetic biology, Gen9 aims to ensure the constructive application of synthetic biology in industries ranging from enzyme and chemical production to pharmaceuticals and biofuels.

          SERVICES

  • Synthetic Biology
  • Gene Synthesis Services
  • Variant Libraries
  • Gene Sequence Design Services

         INVESTORS

  • Agilent Technologies : Private Equity
  • CAMBRIDGE, Mass. and SANTA CLARA, Calif. — April 24, 2013 —Gen9 Receives $21 Million Strategic Investment from Agilent Technologies

SOURCE

gen9bio.com/

 

GenScript Inc.
 NJ 


http://www.genscript.com/

  • GenScript is the largest gene synthesis provider in the USA
  • GenScript Corporation, a biology contract research organization, provides biological research and drug discovery services to pharmaceutical companies, biotech firms, and research institutions in the United States, Europe, and Japan. It offers bio-reagent, custom molecular biology, custom peptide, protein production, custom antibody production, drug candidates testing, assay development and screening, lead optimization, antibody drug development, gene synthesis, and assay-ready cell line production services.
  • The company also offers molecular biology, peptide, protein, immunoassay, chemicals, and cell biology products. It offers its products through distributors in Tokyo, Japan; and Seoul, Korea. GenScript Corporation has a strategic partnership with Immunologix, Inc. The company was founded in 2002 and is based in Piscataway, New Jersey. It has subsidiaries in France, Japan, and China.

 

Note: As of October 24, 2011, Immunologix, Inc. was acquired by Intrexon Corporation. Immunologix, Inc. develops and produces antibody-based therapeutics for various biological targets. It produces human monoclonal antibodies against viral, bacterial, and tumor antigens, as well as human auto antigens.

Intrexon Corporation, founded in 1998, is a leader in synthetic biology focused on collaborating with companies in Health, Food, Energy, Environment and Consumer sectors to create biologically based products that improve quality of life and the health of the planet.

 

 

             PRODUCTS AND SERVICES

  • Gene synthesis
  • Antibody services
  • Protein Services
  • Peptide services

 

               INVESTORS


Note: The Balloch Group (‘TBG’) was established in 2001 by Howard Balloch (Canada‘s ambassador to China from 1996 to 2001). TBG has since grown from a market-entry consultancy working with North American clients in China to a leading advisory and merchant banking firm serving both domestic Chinese companies and multinational corporations. TBG was ranked as the number one boutique investment bank in China by ChinaVenture in 2008.

Kleiner, Perkins, Caufield and Byers

 

Illumina
Inc. CA


http://illumina.com/

 

Monica Heger : SAN FRANCISCO (GenomeWeb) – Illumina today announced two new next-generation sequencing platforms, a targeted sequencing system called MiniSeq and a semiconductor sequencer that is still under development.

Illumina disclosed the initiatives during a presentation at the JP Morgan Healthcare conference held here today. During the presentation, Illumina CEO Jay Flatley also announced a new genotyping array called Infinium XT; a partnership with Bio-Rad to develop a single-cell sequencing workflow; preliminary estimates of its fourth-quarter 2015 revenues; and an update on existing products. The presentation followed the company’s announcement on Sunday that it has launched a new company called Grail to develop a next-generation sequencing test for early cancer detection from patient blood samples.

The MiniSeq system, which is based on Illumina’s current sequencing technology, will begin shipping early this quarter and has a list price of $49,500. It can perform a variety of targeted DNA and RNA applications, from single-gene to pathway sequencing, and promises “all-in” prices, including library prep and sequencing, of $200 to $300 per sample, Flatley said during the JP Morgan presentation.

SOURCES

https://www.genomeweb.com/sequencing-technology/illumina-unveils-mini-targeted-sequencer-semiconductor-sequencing-project-jp

http://investor.biospace.com/biospace/quote?Symbol=ILMN

 

              PRODUCTS AND SERVICES

  •               Mid to large scale manufacturing assets
  •               Analytical Labs
  •               Pre-clinical
  •               Clinical
  •               Launched products

 

              COMPETITORS

https://finance.yahoo.com/q/co?s=ILMN+Competitors Tue, Feb 2, 2016, 2:16pm EST – US Markets

ILMN PVT1 AFFX LMNX Industry
Market Cap: 22.75B N/A 1.13B 835.66M 134.14M
Employees: 3,700 10,000 1,200 745 45.00
Qtrly Rev Growth (yoy): 0.14 N/A -0.01 0.07 0.18
Revenue (ttm): 2.14B 3.80B1 357.74M 235.37M 8.47M
Gross Margin (ttm): 0.73 N/A 0.63 0.71 0.58
EBITDA (ttm): 770.84M N/A 46.64M 52.99M -12.31M
Operating Margin (ttm): 0.30 N/A 0.08 0.17 -1.62
Net Income (ttm): 510.36M 430.90M1 11.22M 39.29M N/A
EPS (ttm): 3.42 N/A 0.13 0.93 -0.34
P/E (ttm): 45.43 N/A 104.40 20.91 25.33
PEG (5 yr expected): 2.68 N/A 4.66 0.55 N/A
P/S (ttm): 10.87 N/A 3.13 3.45 13.65

 

Pvt1 = Life Technologies Corporation (privately held)

AFFX = Affymetrix Inc.

LMNX = Luminex Corporation

 

 

Integrated DNA Technologies (IDT)
IOWA + CA

http://www.com/

 

Integrated DNA Technologies, Inc. (IDT), the global leader in nucleic acid synthesis, serving all areas of life sciences research and development, offers products for a broad range of genomics applications. IDT’s primary business is the production of custom, synthetic nucleic acids for molecular biology applications, including qPCR, sequencing, synthetic biology, and functional genomics. The company manufactures and ships an average of 44,000 custom nucleic acids per day to more than 82,000 customers worldwide. For more information, visit idtdna.com.

 

               PRODUCTS AND SERVICES

               https://eu.idtdna.com/site

  • DNA & RNA Synthesis
  • Custom DNA Oligos 96- & 384-Well Plates Ultramer Oligos Custom RNA Oligos SameDay Oligos HotPlates ReadyMade Primers Oligo Modifications Freedom
  • Dyes GMP for Molecular Diagnostics Large Scale Oligo Synthesis

 

Note : Skokie, IL – December 1, 2015. Integrated DNA Technologies Inc. (“IDT”), the global leader in custom nucleic acid synthesis, has entered into a definitive agreement to acquire the oligonucleotide synthesis business of AITbiotech Pte. Ltd. in Singapore (“AITbiotech”). With this acquisition, IDT expands its customer base across Southeast Asia making it possible for these additional customers to now have access to its broad range of products for genomic applications. AITbiotech will continue operations in its other core business areas.

 

New England Biolabs Inc.
 MA 


http://www.neb.com/

 

                PRODUCTS AND SERVICES

  •                 Restriction Endonucleases
  •                 PCR, Polymerases & Amplification Technologies
  •                 DNA Modifying Enzymes
  •                 Library Preparation for Next Generation Sequencing
  •                 Nucleic Acid Purification
  •                 Markers & Ladders
  •                 RNA Reagents
  •                 Gene Expression
  •                 Cellular Analysis

SOURCE

neb.com/

 

Nitto Denko Avecia Inc.
 MA


http://avecia.com/

 

With over 20 years of experience in oligonucleotide development and production, and over 1000 sequences manufactured, Avecia has played an integral role in the advancing oligo therapeutic market. Our mission is to continue to build value for our customers, as they progress through drug development into commercialization. And as a member of the Nitto Denko Corporation (nitto.com), Avecia is committed to the future of the oligonucleotide market. We are driven by innovative ideas and flexible solutions, designed to provide our customers with the best in service, quality, and technology.

 

SOURCE

http://avecia.com/

 

Note : 1918 Nitto Electric Industrial Co., Ltd. forms in Ohsaki, Tokyo, to produce electrical insulating materials in Japan.

2011 Acquires Avecia Biotechnology Inc. in the U.S.A.

 

 

OriGene Technologies Inc.
 CA

http://www.com/

 

OriGene Technologies, Inc. develops, manufactures, and sells genome wide research and diagnostic products for pharmaceutical, biotechnology, and academic research applications. The company offers cDNA clones, including TrueORF cDNA, viral ORF, destination vectors, TrueClones (human), TrueClones (mouse), organelle marker plasmids, MicroRNA tools, mutant and variant clones, plasmid purification kits, transfection reagents, and gene synthesis service; and HuSH shRNA, siRNA, miRNA, qPCR reagents, plasmid purification products, transfection reagents, PolyA+ and total RNA products, first-strand cDNA synthesis, and CRISPR/Cas9 genome products. It also provides proteins and lysates, such as purified human proteins, over-expression cell lysates, mass spectrometry standard proteins, and protein purification reagents; UltraMAB IHC antibodies, TrueMAB primary antibodies, anti-tag and fluorescent proteins, ELISA antibodies, luminex antibodies, secondary antibodies, and controls and others; and anatomic pathology products, including IHC antibodies, detection systems, and IHC accessories

The company offers luminex and ELISA antibody pairs, autoantibody profiling arrays, ELISA kits, cell assay kits, assay reagents, custom development, and fluorogenic cell assays; TissueFocus search tools; tissue sections; tissue microarrays, cancer protein lysate arrays, TissueScan cDNA arrays, tissue blocks, and quality control products, as well as tissue RNA, DNA, and protein lysates; and lab essentials. Its research areas include cancer biomarker research, RNAi, pathology IHC, stem cell research, ion channels, and protein kinase products. The company provides gene synthesis and molecular biology services, genome editing, custom cloning, custom shRNA, purified protein, monoclonal antibody development, and assay development. It sells its products through distributors worldwide, as well as online. OriGene Technologies, Inc. was incorporated in 1995 and is based in Rockville, Maryland.

SOURCE

http://BLOOMBERG.com

               PRODUCTS AND SERVICES

  •                cDNA Clones
Human, mouse, rat
Expression validated
  •                RNAi
shRNA, siRNA
microRNA & 3’UTR clones
  •                Gene Synthesis
Codon optimization
Variant libraries
  •                Real-time PCR
Primer pairs, panels
SYBR green reagents
  •                Lab Essentials
DNA/RNA purification kits
Transfection reagents
  •                Anatomic Pathology
UltraMAB antibodies
Specificity validated
  •                Recombinant Proteins
10,000 human proteins
from mammalian system
  •                Antibodies
TrueMAB primary antibodies
Anti-tag antibodies
  •                Assays and Kits
ELISA & Luminex antibodies
Autoantibody Profiling Array
  •                Cancer & Normal Tissues
Pathologist verified
gDNA, RNA, sections, arrays

SOURCE

origene.com/

 

Sigma-Aldrich Corporation 
MI 


http://www.sigmaaldrich.com/

Louis, MO – November 18, 2015 Merck KGaA, Darmstadt, Germany, Completes Sigma-Aldrich Acquisition

Merck KGaA today announced the completion of its $17 billion acquisition of Sigma-Aldrich, creating one of the leaders in the $130 billion global industry to help solve the toughest problems in life science.

Press Release: 18-Nov-2015

Letter to our Life Science Customers from Dr. Udit Batra

The life science business of Merck KGaA, Darmstadt, Germany brings together the world-class products and services, innovative capabilities and exceptional talent of EMD Millipore and Sigma-Aldrich to create a global leader in the life science industry.

Everything we do starts with our shared purpose – to solve the toughest problems in life science by collaborating with the global scientific community. 

This combination is built on complementary strengths, which will enable us to serve you even better as one organization than either company could alone.

This means providing a broader portfolio with a catalog of more than 300,000 products, including many of the most respected brands in the industry, greater geographic reach, and an unmatched combination of industry-leading capabilities.

                PRODUCTS AND SERVICES

                http://www.sigmaaldrich.com/configurator/servlet/DesignCenter?btnOpen_0.x=1

                http://www.sigmaaldrich.com/content/dam/sigma-aldrich/common/quality-products.jpg

 

Thermo Fisher Scientific Inc.
 MA 
NYSE :TMO


http://thermofisher.com/

Thermo Fisher Scientific Inc. is a provider of analytical instruments, equipment, reagents and consumables, software and services for research, manufacturing, analysis, discovery and diagnostics. The company operates through four segments: Life Sciences Solutions, provides reagents, instruments and consumables used in biological and medical research, discovery and production of new drugs and vaccines as well as diagnosis of disease; Analytical Instruments, provides instruments, consumables, software and services that are used in the laboratory; Specialty Diagnostics, offers diagnostic test kits, reagents, culture media, instruments and associated products, and Laboratory Products and Services, offers self-manufactured and sourced products for the laboratory.

SOURCE

http://REUTERS.com

 

                PRODUCTS AND SERVICES

  •                 Oligos Value – Standard – Plate
  •                 Primers
  •                 Probes
  •                 Nucleotides

 

                BRANDS

  1.                THERMO SCIENTIFIC
  2.                 APPLIED BIOSYSTEMS
  3.                 INVITROGEN
  4.                 FISHER SCIENTIFIC
  5.                 UNITY LAB SERVICES

 

                 PARTNERSHIPS

AFFYMETRIX : NASDAQ : AFFX : affymetrix.com/

WALTHAM, Mass. & SANTA CLARA, Calif.–(BUSINESS WIRE)–Jan. 8, 2016– Thermo Fisher Scientific Inc. (NYSE:TMO), the world leader in serving science, and Affymetrix Inc. (NASDAQ:AFFX), a leading provider of cellular and genetic analysis products, today announced that their boards of directors have unanimously approved Thermo Fisher’s acquisition of Affymetrix for $14.00 per share in cash. The transaction represents a purchase price of approximately $1.3 billion.

SOURCE

http://BUSINESSWIRE.com

 

TriLink Biotechnologies
 CA 


http://www.com/

 

              PRODUCTS

              Oligonucleotides

  •               DNA Oligos
  •               RNA Oligos
  •               Modified Oligos
  •               Specialty Oligos

              Nucleotides

  •               NTPs (Nucleoside Triphosphates)
  •               Biphosphates
  •               Monophosphates

 

              SERVICES

  •              Custom Chemistry
  •              Reagents
  •              Aptamers

 

             PARTNERSHIPS

  • LIFE TECHNOLOGIES,
  • TERMO FISHER SCIENTIFIC since July 2015 thermofisher.com/
  • GENMARK genmarkdx.com/

SOURCE

http://trilinkbiotech.com/

 

Other related articles published in this Open Access Online Scientific Journal include the following:

Gene Editing: The Role of Oligonucleotide Chips

https://pharmaceuticalintelligence.com/2016/01/07/gene-editing-the-role-of-oligonucleotide-chips/

Gene Editing for Exon 51: Why CRISPR Snipping might be better than Exon Skipping for DMD

https://pharmaceuticalintelligence.com/2016/01/23/gene-editing-for-exon-51-why-crispr-snipping-might-be-better-than-exon-skipping-for-dmd/

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Read Full Post »


Mindful Discoveries

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Schizophrenia and the Synapse

Genetic evidence suggests that overactive synaptic pruning drives development of schizophrenia.

By Ruth Williams | January 27, 2016 … more follows)

http://www.the-scientist.com/?articles.view/articleNo/45189/title/Schizophrenia-and-the-Synapse/

http://www.the-scientist.com/images/News/January2016/Schizophrenia.jpg

C4 (green) at synapses of human neurons

Compared to the brains of healthy individuals, those of people with schizophrenia have higher expression of a gene called C4, according to a paper published inNature today (January 27). The gene encodes an immune protein that moonlights in the brain as an eradicator of unwanted neural connections (synapses). The findings, which suggest increased synaptic pruning is a feature of the disease, are a direct extension of genome-wide association studies (GWASs) that pointed to the major histocompatibility (MHC) locus as a key region associated with schizophrenia risk.

“The MHC [locus] is the first and the strongest genetic association for schizophrenia, but many people have said this finding is not useful,” said psychiatric geneticist Patrick Sullivan of the University of North Carolina School of Medicine who was not involved in the study. “The value of [the present study is] to show that not only is it useful, but it opens up new and extremely interesting ideas about the biology and therapeutics of schizophrenia.”

Schizophrenia has a strong genetic component—it runs in families—yet, because of the complex nature of the condition, no specific genes or mutations have been identified. The pathological processes driving the disease remain a mystery.

Researchers have turned to GWASs in the hope of finding specific genetic variations associated with schizophrenia, but even these have not provided clear candidates.

“There are some instances where genome-wide association will literally hit one base [in the DNA],” explained Sullivan. While a 2014 schizophrenia GWAS highlighted the MHC locus on chromosome 6 as a strong risk area, the association spanned hundreds of possible genes and did not reveal specific nucleotide changes. In short, any hope of pinpointing the MHC association was going to be “really challenging,” said geneticist Steve McCarroll of Harvard who led the new study.

Nevertheless, McCarroll and colleagues zeroed in on the particular region of the MHC with the highest GWAS score—the C4 gene—and set about examining how the area’s structural architecture varied in patients and healthy people.

The C4gene can exist in multiple copies (from one to four) on each copy of chromosome 6, and has four different forms: C4A-short, C4B-short, C4A-long, and C4B-long. The researchers first examined the “structural alleles” of the C4 locus—that is, the combinations and copy numbers of the different C4 forms—in healthy individuals. They then examined how these structural alleles related to expression of both C4Aand C4B messenger RNAs (mRNAs) in postmortem brain tissues.From this the researchers had a clear picture of how the architecture of the C4 locus affected expression ofC4A and C4B. Next, they compared DNA from roughly 30,000 schizophrenia patients with that from 35,000 healthy controls, and a correlation emerged: the alleles most strongly associated with schizophrenia were also those that were associated with the highest C4A expression. Measuring C4A mRNA levels in the brains of 35 schizophrenia patients and 70 controls then revealed that, on average, C4A levels in the patients’ brains were 1.4-fold higher.C4 is an immune system “complement” factor—a small secreted protein that assists immune cells in the targeting and removal of pathogens. The discovery of C4’s association to schizophrenia, said McCarroll, “would have seemed random and puzzling if it wasn’t for work . . . showing that other complement components regulate brain wiring.” Indeed, complement protein C3 locates at synapses that are going to be eliminated in the brain, explained McCarroll, “and C4 was known to interact with C3 . . . so we thought well, actually, this might make sense.”McCarroll’s team went on to perform studies in mice that revealed C4 is necessary for C3 to be deposited at synapses. They also showed that the more copies of the C4 gene present in a mouse, the more the animal’s neurons were pruned.Synaptic pruning is a normal part of development and is thought to reflect the process of learning, where the brain strengthens some connections and eradicates others. Interestingly, the brains of deceased schizophrenia patients exhibit reduced neuron density. The new results, therefore, “make a lot of sense,” said Cardiff University’s Andrew Pocklington who did not participate in the work. They also make sense “in terms of the time period when synaptic pruning is occurring, which sort of overlaps with the period of onset for schizophrenia: around adolescence and early adulthood,” he added.

“[C4] has not been on anybody’s radar for having anything to do with schizophrenia, and now it is and there’s a whole bunch of really neat stuff that could happen,” said Sullivan. For one, he suggested, “this molecule could be something that is amenable to therapeutics.”

A. Sekar et al., “Schizophrenia risk from complexvariation of complement component 4,”Nature,   http://dx.doi.com:/10.1038/nature16549, 2016.     

Tags schizophrenia, neuroscience, gwas, genetics & genomics, disease/medicine and cell & molecular biology

 

Schizophrenia: From genetics to physiology at last

Ryan S. Dhindsa& David B. Goldstein

Nature (2016)  http://dx.doi.org://10.1038/nature16874

The identification of a set of genetic variations that are strongly associated with the risk of developing schizophrenia provides insights into the neurobiology of this destructive disease.

http://www.nytimes.com/2016/01/28/health/schizophrenia-cause-synaptic-pruning-brain-psychiatry.html

 

Genetic study provides first-ever insight into biological origin of schizophrenia

Suspect gene may trigger runaway synaptic pruning during adolescence — NIH-funded study

NIH/NATIONAL INSTITUTE OF MENTAL HEALTH

IMAGE

http://media.eurekalert.org/multimedia_prod/pub/web/107629_web.jpg

The site in Chromosome 6 harboring the gene C4 towers far above other risk-associated areas on schizophrenia’s genomic “skyline,” marking its strongest known genetic influence. The new study is the first to explain how specific gene versions work biologically to confer schizophrenia risk.  CREDIT  Psychiatric Genomics Consortium

Versions of a gene linked to schizophrenia may trigger runaway pruning of the teenage brain’s still-maturing communications infrastructure, NIH-funded researchers have discovered. People with the illness show fewer such connections between neurons, or synapses. The gene switched on more in people with the suspect versions, who faced a higher risk of developing the disorder, characterized by hallucinations, delusions and impaired thinking and emotions.

“Normally, pruning gets rid of excess connections we no longer need, streamlining our brain for optimal performance, but too much pruning can impair mental function,” explained Thomas Lehner, Ph.D., director of the Office of Genomics Research Coordination of the NIH’s National Institute of Mental Health (NIMH), which co-funded the study along with the Stanley Center for Psychiatric Research at the Broad Institute and other NIH components. “It could help explain schizophrenia’s delayed age-of-onset of symptoms in late adolescence/early adulthood and shrinkage of the brain’s working tissue. Interventions that put the brakes on this pruning process-gone-awry could prove transformative.”

The gene, called C4 (complement component 4), sits in by far the tallest tower on schizophrenia’s genomic “skyline” (see graph below) of more than 100 chromosomal sites harboring known genetic risk for the disorder. Affecting about 1 percent of the population, schizophrenia is known to be as much as 90 percent heritable, yet discovering how specific genes work to confer risk has proven elusive, until now.

A team of scientists led by Steve McCarroll, Ph.D., of the Broad Institute and Harvard Medical School, Boston, leveraged the statistical power conferred by analyzing the genomes of 65,000 people, 700 postmortem brains, and the precision of mouse genetic engineering to discover the secrets of schizophrenia’s strongest known genetic risk. C4’s role represents the most compelling evidence, to date, linking specific gene versions to a biological process that could cause at least some cases of the illness.

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” said McCarroll. “The human genome is providing a powerful new way in to this disease. Understanding these genetic effects on risk is a way of prying open that block box, peering inside and starting to see actual biological mechanisms.”

McCarroll’s team, including Harvard colleagues Beth Stevens, Ph.D., Michael Carroll, Ph.D., and Aswin Sekar, report on their findings online Jan. 27, 2016 in the journal Nature.

A swath of chromosome 6 encompassing several genes known to be involved in immune function emerged as the strongest signal associated with schizophrenia risk in genome-wide analyses by the NIMH-funded Psychiatric Genomics Consortium over the past several years. Yet conventional genetics failed to turn up any specific gene versions there linked to schizophrenia.

To discover how the immune-related site confers risk for the mental disorder, McCarroll’s team mounted a search for “cryptic genetic influences” that might generate “unconventional signals.” C4, a gene with known roles in immunity, emerged as a prime suspect because it is unusually variable across individuals. It is not unusual for people to have different numbers of copies of the gene and distinct DNA sequences that result in the gene working differently.

The researchers dug deeply into the complexities of how such structural variation relates to the gene’s level of expression and how that, in turn, might relate to schizophrenia. They discovered structurally distinct versions that affect expression of two main forms of the gene in the brain. The more a version resulted in expression of one of the forms, called C4A, the more it was associated with schizophrenia. The more a person had the suspect versions, the more C4 switched on and the higher their risk of developing schizophrenia. Moreover, in the human brain, the C4 protein turned out to be most prevalent in the cellular machinery that supports connections between neurons.

Adapting mouse molecular genetics techniques for studying synaptic pruning and C4’s role in immune function, the researchers also discovered a previously unknown role for C4 in brain development. During critical periods of postnatal brain maturation, C4 tags a synapse for pruning by depositing a sister protein in it called C3. Again, the more C4 got switched on, the more synapses got eliminated.

In humans, such streamlining/pruning occurs as the brain develops to full maturity in the late teens/early adulthood – conspicuously corresponding to the age-of-onset of schizophrenia symptoms.

Future treatments designed to suppress excessive levels of pruning by counteracting runaway C4 in at risk individuals might nip in the bud a process that could otherwise develop into psychotic illness, suggest the researchers. And thanks to the head start gained in understanding the role of such complement proteins in immune function, such agents are already in development, they note.

“This study marks a crucial turning point in the fight against mental illness. It changes the game,” added acting NIMH director Bruce Cuthbert, Ph.D. “Thanks to this genetic breakthrough, we can finally see the potential for clinical tests, early detection, new treatments and even prevention.”

###

VIDEO: Opening Schizophrenia’s Black Box https://youtu.be/s0y4equOTLg

Reference: Sekar A, Biala AR, de Rivera H, Davis A, Hammond TR, Kamitaki N, Tooley K Presumey J Baum M, Van Doren V, Genovese G, Rose SA, Handsaker RE, Schizophrenia Working Group of the Psychiatric Genomics Consortium, Daly MJ, Carroll MC, Stevens B, McCarroll SA. Schizophrenia risk from complex variation of complement component 4.Nature. Jan 27, 2016. DOI: 10.1038/nature16549.

 

Schizophrenia risk from complex variation of complement component 4

Aswin SekarAllison R. BialasHeather de RiveraAvery DavisTimothy R. Hammond, …., Michael C. CarrollBeth Stevens Steven A. McCarroll

Nature(2016)   http://dx.doi.org:/10.1038/nature16549

Schizophrenia is a heritable brain illness with unknown pathogenic mechanisms. Schizophrenia’s strongest genetic association at a population level involves variation in the major histocompatibility complex (MHC) locus, but the genes and molecular mechanisms accounting for this have been challenging to identify. Here we show that this association arises in part from many structurally diverse alleles of the complement component 4 (C4) genes. We found that these alleles generated widely varying levels of C4A and C4B expression in the brain, with each common C4 allele associating with schizophrenia in proportion to its tendency to generate greater expression of C4A. Human C4 protein localized to neuronal synapses, dendrites, axons, and cell bodies. In mice, C4 mediated synapse elimination during postnatal development. These results implicate excessive complement activity in the development of schizophrenia and may help explain the reduced numbers of synapses in the brains of individuals with schizophrenia.

Figure 1: Structural variation of the complement component 4 (C4) gene.

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-f1.jpg

a, Location of the C4 genes within the major histocompatibility complex (MHC) locus on human chromosome 6. b, Human C4 exists as two paralogous genes (isotypes), C4A and C4B; the encoded proteins are distinguished at a key site

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-f3.jpg

http://www.nature.com/nature/journal/vaop/ncurrent/carousel/nature16549-sf8.jpg

Gene Study Points Toward Therapies for Common Brain Disorders

University of Edinburgh    http://www.dddmag.com/news/2016/01/gene-study-points-toward-therapies-common-brain-disorders

Scientists have pinpointed the cells that are likely to trigger common brain disorders, including Alzheimer’s disease, Multiple Sclerosis and intellectual disabilities.

It is the first time researchers have been able to identify the particular cell types that malfunction in a wide range of brain diseases.

Scientists say the findings offer a roadmap for the development of new therapies to target the conditions.

The researchers from the University of Edinburgh’s Centre for Clinical Brain Sciences used advanced gene analysis techniques to investigate which genes were switched on in specific types of brain cells.

They then compared this information with genes that are known to be linked to each of the most common brain conditions — Alzheimer’s disease, anxiety disorders, autism, intellectual disability, multiple sclerosis, schizophrenia and epilepsy.

Their findings reveal that for some conditions, the support cells rather than the neurons that transmit messages in the brain are most likely to be the first affected.

Alzheimer’s disease, for example, is characterised by damage to the neurons. Previous efforts to treat the condition have focused on trying to repair this damage.

The study found that a different cell type — called microglial cells — are responsible for triggering Alzheimer’s and that damage to the neurons is a secondary symptom of disease progression.

Researchers say that developing medicines that target microglial cells could offer hope for treating the illness.

The approach could also be used to find new treatment targets for other diseases that have a genetic basis, the researchers say.

Dr Nathan Skene, who carried out the study with Professor Seth Grant, said: “The brain is the most complex organ made up from a tangle of many cell types and sorting out which of these cells go wrong in disease is of critical importance to developing new medicines.”

Professor Seth Grant said: “We are in the midst of scientific revolution where advanced molecular methods are disentangling the Gordian Knot of the brain and completely unexpected new pathways to solving diseases are emerging. There is a pressing need to exploit the remarkable insights from the study.”

 

Quantitative multimodal multiparametric imaging in Alzheimer’s disease

Qian Zhao, Xueqi Chen, Yun Zhou      Brain Informatics  http://link.springer.com/article/10.1007/s40708-015-0028-9

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder, causing changes in memory, thinking, and other dysfunction of brain functions. More and more people are suffering from the disease. Early neuroimaging techniques of AD are needed to develop. This review provides a preliminary summary of the various neuroimaging techniques that have been explored for in vivo imaging of AD. Recent advances in magnetic resonance (MR) techniques, such as functional MR imaging (fMRI) and diffusion MRI, give opportunities to display not only anatomy and atrophy of the medial temporal lobe, but also at microstructural alterations or perfusion disturbance within the AD lesions. Positron emission tomography (PET) imaging has become the subject of intense research for the diagnosis and facilitation of drug development of AD in both animal models and human trials due to its non-invasive and translational characteristic. Fluorodeoxyglucose (FDG) PET and amyloid PET are applied in clinics and research departments. Amyloid beta (Aβ) imaging using PET has been recognized as one of the most important methods for the early diagnosis of AD, and numerous candidate compounds have been tested for Aβ imaging. Besides in vivo imaging method, a lot of ex vivo modalities are being used in the AD researches. Multiphoton laser scanning microscopy, neuroimaging of metals, and several metal bioimaging methods are also mentioned here. More and more multimodality and multiparametric neuroimaging techniques should improve our understanding of brain function and open new insights into the pathophysiology of AD. We expect exciting results will emerge from new neuroimaging applications that will provide scientific and medical benefits.

Keywords –   Alzheimer’s disease Neuroimaging PET MRI Amyloid beta Multimodal

Alzheimer’s disease (AD) is a progressive neurodegenerative disorder that gradually destroys brain cells, causing changes in memory, thinking, and other dysfunction of brain functions [1]. AD is considered to a prolonged preclinical stage where neuropathological changes precede the clinical symptoms [2]. An estimation of 35 million people worldwide is living with this disease. If effective treatments are not discovered in a timely fashion, the number of AD cases is anticipated to rise to 113 million by 2050 [3].

Amyloid beta (Aβ) and tau are two of the major biomarkers of AD, and have important and different roles in association with the progression of AD pathophysiology. Jack et al. established hypothetical models of the major biomarkers of AD. By renewing and modifying the models, they found that the two major proteinopathies underlying AD biomarker changes, Aβ and tau, may be initiated independently in late onset AD where they hypothesize that an incident Aβ pathophysiology can accelerate an antecedent limbic and brainstem tauopathy [4]. MRI technique was used in the article, which revealed that the level of Aβ load was associated with a shorter time-to-progression of AD [5]. This warrants an urgent need to develop early neuroimaging techniques of AD neuropathology that can detect and predict the disease before the onset of dementia, monitor therapeutic efficacy in halting and slowing down progression in the earlier stage of the disease.

There have been various reports on the imaging assessments of AD. Some measurements reflect the pathology of AD directly, including positron emission tomography (PET) amyloid imaging and cerebrospinal fluid (CSF) beta-amyloid 42 (Aβ42), while others reflect neuronal injury associated with AD indirectly, including CSF tau (total and phosphorylated tau), fluorodeoxy-d-glucose (FDG)-PET, and MRI. AD Neuroimaging Initiative (ADNI) has been to establish the optimal panel of clinical assessments, MRI and PET imaging measures, as well as other biomarkers from blood and CSF, to inform clinical trial design for AD therapeutic development. At the same time, it has been highly productive in generating a wealth of data for elucidating disease mechanisms occurring during early stages of preclinical and prodromal AD [6].

Single neuroimaging often reflects limit information of AD. As a result, multimodal neuroimaging is widely used in neuroscience researches, as it overcomes the limitations of individual modalities. Multimodal multiparametric imaging mean the combination of different imaging techniques, such as PET, MRI, simultaneously or separately. The multimodal multiparametric imaging enables the visualization and quantitative analysis of the alterations in brain structure and function, such as PET/CT, and PET/MRI. [7]. In this review article, we summarize and discuss the main applications, findings, perspectives as well as advantages and challenges of different neuroimaging in AD, especially MRI and PET imaging.

2 Magnetic resonance imaging

MRI demonstrates specific volume loss or cortical atrophy patterns with disease progression in AD patients [810]. There are several MRI techniques and analysis methods used in clinical and scientific research of AD. Recent advances in MR techniques, such as functional MRI (fMRI) and diffusion MRI, depict not only anatomy and atrophy of the medial temporal lobe (MTL), but also microstructural alterations or perfusion disturbance within this region.

2.1 Functional MRI

Because of the cognitive reserve (CR), the relationship between severity of AD patients’ brain damage and corresponding clinical symptoms is not always paralleled [11, 12]. Recently, resting-state fMRI (RS-fMRI) is popular for its ability to map brain functional connectivity non-invasively [13]. By using RS-fMRI, Bozzali et al. reported that the CR played a role in modulating the effect of AD pathology on default mode network functional connectivity, which account for the variable clinical symptoms of AD [14]. Moreover, AD patients with higher educated experience were able to recruit compensatory neural mechanisms, which can be measured using RS-fMRI. Arterial spin-labeled (ASL) MRI is another functional brain imaging modality, which measures cerebral blood flow (CBF) by magnetically labeled arterial blood water following through the carotid and vertebral arteries as an endogenous contrast medium. Several studies have concluded the characteristics of CBF changes in AD patients using ASL-MRI [1517].

At some point in time, sufficient brain damage accumulates to result in cognitive symptoms and impairment. Mild cognitive impairment (MCI) is a condition in which subjects are usually only mildly impaired in memory with relative preservation of other cognitive domains and functional activities and do not meet the criteria for dementia [18], or as the prodromal state AD [19]. MCI patients are at a higher risk of developing AD and up to 15 % convert to AD per year [18]. Binnewijzend et al. have reported the pseudocontinuous ASL could distinguish both MCI and AD from healthy controls, and be used in the early diagnosis of AD [20]. In their continuous study, they used quantitative whole brain pseudocontinuous ASL to compare regional CBF (rCBF) distribution patterns in different types of dementia, and concluded that ASL-MRI could be a non-invasive and easily accessible alternative to FDG-PET imaging in the assessment of CBF of AD patients [21].

2.2 Structure MRI

Structural MRI (sMRI) has already been a reliable imaging method in the clinical diagnosis of AD, characterized as gray matter reduction and ventricular enlargement in standard T1-weighted sequences [9]. Locus coeruleus (LC) and substantia nigra (SN) degeneration was seen in AD. By using new quantitative calculating method, Chen et al. presented a new quantitative neuromelanin MRI approach for simultaneous measurement of locus LC and SN of brainstem in living human subjects [22]. The approach they used demonstrated advantages in image acquisition, pre-processing, and quantitative analysis. Numerous transgenic animal models of amyloidosis are available, which can manipulate a lot of neuropathological features of AD progression from the deposition of β-amyloid [23]. Braakman et al. demonstrated the dynamics of amyloid plaque formation and development in a serial MRI study in a transgenic mouse model [24]. Increased iron accumulation in gray matter is frequently observed in AD. Because of the paramagnetic nature of iron, MRI shows nice potential in the investigating iron levels in AD [25]. Quantitative MRI was shown high sensitivity and specificity in mapping cerebral iron deposition, and helped in the research on AD diagnosis [26].

The imaging patterns are always associated with the pathologic changes, such as specific protein markers. Spencer et al. manifested the relationship between quantitative T1 and T2 relaxation time changes and three immunohistochemical markers: β-amyloid, neuron-specific nuclear protein (a marker of neuronal cell load), and myelin basic protein (a marker of myelin load) in AD transgenic mice [27].

High-field MRI has been successfully applied to imaging plaques in transgenic mice for over a decade without contrast agents [24, 2830]. Sillerud et al. devised a method using blood–brain barrier penetrating, amyloid-targeted, superparamagnetic iron oxide nanoparticles (SPIONs) for better imaging of amyloid plaque [31]. Then, they successfully used this SPION-MRI to assess the drug efficacy on the 3D distribution of Aβ plaques in transgenic AD mouse [32].

2.3 Diffusion MRI

Diffusion-weighted imaging (DWI) is a sensitive tool that allows quantifying of physiologic alterations in water diffusion, which result from microscopic structural changes.

Diffusion tensor imaging (DTI) is a well-established and commonly employed diffusion MRI technique in clinical and research on neuroimaging studies, which is based on a Gaussian model of diffusion processes [33]. In general, AD is associated with widespread reduced fractional anisotropy (FA) and increased mean diffusivity (MD) in several regions, most prominently in the frontal and temporal lobes, and along the cingulum, corpus callosum, uncinate fasciculus, superior longitudinal fasciculus, and MTL-associated tracts than healthy controls [3437]. Acosta-Cabronero et al. reported increased axial diffusivity and MD in the splenium, which were the earliest abnormalities in AD [38]. FA and radial diffusivity (DR) differences in the corpus callosum, cingulum, and fornix were found to separate individuals with MCI who converted to AD from non-converters [39]. DTI was also found to be a better predictor of AD-specific MTL atrophy when compared to CSF biomarkers [40]. These findings suggested the potential clinical utility of DTI as early biomarkers of AD and its progression. However, an increase in MD and DR and a decrease in FA with advancing age in selective brain regions have been previously reported [41, 42]. Diffusion MRI can be also used in the classifying of various stages of AD. Multimodal classification method, which combined fMRI and DTI, separated more MCI from healthy controls than single approaches [43].

In recent years, tau has emerged as a potential target for therapeutic intervention. Tau plays a critical role in the neurodegenerative process forming neurofibrillary tangles, which is a major hallmark of AD and correlates with clinical disease progression. Wells et al. applied multiparametric MRI, containing high-resolution structure MRI (sMRI), a novel chemical exchange saturation transfer (CEST) MRI, DTI, and ASL, and glucose CEST to measure changes of tau pathology in AD transgenic mouse [44].

Besides DWI MRI, perfusion-weighted imaging (PWI) is another advanced MR technique, which could measure the cerebral hemodynamics at the capillary level. Zimny et al. evaluated the correlation of MTL with both DWI and PWI in AD and MCI patients [45].

3 Positron emission tomography

PET is a specific imaging technique applying in researches of brain function and neurochemistry of small animals, medium-sized animals, and human subjects [4648]. As a particular brain imaging technique, PET imaging has become the subject of intense research for the diagnosis and facilitation of drug development of AD in both animal models and human trials due to its non-invasive and translational characteristic. PET with various radiotracers is considered as a standard non-invasive quantitative imaging technique to measure CBF, glucose metabolism, and β-amyloid and tau deposition.

3.1 FDG-PET

To date, 18F-FDG is one of the best and widely used neuroimaging tracers of PET, which employed for research and clinical assessment of AD [49]. Typical lower FDG metabolism was shown in the precuneus, posterior cingulate, and temporal and parietal cortex with progression to whole brain reductions with increasing disease progress in AD brains [50, 51]. FDG-PET imaging reflects the cerebral glucose metabolism, neuronal injury, which provides indirect evidence on cognitive function and progression that cannot be provided by amyloid PET imaging.

Schraml et al. [52] identified a significant association between hypometabolic convergence index and phenotypes using ADNI data. Some researchers also used 18F-FDG-PET to analyze genetic information with multiple biomarkers to classify AD status, predicting cognitive decline or MCI to AD conversion [5355]. Trzepacz et al. [56] reported multimodal AD neuroimaging study, using MRI, 11C-PiB PET, and 18F-FDG-PET imaging to predict MCI conversion to AD along with APOE genotype. Zhang et al. [57] compared the genetic modality single-nucleotide polymorphism (SNP) with sMRI, 18F-FDG-PET, and CSF biomarkers, which were used to differentiate healthy control, MCI, and AD. They found FDG-PET is the best modality in terms of accuracy.

3.2 Amyloid beta PET

Aβ, the primary constituent of senile plaques, and tau tangles are hypothesized to play a primary role in the pathogenesis of AD, but it is still hard to identify the fundamental mechanisms [5860]. Aβ plaque in brain is one of the pathological hallmarks of AD [61,62]. Accumulation of Aβ peptide in the cerebral cortex is considered one cause of dementia in AD [63]. Numerous studies have involved in vivo PET imaging assessing cortical β-amyloid burden [6466].

Aβ imaging using PET has been recognized as one of the most important methods for the early diagnosis of AD [67]. Numerous candidate compounds have been tested for Aβ imaging, such as 11C-PiB [68], 18F-FDDNP [69], 11C-SB-13 [70], 18F-BAY94-9172 [71], 18F-AV-45 [72], 18F-flutemetamol [73, 74], 11C-AZD2184 [75], and 18F-ADZ4694 [76], 11C-BF227 and 18F-FACT [77].

Several amyloid PET studies examined genotypes, phenotypes, or gene–gene interactions. Ramanan et al. [78] reported the GWAS results with 18F-AV-45 reflecting the cerebral amyloid metabolism in AD for the first time. Swaminathan et al. [79] revealed the association between plasma Aβ from peripheral blood and cortical amyloid deposition on 11C-PiB. Hohman et al. [80] reported the relationship between SNPs involved in amyloid and tau pathophysiology with 18F-AV-45 PET.

Among the PET tracers, 11C-PiB, which has a high affinity for fibrillar Aβ, is a reliable biomarker of underlying AD pathology [68, 81]. It shows cortical uptake well paralleled with AD pathology [82, 83], has recently been approved for use by the Food and Drug Administration (FDA, April 2012) and the European Medicines Agency (January 2013). 18F-GE-067 (flutemetamol) and 18F-BAY94-9172 (florbetaben) have also been approved by the US FDA in the last 2 years [84, 85].

18F-Florbetapir (also known as 18F-AV-45) exhibits high affinity specific binding to amyloid plaques. 18F-AV-45 labels Aβ plaques in sections from patients with pathologically confirmed AD [72].

It was reported in several research groups that 18F-AV-45 PET imaging showed a reliability of both qualitative and quantitative assessments in AD patients, and Aβ+ increased with diagnostic category (healthy control < MCI < AD) [82, 86, 87]. Johnson et al. used 18F-AV-45 PET imaging to evaluate the amyloid deposition in both MCI and AD patients qualitatively and quantitatively, and found that amyloid burden increased with diagnostic category (MCI < AD), age, and APOEε4 carrier status [88]. Payoux et al. reported the equivocal amyloid PET scans using 18F-AV-45 associated with a specific pattern of clinical signs in a large population of non-demented older adults more than 70 years old [89].

More and more researchers consider combination and comparison of multiple PET tracers targeting amyloid plaque imaging together. Bruck et al. compared the prognostic ability of 11C-PiB PET, 18F-FDG-PET, and quantitative hippocampal volumes measured with MR imaging in predicting MCI to AD conversion. They found that the FDG-PET and 11C-PiB PET imaging are better in predicting MCI to AD conversion [90]. Hatashita et al. used 11C-PiB and FDG-PET imaging to identify MCI due to AD, 11C-PiB showed a higher sensitivity of 96.6 %, and FDG-PET added diagnostic value in predicting AD over a short period [91].

Besides, new Aβ imaging agents were radiosynthesized. Yousefi et al. radiosynthesized a new Aβ imaging agent 18F-FIBT, and compared the three different Aβ-targeted radiopharmaceuticals for PET imaging, including 18F-FIBT, 18F-florbetaben, and 11C-PiB [92]. 11C-AZD2184 is another new PET tracer developed for amyloid senile plaque imaging, and the kinetic behavior of 11C-AZD2184 is suitable for quantitative analysis and can be used in clinical examination without input function [75,93, 94].

4 Multimodality imaging: PET/MRI

Several diagnostic techniques, including MRI and PET, are employed for the diagnosis and monitoring of AD [95]. Multimodal imaging could provide more information in the formation and key molecular event of AD than single method. It drives the progression of neuroimaging research due to the recognition of the clinical benefits of multimodal data [96], and the better access to hybrid devices, such as PET/MRI [97].

Maier et al. evaluated the dynamics of 11C-PiB PET, 15O-H2O-PET, and ASL-MRI in transgenic AD mice and concluded that the AD-related decline of rCBF was caused by the cerebral Aβ angiopathy [98]. Edison et al. systematically compared 11C-PiB PET and MRI in AD, MCI patients, and controls. They thought that 11C-PiB PET was adequate for clinical diagnostic purpose, while MRI remained more appropriate for clinical research [99]. Zhou et al. investigated the interactions between multimodal PET/MRI in elder patients with MCI, AD, and healthy controls, and confirmed the invaluable application of amyloid PET and MRI in early diagnosis of AD [100]. Kim et al. reported that Aβ-weighted cortical thickness, which incorporates data from both MRI and amyloid PET imaging, is a consistent and objective imaging biomarker in AD [101].

5 Other imaging modalities

Multiphoton non-linear optical microscope imaging systems using ultrafast lasers have powerful advantages such as label-free detection, deep penetration of thick samples, high sensitivity, subcellular spatial resolution, 3D optical sectioning, chemical specificity, and minimum sample destruction [102, 103]. Coherent anti-Stokes–Raman scattering (CARS), two-photon excited fluorescence (TPEF), and second-harmonic generation (SHG) microscopy are the most widely used biomedical imaging techniques [104106].

 

Quantitative electroencephalographic and neuropsychological investigation of an alternative measure of frontal lobe executive functions: the Figure Trail Making Test

 Paul S. Foster, Valeria Drago, Brad J. Ferguson, Patti Kelly Harrison,David W. Harrison 

Brain Informatis    http://dx.doi.org:/10.1007/s40708-015-0025-z    http://link.springer.com/article/10.1007/s40708-015-0025-z/fulltext.html

The most frequently used measures of executive functioning are either sensitive to left frontal lobe functioning or bilateral frontal functioning. Relatively little is known about right frontal lobe contributions to executive functioning given the paucity of measures sensitive to right frontal functioning. The present investigation reports the development and initial validation of a new measure designed to be sensitive to right frontal lobe functioning, the Figure Trail Making Test (FTMT). The FTMT, the classic Trial Making Test, and the Ruff Figural Fluency Test (RFFT) were administered to 42 right-handed men. The results indicated a significant relationship between the FTMT and both the TMT and the RFFT. Performance on the FTMT was also related to high beta EEG over the right frontal lobe. Thus, the FTMT appears to be an equivalent measure of executive functioning that may be sensitive to right frontal lobe functioning. Applications for use in frontotemporal dementia, Alzheimer’s disease, and other patient populations are discussed.

Keywords – Frontal lobes, Executive functioning, Trail making test, Sequencing, Behavioral speed, Designs, Nonverbal, Neuropsychological assessment, Regulatory control, Effortful control

A recent survey indicated that the vast majority of neuropsychologists frequently assess executive functioning as part of their neuropsychological evaluations [1]. Surveys of neuropsychologists have indicated that the Trail Making Test (TMT), Controlled Oral Word Association Test (COWAT), Wisconsin Card Sorting Test (WCST), and the Stroop Color-Word Test (SCWT) are among the most commonly used instruments [1,2]. Further, the Rabin et al. [1] survey indicated that these same tests are among the most frequently used by neuropsychologists when specifically assessing executive or frontal lobe functioning. The frequent use of the TMT, WCST, and the SCWT, as well as the assumption that they are measures of executive functioning, led Demakis (2003–2004) to conduct a series of meta-analyses to determine the sensitivity of these test to detect frontal lobe dysfunction, particularly lateralized frontal lobe dysfunction. The findings indicated that the SCWT and Part A of the TMT [3], as well as the WCST [4], were all sensitive to frontal lobe dysfunction. However, only the SCWT differentiated between left and right frontal lobe dysfunction, with the worst performance among those with left frontal lobe dysfunction [3].

The finding of the Demakis [4] meta-analysis, that the WCST was not sensitive to lateralized frontal lobe dysfunction, is not surprising given the equivocal findings that have been reported. Whereas performance on the WCST is sensitive to frontal lobe dysfunction [5, 6], demonstration of lateralized frontal dysfunction has been quite problematic. Unilateral left or right dorsolateral frontal dysfunction has been associated with impaired performance on the WCST [6]. Fallgatter and Strik [7] found bilateral frontal lobe activation during performance of the WCST. However, other imaging studies have found right lateralized frontal lobe activation [8] and left lateralized frontal activation [9] in response to performance on the WCST. Further, left frontal lobe alpha power is negatively correlated with performance on the WCST [10]. Finally, patients with left frontal lobe tumors exhibit more impaired performance on the WCST than those with right frontal tumors [11].

Unlike the data for the WCST, more consistent findings have been reported regarding lateralized frontal lobe functioning for the other commonly used measures of executive functioning. For instance, as with the Demakis [3] study, many investigations have found the SCWT to be sensitive to left frontal lobe functioning, although the precise localization within the left frontal lobe has varied. Impaired performance on the SCWT results from left frontal lesions [12] and specifically from lesions localized to the left dorsolateral frontal lobe [13, 14], though bilateral frontal lesions have also yielded impaired performance [13, 14]. Further, studies using neuroimaging to investigate the neural basis of performance on the SCWT have indicated involvement of the left anterior cingulated cortex [15], left lateral prefrontal cortex [16], left inferior precentral sulcus [17], and the left dorsolateral frontal lobe [18].

Wide agreement exists among investigations of the frontal lateralization of verbal or lexical fluency to confrontation. Specifically, patients with left frontal lobe lesions are known to exhibit impaired performance on lexical fluency to confrontation tasks, relative to either patients with right frontal lesions [12, 19, 20] or controls [21]. A recent meta-analysis also indicated that the largest deficits in performance on measures of lexical fluency are associated with left frontal lobe lesions [22]. Troster et al. [23] found that, relative to patients with right pallidotomy, patients with left pallidotomy exhibited more impaired lexical fluency. Several neuroimaging investigations have further supported the role of the left frontal lobe in lexical fluency tasks [15, 2427]. Performance on lexical fluency tasks also varies as a function of lateral frontal lobe asymmetry, as assessed by electroencephalography [28].

The Trail Making Test is certainly among the most widely used tests [1] and perhaps the most widely researched. Various norms exist for the TMT (see [29]), with Tombaugh [30] providing the most recent comprehensive set of normative data. Different methods of analyzing and interpreting the data have also been proposed and used, including error analysis [13, 14, 3133], subtraction scores [13, 14, 34], and ratio scores [13, 14, 35].

Several different language versions of the test have been developed and reported, including Arabic [36], Chinese [37, 38], Greek [39], and Hebrew [40]. Numerous alternative versions of the TMT have been developed to address perceived shortcomings of the original TMT. For instance, the Symbol Trail Making Test [41] was developed to reduce the cultural confounds associated with the use of the Arabic numeral system and English alphabet in the original TMT. The Color Trails Test (CTT; [42]) was also developed to control for cultural confounds, although mixed results have been reported regarding whether the CTT is indeed analogous to the TMT [4345]. A version of the TMT for preschool children, the TRAILS-P, has also been reported [46].

Additionally, the Comprehensive Trail Making Test [47] was developed to control for perceived psychometric shortcomings of the original TMT (for a review see [48] and the Oral Trail Making Test (OTMT; [49]) was developed to reduce confounds associated with motor speed and visual search abilities, with research supporting the OTMT as an equivalent measure [50, 51]. Alternate forms of the TMT have also been developed to permit successive administrations [32, 52] and to assess the relative contributions of the requisite cognitive skills [53].

Delis et al. [54] stated that the continued development of new instrumentation for improving diagnosis and treatment is a critical undertaking in all health-related fields. Further, in their view, the field of neuropsychology has recognized the importance of continually striving to develop new clinical measures. Delis and colleagues developed the extensive Delis-Kaplan Executive Functioning System (D-KEFS; [55]) in the spirit of advancing the instrumentation of neuropsychology. The D-KEFS includes a Trail Making Test consisting of five separate conditions. The Number-Letter Switching condition involves a sequencing procedure similar to that of the classic TMT. The other four conditions are designed to assess the component processes involved in completing the Number-Letter Switching condition so that a precise analysis of the nature of any underlying dysfunction may be accomplished. Specifically, these additional components include Visual Scanning, Number Sequencing, Letter Sequencing, and Motor Speed.

Given that the TMT comprises numbers and letters and is a measure of executive functioning, it may preferentially involve the left frontal lobe. Although the literature is somewhat controversial, neuropsychological and neuroimaging studies seem to provide support for the sensitivity of the TMT to detect left frontal dysfunction [56]. Recent clinically oriented studies investigating frontal lobe involvement of the TMT using transcranial magnetic stimulation (TMS) and near-infrared spectroscopy (NIRS) also support this localization [57]. Performance on Part B of the TMT improved following repetitive TMS applied to the left dorsolateral frontal lobe [57].

With 9–13-year-old boys performing TMT Part B, Weber et al. [58] found a left lateralized increase in the prefrontal cortex in deoxygenated hemoglobin, an indicator of increased oxygen consumption. Moll et al. [59] demonstrated increased activation specific to the prefrontal cortex, especially the left prefrontal region, in healthy controls performing Part B of the TMT. Foster et al. [60] found a significant positive correlation between performance on Part A of the TMT and low beta (13–21 Hz) magnitude (μV) at the left lateral frontal lobe, but not at the right lateral frontal lobe. Finally, Stuss et al. [13, 14] found that patients with left dorsolateral frontal dysfunction evidenced more errors than patients with lesions in other areas of the frontal lobes and those patients with left frontal lesions were the slowest to complete the test.

Taken together, the possibility exists that the aforementioned tests are largely associated with left frontal lobe activity and the TMT, in particular, provides information concerning mental processing speed as well as cognitive flexibility and set-shifting. While some studies have found that deficits in visuomotor set-shifting are specific to the frontal lobe damage [61], others investigators have reported such impairment in patients with posterior brain lesions and widespread cerebral dysfunctions, including cerebellar damage [62] and Alzheimer disease [63]. Thus, it remains unclear whether impairments in visuomotor set-shifting are specific to frontal lobe dysfunction or whether they are non-specific and can result from more posterior or widespread brain dysfunction.

Compared to the collective knowledge we have regarding the cognitive roles of the left frontal lobe, relatively little is known about right frontal lobe contributions to executive functioning. This is likely a result of the dearth of tests that are associated with right frontal activity. The Ruff Figural Fluency Test (RFFT; [64]) is among the few standardized tests of right frontal lobe functioning and was listed as the 14th most commonly used instrument to assess executive functioning in the Rabin et al. [1] survey. The RFFT is known to be sensitive to right frontal lobe functioning [65, 66]; see also [67] pp. 297–298), as is a measure based on the RFFT [19].

The present investigation, with the same intent and spirit as that reported by Delis et al. [54], sought to develop and initially validate a measure of right frontal lobe functioning in an effort to attain a greater understanding of right frontal contributions to executive functioning and to advance the instrumentation of neuropsychology. To meet this objective, a version of the Trail Making Test comprising figures, as opposed to numbers and letters, was developed. The TMT was used as a model for the new test, referred to as the Figure Trail Making Test (FTMT), due to the high frequency of use, the volume of research conducted, and the ease of administration of the TMT. Given that the TMT and the FTMT are both measuring executive functioning, we felt that a moderate correlation would exist between these two measures. Specifically, we hypothesized that performance on the FTMT would be positively correlated with performance on the TMT, in terms of the total time required to complete each part of the tests, an additive and subtractive score, and a ratio score. The total time required to complete each part of the FTMT was also hypothesized to be negatively correlated with the total number of unique designs produced on the RFFT and positively correlated with the number of perseverative errors committed on the RFFT and the perseverative error ratio. We also sought to determine whether the TMT and the FTMT were measuring different constructs by conducting a factor analysis, anticipating that the two tests would load on separate factors.

Additionally, we sought to obtain neurophysiological evidence that the FTMT is sensitive to right frontal lobe functioning. Specifically, we used quantitative electroencephalography (QEEG) to measure electrical activity over the left and right frontal lobes. A previous investigation we conducted found that performance on Part A of the TMT was related to left frontal lobe (F7) low beta magnitude [60]. For the present investigation, we predicted that significant negative correlations would exist between performance on Parts A and B of the TMT and both low and high beta magnitude at the F7 electrode site. We further predicted that significant negative correlations would exist between performance on Parts C and D of the FTMT and both low and high beta magnitude at the F8 electrode site.

3 Discussion

The need for additional measures of executive functions and especially instruments which may provide implications relevant to cerebral laterality is clear. There remains especially a void for neuropsychological instruments using a TMT format, which may provide information pertaining to the functional integrity of the right frontal region. Consistent with the hypotheses forwarded, significant correlations were found between performance on the TMT and the FTMT, in terms of the raw time required to complete each respective part of the tests as well as the additive and subtraction scores. The fact that the ratio scores were not significantly correlated is not surprising given that research has generally indicated a lack of clinical utility for this score [13, 14, 35]. Given the present findings, the TMT and the FTMT appear to be equivalent measures of executive functioning. Further, the present findings not only suggest that the FTMT may be a measure of executive functioning but also extend the realm of executive functioning to the sequencing and set-shifting of nonverbal stimuli.

However, the finding of significant correlations between the TMT and the FTMT represents somewhat of a caveat in that the TMT has been found to be sensitive to left frontal lobe functioning [13, 14, 57, 59]. This would seem to suggest the possibility that the FTMT is also sensitive to left frontal lobe functioning. The possibility that FTMT is related to left frontal lobe functioning is tempered, though, by the fact that the many of the hypothesized correlations between performance on the RFFT and the FTMT were also significant. Performance on the RFFT is related to right frontal lobe functioning [65,66]. Thus, the significant correlations between the RFFT and the FTMT suggest that the FTMT may also be sensitive to right frontal lobe functioning. Additionally, it should also be noted that the TMT was not significantly correlated with performance on the RFFT, with the exception of the significant correlation between performance on the TMT Part A and the total number of unique designs produced on the RFFT. Taken together, the results suggest that the FTMT may be a measure of right frontal executive functioning.

Additional support for the sensitivity of the FTMT to right frontal lobe functioning is provided by the finding of a significant negative correlation between performance on Part D of the FTMT and high beta magnitude. We have previously used QEEG to provide neurophysiological validation of the RFFT [65] and the Rey Auditory Verbal Learning Test [70] and the present findings provide further support for the use of QEEG in validating neuropsychological tests. The lack of significant correlations between the TMT and either low or high beta magnitude may be related to a restricted range of scores on the TMT. As a whole, performance on the FTMT was more variable than performance on the TMT and this relatively restricted range for the TMT may have impacted the obtained correlations. Given the present findings, together with those of the Foster et al. [65, 70] investigations, further support is also provided for the use of EEG in establishing neurophysiological validation for neuropsychological tests.

The results from the factor analysis provide support for the contention that the FMT may be a measure of right frontal lobe activity and also provide initial discriminant validity for the FTMT. Specifically, Parts C and D of the FTMT were found to load on the same factor as the number of designs generated on the RFFT, although the time required to complete Part A of the TMT is also included. Additionally, the number of errors committed on Parts C and D of the FTMT comprises a single factor, separate from either the TMT or the RFFT. Although these results support the FTMT as a measure of nonverbal executive functioning, it would be helpful to conduct an additional factor analysis including additional measures of right frontal functioning, and perhaps other measures of right hemisphere functioning as marker variables.

We sought to develop a measure sensitive to right frontal lobe functioning due to the paucity of such tests and the potentially important uses that right frontal lobe tests may have clinically. Tests of right frontal lobe functioning may, for instance, be useful in identifying and distinguishing left versus right frontotemporal dementia (FTD). Research has indicated that FTD is associated with cerebral atrophy at the right dorsolateral frontal and left premotor cortices [71]. Fukui and Kertesz [72] found right frontal lobe volume reduction in FTD relative to Alzheimer’s disease and progressive nonfluent aphasia. Some have suggested that FTD should not be considered as a unitary disorder and that neuropsychological testing may aid in differentially diagnosing left versus right FTD [73].

Whereas right FTD has been associated with more errors and perseverative responses on the Wisconsin Card Sorting Test (WCST), left FTD has been associated with significantly worse performance on the Boston Naming Test (BNT) and the Stroop Color-Word test [73]. Razani et al. [74] also distinguished between left and right FTD in finding that left FTD performed worse on the BNT and the right FTD patients performed worse on the WCST. However, as noted earlier, the WCST has been associated with left frontal activity [9], right frontal activation [8], and bilateral frontal activation [7]. Further, patients with left frontal tumors perform worse than those with right frontal tumors [11].

Patients with FTD that predominantly involves the right frontotemporal region have behavioral and emotional abnormalities and those with predominantly left frontotemporal region damage have a loss of lexical semantic knowledge. Patients, in whom neural degeneration begins on the left side, often present to the clinicians at an early stage of the disease due to the presence of language abnormalities, but maintain their emotion processing abilities, being preserved the right anterior temporal lobe. However, as this disease advances, the disease may progress to the right frontotemporal regions. Tests sensitive to right frontal lobe functioning may be useful tools to identify in advance the course of the disease, providing immediate and specific treatments and informing the caregivers on the possible prospective frame of the disease.

A potentially more important use of tests sensitive to right frontal lobe functioning, though, may be in predicting dementia patients that will develop significant and disruptive behavioral deficits. Research has found that approximately 92 % of right-sided FTD patients exhibit socially undesirable behaviors as their initial symptom, as compared to only 11 % of left-sided FTD patients [75]. Behavioral deficits in FTD are associated with gray matter loss at the dorsomedial frontal region, particularly on the right [76].

Alzheimer’s disease (AD) is also often associated with significant behavioral disturbances. Even AD patients with mild dementia are noted to exhibit behavioral deficits such as delusions, hallucinations, agitation, dysphoria, anxiety, apathy, and irritability [77]. Indeed, Shimabukuro et al. [77] found that regardless of dementia severity, over half of all AD patients exhibited apathy, delusions, irritability, dysphoria, and anxiety. Delusions in AD patients are associated with relative right frontal hypoperfusion as indicated by SPECT imaging [78, 79]. Further, positron emission tomography (PET) has indicated that AD patients exhibiting delusions exhibit hypometabolism at the right superior dorsolateral frontal and right inferior frontal pole [80].

Although research clearly implicates right frontal lobe dysfunction in the expression of behavioral deficits, data from neuropsychological testing are not as clear. Negative symptoms in patients with AD and FTD have been related to measures of nonverbal and verbal executive functioning as well as verbal memory [81]. Positive symptoms, in contrast, were related to constructional skills and attention. However, Staff et al. [78] failed to dissociate patients with delusions from those without delusions based on neuropsychological test performance, despite significant differences existing in right frontal and limbic functioning as revealed by functional imaging. The inclusion of other measures of right frontal lobe functioning may result in improved neuropsychological differentiation of dementia patients with and without significant behavioral disturbances. Further, it may be possible to predict early in the disease process those patients that will ultimately develop behavioral disturbances with improved measures of right frontal functioning. Predicting those that may develop behavioral problems will permit earlier treatment and will provide the family with more time to prepare for the potential emergence of such difficulties. Certainly, future research needs to be conducted that incorporates measures of right and left frontal lobe functioning in regression analyses to determine the plausibility of such prediction.

Tests sensitive to right frontal lobe functioning may also be useful in identifying more subtle right frontal lobe dysfunction and the cognitive and behavioral changes that follow. The right frontal lobe mediates language melody or prosody and forms a cohesive discourse, interprets abstract communication in spoken and written languages, and interprets the inferred relationships involved in communications. Subtle difficulties in interpreting abstract meaning in communication, comprehending metaphors, and even understanding jokes that are often seen in right frontal lobe stroke patients may not be detected by the family and may also be under diagnosed by clinicians [82]. Further, patients with right frontal lobe lesions are generally more euphoric and unconcerned, often minimizing their symptoms [82] or denying the illness, which may delay referral to a clinician and diagnosis.

Attention deficit hyperactivity disorder (ADHD) is a neurological disease characterized by motor inhibition deficit, problems with cognitive flexibility, social disruption, and emotional disinhibition [83, 84]. Functional MRI studies reveal reduced right prefrontal activation during “frontal tasks,” such as go/no go [85], Stroop [86], and attention task performance [87]. The right frontal lobe deficit hypothesis is further supported by structural studies [88, 89]. Tests of right frontal lobe functioning may be useful in further characterizing the nature of this deficit and in specifying the likely hemispheric locus of dysfunction.

To summarize, we feel that right frontal lobe functioning has been relatively neglected in neuropsychological assessment and that many uses for such tests exist. Our intent was to develop a test purportedly sensitive to right frontal functioning that would be easy and quick to administer in a clinical setting. However, we are certainly not meaning to assert that our FTMT would be applicable in all the aforementioned conditions. Additional research should be conducted to determine the precise clinical utility of the FTMT.

Further validation of the FTMT should also be undertaken. Establishing convergent validation may involve correlating tests measuring the same domain, such as executive functioning. This was initially accomplished in the present investigation through the significant correlations between the TMT and the FTMT. Additionally, convergent validation may also involve correlating tests that purportedly measure the same region of the brain. This was also initially accomplished in the present investigation through the significant correlations between the FTMT and the RFFT. However, additional convergent validation certainly needs to be obtained, as well as validation using patient populations and neurophysiological validation.

We are currently collecting data that hopefully will provide neurophysiological validation of the FTMT. Certainly, though, it is hoped that the present investigation will not only stimulate further research seeking to validate the FTMT and provide more comprehensive normative data, but also stimulate research investigating whether the FTMT or other measures of right frontal lobe functioning may be used to predict patients that will develop behavioral disturbances.

 

World’s Greatest Literature Reveals Multifractals, Cascades of Consciousness

http://www.scientificcomputing.com/news/2016/01/worlds-greatest-literature-reveals-multifractals-cascades-consciousness

http://www.scientificcomputing.com/sites/scientificcomputing.com/files/Worlds_Greatest_Literature_Reveals_Multifractals_Cascades_of_Consciousness_440.jpg

Multifractal analysis of Finnegan’s Wake by James Joyce. The ideal shape of the graph is virtually indistinguishable from the results for purely mathematical multifractals. The horizontal axis represents the degree of singularity, and the vertical axis shows the spectrum of singularity. Courtesy of IFJ PAN

Arthur Conan Doyle, Charles Dickens, James Joyce, William Shakespeare and JRR Tolkien. Regardless of the language they were working in, some of the world’s greatest writers appear to be, in some respects, constructing fractals. Statistical analysis, however, revealed something even more intriguing. The composition of works from within a particular genre was characterized by the exceptional dynamics of a cascading (avalanche) narrative structure. This type of narrative turns out to be multifractal. That is, fractals of fractals are created.

As far as many bookworms are concerned, advanced equations and graphs are the last things which would hold their interest, but there’s no escape from the math. Physicists from the Institute of Nuclear Physics of the Polish Academy of Sciences (IFJ) in Cracow, Poland, performed a detailed statistical analysis of more than one hundred famous works of world literature, written in several languages and representing various literary genres. The books, tested for revealing correlations in variations of sentence length, proved to be governed by the dynamics of a cascade. This means that the construction of these books is, in fact, a fractal. In the case of several works, their mathematical complexity proved to be exceptional, comparable to the structure of complex mathematical objects considered to be multifractal. Interestingly, in the analyzed pool of all the works, one genre turned out to be exceptionally multifractal in nature.

Fractals are self-similar mathematical objects: when we begin to expand one fragment or another, what eventually emerges is a structure that resembles the original object. Typical fractals, especially those widely known as the Sierpinski triangle and the Mandelbrot set, are monofractals, meaning that the pace of enlargement in any place of a fractal is the same, linear: if they at some point were rescaled x number of times to reveal a structure similar to the original, the same increase in another place would also reveal a similar structure.

Multifractals are more highly advanced mathematical structures: fractals of fractals. They arise from fractals ‘interwoven’ with each other in an appropriate manner and in appropriate proportions. Multifractals are not simply the sum of fractals and cannot be divided to return back to their original components, because the way they weave is fractal in nature. The result is that, in order to see a structure similar to the original, different portions of a multifractal need to expand at different rates. A multifractal is, therefore, non-linear in nature.

“Analyses on multiple scales, carried out using fractals, allow us to neatly grasp information on correlations among data at various levels of complexity of tested systems. As a result, they point to the hierarchical organization of phenomena and structures found in nature. So, we can expect natural language, which represents a major evolutionary leap of the natural world, to show such correlations as well. Their existence in literary works, however, had not yet been convincingly documented. Meanwhile, it turned out that, when you look at these works from the proper perspective, these correlations appear to be not only common, but in some works they take on a particularly sophisticated mathematical complexity,” says Professor Stanislaw Drozdz, IFJ PAN, Cracow University of Technology.

The study involved 113 literary works written in English, French, German, Italian, Polish, Russian and Spanish by such famous figures as Honore de Balzac, Arthur Conan Doyle, Julio Cortazar, Charles Dickens, Fyodor Dostoevsky, Alexandre Dumas, Umberto Eco, George Elliot, Victor Hugo, James Joyce, Thomas Mann, Marcel Proust, Wladyslaw Reymont, William Shakespeare, Henryk Sienkiewicz, JRR Tolkien, Leo Tolstoy and Virginia Woolf, among others. The selected works were no less than 5,000 sentences long, in order to ensure statistical reliability.

To convert the texts to numerical sequences, sentence length was measured by the number of words (an alternative method of counting characters in the sentence turned out to have no major impact on the conclusions). The dependences were then searched for in the data — beginning with the simplest, i.e. linear. This is the posited question: if a sentence of a given length is x times longer than the sentences of different lengths, is the same aspect ratio preserved when looking at sentences respectively longer or shorter?

“All of the examined works showed self-similarity in terms of organization of the lengths of sentences. Some were more expressive — here The Ambassadors by Henry James stood out — while others to far less of an extreme, as in the case of the French seventeenth-century romance Artamene ou le Grand Cyrus. However, correlations were evident and, therefore, these texts were the construction of a fractal,” comments Dr. Pawel Oswiecimka (IFJ PAN), who also noted that fractality of a literary text will, in practice, never be as perfect as in the world of mathematics. It is possible to magnify mathematical fractals up to infinity, while the number of sentences in each book is finite and, at a certain stage of scaling, there will always be a cut-off in the form of the end of the dataset.

Things took a particularly interesting turn when physicists from IFJ PAN began tracking non-linear dependence, which in most of the studied works was present to a slight or moderate degree. However, more than a dozen works revealed a very clear multifractal structure, and almost all of these proved to be representative of one genre, that of stream of consciousness. The only exception was the Bible, specifically the Old Testament, which has, so far, never been associated with this literary genre.

“The absolute record in terms of multifractality turned out to be Finnegan’s Wakeby James Joyce. The results of our analysis of this text are virtually indistinguishable from ideal, purely mathematical multifractals,” says Drozdz.

The most multifractal works also included A Heartbreaking Work of Staggering Genius by Dave Eggers, Rayuela by Julio Cortazar, The US Trilogy by John Dos Passos, The Waves by Virginia Woolf, 2666 by Roberto Bolano, and Joyce’sUlysses. At the same time, a lot of works usually regarded as stream of consciousness turned out to show little correlation to multifractality, as it was hardly noticeable in books such as Atlas Shrugged by Ayn Rand and A la recherche du temps perdu by Marcel Proust.

“It is not entirely clear whether stream of consciousness writing actually reveals the deeper qualities of our consciousness, or rather the imagination of the writers. It is hardly surprising that ascribing a work to a particular genre is, for whatever reason, sometimes subjective. We see, moreover, the possibility of an interesting application of our methodology: it may someday help in a more objective assignment of books to one genre or another,” notes Drozdz.

Multifractal analyses of literary texts carried out by the IFJ PAN have been published in Information Sciences, the journal of computer science. The publication has undergone rigorous verification: given the interdisciplinary nature of the subject, editors immediately appointed up to six reviewers.

Citation: “Quantifying origin and character of long-range correlations in narrative texts” S. Drożdż, P. Oświęcimka, A. Kulig, J. Kwapień, K. Bazarnik, I. Grabska-Gradzińska, J. Rybicki, M. Stanuszek; Information Sciences, vol. 331, 32–44, 20 February 2016; DOI: 10.1016/j.ins.2015.10.023

 

New Quantum Approach to Big Data could make Impossibly Complex Problems Solvable

David L. Chandler, MIT

http://www.scientificcomputing.com/news/2016/01/new-quantum-approach-big-data-could-make-impossibly-complex-problems-solvable

 

http://www.scientificcomputing.com/sites/scientificcomputing.com/files/New_Quantum_Approach_to_Big_Data_could_make_Impossibly_Complex_Problems_Solvable_440.jpg

This diagram demonstrates the simplified results that can be obtained by using quantum analysis on enormous, complex sets of data. Shown here are the connections between different regions of the brain in a control subject (left) and a subject under the influence of the psychedelic compound psilocybin (right). This demonstrates a dramatic increase in connectivity, which explains some of the drug’s effects (such as “hearing” colors or “seeing” smells). Such an analysis, involving billions of brain cells, would be too complex for conventional techniques, but could be handled easily by the new quantum approach, the researchers say. Courtesy of the researchers

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage or understand.

Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California.

The team describes their theoretical proposal this week in the journal Nature Communications. Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.

In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”

It doesn’t matter what kind of dataset is being analyzed, he says. The topological approach to looking for connections and holes “works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.”

Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive,” Lloyd says. “This is where quantum mechanics kicks in.” The new quantum-based approach, he says, could exponentially speed up such calculations.

Lloyd offers an example to illustrate that potential speedup: If you have a dataset with 300 points, a conventional approach to analyzing all the topological features in that system would require “a computer the size of the universe,” he says. That is, it would take 2300 (two to the 300th power) processing units — approximately the number of all the particles in the universe. In other words, the problem is simply not solvable in that way.

“That’s where our algorithm kicks in,” he says. Solving the same problem with the new system, using a quantum computer, would require just 300 quantum bits — and a device this size may be achieved in the next few years, according to Lloyd.

“Our algorithm shows that you don’t need a big quantum computer to kick some serious topological butt,” he says.

There are many important kinds of huge datasets where the quantum-topological approach could be useful, Lloyd says, for example understanding interconnections in the brain. “By applying topological analysis to datasets gleaned by electroencephalography or functional MRI, you can reveal the complex connectivity and topology of the sequences of firing neurons that underlie our thought processes,” he says.

The same approach could be used for analyzing many other kinds of information. “You could apply it to the world’s economy, or to social networks, or almost any system that involves long-range transport of goods or information,” Lloyd says. But the limits of classical computation have prevented such approaches from being applied before.

While this work is theoretical, “experimentalists have already contacted us about trying prototypes,” he says. “You could find the topology of simple structures on a very simple quantum computer. People are trying proof-of-concept experiments.”

Ignacio Cirac, a professor at the Max Planck Institute of Quantum Optics in Munich, Germany, who was not involved in this research, calls it “a very original idea, and I think that it has a great potential.” He adds “I guess that it has to be further developed and adapted to particular problems. In any case, I think that this is top-quality research.”

The team also included Silvano Garnerone of the University of Waterloo in Ontario, Canada, and Paolo Zanardi of the Center for Quantum Information Science and Technology at the University of Southern California. The work was supported by the Army Research Office, Air Force Office of Scientific Research, Defense Advanced Research Projects Agency, Multidisciplinary University Research Initiative of the Office of Naval Research, and the National Science Foundation.

 

Beyond Chess: Computer Beats Human in Ancient Chinese Game

http://www.rdmag.com/news/2016/01/beyond-chess-computer-beats-human-ancient-chinese-game

http://www.rdmag.com/sites/rdmag.com/files/rd1601_chess.jpg

A player places a black stone while his opponent waits to place a white one as they play Go, a game of strategy, in the Seattle Go Center, Tuesday, April 30, 2002. The game, which originated in China more than 2,500 years ago, involves two players who take turns putting markers on a grid. The object is to surround more area on the board with the markers than one’s opponent, as well as capturing the opponent’s pieces by surrounding them. A paper released Wednesday, Jan. 27, 2016 describes how a computer program has beaten a human master at the complex board game, marking significant advance for development of artificial intelligence. (AP Photo/Cheryl Hatch)

 

A computer program has beaten a human champion at the ancient Chinese board game Go, marking a significant advance for development of artificial intelligence.

The program had taught itself how to win, and its developers say its learning strategy may someday let computers help solve real-world problems like making medical diagnoses and pursuing scientific research.

The program and its victory are described in a paper released Wednesday by the journal Nature.

Computers previously have surpassed humans for other games, including chess, checkers and backgammon. But among classic games, Go has long been viewed as the most challenging for artificial intelligence to master.

Go, which originated in China more than 2,500 years ago, involves two players who take turns putting markers on a checkerboard-like grid. The object is to surround more area on the board with the markers than one’s opponent, as well as capturing the opponent’s pieces by surrounding them.

While the rules are simple, playing it well is not. It’s “probably the most complex game ever devised by humans,” Dennis Hassabis of Google DeepMind in London, one of the study authors, told reporters Tuesday.

The new program, AlphaGo, defeated the European champion in all five games of a match in October, the Nature paper reports.

In March, AlphaGo will face legendary player Lee Sedol in Seoul, South Korea, for a $1 million prize, Hassabis said.

Martin Mueller, a computing science professor at the University of Alberta in Canada who has worked on Go programs for 30 years but didn’t participate in AlphaGo, said the new program “is really a big step up from everything else we’ve seen…. It’s a very, very impressive piece of work.”

 

 

Biological Origin of Schizophrenia

Excessive ‘pruning’ of connections between neurons in brain predisposes to disease

http://hms.harvard.edu/sites/default/files/uploads/news/McCarroll_C4_600x400.jpg

Imaging studies showed C4 (in green) located at the synapses of primary human neurons. Image: Heather de Rivera, McCarroll lab

 PAUL GOLDSMITH    http://hms.harvard.edu/news/biological-origin-schizophrenia

The risk of schizophrenia increases if a person inherits specific variants in a gene related to “synaptic pruning”—the elimination of connections between neurons—according to a study from Harvard Medical School, the Broad Institute and Boston Children’s Hospital. The findings were based on genetic analysis of nearly 65,000 people.

The study represents the first time that the origin of this psychiatric disease has been causally linked to specific gene variants and a biological process.

Get more HMS news here

It also helps explain two decades-old observations: synaptic pruning is particularly active during adolescence, which is the typical period of onset for symptoms of schizophrenia, and the brains of schizophrenic patients tend to show fewer connections between neurons.

The gene, complement component 4 (C4), plays a well-known role in the immune system. It has now been shown to also play a key role in brain development and schizophrenia risk. The insight may allow future therapeutic strategies to be directed at the disorder’s roots, rather than just its symptoms.

The study, which appears online Jan. 27 in Nature, was led by HMS researchers at the Broad Institute’s Stanley Center for Psychiatric Research and Boston Children’s. They include senior author Steven McCarroll, HMS associate professor of genetics and director of genetics for the Stanley Center; Beth Stevens, HMS assistant professor of neurology at Boston Children’s and institute member at the Broad; Michael Carroll, HMS professor of pediatrics at Boston Children’s; and first author Aswin Sekar, an MD-PhD student at HMS.

The study has the potential to reinvigorate translational research on a debilitating disease. Schizophrenia afflicts approximately 1 percent people worldwide and is characterized by hallucinations, emotional withdrawal and a decline in cognitive function. These symptoms most frequently begin in patients when they are teenagers or young adults.

“These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

First described more than 130 years ago, schizophrenia lacks highly effective treatments and has seen few biological or medical breakthroughs over the past half-century.

In the summer of 2014, an international consortium led by researchers at the Stanley Center identified more than 100 regions in the human genome that carry risk factors for schizophrenia.

The newly published study now reports the discovery of the specific gene underlying the strongest of these risk factors and links it to a specific biological process in the brain.

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” said McCarroll. “The human genome is providing a powerful new way in to this disease. Understanding these genetic effects on risk is a way of prying open that black box, peering inside and starting to see actual biological mechanisms.”

“This study marks a crucial turning point in the fight against mental illness,” said Bruce Cuthbert, acting director of the National Institute of Mental Health. “Because the molecular origins of psychiatric diseases are little-understood, efforts by pharmaceutical companies to pursue new therapeutics are few and far between. This study changes the game. Thanks to this genetic breakthrough we can finally see the potential for clinical tests, early detection, new treatments and even prevention.”

The path to discovery

The discovery involved the collection of DNA from more than 100,000 people, detailed analysis of complex genetic variation in more than 65,000 human genomes, development of an innovative analytical strategy, examination of postmortem brain samples from hundreds of people and the use of animal models to show that a protein from the immune system also plays a previously unsuspected role in the brain.

Over the past five years, Stanley Center geneticists and collaborators around the world collected more than 100,000 human DNA samples from 30 different countries to locate regions of the human genome harboring genetic variants that increase the risk of schizophrenia. The strongest signal by far was on chromosome 6, in a region of DNA long associated with infectious disease. This caused some observers to suggest that schizophrenia might be triggered by an infectious agent. But researchers had no idea which of the hundreds of genes in the region was actually responsible or how it acted.

Based on analyses of the genetic data, McCarroll and Sekar focused on a region containing the C4 gene. Unlike most genes, C4 has a high degree of structural variability. Different people have different numbers of copies and different types of the gene.

McCarroll and Sekar developed a new molecular technique to characterize the C4 gene structure in human DNA samples. They also measured C4 gene activity in nearly 700 post-mortem brain samples.

They found that the C4 gene structure (DNA) could predict the C4 gene activity (RNA) in each person’s brain. They then used this information to infer C4 gene activity from genome data from 65,000 people with and without schizophrenia.

These data revealed a striking correlation. People who had particular structural forms of the C4 gene showed higher expression of that gene and, in turn, had a higher risk of developing schizophrenia.

Connecting cause and effect through neuroscience

But how exactly does C4—a protein known to mark infectious microbes for destruction by immune cells—affect the risk of schizophrenia?

Answering this question required synthesizing genetics and neurobiology.

Stevens, a recent recipient of a MacArthur Foundation “genius grant,” had found that other complement proteins in the immune system also played a role in brain development. These results came from studying an experimental model of synaptic pruning in the mouse visual system.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics.”

Carroll had long studied C4 for its role in immune disease, and developed mice with different numbers of copies of C4.

The three labs set out to study the role of C4 in the brain.

They found that C4 played a key role in pruning synapses during maturation of the brain. In particular, they found that C4 was necessary for another protein—a complement component called C3—to be deposited onto synapses as a signal that the synapses should be pruned. The data also suggested that the more C4 activity an animal had, the more synapses were eliminated in its brain at a key time in development.

The findings may help explain the longstanding mystery of why the brains of people with schizophrenia tend to have a thinner cerebral cortex (the brain’s outer layer, responsible for many aspects of cognition) with fewer synapses than do brains of unaffected individuals. The work may also help explain why the onset of schizophrenia symptoms tends to occur in late adolescence.

The human brain normally undergoes widespread synapse pruning during adolescence, especially in the cerebral cortex. Excessive synaptic pruning during adolescence and early adulthood, due to increased complement (C4) activity, could lead to the cognitive symptoms seen in schizophrenia.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” Stevens said.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics,” she said. “We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

Opening a path toward early detection and potential therapies

Beyond providing the first insights into the biological origins of schizophrenia, the work raises the possibility that therapies might someday be developed that could turn down the level of synaptic pruning in people who show early symptoms of schizophrenia.

This would be a dramatically different approach from current medical therapies, which address only a specific symptom of schizophrenia—psychosis—rather than the disorder’s root causes, and which do not stop cognitive decline or other symptoms of the illness.

The researchers emphasize that therapies based on these findings are still years down the road. Still, the fact that much is already known about the role of complement proteins in the immune system means that researchers can tap into a wealth of existing knowledge to identify possible therapeutic approaches. For example, anticomplement drugs are already under development for treating other diseases.

“In this area of science, our dream has been to find disease mechanisms that lead to new kinds of treatments,” said McCarroll. “These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

This work was supported by the Broad Institute’s Stanley Center for Psychiatric Research and by the National Institutes of Health (grants U01MH105641, R01MH077139 and T32GM007753).

Adapted from a Broad Institute news release.

 

Scientists open the ‘black box’ of schizophrenia with dramatic genetic discovery

Amy Ellis Nutt    https://www.washingtonpost.com/news/speaking-of-science/wp/2016/01/27/scientists-open-the-black-box-of-schizophrenia-with-dramatic-genetic-finding/

 

Scientists Prune Away Schizophrenia’s Hidden Genetic Mechanisms

http://www.genengnews.com/gen-news-highlights/scientists-prune-away-schizophrenia-s-hidden-genetic-mechanisms/81252297/

https://youtu.be/s0y4equOTLg

A landmark study has revealed that a person’s risk of schizophrenia is increased if they inherit specific variants in a gene related to “synaptic pruning”—the elimination of connections between neurons. The findings represent the first time that the origin of this devastating psychiatric disease has been causally linked to specific gene variants and a biological process.

http://www.genengnews.com/Media/images/GENHighlight/thumb_107629_web2209513618.jpg

The site in Chromosome 6 harboring the gene C4 towers far above other risk-associated areas on schizophrenia’s genomic “skyline,” marking its strongest known genetic influence. The new study is the first to explain how specific gene versions work biologically to confer schizophrenia risk. [Psychiatric Genomics Consortium]

  • A new study by researchers at the Broad Institute’s Stanley Center for Psychiatric Research, Harvard Medical School, and Boston Children’s Hospital genetically analyzed nearly 65,000 people and revealed that an individual’s risk of schizophrenia is increased if they inherited distinct variants in a gene related to “synaptic pruning”—the elimination of connections between neurons. This new data represents the first time that the origin of this psychiatric disease has been causally linked to particular gene variants and a biological process.

The investigators discovered that versions of a gene commonly thought to be involved in immune function might trigger a runaway pruning of an adolescent brain’s still-maturing communications infrastructure. The researchers described a scenario where patients with schizophrenia show fewer such connections between neurons or synapses.

“Normally, pruning gets rid of excess connections we no longer need, streamlining our brain for optimal performance, but too much pruning can impair mental function,” explained Thomas Lehner, Ph.D., director of the Office of Genomics Research Coordination at the NIH’s National Institute of Mental Health (NIMH), which co-funded the study along with the Stanley Center for Psychiatric Research at the Broad Institute and other NIH components. “It could help explain schizophrenia’s delayed age-of-onset of symptoms in late adolescence and early adulthood and shrinkage of the brain’s working tissue. Interventions that put the brakes on this pruning process-gone-awry could prove transformative.”

The gene the research team called into question, dubbed C4 (complement component 4), was associated with the largest risk for the disorder. C4’s role represents some of the most compelling evidence, to date, linking specific gene versions to a biological process that could cause at least some cases of the illness.

The findings from this study were published recently in Nature through an article entitled “Schizophrenia risk from complex variation of complement component 4.”

“Since schizophrenia was first described over a century ago, its underlying biology has been a black box, in part because it has been virtually impossible to model the disorder in cells or animals,” noted senior study author Steven McCarroll, Ph.D., director of genetics for the Stanley Center and an associate professor of genetics at Harvard Medical School. “The human genome is providing a powerful new way into this disease. Understanding these genetic effects on risk is a way of prying open that block box, peering inside and starting to see actual biological mechanisms.”

Dr. McCarroll and his colleagues found that a stretch of chromosome 6 encompassing several genes known to be involved in immune function emerged as the strongest signal associated with schizophrenia risk in genome-wide analyses. Yet conventional genetics failed to turn up any specific gene versions there that were linked to schizophrenia.

In order to uncover how the immune-related site confers risk for the mental disorder, the scientists mounted a search for cryptic genetic influences that might generate unconventional signals. C4, a gene with known roles in immunity, emerged as a prime suspect because it is unusually variable across individuals.

Upon further investigation into the complexities of how such structural variation relates to the gene’s level of expression and how that, in turn, might link to schizophrenia, the team discovered structurally distinct versions that affect expression of two main forms of the gene within the brain. The more a version resulted in expression of one of the forms, called C4A, the more it was associated with schizophrenia. The greater number of copies an individual had of the suspect versions, the more C4 switched on and the higher their risk of developing schizophrenia. Furthermore, the C4 protein turned out to be most prevalent within the cellular machinery that supports connections between neurons.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” remarked co-author Beth Stevens, Ph.D. a neuroscientist and assistant professor of neurology at Boston Children’s Hospital and institute member at the Broad. “This discovery enriches our understanding of the complement system in brain development and disease, and we could not have made that leap without the genetics. We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

“This study marks a crucial turning point in the fight against mental illness. It changes the game,” added acting NIMH director Bruce Cuthbert, Ph.D. “Because the molecular origins of psychiatric diseases are little-understood, efforts by pharmaceutical companies to pursue new therapeutics are few and far between. This study changes the game. Thanks to this genetic breakthrough, we can finally see the potential for clinical tests, early detection, new treatments, and even prevention.”

 

Connecting cause and effect through neuroscience

But how exactly does C4—a protein known to mark infectious microbes for destruction by immune cells—affect the risk of schizophrenia?

Answering this question required synthesizing genetics and neurobiology.

Stevens, a recent recipient of a MacArthur Foundation “genius grant,” had found that other complement proteins in the immune system also played a role in brain development. These results came from studying an experimental model of synaptic pruning in the mouse visual system.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics.”

Carroll had long studied C4 for its role in immune disease, and developed mice with different numbers of copies of C4.

The three labs set out to study the role of C4 in the brain.

They found that C4 played a key role in pruning synapses during maturation of the brain. In particular, they found that C4 was necessary for another protein—a complement component called C3—to be deposited onto synapses as a signal that the synapses should be pruned. The data also suggested that the more C4 activity an animal had, the more synapses were eliminated in its brain at a key time in development.

The findings may help explain the longstanding mystery of why the brains of people with schizophrenia tend to have a thinner cerebral cortex (the brain’s outer layer, responsible for many aspects of cognition) with fewer synapses than do brains of unaffected individuals. The work may also help explain why the onset of schizophrenia symptoms tends to occur in late adolescence.

The human brain normally undergoes widespread synapse pruning during adolescence, especially in the cerebral cortex. Excessive synaptic pruning during adolescence and early adulthood, due to increased complement (C4) activity, could lead to the cognitive symptoms seen in schizophrenia.

“Once we had the genetic findings in front of us we started thinking about the possibility that complement molecules are excessively tagging synapses in the developing brain,” Stevens said.

“This discovery enriches our understanding of the complement system in brain development and in disease, and we could not have made that leap without the genetics,” she said. “We’re far from having a treatment based on this, but it’s exciting to think that one day we might be able to turn down the pruning process in some individuals and decrease their risk.”

Opening a path toward early detection and potential therapies

Beyond providing the first insights into the biological origins of schizophrenia, the work raises the possibility that therapies might someday be developed that could turn down the level of synaptic pruning in people who show early symptoms of schizophrenia.

This would be a dramatically different approach from current medical therapies, which address only a specific symptom of schizophrenia—psychosis—rather than the disorder’s root causes, and which do not stop cognitive decline or other symptoms of the illness.

The researchers emphasize that therapies based on these findings are still years down the road. Still, the fact that much is already known about the role of complement proteins in the immune system means that researchers can tap into a wealth of existing knowledge to identify possible therapeutic approaches. For example, anticomplement drugs are already under development for treating other diseases.

“In this area of science, our dream has been to find disease mechanisms that lead to new kinds of treatments,” said McCarroll. “These results show that it is possible to go from genetic data to a new way of thinking about how a disease develops—something that has been greatly needed.”

This work was supported by the Broad Institute’s Stanley Center for Psychiatric Research and by the National Institutes of Health (grants U01MH105641, R01MH077139 and T32GM007753).

Adapted from a Broad Institute news release.

 

 

https://img.washingtonpost.com/wp-apps/imrs.php?src=https://img.washingtonpost.com/rf/image_908w/2010-2019/WashingtonPost/2011/09/27/Production/Sunday/SunBiz/Images/mental2b.jpg&w=1484

This post has been updated.

For the first time, scientists have pinned down a molecular process in the brain that helps to trigger schizophrenia. The researchers involved in the landmark study, which was published Wednesday in the journal Nature, say the discovery of this new genetic pathway probably reveals what goes wrong neurologically in a young person diagnosed with the devastating disorder.

The study marks a watershed moment, with the potential for early detection and new treatments that were unthinkable just a year ago, according to Steven Hyman, director of the Stanley Center for Psychiatric Research at the Broad Institute at MIT. Hyman, a former director of the National Institute of Mental Health, calls it “the most significant mechanistic study about schizophrenia ever.”

“I’m a crusty, old, curmudgeonly skeptic,” he said. “But I’m almost giddy about these findings.”

The researchers, chiefly from the Broad Institute, Harvard Medical School and Boston Children’s Hospital, found that a person’s risk of schizophrenia is dramatically increased if they inherit variants of a gene important to “synaptic pruning” — the healthy reduction during adolescence of brain cell connections that are no longer needed.

[Schizophrenic patients have different oral bacteria than non-mentally ill individuals]

In patients with schizophrenia, a variation in a single position in the DNA sequence marks too many synapses for removal and that pruning goes out of control. The result is an abnormal loss of gray matter.

The genes involved coat the neurons with “eat-me signals,” said study co-author Beth Stevens, a neuroscientist at Children’s Hospital and Broad. “They are tagging too many synapses. And they’re gobbled up.

The Institute’s founding director, Eric Lander, believes the research represents an astonishing breakthrough. “It’s taking what has been a black box…and letting us peek inside for the first time. And that is amazingly consequential,” he said.

The timeline for this discovery has been relatively fast. In July 2014, Broad researchers published the results of the largest genomic study on the disorder and found more than 100 genetic locations linked to schizophrenia. Based on that research, Harvard and Broad geneticist Steven McCarroll analyzed data from about 29,000 schizophrenia cases, 36,000 controls and 700 post mortem brains. The information was drawn from dozens of studies performed in 22 countries, all of which contribute to the worldwide database called the Psychiatric Genomics Consortium.

[Influential government-appointed panel recommends depression screening for everyone]

One area in particular, when graphed, showed the strongest association. It was dubbed the “Manhattan plot” for its resemblance to New York City’s towering buildings. The highest peak was on chromosome 6, where McCarroll’s team discovered the gene variant. C4 was “a dark corner of the human genome,” he said, an area difficult to decipher because of its “astonishing level” of diversity.

C4 and numerous other genes reside in a region of chromosome 6 involved in the immune system, which clears out pathogens and similar cellular debris from the brain. The study’s researchers found that one of C4’s variants, C4A, was most associated with a risk for schizophrenia.

More than 25 million people around the globe are affected by schizophrenia, according to the World Health Organization, including 2 million to 3 million Americans. Highly hereditable, it is one of the most severe mental illnesses, with an annual economic burden in this country of tens of billions of dollars.

“This paper is really exciting,” said Jacqueline Feldman, associate medical director of the National Alliance on Mental Illness. “We as scientists and physicians have to temper our enthusiasm because we’ve gone down this path before. But this is profoundly interesting.”

There have been hundreds of theories about schizophrenia over the years, but one of the enduring mysteries has been how three prominent findings related to each other: the apparent involvement of immune molecules, the disorder’s typical onset in late adolescence and early adulthood, and the thinning of gray matter seen in autopsies of patients.

[A low-tech way to help treat young schizophrenic patients]

“The thing about this result,” said McCarroll, the lead author, ” it makes a lot of other things understandable. To have a result to connect to these observations and to have a molecule and strong level of genetic evidence from tens of thousands of research participants, I think that combination sets [this study] apart.”

The authors stressed that their findings, which combine basic science with large-scale analysis of genetic studies, depended on an unusual level of cooperation among experts in genetics, molecular biology, developmental neurobiology and immunology.

“This could not have been done five years ago,” said Hyman. “This required the ability to reference a very large dataset . …When I was [NIMH] director, people really resisted collaborating. They were still in the Pharaoh era. They wanted to be buried with their data.”

The study offers a new approach to schizophrenia research, which has been largely stagnant for decades.  Most psychiatric drugs seek to interrupt psychotic thinking, but experts agree that psychosis is just a single symptom — and a late-occurring one at that. One of the chief difficulties for psychiatric researchers, setting them apart from most other medical investigators, is that they can’t cut schizophrenia out of the brain and look at it under a microscope. Nor are there any good animal models.

All that now has changed, according to Stevens. “We now have a strong molecular handle, a pathway and a gene, to develop better models,” he said.

Which isn’t to say a cure is right around the corner.

“This is the first exciting  clue, maybe even the most important we’ll ever have, but it will be decades” before a true cure is found,” Hyman said. “Hope is a wonderful thing. False promise is not.”

Insight Pharma Report

Three neurodegenerative disorders that are heavily focused on in this report include: Alzheimer’s Disease/Mild Cognitive Impairment, Parkinson’s Disease, and Amyotrophic Lateral Sclerosis. Part II of the report will include all three of these disorders, highlighting specifics including background, history, and development of the disease. Deeper into the chapters, the report will unfold biomarkers under investigation, genetic targets, and an analysis of multiple studies investigating these elements.

Experts interviewed in these chapters include:

  • Dr. Jens Wendland, Head of Neuroscience Genetics, Precision Medicine, Clinical Research, Pfizer Worldwide R&D
  • Dr. Howard J. Federoff, Executive Vice President for Health Sciences, Georgetown University
  • Dr. Andrew West, Associate Professor of Neurology and Neurobiology and Co-Director, Center for Neurodegeneration and Experimental Therapeutics
  • Dr. Merit Ester Cudkowicz, Chief of Neurology at Massachusetts General Hospital

Part III of the report makes a shift from neurobiomarkers to neurodiagnostics. This section highlights several diagnostics in play and in the making from a number of companies, identifying company strategies, research underway, hypotheses, and institution goals. Elite researchers and companies highlighted in this part include:

  • Dr. Xuemei Huang, Professor and Vice Chair, Department of Neurology; Professor of Neurosurgery, Radiology,  Pharmacology, and Kinesiology Director; Hershey Brain Analysis Research Laboratory for Neurodegenerative Disorders, Penn State University-Milton, S. Hershey Medical Center Department of Neurology
  • Dr. Andreas Jeromin, CSO and President of Atlantic Biomarkers
  • Julien Bradley, Senior Director, Sales & Marketing, Quanterix
  • Dr. Scott Marshall, Head of Bioanalytics, and Dr. Jared Kohler, Head of Biomarker Statistics, BioStat Solutions, Inc.

Further analysis appears in Part IV. This section includes a survey exclusively conducted for this report. With over 30 figures and graphics and an in depth analysis, this part features insight into targets under investigation, challenges, advantages, and desired features of future diagnostic applications. Furthermore, the survey covers more than just the featured neurodegenerative disorders in this report, expanding to Multiple Sclerosis and Huntington’s Disease.

Finally, Insight Pharma Reports concludes this report with clinical trial and pipeline data featuring targets and products from over 300 companies working in Alzheimer’s Disease, Parkinson’s Disease and Amyotrophic Lateral Sclerosis.

 

Epigenome Tapped to Understand Rise of Subtype of Brain Medulloblastoma

http://www.genengnews.com/gen-news-highlights/epigenome-tapped-to-understand-rise-of-subtype-of-brain-medulloblastoma/81252294/

Scientists have identified the cells that likely give rise to the brain tumor subtype Group 4 medulloblastoma. [V. Yakobchuk/ Fotolia]

http://www.genengnews.com/Media/images/GENHighlight/thumb_Jan_28_2016_Fotolia_6761569_ColorfulBrain_4412824411.jpg

An international team of scientists say they have identified the cells that likely give rise to the brain tumor subtype Group 4 medulloblastoma. The believe their study (“Active medulloblastoma enhancers reveal subgroup-specific cellular origins”), published in Nature, removes a barrier to developing more effective targeted therapies against the brain tumor’s most common subtype.

Medulloblastoma occurs in infants, children, and adults, but it is the most common malignant pediatric brain tumor. The disease includes four biologically and clinically distinct subtypes, of which Group 4 is the most common. In children, about half of medulloblastoma patients are of the Group 4 subtype. Efforts to improve patient outcomes, particularly for those with high-risk Group 4 medulloblastoma, have been hampered by the lack of accurate animal models.

Evidence from this study suggests Group 4 tumors begin in neural stem cells that are born in a region of the developing cerebellum called the upper rhomic lip (uRL), according to the researchers.

“Pinpointing the cell(s) of origin for Group 4 medulloblastoma will help us to better understand normal cerebellar development and dramatically improve our chances of developing genetically faithful preclinical mouse models. These models are desperately needed for learning more about Group 4 medulloblastoma biology and evaluating rational, molecularly targeted therapies to improve patient outcomes,” said Paul Northcott, Ph.D., an assistant member of the St. Jude department of developmental neurobiology. Dr. Northcott, Stefan Pfister, M.D., of the German Cancer Research Center (DKFZ), and James Bradner, M.D., of Dana-Farber Cancer Institute, are the corresponding authors.

The discovery and other findings about the missteps fueling tumor growth came from studying the epigenome. Researchers used the analytic tool ChiP-seq to identify and track medulloblastoma subtype differences based on the activity of epigenetic regulators, which included proteins known as master regulator transcription factors. They bind to DNA enhancers and super-enhancers. The master regulator transcription factors and super-enhancers work together to regulate the expression of critical genes, such as those responsible for cell identity.

Those and other tools helped investigators identify more than 3,000 super-enhancers in 28 medulloblastoma tumors as well as evidence that the activity of super-enhancers varied by subtype. The super-enhancers switched on known cancer genes, including genes like ALK, MYC, SMO, and OTX2 that are associated with medulloblastoma, the researchers reported.

Knowledge of the subtype super-enhancers led to identification of the transcription factors that regulate their activity. Using computational methods, researchers applied that information to reconstruct the transcription factor networks responsible for medulloblastoma subtype diversity and identity, providing previously unknown insights into the regulatory landscape and transcriptional output of the different medulloblastoma subtypes.

The approach helped to discover and nominate Lmx1A as a master regulator transcription factor of Group 4 tumors, which led to the identification of the likely Group 4 tumor cells of origin. Lmx1A was known to play an important role in normal development of cells in the uRL and cerebellum. Additional studies performed in mice with and without Lmx1A in this study supported uRL cells as the likely source of Group 4 tumors.

“By studying the epigenome, we also identified new pathways and molecular dependencies not apparent in previous gene expression and mutational studies,” explained Dr. Northcott. “The findings open new therapeutic avenues, particularly for the Group 3 and 4 subtypes where patient outcomes are inferior for the majority of affected children.”

For example, researchers identified increased enhancer activity targeting the TGFbeta pathway. The finding adds to evidence that the pathway may drive Group 3 medulloblastoma, currently the subtype with the worst prognosis. The pathway regulates cell growth, cell death, and other functions that are often disrupted in cancer, but it’s role in medulloblastoma is poorly understood.

The analysis included samples from 28 medulloblastoma tumors representing the four subtypes. Researchers believe it is the largest epigenetic study yet for any single cancer type and, importantly, the first to use a large cohort of primary patient tumor tissues instead of cell lines grown in the laboratory. Previous studies have suggested that cell lines may be of limited use for studying the tumor epigenome. The three Group 3 medulloblastoma cell lines used in this study reinforced the observation, highlighting significant differences in epigenetic regulators at work in medulloblastoma cell lines versus tumor samples.

 

 

Read Full Post »

Older Posts »