Advertisements
Feeds:
Posts
Comments

Archive for the ‘Cell Level’ Category


Topical Solution for Combination Oncology Drug Therapy: Patch that delivers Drug, Gene, and Light-based Therapy to Tumor

Reporter: Aviva Lev-Ari, PhD, RN

 

Self-assembled RNA-triple-helix hydrogel scaffold for microRNA modulation in the tumour microenvironment

Affiliations

  1. Massachusetts Institute of Technology, Institute for Medical Engineering and Science, Harvard-MIT Division for Health Sciences and Technology, Cambridge, Massachusetts 02139, USA
    • João Conde,
    • Nuria Oliva,
    • Mariana Atilano,
    • Hyun Seok Song &
    • Natalie Artzi
  2. School of Engineering and Materials Science, Queen Mary University of London, London E1 4NS, UK
    • João Conde
  3. Grup dEnginyeria de Materials, Institut Químic de Sarrià-Universitat Ramon Llull, Barcelona 08017, Spain
    • Mariana Atilano
  4. Division of Bioconvergence Analysis, Korea Basic Science Institute, Yuseong, Daejeon 169-148, Republic of Korea
    • Hyun Seok Song
  5. Broad Institute of MIT and Harvard, Cambridge, Massachusetts 02142, USA
    • Natalie Artzi
  6. Department of Medicine, Biomedical Engineering Division, Brigham and Womens Hospital, Harvard Medical School, Boston, Massachusetts 02115, USA
    • Natalie Artzi

Contributions

J.C. and N.A. conceived the project and designed the experiments. J.C., N.O., H.S.S. and M.A. performed the experiments, collected and analysed the data. J.C. and N.A. co-wrote the manuscript. All authors discussed the results and reviewed the manuscript.

Nature Materials
15,
353–363
(2016)
doi:10.1038/nmat4497
Received
22 April 2015
Accepted
26 October 2015
Published online
07 December 2015

The therapeutic potential of miRNA (miR) in cancer is limited by the lack of efficient delivery vehicles. Here, we show that a self-assembled dual-colour RNA-triple-helix structure comprising two miRNAs—a miR mimic (tumour suppressor miRNA) and an antagomiR (oncomiR inhibitor)—provides outstanding capability to synergistically abrogate tumours. Conjugation of RNA triple helices to dendrimers allows the formation of stable triplex nanoparticles, which form an RNA-triple-helix adhesive scaffold upon interaction with dextran aldehyde, the latter able to chemically interact and adhere to natural tissue amines in the tumour. We also show that the self-assembled RNA-triple-helix conjugates remain functional in vitro and in vivo, and that they lead to nearly 90% levels of tumour shrinkage two weeks post-gel implantation in a triple-negative breast cancer mouse model. Our findings suggest that the RNA-triple-helix hydrogels can be used as an efficient anticancer platform to locally modulate the expression of endogenous miRs in cancer.

SOURCE

http://www.nature.com/nmat/journal/v15/n3/abs/nmat4497.html#author-information

 

 

Patch that delivers drug, gene, and light-based therapy to tumor sites shows promising results

In mice, device destroyed colorectal tumors and prevented remission after surgery.

Helen Knight | MIT News Office
July 25, 2016

Approximately one in 20 people will develop colorectal cancer in their lifetime, making it the third-most prevalent form of the disease in the U.S. In Europe, it is the second-most common form of cancer.

The most widely used first line of treatment is surgery, but this can result in incomplete removal of the tumor. Cancer cells can be left behind, potentially leading to recurrence and increased risk of metastasis. Indeed, while many patients remain cancer-free for months or even years after surgery, tumors are known to recur in up to 50 percent of cases.

Conventional therapies used to prevent tumors recurring after surgery do not sufficiently differentiate between healthy and cancerous cells, leading to serious side effects.

In a paper published today in the journal Nature Materials, researchers at MIT describe an adhesive patch that can stick to the tumor site, either before or after surgery, to deliver a triple-combination of drug, gene, and photo (light-based) therapy.

Releasing this triple combination therapy locally, at the tumor site, may increase the efficacy of the treatment, according to Natalie Artzi, a principal research scientist at MIT’s Institute for Medical Engineering and Science (IMES) and an assistant professor of medicine at Brigham and Women’s Hospital, who led the research.

The general approach to cancer treatment today is the use of systemic, or whole-body, therapies such as chemotherapy drugs. But the lack of specificity of anticancer drugs means they produce undesired side effects when systemically administered.

What’s more, only a small portion of the drug reaches the tumor site itself, meaning the primary tumor is not treated as effectively as it should be.

Indeed, recent research in mice has found that only 0.7 percent of nanoparticles administered systemically actually found their way to the target tumor.

“This means that we are treating both the source of the cancer — the tumor — and the metastases resulting from that source, in a suboptimal manner,” Artzi says. “That is what prompted us to think a little bit differently, to look at how we can leverage advancements in materials science, and in particular nanotechnology, to treat the primary tumor in a local and sustained manner.”

The researchers have developed a triple-therapy hydrogel patch, which can be used to treat tumors locally. This is particularly effective as it can treat not only the tumor itself but any cells left at the site after surgery, preventing the cancer from recurring or metastasizing in the future.

Firstly, the patch contains gold nanorods, which heat up when near-infrared radiation is applied to the local area. This is used to thermally ablate, or destroy, the tumor.

These nanorods are also equipped with a chemotherapy drug, which is released when they are heated, to target the tumor and its surrounding cells.

Finally, gold nanospheres that do not heat up in response to the near-infrared radiation are used to deliver RNA, or gene therapy to the site, in order to silence an important oncogene in colorectal cancer. Oncogenes are genes that can cause healthy cells to transform into tumor cells.

The researchers envision that a clinician could remove the tumor, and then apply the patch to the inner surface of the colon, to ensure that no cells that are likely to cause cancer recurrence remain at the site. As the patch degrades, it will gradually release the various therapies.

The patch can also serve as a neoadjuvant, a therapy designed to shrink tumors prior to their resection, Artzi says.

When the researchers tested the treatment in mice, they found that in 40 percent of cases where the patch was not applied after tumor removal, the cancer returned.

But when the patch was applied after surgery, the treatment resulted in complete remission.

Indeed, even when the tumor was not removed, the triple-combination therapy alone was enough to destroy it.

The technology is an extraordinary and unprecedented synergy of three concurrent modalities of treatment, according to Mauro Ferrari, president and CEO of the Houston Methodist Research Institute, who was not involved in the research.

“What is particularly intriguing is that by delivering the treatment locally, multimodal therapy may be better than systemic therapy, at least in certain clinical situations,” Ferrari says.

Unlike existing colorectal cancer surgery, this treatment can also be applied in a minimally invasive manner. In the next phase of their work, the researchers hope to move to experiments in larger models, in order to use colonoscopy equipment not only for cancer diagnosis but also to inject the patch to the site of a tumor, when detected.

“This administration modality would enable, at least in early-stage cancer patients, the avoidance of open field surgery and colon resection,” Artzi says. “Local application of the triple therapy could thus improve patients’ quality of life and therapeutic outcome.”

Artzi is joined on the paper by João Conde, Nuria Oliva, and Yi Zhang, of IMES. Conde is also at Queen Mary University in London.

SOURCE

http://news.mit.edu/2016/patch-delivers-drug-gene-light-based-therapy-tumor-0725

Other related articles published in thie Open Access Online Scientific Journal include the following:

The Development of siRNA-Based Therapies for Cancer

Author: Ziv Raviv, PhD

https://pharmaceuticalintelligence.com/2013/05/09/the-development-of-sirna-based-therapies-for-cancer/

 

Targeted Liposome Based Delivery System to Present HLA Class I Antigens to Tumor Cells: Two papers

Reporter: Stephen J. Williams, Ph.D.

https://pharmaceuticalintelligence.com/2016/07/20/targeted-liposome-based-delivery-system-to-present-hla-class-i-antigens-to-tumor-cells-two-papers/

 

Blast Crisis in Myeloid Leukemia and the Activation of a microRNA-editing Enzyme called ADAR1

Curator: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/06/10/blast-crisis-in-myeloid-leukemia-and-the-activation-of-a-microrna-editing-enzyme-called-adar1/

 

First challenge to make use of the new NCI Cloud Pilots – Somatic Mutation Challenge – RNA: Best algorithms for detecting all of the abnormal RNA molecules in a cancer cell

Reporter: Aviva Lev-Ari, PhD, RN

https://pharmaceuticalintelligence.com/2016/07/17/first-challenge-to-make-use-of-the-new-nci-cloud-pilots-somatic-mutation-challenge-rna-best-algorithms-for-detecting-all-of-the-abnormal-rna-molecules-in-a-cancer-cell/

 

miRNA Therapeutic Promise

Curator: Larry H. Bernstein, MD, FCAP

https://pharmaceuticalintelligence.com/2016/05/01/mirna-therapeutic-promise/

Advertisements

Read Full Post »


Imaging of Cancer Cells

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Microscope uses nanosecond-speed laser and deep learning to detect cancer cells more efficiently

April 13, 2016

Scientists at the California NanoSystems Institute at UCLA have developed a new technique for identifying cancer cells in blood samples faster and more accurately than the current standard methods.

In one common approach to testing for cancer, doctors add biochemicals to blood samples. Those biochemicals attach biological “labels” to the cancer cells, and those labels enable instruments to detect and identify them. However, the biochemicals can damage the cells and render the samples unusable for future analyses. There are other current techniques that don’t use labeling but can be inaccurate because they identify cancer cells based only on one physical characteristic.

Time-stretch quantitative phase imaging (TS-QPI) and analytics system

The new technique images cells without destroying them and can identify 16 physical characteristics — including size, granularity and biomass — instead of just one.

The new technique combines two components that were invented at UCLA:

A “photonic time stretch” microscope, which is capable of quickly imaging cells in blood samples. Invented by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering, it works by taking pictures of flowing blood cells using laser bursts (similar to how a camera uses a flash). Each flash only lasts nanoseconds (billionths of a second) to avoid damage to cells, but that normally means the images are both too weak to be detected and too fast to be digitized by normal instrumentation. The new microscope overcomes those challenges by using specially designed optics that amplify and boost the clarity of the images, and simultaneously slow them down enough to be detected and digitized at a rate of 36 million images per second.

A deep learning computer program, which identifies cancer cells with more than 95 percent accuracy. Deep learning is a form of artificial intelligence that uses complex algorithms to extract patterns and knowledge from rich multidimenstional datasets, with the goal of achieving accurate decision making.

The study was published in the open-access journal Nature Scientific Reports. The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.

The research was supported by NantWorks, LLC.

 

Abstract of Deep Learning in Label-free Cell Classification

Label-free cell analysis is essential to personalized genomics, cancer diagnostics, and drug development as it avoids adverse effects of staining reagents on cellular viability and cell signaling. However, currently available label-free cell assays mostly rely only on a single feature and lack sufficient differentiation. Also, the sample size analyzed by these assays is limited due to their low throughput. Here, we integrate feature extraction and deep learning with high-throughput quantitative imaging enabled by photonic time stretch, achieving record high accuracy in label-free cell classification. Our system captures quantitative optical phase and intensity images and extracts multiple biophysical features of individual cells. These biophysical measurements form a hyperdimensional feature space in which supervised learning is performed for cell classification. We compare various learning algorithms including artificial neural network, support vector machine, logistic regression, and a novel deep learning pipeline, which adopts global optimization of receiver operating characteristics. As a validation of the enhanced sensitivity and specificity of our system, we show classification of white blood T-cells against colon cancer cells, as well as lipid accumulating algal strains for biofuel production. This system opens up a new path to data-driven phenotypic diagnosis and better understanding of the heterogeneous gene expressions in cells.

references:

Claire Lifan Chen, Ata Mahjoubfar, Li-Chia Tai, Ian K. Blaby, Allen Huang, Kayvan Reza Niazi & Bahram Jalali. Deep Learning in Label-free Cell Classification. Scientific Reports 6, Article number: 21471 (2016); doi:10.1038/srep21471 (open access)

Supplementary Information

 

Deep Learning in Label-free Cell Classification

Claire Lifan Chen, Ata Mahjoubfar, Li-Chia Tai, Ian K. Blaby, Allen Huang,Kayvan Reza Niazi & Bahram Jalali

Scientific Reports 6, Article number: 21471 (2016)    http://dx.doi.org:/10.1038/srep21471

Deep learning extracts patterns and knowledge from rich multidimenstional datasets. While it is extensively used for image recognition and speech processing, its application to label-free classification of cells has not been exploited. Flow cytometry is a powerful tool for large-scale cell analysis due to its ability to measure anisotropic elastic light scattering of millions of individual cells as well as emission of fluorescent labels conjugated to cells1,2. However, each cell is represented with single values per detection channels (forward scatter, side scatter, and emission bands) and often requires labeling with specific biomarkers for acceptable classification accuracy1,3. Imaging flow cytometry4,5 on the other hand captures images of cells, revealing significantly more information about the cells. For example, it can distinguish clusters and debris that would otherwise result in false positive identification in a conventional flow cytometer based on light scattering6.

In addition to classification accuracy, the throughput is another critical specification of a flow cytometer. Indeed high throughput, typically 100,000 cells per second, is needed to screen a large enough cell population to find rare abnormal cells that are indicative of early stage diseases. However there is a fundamental trade-off between throughput and accuracy in any measurement system7,8. For example, imaging flow cytometers face a throughput limit imposed by the speed of the CCD or the CMOS cameras, a number that is approximately 2000 cells/s for present systems9. Higher flow rates lead to blurred cell images due to the finite camera shutter speed. Many applications of flow analyzers such as cancer diagnostics, drug discovery, biofuel development, and emulsion characterization require classification of large sample sizes with a high-degree of statistical accuracy10. This has fueled research into alternative optical diagnostic techniques for characterization of cells and particles in flow.

Recently, our group has developed a label-free imaging flow-cytometry technique based on coherent optical implementation of the photonic time stretch concept11. This instrument overcomes the trade-off between sensitivity and speed by using Amplified Time-stretch Dispersive Fourier Transform12,13,14,15. In time stretched imaging16, the object’s spatial information is encoded in the spectrum of laser pulses within a pulse duration of sub-nanoseconds (Fig. 1). Each pulse representing one frame of the camera is then stretched in time so that it can be digitized in real-time by an electronic analog-to-digital converter (ADC). The ultra-fast pulse illumination freezes the motion of high-speed cells or particles in flow to achieve blur-free imaging. Detection sensitivity is challenged by the low number of photons collected during the ultra-short shutter time (optical pulse width) and the drop in the peak optical power resulting from the time stretch. These issues are solved in time stretch imaging by implementing a low noise-figure Raman amplifier within the dispersive device that performs time stretching8,11,16. Moreover, warped stretch transform17,18can be used in time stretch imaging to achieve optical image compression and nonuniform spatial resolution over the field-of-view19. In the coherent version of the instrument, the time stretch imaging is combined with spectral interferometry to measure quantitative phase and intensity images in real-time and at high throughput20. Integrated with a microfluidic channel, coherent time stretch imaging system in this work measures both quantitative optical phase shift and loss of individual cells as a high-speed imaging flow cytometer, capturing 36 million images per second in flow rates as high as 10 meters per second, reaching up to 100,000 cells per second throughput.

Figure 1: Time stretch quantitative phase imaging (TS-QPI) and analytics system; A mode-locked laser followed by a nonlinear fiber, an erbium doped fiber amplifier (EDFA), and a wavelength-division multiplexing (WDM) filter generate and shape a train of broadband optical pulses. http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f1.jpg

 

Box 1: The pulse train is spatially dispersed into a train of rainbow flashes illuminating the target as line scans. The spatial features of the target are encoded into the spectrum of the broadband optical pulses, each representing a one-dimensional frame. The ultra-short optical pulse illumination freezes the motion of cells during high speed flow to achieve blur-free imaging with a throughput of 100,000 cells/s. The phase shift and intensity loss at each location within the field of view are embedded into the spectral interference patterns using a Michelson interferometer. Box 2: The interferogram pulses were then stretched in time so that spatial information could be mapped into time through time-stretch dispersive Fourier transform (TS-DFT), and then captured by a single pixel photodetector and an analog-to-digital converter (ADC). The loss of sensitivity at high shutter speed is compensated by stimulated Raman amplification during time stretch. Box 3: (a) Pulse synchronization; the time-domain signal carrying serially captured rainbow pulses is transformed into a series of one-dimensional spatial maps, which are used for forming line images. (b) The biomass density of a cell leads to a spatially varying optical phase shift. When a rainbow flash passes through the cells, the changes in refractive index at different locations will cause phase walk-off at interrogation wavelengths. Hilbert transformation and phase unwrapping are used to extract the spatial phase shift. (c) Decoding the phase shift in each pulse at each wavelength and remapping it into a pixel reveals the protein concentration distribution within cells. The optical loss induced by the cells, embedded in the pulse intensity variations, is obtained from the amplitude of the slowly varying envelope of the spectral interferograms. Thus, quantitative optical phase shift and intensity loss images are captured simultaneously. Both images are calibrated based on the regions where the cells are absent. Cell features describing morphology, granularity, biomass, etc are extracted from the images. (d) These biophysical features are used in a machine learning algorithm for high-accuracy label-free classification of the cells.

On another note, surface markers used to label cells, such as EpCAM21, are unavailable in some applications; for example, melanoma or pancreatic circulating tumor cells (CTCs) as well as some cancer stem cells are EpCAM-negative and will escape EpCAM-based detection platforms22. Furthermore, large-population cell sorting opens the doors to downstream operations, where the negative impacts of labels on cellular behavior and viability are often unacceptable23. Cell labels may cause activating/inhibitory signal transduction, altering the behavior of the desired cellular subtypes, potentially leading to errors in downstream analysis, such as DNA sequencing and subpopulation regrowth. In this way, quantitative phase imaging (QPI) methods24,25,26,27 that categorize unlabeled living cells with high accuracy are needed. Coherent time stretch imaging is a method that enables quantitative phase imaging at ultrahigh throughput for non-invasive label-free screening of large number of cells.

In this work, the information of quantitative optical loss and phase images are fused into expert designed features, leading to a record label-free classification accuracy when combined with deep learning. Image mining techniques are applied, for the first time, to time stretch quantitative phase imaging to measure biophysical attributes including protein concentration, optical loss, and morphological features of single cells at an ultrahigh flow rate and in a label-free fashion. These attributes differ widely28,29,30,31 among cells and their variations reflect important information of genotypes and physiological stimuli32. The multiplexed biophysical features thus lead to information-rich hyper-dimensional representation of the cells for label-free classification with high statistical precision.

We further improved the accuracy, repeatability, and the balance between sensitivity and specificity of our label-free cell classification by a novel machine learning pipeline, which harnesses the advantages of multivariate supervised learning, as well as unique training by evolutionary global optimization of receiver operating characteristics (ROC). To demonstrate sensitivity, specificity, and accuracy of multi-feature label-free flow cytometry using our technique, we classified (1) OT-IIhybridoma T-lymphocytes and SW-480 colon cancer epithelial cells, and (2) Chlamydomonas reinhardtii algal cells (herein referred to as Chlamydomonas) based on their lipid content, which is related to the yield in biofuel production. Our preliminary results show that compared to classification by individual biophysical parameters, our label-free hyperdimensional technique improves the detection accuracy from 77.8% to 95.5%, or in other words, reduces the classification inaccuracy by about five times.     ……..

 

Feature Extraction

The decomposed components of sequential line scans form pairs of spatial maps, namely, optical phase and loss images as shown in Fig. 2 (see Section Methods: Image Reconstruction). These images are used to obtain biophysical fingerprints of the cells8,36. With domain expertise, raw images are fused and transformed into a suitable set of biophysical features, listed in Table 1, which the deep learning model further converts into learned features for improved classification.

The new technique combines two components that were invented at UCLA:

A “photonic time stretch” microscope, which is capable of quickly imaging cells in blood samples. Invented by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering, it works by taking pictures of flowing blood cells using laser bursts (similar to how a camera uses a flash). Each flash only lasts nanoseconds (billionths of a second) to avoid damage to cells, but that normally means the images are both too weak to be detected and too fast to be digitized by normal instrumentation. The new microscope overcomes those challenges by using specially designed optics that amplify and boost the clarity of the images, and simultaneously slow them down enough to be detected and digitized at a rate of 36 million images per second.

A deep learning computer program, which identifies cancer cells with more than 95 percent accuracy. Deep learning is a form of artificial intelligence that uses complex algorithms to extract patterns and knowledge from rich multidimenstional datasets, with the goal of achieving accurate decision making.

The study was published in the open-access journal Nature Scientific Reports. The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.

The research was supported by NantWorks, LLC.

 

http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f2.jpg

The optical loss images of the cells are affected by the attenuation of multiplexed wavelength components passing through the cells. The attenuation itself is governed by the absorption of the light in cells as well as the scattering from the surface of the cells and from the internal cell organelles. The optical loss image is derived from the low frequency component of the pulse interferograms. The optical phase image is extracted from the analytic form of the high frequency component of the pulse interferograms using Hilbert Transformation, followed by a phase unwrapping algorithm. Details of these derivations can be found in Section Methods. Also, supplementary Videos 1 and 2 show measurements of cell-induced optical path length difference by TS-QPI at four different points along the rainbow for OT-II and SW-480, respectively.

Table 1: List of extracted features.

Feature Name    Description         Category

 

Figure 3: Biophysical features formed by image fusion.

(a) Pairwise correlation matrix visualized as a heat map. The map depicts the correlation between all major 16 features extracted from the quantitative images. Diagonal elements of the matrix represent correlation of each parameter with itself, i.e. the autocorrelation. The subsets in box 1, box 2, and box 3 show high correlation because they are mainly related to morphological, optical phase, and optical loss feature categories, respectively. (b) Ranking of biophysical features based on their AUCs in single-feature classification. Blue bars show performance of the morphological parameters, which includes diameter along the interrogation rainbow, diameter along the flow direction, tight cell area, loose cell area, perimeter, circularity, major axis length, orientation, and median radius. As expected, morphology contains most information, but other biophysical features can contribute to improved performance of label-free cell classification. Orange bars show optical phase shift features i.e. optical path length differences and refractive index difference. Green bars show optical loss features representing scattering and absorption by the cell. The best performed feature in these three categories are marked in red.

Figure 4: Machine learning pipeline. Information of quantitative optical phase and loss images are fused to extract multivariate biophysical features of each cell, which are fed into a fully-connected neural network.

The neural network maps input features by a chain of weighted sum and nonlinear activation functions into learned feature space, convenient for classification. This deep neural network is globally trained via area under the curve (AUC) of the receiver operating characteristics (ROC). Each ROC curve corresponds to a set of weights for connections to an output node, generated by scanning the weight of the bias node. The training process maximizes AUC, pushing the ROC curve toward the upper left corner, which means improved sensitivity and specificity in classification.

….   How to cite this article: Chen, C. L. et al. Deep Learning in Label-free Cell Classification.

Sci. Rep. 6, 21471; http://dx.doi.org:/10.1038/srep21471

 

Computer Algorithm Helps Characterize Cancerous Genomic Variations

http://www.genengnews.com/gen-news-highlights/computer-algorithm-helps-characterize-cancerous-genomic-variations/81252626/

To better characterize the functional context of genomic variations in cancer, researchers developed a new computer algorithm called REVEALER. [UC San Diego Health]

Scientists at the University of California San Diego School of Medicine and the Broad Institute say they have developed a new computer algorithm—REVEALER—to better characterize the functional context of genomic variations in cancer. The tool, described in a paper (“Characterizing Genomic Alterations in Cancer by Complementary Functional Associations”) published in Nature Biotechnology, is designed to help researchers identify groups of genetic variations that together associate with a particular way cancer cells get activated, or how they respond to certain treatments.

REVEALER is available for free to the global scientific community via the bioinformatics software portal GenePattern.org.

“This computational analysis method effectively uncovers the functional context of genomic alterations, such as gene mutations, amplifications, or deletions, that drive tumor formation,” said senior author Pablo Tamayo, Ph.D., professor and co-director of the UC San Diego Moores Cancer Center Genomics and Computational Biology Shared Resource.

Dr. Tamayo and team tested REVEALER using The Cancer Genome Atlas (TCGA), the NIH’s database of genomic information from more than 500 human tumors representing many cancer types. REVEALER revealed gene alterations associated with the activation of several cellular processes known to play a role in tumor development and response to certain drugs. Some of these gene mutations were already known, but others were new.

For example, the researchers discovered new activating genomic abnormalities for beta-catenin, a cancer-promoting protein, and for the oxidative stress response that some cancers hijack to increase their viability.

REVEALER requires as input high-quality genomic data and a significant number of cancer samples, which can be a challenge, according to Dr. Tamayo. But REVEALER is more sensitive at detecting similarities between different types of genomic features and less dependent on simplifying statistical assumptions, compared to other methods, he adds.

“This study demonstrates the potential of combining functional profiling of cells with the characterizations of cancer genomes via next-generation sequencing,” said co-senior author Jill P. Mesirov, Ph.D., professor and associate vice chancellor for computational health sciences at UC San Diego School of Medicine.

 

Characterizing genomic alterations in cancer by complementary functional associations

Jong Wook Kim, Olga B Botvinnik, Omar Abudayyeh, Chet Birger, et al.

Nature Biotechnology (2016)              http://dx.doi.org:/10.1038/nbt.3527

Systematic efforts to sequence the cancer genome have identified large numbers of mutations and copy number alterations in human cancers. However, elucidating the functional consequences of these variants, and their interactions to drive or maintain oncogenic states, remains a challenge in cancer research. We developed REVEALER, a computational method that identifies combinations of mutually exclusive genomic alterations correlated with functional phenotypes, such as the activation or gene dependency of oncogenic pathways or sensitivity to a drug treatment. We used REVEALER to uncover complementary genomic alterations associated with the transcriptional activation of β-catenin and NRF2, MEK-inhibitor sensitivity, and KRAS dependency. REVEALER successfully identified both known and new associations, demonstrating the power of combining functional profiles with extensive characterization of genomic alterations in cancer genomes

 

Figure 2: REVEALER results for transcriptional activation of β-catenin in cancer.close

(a) This heatmap illustrates the use of the REVEALER approach to find complementary genomic alterations that match the transcriptional activation of β-catenin in cancer. The target profile is a TCF4 reporter that provides an estimate of…

 

An imaging-based platform for high-content, quantitative evaluation of therapeutic response in 3D tumour models

Jonathan P. Celli, Imran Rizvi, Adam R. Blanden, Iqbal Massodi, Michael D. Glidden, Brian W. Pogue & Tayyaba Hasan

Scientific Reports 4; 3751  (2014)    http://dx.doi.org:/10.1038/srep03751

While it is increasingly recognized that three-dimensional (3D) cell culture models recapitulate drug responses of human cancers with more fidelity than monolayer cultures, a lack of quantitative analysis methods limit their implementation for reliable and routine assessment of emerging therapies. Here, we introduce an approach based on computational analysis of fluorescence image data to provide high-content readouts of dose-dependent cytotoxicity, growth inhibition, treatment-induced architectural changes and size-dependent response in 3D tumour models. We demonstrate this approach in adherent 3D ovarian and pancreatic multiwell extracellular matrix tumour overlays subjected to a panel of clinically relevant cytotoxic modalities and appropriately designed controls for reliable quantification of fluorescence signal. This streamlined methodology reads out the high density of information embedded in 3D culture systems, while maintaining a level of speed and efficiency traditionally achieved with global colorimetric reporters in order to facilitate broader implementation of 3D tumour models in therapeutic screening.

The attrition rates for preclinical development of oncology therapeutics are particularly dismal due to a complex set of factors which includes 1) the failure of pre-clinical models to recapitulate determinants of in vivo treatment response, and 2) the limited ability of available assays to extract treatment-specific data integral to the complexities of therapeutic responses1,2,3. Three-dimensional (3D) tumour models have been shown to restore crucial stromal interactions which are missing in the more commonly used 2D cell culture and that influence tumour organization and architecture4,5,6,7,8, as well as therapeutic response9,10, multicellular resistance (MCR)11,12, drug penetration13,14, hypoxia15,16, and anti-apoptotic signaling17. However, such sophisticated models can only have an impact on therapeutic guidance if they are accompanied by robust quantitative assays, not only for cell viability but also for providing mechanistic insights related to the outcomes. While numerous assays for drug discovery exist18, they are generally not developed for use in 3D systems and are often inherently unsuitable. For example, colorimetric conversion products have been noted to bind to extracellular matrix (ECM)19 and traditional colorimetric cytotoxicity assays reduce treatment response to a single number reflecting a biochemical event that has been equated to cell viability (e.g. tetrazolium salt conversion20). Such approaches fail to provide insight into the spatial patterns of response within colonies, morphological or structural effects of drug response, or how overall culture viability may be obscuring the status of sub-populations that are resistant or partially responsive. Hence, the full benefit of implementing 3D tumour models in therapeutic development has yet to be realized for lack of analytical methods that describe the very aspects of treatment outcome that these systems restore.

Motivated by these factors, we introduce a new platform for quantitative in situ treatment assessment (qVISTA) in 3D tumour models based on computational analysis of information-dense biological image datasets (bioimage-informatics)21,22. This methodology provides software end-users with multiple levels of complexity in output content, from rapidly-interpreted dose response relationships to higher content quantitative insights into treatment-dependent architectural changes, spatial patterns of cytotoxicity within fields of multicellular structures, and statistical analysis of nodule-by-nodule size-dependent viability. The approach introduced here is cognizant of tradeoffs between optical resolution, data sampling (statistics), depth of field, and widespread usability (instrumentation requirement). Specifically, it is optimized for interpretation of fluorescent signals for disease-specific 3D tumour micronodules that are sufficiently small that thousands can be imaged simultaneously with little or no optical bias from widefield integration of signal along the optical axis of each object. At the core of our methodology is the premise that the copious numerical readouts gleaned from segmentation and interpretation of fluorescence signals in these image datasets can be converted into usable information to classify treatment effects comprehensively, without sacrificing the throughput of traditional screening approaches. It is hoped that this comprehensive treatment-assessment methodology will have significant impact in facilitating more sophisticated implementation of 3D cell culture models in preclinical screening by providing a level of content and biological relevance impossible with existing assays in monolayer cell culture in order to focus therapeutic targets and strategies before costly and tedious testing in animal models.

Using two different cell lines and as depicted in Figure 1, we adopt an ECM overlay method pioneered originally for 3D breast cancer models23, and developed in previous studies by us to model micrometastatic ovarian cancer19,24. This system leads to the formation of adherent multicellular 3D acini in approximately the same focal plane atop a laminin-rich ECM bed, implemented here in glass-bottom multiwell imaging plates for automated microscopy. The 3D nodules resultant from restoration of ECM signaling5,8, are heterogeneous in size24, in contrast to other 3D spheroid methods, such as rotary or hanging drop cultures10, in which cells are driven to aggregate into uniformly sized spheroids due to lack of an appropriate substrate to adhere to. Although the latter processes are also biologically relevant, it is the adherent tumour populations characteristic of advanced metastatic disease that are more likely to be managed with medical oncology, which are the focus of therapeutic evaluation herein. The heterogeneity in 3D structures formed via ECM overlay is validated here by endoscopic imaging ofin vivo tumours in orthotopic xenografts derived from the same cells (OVCAR-5).

 

Figure 1: A simplified schematic flow chart of imaging-based quantitative in situ treatment assessment (qVISTA) in 3D cell culture.

(This figure was prepared in Adobe Illustrator® software by MD Glidden, JP Celli and I Rizvi). A detailed breakdown of the image processing (Step 4) is provided in Supplemental Figure 1.

A critical component of the imaging-based strategy introduced here is the rational tradeoff of image-acquisition parameters for field of view, depth of field and optical resolution, and the development of image processing routines for appropriate removal of background, scaling of fluorescence signals from more than one channel and reliable segmentation of nodules. In order to obtain depth-resolved 3D structures for each nodule at sub-micron lateral resolution using a laser-scanning confocal system, it would require ~ 40 hours (at approximately 100 fields for each well with a 20× objective, times 1 minute/field for a coarse z-stack, times 24 wells) to image a single plate with the same coverage achieved in this study. Even if the resources were available to devote to such time-intensive image acquisition, not to mention the processing, the optical properties of the fluorophores would change during the required time frame for image acquisition, even with environmental controls to maintain culture viability during such extended imaging. The approach developed here, with a mind toward adaptation into high throughput screening, provides a rational balance of speed, requiring less than 30 minutes/plate, and statistical rigour, providing images of thousands of nodules in this time, as required for the high-content analysis developed in this study. These parameters can be further optimized for specific scenarios. For example, we obtain the same number of images in a 96 well plate as for a 24 well plate by acquiring only a single field from each well, rather than 4 stitched fields. This quadruples the number conditions assayed in a single run, at the expense of the number of nodules per condition, and therefore the ability to obtain statistical data sets for size-dependent response, Dfrac and other segmentation-dependent numerical readouts.

 

We envision that the system for high-content interrogation of therapeutic response in 3D cell culture could have widespread impact in multiple arenas from basic research to large scale drug development campaigns. As such, the treatment assessment methodology presented here does not require extraordinary optical instrumentation or computational resources, making it widely accessible to any research laboratory with an inverted fluorescence microscope and modestly equipped personal computer. And although we have focused here on cancer models, the methodology is broadly applicable to quantitative evaluation of other tissue models in regenerative medicine and tissue engineering. While this analysis toolbox could have impact in facilitating the implementation of in vitro 3D models in preclinical treatment evaluation in smaller academic laboratories, it could also be adopted as part of the screening pipeline in large pharma settings. With the implementation of appropriate temperature controls to handle basement membranes in current robotic liquid handling systems, our analyses could be used in ultra high-throughput screening. In addition to removing non-efficacious potential candidate drugs earlier in the pipeline, this approach could also yield the additional economic advantage of minimizing the use of costly time-intensive animal models through better estimates of dose range, sequence and schedule for combination regimens.

 

Microscope Uses AI to Find Cancer Cells More Efficiently

Thu, 04/14/2016 – by Shaun Mason

http://www.mdtmag.com/news/2016/04/microscope-uses-ai-find-cancer-cells-more-efficiently

Scientists at the California NanoSystems Institute at UCLA have developed a new technique for identifying cancer cells in blood samples faster and more accurately than the current standard methods.

In one common approach to testing for cancer, doctors add biochemicals to blood samples. Those biochemicals attach biological “labels” to the cancer cells, and those labels enable instruments to detect and identify them. However, the biochemicals can damage the cells and render the samples unusable for future analyses.

There are other current techniques that don’t use labeling but can be inaccurate because they identify cancer cells based only on one physical characteristic.

The new technique images cells without destroying them and can identify 16 physical characteristics — including size, granularity and biomass — instead of just one. It combines two components that were invented at UCLA: a photonic time stretch microscope, which is capable of quickly imaging cells in blood samples, and a deep learning computer program that identifies cancer cells with over 95 percent accuracy.

Deep learning is a form of artificial intelligence that uses complex algorithms to extract meaning from data with the goal of achieving accurate decision making.

The study, which was published in the journal Nature Scientific Reports, was led by Barham Jalali, professor and Northrop-Grumman Optoelectronics Chair in electrical engineering; Claire Lifan Chen, a UCLA doctoral student; and Ata Mahjoubfar, a UCLA postdoctoral fellow.

Photonic time stretch was invented by Jalali, and he holds a patent for the technology. The new microscope is just one of many possible applications; it works by taking pictures of flowing blood cells using laser bursts in the way that a camera uses a flash. This process happens so quickly — in nanoseconds, or billionths of a second — that the images would be too weak to be detected and too fast to be digitized by normal instrumentation.

The new microscope overcomes those challenges using specially designed optics that boost the clarity of the images and simultaneously slow them enough to be detected and digitized at a rate of 36 million images per second. It then uses deep learning to distinguish cancer cells from healthy white blood cells.

“Each frame is slowed down in time and optically amplified so it can be digitized,” Mahjoubfar said. “This lets us perform fast cell imaging that the artificial intelligence component can distinguish.”

Normally, taking pictures in such minuscule periods of time would require intense illumination, which could destroy live cells. The UCLA approach also eliminates that problem.

“The photonic time stretch technique allows us to identify rogue cells in a short time with low-level illumination,” Chen said.

The researchers write in the paper that the system could lead to data-driven diagnoses by cells’ physical characteristics, which could allow quicker and earlier diagnoses of cancer, for example, and better understanding of the tumor-specific gene expression in cells, which could facilitate new treatments for disease.   …..  see also http://www.nature.com/article-assets/npg/srep/2016/160315/srep21471/images_hires/m685/srep21471-f1.jpg

Chen, C. L. et al. Deep Learning in Label-free Cell Classification.    Sci. Rep. 6, 21471;   http://dx.doi.org:/10.1038/srep21471

 

 

Read Full Post »


Human Factor Engineering: New Regulations Impact Drug Delivery, Device Design And Human Interaction

Curator: Stephen J. Williams, Ph.D.

Institute of Medicine report brought medical errors to the forefront of healthcare and the American public (Kohn, Corrigan, & Donaldson, 1999) and  estimated that between

44,000 and 98,000 Americans die each year as a result of medical errors

An obstetric nurse connects a bag of pain medication intended for an epidural catheter to the mother’s intravenous (IV) line, resulting in a fatal cardiac arrest. Newborns in a neonatal intensive care unit are given full-dose heparin instead of low-dose flushes, leading to threedeaths from intracranial bleeding. An elderly man experiences cardiac arrest while hospitalized, but when the code blue team arrives, they are unable to administer a potentially life-saving shock because the defibrillator pads and the defibrillator itself cannot be physically connected.

Human factors engineering is the discipline that attempts to identify and address these issues. It is the discipline that takes into account human strengths and limitations in the design of interactive systems that involve people, tools and technology, and work environments to ensure safety, effectiveness, and ease of use.

 

FDA says drug delivery devices need human factors validation testing

Several drug delivery devices are on a draft list of med tech that will be subject to a final guidance calling for the application of human factors and usability engineering to medical devices. The guidance calls called for validation testing of devices, to be collected through interviews, observation, knowledge testing, and in some cases, usability testing of a device under actual conditions of use. The drug delivery devices on the list include anesthesia machines, autoinjectors, dialysis systems, infusion pumps (including implanted ones), hemodialysis systems, insulin pumps and negative pressure wound therapy devices intended for home use. Studieshave consistently shown that patients struggle to properly use drug delivery devices such as autoinjectors, which are becoming increasingly prevalent due to the rise of self-administered injectable biologics. The trend toward home healthcare is another driver of usability issues on the patient side, while professionals sometimes struggle with unclear interfaces or instructions for use.

 

Humanfactors engineering, also called ergonomics, or human engineering, science dealing with the application of information on physical and psychological characteristics to the design of devices and systems for human use. ( for more detail see source@ Britannica.com)

The term human-factors engineering is used to designate equally a body of knowledge, a process, and a profession. As a body of knowledge, human-factors engineering is a collection of data and principles about human characteristics, capabilities, and limitations in relation to machines, jobs, and environments. As a process, it refers to the design of machines, machine systems, work methods, and environments to take into account the safety, comfort, and productiveness of human users and operators. As a profession, human-factors engineering includes a range of scientists and engineers from several disciplines that are concerned with individuals and small groups at work.

The terms human-factors engineering and human engineering are used interchangeably on the North American continent. In Europe, Japan, and most of the rest of the world the prevalent term is ergonomics, a word made up of the Greek words, ergon, meaning “work,” and nomos, meaning “law.” Despite minor differences in emphasis, the terms human-factors engineering and ergonomics may be considered synonymous. Human factors and human engineering were used in the 1920s and ’30s to refer to problems of human relations in industry, an older connotation that has gradually dropped out of use. Some small specialized groups prefer such labels as bioastronautics, biodynamics, bioengineering, and manned-systems technology; these represent special emphases whose differences are much smaller than the similarities in their aims and goals.

The data and principles of human-factors engineering are concerned with human performance, behaviour, and training in man-machine systems; the design and development of man-machine systems; and systems-related biological or medical research. Because of its broad scope, human-factors engineering draws upon parts of such social or physiological sciences as anatomy, anthropometry, applied physiology, environmental medicine, psychology, sociology, and toxicology, as well as parts of engineering, industrial design, and operations research.

source@ Britannica.com

The human-factors approach to design

Two general premises characterize the approach of the human-factors engineer in practical design work. The first is that the engineer must solve the problems of integrating humans into machine systems by rigorous scientific methods and not rely on logic, intuition, or common sense. In the past the typical engineer tended either to ignore the complex and unpredictable nature of human behaviour or to deal with it summarily with educated guesses. Human-factors engineers have tried to show that with appropriate techniques it is possible to identify man-machine mismatches and that it is usually possible to find workable solutions to these mismatches through the use of methods developed in the behavioral sciences.

The second important premise of the human-factors approach is that, typically, design decisions cannot be made without a great deal of trial and error. There are only a few thousand human-factors engineers out of the thousands of thousands of engineers in the world who are designing novel machines, machine systems, and environments much faster than behavioral scientists can accumulate data on how humans will respond to them. More problems, therefore, are created than there are ready answers for them, and the human-factors specialist is almost invariably forced to resort to trying things out with various degrees of rigour to find solutions. Thus, while human-factors engineering aims at substituting scientific method for guesswork, its specific techniques are usually empirical rather than theoretical.

HFgeneralpic

 

 

 

 

 

 

 

 

 

 

 

The Man-Machine Model: Human-factors engineers regard humans as an element in systems

The simple man-machine model provides a convenient way for organizing some of the major concerns of human engineering: the selection and design of machine displays and controls; the layout and design of workplaces; design for maintainability; and the work environment.

Components of the Man-Machine Model

  1. human operator first has to sense what is referred to as a machine display, a signal that tells him something about the condition or the functioning of the machine
  2. Having sensed the display, the operator interprets it, perhaps performs some computation, and reaches a decision. In so doing, the worker may use a number of human abilities, Psychologists commonly refer to these activities as higher mental functions; human-factors engineers generally refer to them as information processing.
  3. Having reached a decision, the human operator normally takes some action. This action is usually exercised on some kind of a control—a pushbutton, lever, crank, pedal, switch, or handle.
  4. action upon one or more of these controls exerts an influence on the machine and on its output, which in turn changes the display, so that the cycle is continuously repeated

 

Driving an automobile is a familiar example of a simple man-machine system. In driving, the operator receives inputs from outside the vehicle (sounds and visual cues from traffic, obstructions, and signals) and from displays inside the vehicle (such as the speedometer, fuel indicator, and temperature gauge). The driver continually evaluates this information, decides on courses of action, and translates those decisions into actions upon the vehicle’s controls—principally the accelerator, steering wheel, and brake. Finally, the driver is influenced by such environmental factors as noise, fumes, and temperature.

 

hfactorconsideroutcomes

How BD Uses Human Factors to Design Drug-Delivery Systems

Posted in Design Services by Jamie Hartford on August 30, 2013

 Human factors testing has been vital to the success of the company’s BD Physioject Disposable Autoinjector.

Improving the administration and compliance of drug delivery is a common lifecycle strategy employed to enhance short- and long-term product adoption in the biotechnology and pharmaceutical industries. With increased competition in the industry and heightened regulatory requirements for end-user safety, significant advances in product improvements have been achieved in the injectable market, for both healthcare professionals and patients. Injection devices that facilitate preparation, ease administration, and ensure safety are increasingly prevalent in the marketplace.

Traditionally, human factors engineering addresses individualized aspects of development for each self-injection device, including the following:

  • Task analysis and design.
  • Device evaluation and usability.
  • Patient acceptance, compliance, and concurrence.
  • Anticipated training and education requirements.
  • System resilience and failure.

To achieve this, human factors scientists and engineers study the disease, patient, and desired outcome across multiple domains, including cognitive and organizational psychology, industrial and systems engineering, human performance, and economic theory—including formative usability testing that starts with the exploratory stage of the device and continues through all stages of conceptual design. Validation testing performed with real users is conducted as the final stage of the process.

To design the BD Physioject Disposable Autoinjector System , BD conducted multiple human factors studies and clinical studies to assess all aspects of performance safety, efficiency, patient acceptance, and ease of use, including pain perception compared with prefilled syringes.5 The studies provided essential insights regarding the overall user-product interface and highlighted that patients had a strong and positive response to both the product design and the user experience.

As a result of human factors testing, the BD Physioject Disposable Autoinjector System provides multiple features designed to aide in patient safety and ease of use, allowing the patient to control the start of the injection once the autoinjector is placed on the skin and the cap is removed. Specific design features included in the BD Physioject Disposable Autoinjector System include the following:

  • Ergonomic design that is easy to handle and use, especially in patients with limited dexterity.
  • A 360° view of the drug and injection process, allowing the patient to confirm full dose delivery.
  • A simple, one-touch injection button for activation.
  • A hidden needle before and during injection to reduce needle-stick anxiety.
  • A protected needle before and after injection to reduce the risk of needle stick injury.

 

YouTube VIDEO: Integrating Human Factors Engineering (HFE) into Drug Delivery

 

Notes:

 

 

The following is a slideshare presentation on Parental Drug Delivery Issues in the Future

 The Dangers of Medical Devices

The FDA receives on average 100,000 medical device incident reports per year, and more than a third involve user error.

In an FDA recall study, 44% of medical device recalls are due to design problems, and user error is often linked to the poor design of a product.

Drug developers need to take safe drug dosage into consideration, and this consideration requires the application of thorough processes for Risk Management and Human Factors Engineering (HFE).

Although unintended, medical devices can sometimes harm patients or the people administering the healthcare. The potential harm arises from two main sources:

  1. failure of the device and
  2. actions of the user or user-related errors. A number of factors can lead to these user-induced errors, including medical devices are often used under stressful conditions and users may think differently than the device designer.

Human Factors: Identifying the Root Causes of Use Errors

Instead of blaming test participants for use errors, look more carefully at your device’s design.

Great posting on reasons typical design flaws creep up in medical devices and where a company should integrate fixes in product design.
Posted in Design Services by Jamie Hartford on July 8, 2013

 

 

YouTube VIDEO: Integrating Human Factors Engineering into Medical Devices

 

 

Notes:

 

 Regulatory Considerations

  • Unlike other medication dosage forms, combination products require user interaction
  •  Combination products are unique in that their safety profile and product efficacy depends on user interaction
Human Factors Review: FDA Outlines Highest Priority Devices

Posted 02 February 2016By Zachary Brennan on http://www.raps.org/Regulatory-Focus/News/2016/02/02/24233/Human-Factors-Review-FDA-Outlines-Highest-Priority-Devices/ 

The US Food and Drug Administration (FDA) on Tuesday released new draft guidance to inform medical device manufacturers which device types should have human factors data included in premarket submissions, as well final guidance from 2011 on applying human factors and usability engineering to medical devices.

FDA said it believes these device types have “clear potential for serious harm resulting from use error and that review of human factors data in premarket submissions will help FDA evaluate the safety and effectiveness and substantial equivalence of these devices.”

Manufacturers should provide FDA with a report that summarizes the human factors or usability engineering processes they have followed, including any preliminary analyses and evaluations and human factors validation testing, results and conclusions, FDA says.

The list was based on knowledge obtained through Medical Device Reporting (MDRs) and recall data, and includes:

  • Ablation generators (associated with ablation systems, e.g., LPB, OAD, OAE, OCM, OCL)
  • Anesthesia machines (e.g., BSZ)
  • Artificial pancreas systems (e.g., OZO, OZP, OZQ)
  • Auto injectors (when CDRH is lead Center; e.g., KZE, KZH, NSC )
  • Automated external defibrillators
  • Duodenoscopes (on the reprocessing; e.g., FDT) with elevator channels
  • Gastroenterology-urology endoscopic ultrasound systems (on the reprocessing; e.g., ODG) with elevator channels
  • Hemodialysis and peritoneal dialysis systems (e.g., FKP, FKT, FKX, KDI, KPF ODX, ONW)
  • Implanted infusion pumps (e.g., LKK, MDY)
  • Infusion pumps (e.g., FRN, LZH, MEA, MRZ )
  • Insulin delivery systems (e.g., LZG, OPP)
  • Negative-pressure wound therapy (e.g., OKO, OMP) intended for home use
  • Robotic catheter manipulation systems (e.g., DXX)
  • Robotic surgery devices (e.g., NAY)
  • Ventilators (e.g., CBK, NOU, ONZ)
  • Ventricular assist devices (e.g., DSQ, PCK)

Final Guidance

In addition to the draft list, FDA finalized guidance from 2011 on applying human factors and usability engineering to medical devices.

The agency said it received over 600 comments on the draft guidance, which deals mostly with design and user interface, “which were generally supportive of the draft guidance document, but requested clarification in a number of areas. The most frequent types of comments requested revisions to the language or structure of the document, or clarification on risk mitigation and human factors testing methods, user populations for testing, training of test participants, determining the appropriate sample size in human factors testing, reporting of testing results in premarket submissions, and collecting human factors data as part of a clinical study.”

In response to these comments, FDA said it revised the guidance, which supersedes guidance from 2000 entitled “Medical Device Use-Safety: Incorporating Human Factors Engineering into Risk Management,” to clarify “the points identified and restructured the information for better readability and comprehension.”

Details

The goal of the guidance, according to FDA, is to ensure that the device user interface has been designed such that use errors that occur during use of the device that could cause harm or degrade medical treatment are either eliminated or reduced to the extent possible.

FDA said the most effective strategies to employ during device design to reduce or eliminate use-related hazards involve modifications to the device user interface, which should be logical and intuitive.

In its conclusion, FDA also outlined the ways that device manufacturers were able to save money through the use of human factors engineering (HFE) and usability engineering (UE).

– See more at: http://www.raps.org/Regulatory-Focus/News/2016/02/02/24233/Human-Factors-Review-FDA-Outlines-Highest-Priority-Devices/#sthash.cDTr9INl.dpuf

 

Please see an FDA PowerPoint on Human Factors Regulatory Issues for Combination Drug/Device Products here: MFStory_RAPS 2011 – HF of ComboProds_v4

 

 

 

 

Read Full Post »


From GEN News Highlights

Reposted from GEN News

Nov 18, 2015

RNA-Based Drugs Turn CRISPR/Cas9 On and Off

  • This image depicts a conventional CRISPR-Cas9 system. The Cas9 enzyme acts like a wrench, and specific RNA guides act as different socket heads. Conventional CRISPR-Cas9 systems act continuously, raising the risk of off-target effects. But CRISPR-Cas9 systems that incorporate specially engineered RNAs could act transiently, potentially reducing unwanted changes. [Ernesto del Aguila III, NHGRI]

    By removing parts of the CRISPR/Cas9 gene-editing system, and replacing them with specially engineered molecules, researchers at the University of California, San Diego (UCSD) and Isis Pharmaceutical hope to limit the CRISPR/Cas9 system’s propensity for off-target effects. The researchers say that CRISPR/Cas9 needn’t remain continuously active. Instead, it could be transiently activated and deactivated. Such on/off control could prevent residual gene-editing activity that might go awry. Also, such control could be exploited for therapeutic purposes.

    The key, report the scientists, is the introduction of RNA-based drugs that can replace the guide RNA that usually serves to guide the Cas9 enzyme to a particular DNA sequence. When Cas9 is guided by a synthetic RNA-based drug, its cutting action can be suspended whenever the RNA-based drug is cleared. The Cas9’s cutting action can be stopped even more quickly if a second, chemically modified RNA drug is added, provided that it is engineered to direct inactivation of the gene encoding the Cas9 enzyme.

    Details about temporarily activated CRISPR/Cas9 systems appeared November 16 in the Proceedings of the National Academy of Sciences, in a paper entitled, “Synthetic CRISPR RNA-Cas9–guided genome editing in human cells.” The paper’s senior author, the USCD’s Don Cleveland, Ph.D., noted that the RNA-based drugs described in the study “provide many advantages over the current CRISPR/Cas9 system,” such as increased editing efficiency and potential selectivity.

    “Here we develop a chemically modified, 29-nucleotide synthetic CRISPR RNA (scrRNA), which in combination with unmodified transactivating crRNA (tracrRNA) is shown to functionally replace the natural guide RNA in the CRISPR-Cas9 nuclease system and to mediate efficient genome editing in human cells,” wrote the authors of the PNAS paper. “Incorporation of rational chemical modifications known to protect against nuclease digestion and stabilize RNA–RNA interactions in the tracrRNA hybridization region of CRISPR RNA (crRNA) yields a scrRNA with enhanced activity compared with the unmodified crRNA and comparable gene disruption activity to the previously published single guide RNA.”

    Not only did the synthetic RNA functionally replace the natural crRNA, it produced enhanced cleavage activity at a target DNA site with apparently reduced off-target cleavage. These findings, Dr. Cleveland explained, could provide a platform for multiple therapeutic applications, especially for nervous system diseases, using successive application of cell-permeable, synthetic CRISPR RNAs to activate and then silence Cas9 activity. “In addition,” he said, “[these designer RNAs] can be synthesized efficiently, on an industrial scale and in a commercially feasible manner today.”

 

 

Read Full Post »


Research on Scaffolds to support Stem Cells prior to Implantation

Reporter: Aviva Lev-Ari, PhD, RN

 

 

Fibrous Scaffolds with Varied Fiber Chemistry and Growth Factor Delivery Promote Repair in a Porcine Cartilage Defect Model

Iris L. Kim, Christian G. Pfeifer, Matthew B. Fisher, Vishal Saxena, Gregory R. Meloni, Mi Y. Kwon, Minwook Kim, David R. Steinberg, Robert L. Mauck, Jason A. Burdick

Tissue Engineering Part A. November 2015: 2680-2690.

Abstract | Full Text PDF or HTML | Supplementary Material | Reprints | Permissions

 

  Hydrogel Microencapsulated Insulin-Secreting Cells Increase Keratinocyte Migration, Epidermal Thickness, Collagen Fiber Density, and Wound Closure in a Diabetic Mouse Model of Wound Healing

Ayesha Aijaz, Renea Faulknor, François Berthiaume, Ronke M. Olabisi

Tissue Engineering Part A. November 2015: 2723-2732.

Abstract | Full Text PDF or HTML | Reprints | Permissions

 

Bone Regeneration Using Hydroxyapatite Sponge Scaffolds with In Vivo Deposited Extracellular Matrix

Reiza Dolendo Ventura, Andrew Reyes Padalhin, Young-Ki Min, Byong-Taek Lee

Tissue Engineering Part A. November 2015: 2649-2661.

Abstract | Full Text PDF or HTML | Reprints | Permissions

 

In Vivo Evaluation of Adipose-Derived Stromal Cells Delivered with a Nanofiber Scaffold for Tendon-to-Bone Repair

Justin Lipner, Hua Shen, Leonardo Cavinatto, Wenying Liu, Necat Havlioglu, Younan Xia, Leesa M. Galatz,Stavros Thomopoulos

Tissue Engineering Part A. November 2015: 2766-2774.

Abstract | Full Text PDF or HTML | Supplementary Material | Reprints | Permissions

 

The Effects of Platelet-Rich Plasma on Cell Proliferation and Adipogenic Potential of Adipose-Derived Stem Cells

Han Tsung Liao, Isaac B. James, Kacey G. Marra, J. Peter Rubin

Tissue Engineering Part A. November 2015: 2714-2722.

Abstract | Full Text PDF or HTML | Reprints | Permissions

 

Ligament Tissue Engineering Using a Novel Porous Polycaprolactone Fumarate Scaffold and Adipose Tissue-Derived Mesenchymal Stem Cells Grown in Platelet Lysate

Eric R. Wagner, Dalibel Bravo, Mahrokh Dadsetan, Scott M. Riester, Steven Chase, Jennifer J. Westendorf,Allan B. Dietz, Andre J. van Wijnen, Michael J. Yaszemski, Sanjeev Kakar

Tissue Engineering Part A. November 2015: 2703-2713.

Abstract | Full Text PDF or HTML | Reprints | Permissions

 

Read Full Post »


FDA Guidance On Source Animal, Product, Preclinical and Clinical Issues Concerning the Use of Xenotranspantation Products in Humans – Implications for 3D BioPrinting of Regenerative Tissue

Reporter: Stephen J. Williams, Ph.D.

 

The FDA has submitted Final Guidance on use xeno-transplanted animal tissue, products, and cells into human and their use in medical procedures. Although the draft guidance was to expand on previous guidelines to prevent the introduction, transmission, and spread of communicable diseases, this updated draft may have implications for use of such tissue in the emerging medical 3D printing field.

This document is to provide guidance on the production, testing and evaluation of products intended for use in xenotransplantation. The guidance includes scientific questions that should be addressed by sponsors during protocol development and during the preparation of submissions to the Food and Drug Administration (FDA), e.g., Investigational New Drug Application (IND) and Biologics License Application (BLA). This guidance document finalizes the draft guidance of the same title dated February 2001.

For the purpose of this document, xenotransplantation refers to any procedure that involves the transplantation, implantation, or infusion into a human recipient of either (a) live cells, tissues, or organs from a nonhuman animal source, or (b) human body fluids, cells, tissues or organs that have had ex vivo contact with live nonhuman animal cells, tissues or organs. For the purpose of this document, xenotransplantation products include live cells, tissues or organs used in xenotransplantation. (See Definitions in section I.C.)

This document presents issues that should be considered in addressing the safety of viable materials obtained from animal sources and intended for clinical use in humans. The potential threat to both human and animal welfare from zoonotic or other infectious agents warrants careful characterization of animal sources of cells, tissues, and organs. This document addresses issues such as the characterization of source animals, source animal husbandry practices, characterization of xenotransplantation products, considerations for the xenotransplantation product manufacturing facility, appropriate preclinical models for xenotransplantation protocols, and monitoring of recipients of xenotransplantation products. This document recommends specific practices intended to prevent the introduction and spread of infectious agents of animal origin into the human population. FDA expects that new methods proposed by sponsors to address specific issues will be scientifically rigorous and that sufficient data will be presented to justify their use.

Examples of procedures involving xenotransplantation products include:

  • transplantation of xenogeneic hearts, kidneys, or pancreatic tissue to treat organ failure,
  • implantation of neural cells to ameliorate neurological degenerative diseases,
  • administration of human cells previously cultured ex vivo with live nonhuman animal antigen-presenting or feeder cells, and
  • extracorporeal perfusion of a patient’s blood or blood component perfused through an intact animal organ or isolated cells contained in a device to treat liver failure.

The guidance addresses issues such as:

  1. Clinical Protocol Review
  2. Xenotransplantation Site
  3. Criteria for Patient Selection
  4. Risk/Benefit Assessment
  5. Screening for Infectious Agents
  6. Patient Follow-up
  7. Archiving of Patient Plasma and Tissue Specimens
  8. Health Records and Data Management
  9. Informed Consent
  10. Responsibility of the Sponsor in Informing the Patient of New Scientific Information

A full copy of the PDF can be found below for reference:

fdaguidanceanimalsourcesxenotransplatntation

An example of the need for this guidance in conjunction with 3D printing technology can be understood from the below article (source http://www.geneticliteracyproject.org/2015/09/03/pig-us-xenotransplantation-new-age-chimeric-organs/)

Pig in us: Xenotransplantation and new age of chimeric organs

David Warmflash | September 3, 2015 | Genetic Literacy Project

Imagine stripping out the failing components of an old car — the engine, transmission, exhaust system and all of those parts — leaving just the old body and other structural elements. Replace those old mechanical parts with a brand new electric, hydrogen powered, biofuel, nuclear or whatever kind of engine you want and now you have a brand new car. It has an old frame, but that’s okay. The frame wasn’t causing the problem, and it can live on for years, undamaged.

When challenged to design internal organs, tissue engineers are taking a similar approach, particularly with the most complex organs, like the heart, liver and kidneys. These organs have three dimensional structures that are elaborate, not just at the gross anatomic level, but in microscopic anatomy too. Some day, their complex connective tissue scaffolding, the stroma, might be synthesized from the needed collagen proteins with advanced 3-D printing. But biomedical engineering is not there yet, so right now the best candidate for organ scaffolding comes from one of humanity’s favorite farm animals: the pig.

Chimera alarmists connecting with anti-biotechnology movements might cringe at the thought of building new human organs starting with pig tissue, but if you’re using only the organ scaffolding and building a working organ from there, pig organs may actually be more desirable than those donated by humans.

How big is the anti-chimerite movement?

Unlike anti-GMO and anti-vaccination activists, there really aren’t too many anti-chemerites around. Nevertheless, there is a presence on the web of people who express concern about mixing of humans and non-human animals. Presently, much of their concern is focussed on the growing of human organs inside non-human animals, pigs included. One anti-chemerite has written that it could be a problem for the following reason:

Once a human organ is grown inside a pig, that pig is no longer fully a pig. And without a doubt, that organ will no longer be a fully human organ after it is grown inside the pig. Those receiving those organs will be allowing human-animal hybrid organs to be implanted into them. Most people would be absolutely shocked to learn some of the things that are currently being done in the name of science.

The blog goes on to express alarm about the use of human genes in rice and from there morphs into an off the shelf garden variety anti-GMO tirade, though with an an anti-chemeric current running through it. The concern about making pigs a little bit human and humans a little bit pig becomes a concern about making rice a little bit human. But the concern about fusing tissues and genes of humans and other species does not fit with the trend in modern medicine.

Utilization of pig tissue enters a new age 

pigsinus

A porcine human ear for xenotransplantation. source: The Scientist

For decades, pig, bovine and other non-human tissues have been used in medicine. People are walking around with pig and cow heart valves. Diabetics used to get a lot of insulin from pigs and cows, although today, thanks to genetic engineering, they’re getting human insulin produced by microorganisms modified genetically to make human insulin, which is safer and more effective.

When it comes to building new organs from old ones, however, pig organs could actually be superior for a couple of reasons. For one thing, there’s no availability problem with pigs. Their hearts and other organs also have all of the crucial components of the extracellular matrix that makes up an organ’s scaffolding. But unlike human organs, the pig organs don’t tend to carry or transfer human diseases. That is a major advantage that makes them ideal starting material. Plus there is another advantage: typically, the hearts of human cadavers are damaged, either because heart disease is what killed the human owner or because resuscitation efforts aimed at restarting the heart of a dying person using electrical jolts and powerful drugs.

Rebuilding an old organ into a new one

How then does the process work? Whether starting with a donated human or pig organ, there are several possible methods. But what they all have in common is that only the scaffolding of the original organ is retained. Just like the engine and transmission of the old car, the working tissue is removed, usually using detergents. One promising technique that has been applied to engineer new hearts is being tested by researchers at the University of Pittsburgh. Detergents pumped into the aorta attached to a donated heart (donated by a human cadaver, or pig or cow). The pressure keeps the aortic valve closed, so the detergents to into the coronary arteries and through the myocardial (heart muscle) and endocardial (lining over the muscle inside the heart chambers) tissue, which thus gets dissolved over the course of days. What’s left is just the stroma tissue, forming a scaffold. But that scaffold has signaling factors that enable embryonic stem cells, or specially programed adult pleuripotent cells to become all of the needed cells for a new heart.

Eventually, 3-D printing technology may reach the point when no donated scaffolding is needed, but that’s not the case quite yet, plus with a pig scaffolding all of the needed signaling factors are there and they work just as well as those in a human heart scaffold. All of this can lead to a scenario, possibly very soon, in which organs are made using off-the-self scaffolding from pig organs, ready to produce a custom-made heart using stem or other cells donated by new organ’s recipient.

David Warmflash is an astrobiologist, physician, and science writer. Follow @CosmicEvolution to read what he is saying on Twitter.

And a Great Article in The Scientist by Dr. Ed Yong Entitled

Replacement Parts

To cope with a growing shortage of hearts, livers, and lungs suitable for transplant, some scientists are genetically engineering pigs, while others are growing organs in the lab.

By Ed Yong | August 1, 2012

Source: http://www.the-scientist.com/?articles.view/articleNo/32409/title/Replacement-Parts/

.. where Joseph Vacanti and David Cooper figured that using

“engineered pigs without the a-1,3-galactosyltransferase gene that produces the a-gal residues. In addition, the pigs carry human cell-membrane proteins such as CD55 and CD46 that prevent the host’s complement system from assembling and attacking the foreign cells”

thereby limiting rejection of the xenotransplated tissue.

In addition to issues related to animal virus transmission the issue of optimal scaffolds for organs as well as the advantages which 3D Printing would have in mass production of organs is discussed:

To Vacanti, artificial scaffolds are the future of organ engineering, and the only way in which organs for transplantation could be mass-produced. “You should be able to make them on demand, with low-cost materials and manufacturing technologies,” he says. That is relatively simple for organs like tracheas or bladders, which are just hollow tubes or sacs. Even though it is far more difficult for the lung or liver, which have complicated structures, Vacanti thinks it will be possible to simulate their architecture with computer models, and fabricate them with modern printing technology. (See “3-D Printing,” The Scientist, July 2012.) “They obey very ordered rules, so you can reduce it down to a series of algorithms, which can help you design them,” he says. But Taylor says that even if the architecture is correct, the scaffold would still need to contain the right surface molecules to guide the growth of any added cells. “It seems a bit of an overkill when nature has already done the work for us,” she says.

Read Full Post »