Healthcare analytics, AI solutions for biological big data, providing an AI platform for the biotech, life sciences, medical and pharmaceutical industries, as well as for related technological approaches, i.e., curation and text analysis with machine learning and other activities related to AI applications to these industries.
Core member and chair of the faculty, Broad Institute of MIT and Harvard; director, Klarman Cell Observatory, Broad Institute of MIT and Harvard; professor of biology, MIT; investigator, Howard Hughes Medical Institute; founding co-chair, Human Cell Atlas.
millions of genome variants, tens of thousands of disease-associated genes, thousands of cell types and an almost unimaginable number of ways they can combine, we had to approximate a best starting point—choose one target, guess the cell, simplify the experiment.
In 2020, advances in polygenic risk scores, in understanding the cell and modules of action of genes through genome-wide association studies (GWAS), and in predicting the impact of combinations of interventions.
we need algorithms to make better computational predictions of experiments we have never performed in the lab or in clinical trials.
Human Cell Atlas and the International Common Disease Alliance—and in new experimental platforms: data platforms and algorithms. But we also need a broader ecosystem of partnerships in medicine that engages interaction between clinical experts and mathematicians, computer scientists and engineers
Feng Zhang, PhD
investigator, Howard Hughes Medical Institute; core member, Broad Institute of MIT and Harvard; James and Patricia Poitras Professor of Neuroscience, McGovern Institute for Brain Research, MIT.
fundamental shift in medicine away from treating symptoms of disease and toward treating disease at its genetic roots.
Gene therapy with clinical feasibility, improved delivery methods and the development of robust molecular technologies for gene editing in human cells, affordable genome sequencing has accelerated our ability to identify the genetic causes of disease.
1,000 clinical trials testing gene therapies are ongoing, and the pace of clinical development is likely to accelerate.
refine molecular technologies for gene editing, to push our understanding of gene function in health and disease forward, and to engage with all members of society
Elizabeth Jaffee, PhD
Dana and Albert “Cubby” Broccoli Professor of Oncology, Johns Hopkins School of Medicine; deputy director, Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins.
a single blood test could inform individuals of the diseases they are at risk of (diabetes, cancer, heart disease, etc.) and that safe interventions will be available.
developing cancer vaccines. Vaccines targeting the causative agents of cervical and hepatocellular cancers have already proven to be effective. With these technologies and the wealth of data that will become available as precision medicine becomes more routine, new discoveries identifying the earliest genetic and inflammatory changes occurring within a cell as it transitions into a pre-cancer can be expected. With these discoveries, the opportunities to develop vaccine approaches preventing cancers development will grow.
shape how the culture of research will develop over the next 25 years, a culture that cares more about what is achieved than how it is achieved.
building a creative, inclusive and open research culture will unleash greater discoveries with greater impact.
John Nkengasong, PhD
Director, Africa Centres for Disease Control and Prevention.
To meet its health challenges by 2050, the continent will have to be innovative in order to leapfrog toward solutions in public health.
Precision medicine will need to take center stage in a new public health order— whereby a more precise and targeted approach to screening, diagnosis, treatment and, potentially, cure is based on each patient’s unique genetic and biologic make-up.
Eric Topol, MD
Executive vice-president, Scripps Research Institute; founder and director, Scripps Research Translational Institute.
In 2045, a planetary health infrastructure based on deep, longitudinal, multimodal human data, ideally collected from and accessible to as many as possible of the 9+ billion people projected to then inhabit the Earth.
enhanced capabilities to perform functions that are not feasible now.
AI machines’ ability to ingest and process biomedical text at scale—such as the corpus of the up-to-date medical literature—will be used routinely by physicians and patients.
the concept of a learning health system will be redefined by AI.
Linda Partridge, PhD
Professor, Max Planck Institute for Biology of Ageing.
Geroprotective drugs, which target the underlying molecular mechanisms of ageing, are coming over the scientific and clinical horizons, and may help to prevent the most intractable age-related disease, dementia.
Trevor Mundel, MD
President of Global Health, Bill & Melinda Gates Foundation.
finding new ways to share clinical data that are as open as possible and as closed as necessary.
moving beyond drug donations toward a new era of corporate social responsibility that encourages biotechnology and pharmaceutical companies to offer their best minds and their most promising platforms.
working with governments and multilateral organizations much earlier in the product life cycle to finance the introduction of new interventions and to ensure the sustainable development of the health systems that will deliver them.
deliver on the promise of global health equity.
Josep Tabernero, MD, PhD
Vall d’Hebron Institute of Oncology (VHIO); president, European Society for Medical Oncology (2018–2019).
genomic-driven analysis will continue to broaden the impact of personalized medicine in healthcare globally.
Precision medicine will continue to deliver its new paradigm in cancer care and reach more patients.
Immunotherapy will deliver on its promise to dismantle cancer’s armory across tumor types.
AI will help guide the development of individually matched
genetic patient screenings
the promise of liquid biopsy policing of disease?
Pardis Sabeti, PhD
Professor, Harvard University & Harvard T.H. Chan School of Public Health and Broad Institute of MIT and Harvard; investigator, Howard Hughes Medical Institute.
the development and integration of tools into an early-warning system embedded into healthcare systems around the world could revolutionize infectious disease detection and response.
But this will only happen with a commitment from the global community.
Els Toreele, PhD
Executive director, Médecins Sans Frontières Access Campaign
we need a paradigm shift such that medicines are no longer lucrative market commodities but are global public health goods—available to all those who need them.
This will require members of the scientific community to go beyond their role as researchers and actively engage in R&D policy reform mandating health research in the public interest and ensuring that the results of their work benefit many more people.
The global research community can lead the way toward public-interest driven health innovation, by undertaking collaborative open science and piloting not-for-profit R&D strategies that positively impact people’s lives globally.
Use of 3D Bioprinting for Development of Toxicity Prediction Models
Curator: Stephen J. Williams, PhD
SOT FDA Colloquium on 3D Bioprinted Tissue Models: Tuesday, April 9, 2019
The Society of Toxicology (SOT) and the U.S. Food and Drug Administration (FDA) will hold a workshop on “Alternative Methods for Predictive Safety Testing: 3D Bioprinted Tissue Models” on Tuesday, April 9, at the FDA Center for Food Safety and Applied Nutrition in College Park, Maryland. This workshop is the latest in the series, “SOT FDA Colloquia on Emerging Toxicological Science: Challenges in Food and Ingredient Safety.”
Human 3D bioprinted tissues represent a valuable in vitro approach for chemical, personal care product, cosmetic, and preclinical toxicity/safety testing. Bioprinting of skin, liver, and kidney is already appearing in toxicity testing applications for chemical exposures and disease modeling. The use of 3D bioprinted tissues and organs may provide future alternative approaches for testing that may more closely resemble and simulate intact human tissues to more accurately predict human responses to chemical and drug exposures.
A synopsis of the schedule and related works from the speakers is given below:
8:40 AM–9:20 AM
Overview and Challenges of Bioprinting
Sharon Presnell, Amnion Foundation, Winston-Salem, NC
9:20 AM–10:00 AM
Putting 3D Bioprinting to the Use of Tissue Model Fabrication
Y. Shrike Zhang, Brigham and Women’s Hospital, Harvard Medical School and Harvard-MIT Division of Health Sciences and Technology, Boston, MA
10:00 AM–10:20 AM
Break
10:20 AM–11:00 AM
Uses of Bioprinted Liver Tissue in Drug Development
Jean-Louis Klein, GlaxoSmithKline, Collegeville, PA
11:00 AM–11:40 AM
Biofabrication of 3D Tissue Models for Disease Modeling and Chemical Screening
Marc Ferrer, National Center for Advancing Translational Sciences, NIH, Rockville, MD
Dr. Sharon Presnell was most recently the Chief Scientific Officer at Organovo, Inc., and the President of their wholly-owned subsidiary, Samsara Sciences. She received a Ph.D. in Cell & Molecular Pathology from the Medical College of Virginia and completed her undergraduate degree in biology at NC State. In addition to her most recent roles, Presnell has served as the director of cell biology R&D at Becton Dickinson’s corporate research center in RTP, and as the SVP of R&D at Tengion. Her roles have always involved the commercial and clinical translation of basic research and early development in the cell biology space. She serves on the board of the Coulter Foundation at the University of Virginia and is a member of the College of Life Sciences Foundation Board at NC State. In January 2019, Dr. Presnell will begin a new role as President of the Amnion Foundation, a non-profit organization in Winston-Salem.
Integrating Kupffer cells into a 3D bioprinted model of human liver recapitulates fibrotic responses of certain toxicants in a time and context dependent manner. This work establishes that the presence of Kupffer cells or macrophages are important mediators in fibrotic responses to certain hepatotoxins and both should be incorporated into bioprinted human liver models for toxicology testing.
Abstract: Modeling clinically relevant tissue responses using cell models poses a significant challenge for drug development, in particular for drug induced liver injury (DILI). This is mainly because existing liver models lack longevity and tissue-level complexity which limits their utility in predictive toxicology. In this study, we established and characterized novel bioprinted human liver tissue mimetics comprised of patient-derived hepatocytes and non-parenchymal cells in a defined architecture. Scaffold-free assembly of different cell types in an in vivo-relevant architecture allowed for histologic analysis that revealed distinct intercellular hepatocyte junctions, CD31+ endothelial networks, and desmin positive, smooth muscle actin negative quiescent stellates. Unlike what was seen in 2D hepatocyte cultures, the tissues maintained levels of ATP, Albumin as well as expression and drug-induced enzyme activity of Cytochrome P450s over 4 weeks in culture. To assess the ability of the 3D liver cultures to model tissue-level DILI, dose responses of Trovafloxacin, a drug whose hepatotoxic potential could not be assessed by standard pre-clinical models, were compared to the structurally related non-toxic drug Levofloxacin. Trovafloxacin induced significant, dose-dependent toxicity at clinically relevant doses (≤ 4uM). Interestingly, Trovafloxacin toxicity was observed without lipopolysaccharide stimulation and in the absence of resident macrophages in contrast to earlier reports. Together, these results demonstrate that 3D bioprinted liver tissues can both effectively model DILI and distinguish between highly related compounds with differential profile. Thus, the combination of patient-derived primary cells with bioprinting technology here for the first timedemonstrates superior performance in terms of mimicking human drug response in a known target organ at the tissue level.
A great interview with Dr. Presnell and the 3D Models 2017 Symposium is located here:
Please clickhere for Web based and PDF version of interview
Some highlights of the interview include
Exciting advances in field showing we can model complex tissue-level disease-state phenotypes that develop in response to chronic long term injury or exposure
Sees the field developing a means to converge both the biology and physiology of tissues, namely modeling the connectivity between tissues such as fluid flow
Future work will need to be dedicated to develop comprehensive analytics for 3D tissue analysis. As she states “we are very conditioned to get information in a simple way from biochemical readouts in two dimension, monocellular systems” however how we address the complexity of various cellular responses in a 3D multicellular environment will be pertinent.
Additional challenges include the scalability of such systems and making such system accessible in a larger way
Shrike Zhang, Brigham and Women’s Hospital, Harvard Medical School and Harvard-MIT Division of Health Sciences and Technology
Dr. Zhang currently holds an Assistant Professor position at Harvard Medical School and is an Associate Bioengineer at Brigham and Women’s Hospital. His research interests include organ-on-a-chip, 3D bioprinting, biomaterials, regenerative engineering, biomedical imaging, biosensing, nanomedicine, and developmental biology. His scientific contributions have been recognized by >40 international, national, and regional awards. He has been invited to deliver >70 lectures worldwide, and has served as reviewer for >400 manuscripts for >30 journals. He is serving as Editor-in-Chief for Microphysiological Systems, and Associate Editor for Bio-Design and Manufacturing. He is also on Editorial Board of Bioprinting, Heliyon, BMC Materials, and Essays in Biochemistry, and on Advisory Panel of Nanotechnology.
Skardal A, Murphy SV, Devarasetty M, Mead I, Kang HW, Seol YJ, Shrike Zhang Y, Shin SR, Zhao L, Aleman J, Hall AR, Shupe TD, Kleensang A, Dokmeci MR, Jin Lee S, Jackson JD, Yoo JJ, Hartung T, Khademhosseini A, Soker S, Bishop CE, Atala A.
Sci Rep. 2017 Aug 18;7(1):8837. doi: 10.1038/s41598-017-08879-x.
Bhise NS, Manoharan V, Massa S, Tamayol A, Ghaderi M, Miscuglio M, Lang Q, Shrike Zhang Y, Shin SR, Calzone G, Annabi N, Shupe TD, Bishop CE, Atala A, Dokmeci MR, Khademhosseini A.
Biofabrication. 2016 Jan 12;8(1):014101. doi: 10.1088/1758-5090/8/1/014101.
Marc Ferrer, National Center for Advancing Translational Sciences, NIH
Marc Ferrer is a team leader in the NCATS Chemical Genomics Center, which was part of the National Human Genome Research Institute when Ferrer began working there in 2010. He has extensive experience in drug discovery, both in the pharmaceutical industry and academic research. Before joining NIH, he was director of assay development and screening at Merck Research Laboratories. For 10 years at Merck, Ferrer led the development of assays for high-throughput screening of small molecules and small interfering RNA (siRNA) to support programs for lead and target identification across all disease areas.
At NCATS, Ferrer leads the implementation of probe development programs, discovery of drug combinations and development of innovative assay paradigms for more effective drug discovery. He advises collaborators on strategies for discovering small molecule therapeutics, including assays for screening and lead identification and optimization. Ferrer has experience implementing high-throughput screens for a broad range of disease areas with a wide array of assay technologies. He has led and managed highly productive teams by setting clear research strategies and goals and by establishing effective collaborations between scientists from diverse disciplines within industry, academia and technology providers.
Ferrer has a Ph.D. in biological chemistry from the University of Minnesota, Twin Cities, and completed postdoctoral training at Harvard University’s Department of Molecular and Cellular Biology. He received a B.Sc. degree in organic chemistry from the University of Barcelona in Spain.
Wilson KM, Mathews-Griner LA, Williamson T, Guha R, Chen L, Shinn P, McKnight C, Michael S, Klumpp-Thomas C, Binder ZA, Ferrer M, Gallia GL, Thomas CJ, Riggins GJ.
SLAS Technol. 2019 Feb;24(1):28-40. doi: 10.1177/2472630318803749. Epub 2018 Oct 5.
Can Blockchain Technology and Artificial Intelligence Cure What Ails Biomedical Research and Healthcare, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)
Can Blockchain Technology and Artificial Intelligence Cure What Ails Biomedical Research and Healthcare
Curator: Stephen J. Williams, Ph.D.
Updated 12/18/2018
In the efforts to reduce healthcare costs, provide increased accessibility of service for patients, and drive biomedical innovations, many healthcare and biotechnology professionals have looked to advances in digital technology to determine the utility of IT to drive and extract greater value from healthcare industry. Two areas of recent interest have focused how best to use blockchain and artificial intelligence technologies to drive greater efficiencies in our healthcare and biotechnology industries.
More importantly, with the substantial increase in ‘omic data generated both in research as well as in the clinical setting, it has become imperative to develop ways to securely store and disseminate the massive amounts of ‘omic data to various relevant parties (researchers or clinicians), in an efficient manner yet to protect personal privacy and adhere to international regulations. This is where blockchain technologies may play an important role.
A recent Oncotarget paper by Mamoshina et al. (1) discussed the possibility that next-generation artificial intelligence and blockchain technologies could synergize to accelerate biomedical research and enable patients new tools to control and profit from their personal healthcare data, and assist patients with their healthcare monitoring needs. According to the abstract:
The authors introduce new concepts to appraise and evaluate personal records, including the combination-, time- and relationship value of the data. They also present a roadmap for a blockchain-enabled decentralized personal health data ecosystem to enable novel approaches for drug discovery, biomarker development, and preventative healthcare. In this system, blockchain and deep learning technologies would provide the secure and transparent distribution of personal data in a healthcare marketplace, and would also be useful to resolve challenges faced by the regulators and return control over personal data including medical records to the individual.
The review discusses:
Recent achievements in next-generation artificial intelligence
Basic concepts of highly distributed storage systems (HDSS) as a preferred method for medical data storage
Open source blockchain Exonium and its application for healthcare marketplace
A blockchain-based platform allowing patients to have control of their data and manage access
How advances in deep learning can improve data quality, especially in an era of big data
Advances in Artificial Intelligence
Integrative analysis of the vast amount of health-associated data from a multitude of large scale global projects has proven to be highly problematic (REF 27), as high quality biomedical data is highly complex and of a heterogeneous nature, which necessitates special preprocessing and analysis.
Increased computing processing power and algorithm advances have led to significant advances in machine learning, especially machine learning involving Deep Neural Networks (DNNs), which are able to capture high-level dependencies in healthcare data. Some examples of the uses of DNNs are:
Prediction of drug properties(2, 3) and toxicities(4)
Generative Adversarial Networks (https://arxiv.org/abs/1406.2661): requires good datasets for extensive training but has been used to determine tumor growth inhibition capabilities of various molecules (7)
Recurrent neural Networks (RNN): Originally made for sequence analysis, RNN has proved useful in analyzing text and time-series data, and thus would be very useful for electronic record analysis. Has also been useful in predicting blood glucose levels of Type I diabetic patients using data obtained from continuous glucose monitoring devices (8)
Transfer Learning: focused on translating information learned on one domain or larger dataset to another, smaller domain. Meant to reduce the dependence on large training datasets that RNN, GAN, and DNN require. Biomedical imaging datasets are an example of use of transfer learning.
One and Zero-Shot Learning: retains ability to work with restricted datasets like transfer learning. One shot learning aimed to recognize new data points based on a few examples from the training set while zero-shot learning aims to recognize new object without seeing the examples of those instances within the training set.
Highly Distributed Storage Systems (HDSS)
The explosion in data generation has necessitated the development of better systems for data storage and handling. HDSS systems need to be reliable, accessible, scalable, and affordable. This involves storing data in different nodes and the data stored in these nodes are replicated which makes access rapid. However data consistency and affordability are big challenges.
Blockchain is a distributed database used to maintain a growing list of records, in which records are divided into blocks, locked together by a crytosecurity algorithm(s) to maintain consistency of data. Each record in the block contains a timestamp and a link to the previous block in the chain. Blockchain is a distributed ledger of blocks meaning it is owned and shared and accessible to everyone. This allows a verifiable, secure, and consistent history of a record of events.
Data Privacy and Regulatory Issues
The establishment of the Health Insurance Portability and Accountability Act (HIPAA) in 1996 has provided much needed regulatory guidance and framework for clinicians and all concerned parties within the healthcare and health data chain. The HIPAA act has already provided much needed guidance for the latest technologies impacting healthcare, most notably the use of social media and mobile communications (discussed in this article Can Mobile Health Apps Improve Oral-Chemotherapy Adherence? The Benefit of Gamification.). The advent of blockchain technology in healthcare offers its own unique challenges however HIPAA offers a basis for developing a regulatory framework in this regard. The special standards regarding electronic data transfer are explained in HIPAA’s Privacy Rule, which regulates how certain entities (covered entities) use and disclose individual identifiable health information (Protected Health Information PHI), and protects the transfer of such information over any medium or electronic data format. However, some of the benefits of blockchain which may revolutionize the healthcare system may be in direct contradiction with HIPAA rules as outlined below:
Issues of Privacy Specific In Use of Blockchain to Distribute Health Data
Blockchain was designed as a distributed database, maintained by multiple independent parties, and decentralized
Linkage timestamping; although useful in time dependent data, proof that third parties have not been in the process would have to be established including accountability measures
Blockchain uses a consensus algorithm even though end users may have their own privacy key
Applied cryptography measures and routines are used to decentralize authentication (publicly available)
Blockchain users are divided into three main categories: 1) maintainers of blockchain infrastructure, 2) external auditors who store a replica of the blockchain 3) end users or clients and may have access to a relatively small portion of a blockchain but their software may use cryptographic proofs to verify authenticity of data.
YouTube video on How #Blockchain Will Transform Healthcare in 25 Years (please click below)
In Big Data for Better Outcomes, BigData@Heart, DO->IT, EHDN, the EU data Consortia, and yes, even concepts like pay for performance, Richard Bergström has had a hand in their creation. The former Director General of EFPIA, and now the head of health both at SICPA and their joint venture blockchain company Guardtime, Richard is always ahead of the curve. In fact, he’s usually the one who makes the curve in the first place.
Please click on the following link for a podcast on Big Data, Blockchain and Pharma/Healthcare by Richard Bergström:
Mamoshina, P., Ojomoko, L., Yanovich, Y., Ostrovski, A., Botezatu, A., Prikhodko, P., Izumchenko, E., Aliper, A., Romantsov, K., Zhebrak, A., Ogu, I. O., and Zhavoronkov, A. (2018) Converging blockchain and next-generation artificial intelligence technologies to decentralize and accelerate biomedical research and healthcare, Oncotarget9, 5665-5690.
Aliper, A., Plis, S., Artemov, A., Ulloa, A., Mamoshina, P., and Zhavoronkov, A. (2016) Deep Learning Applications for Predicting Pharmacological Properties of Drugs and Drug Repurposing Using Transcriptomic Data, Molecular pharmaceutics13, 2524-2530.
Wen, M., Zhang, Z., Niu, S., Sha, H., Yang, R., Yun, Y., and Lu, H. (2017) Deep-Learning-Based Drug-Target Interaction Prediction, Journal of proteome research16, 1401-1409.
Gao, M., Igata, H., Takeuchi, A., Sato, K., and Ikegaya, Y. (2017) Machine learning-based prediction of adverse drug effects: An example of seizure-inducing compounds, Journal of pharmacological sciences133, 70-78.
Putin, E., Mamoshina, P., Aliper, A., Korzinkin, M., Moskalev, A., Kolosov, A., Ostrovskiy, A., Cantor, C., Vijg, J., and Zhavoronkov, A. (2016) Deep biomarkers of human aging: Application of deep neural networks to biomarker development, Aging8, 1021-1033.
Vandenberghe, M. E., Scott, M. L., Scorer, P. W., Soderberg, M., Balcerzak, D., and Barker, C. (2017) Relevance of deep learning to facilitate the diagnosis of HER2 status in breast cancer, Scientific reports7, 45938.
Kadurin, A., Nikolenko, S., Khrabrov, K., Aliper, A., and Zhavoronkov, A. (2017) druGAN: An Advanced Generative Adversarial Autoencoder Model for de Novo Generation of New Molecules with Desired Molecular Properties in Silico, Molecular pharmaceutics14, 3098-3104.
Ordonez, F. J., and Roggen, D. (2016) Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition, Sensors (Basel)16.
December 5, 2018 | The boom of blockchain and distributed ledger technologies have inspired healthcare organizations to test the capabilities of their data. Quest Diagnostics, in partnership with Humana, MultiPlan, and UnitedHealth Group’s Optum and UnitedHealthcare, have launched a pilot program that applies blockchain technology to improve data quality and reduce administrative costs associated with changes to healthcare provider demographic data.
The collective body, called Synaptic Health Alliance, explores how blockchain can keep only the most current healthcare provider information available in health plan provider directories. The alliance plans to share their progress in the first half of 2019.
Providing consumers looking for care with accurate information when they need it is essential to a high-functioning overall healthcare system, Jason O’Meara, Senior Director of Architecture at Quest Diagnostics, told Clinical Informatics News in an email interview.
“We were intentional about calling ourselves an alliance as it speaks to the shared interest in improving health care through better, collaborative use of an innovative technology,” O’Meara wrote. “Our large collective dataset and national footprints enable us to prove the value of data sharing across company lines, which has been limited in healthcare to date.”
O’Meara said Quest Diagnostics has been investing time and resources the past year or two in understanding blockchain, its ability to drive purpose within the healthcare industry, and how to leverage it for business value.
“Many health care and life science organizations have cast an eye toward blockchain’s potential to inform their digital strategies,” O’Meara said. “We recognize it takes time to learn how to leverage a new technology. We started exploring the technology in early 2017, but we quickly recognized the technology’s value is in its application to business to business use cases: to help transparently share information, automate mutually-beneficial processes and audit interactions.”
Quest began discussing the potential for an alliance with the four other companies a year ago, O’Meara said. Each company shared traits that would allow them to prove the value of data sharing across company lines.
“While we have different perspectives, each member has deep expertise in healthcare technology, a collaborative culture, and desire to continuously improve the patient/customer experience,” said O’Meara. “We also recognize the value of technology in driving efficiencies and quality.”
Following its initial launch in April, Synaptic Health Alliance is deploying a multi-company, multi-site, permissioned blockchain. According to a whitepaper published by Synaptic Health, the choice to use a permissioned blockchain rather than an anonymous one is crucial to the alliance’s success.
“This is a more effective approach, consistent with enterprise blockchains,” an alliance representative wrote. “Each Alliance member has the flexibility to deploy its nodes based on its enterprise requirements. Some members have elected to deploy their nodes within their own data centers, while others are using secured public cloud services such as AWS and Azure. This level of flexibility is key to growing the Alliance blockchain network.”
As the pilot moves forward, O’Meara says the Alliance plans to open ability to other organizations. Earlier this week Aetna and Ascension announced they joined the project.
“I am personally excited by the amount of cross-company collaboration facilitated by this project,” O’Meara says. “We have already learned so much from each other and are using that knowledge to really move the needle on improving healthcare.”
November 29, 2018 | The US Department of Health and Human Services (HHS) is making waves in the blockchain space. The agency’s Division of Acquisition (DA) has developed a new system, called Accelerate, which gives acquisition teams detailed information on pricing, terms, and conditions across HHS in real-time. The department’s Associate Deputy Assistant Secretary for Acquisition, Jose Arrieta, gave a presentation and live demo of the blockchain-enabled system at the Distributed: Health event earlier this month in Nashville, Tennessee.
Accelerate is still in the prototype phase, Arrieta said, with hopes that the new system will be deployed at the end of the fiscal year.
HHS spends around $25 billion a year in contracts, Arrieta said. That’s 100,000 contracts a year with over one million pages of unstructured data managed through 45 different systems. Arrieta and his team wanted to modernize the system.
“But if you’re going to change the way a workforce of 20,000 people do business, you have to think your way through how you’re going to do that,” said Arrieta. “We didn’t disrupt the existing systems: we cannibalized them.”
The cannibalization process resulted in Accelerate. According to Arrieta, the system functions by creating a record of data rather than storing it, leveraging machine learning, artificial intelligence (AI), and robotic process automation (RPA), all through blockchain data.
“We’re using that data record as a mechanism to redesign the way we deliver services through micro-services strategies,” Arrieta said. “Why is that important? Because if you have a single application or data use that interfaces with 55 other applications in your business network, it becomes very expensive to make changes to one of the 55 applications.”
Accelerate distributes the data to the workforce, making it available to them one business process at a time.
“We’re building those business processes without disrupting the existing systems,” said Arrieta, and that’s key. “We’re not shutting off those systems. We’re using human-centered design sessions to rebuild value exchange off of that data.”
The first application for the system, Arrieta said, can be compared to department stores price-matching their online competitors.
It takes the HHS close to a month to collect the amalgamation of data from existing system, whether that be terms and conditions that drive certain price points, or software licenses.
“The micro-service we built actually analyzes that data, and provides that information to you within one second,” said Arrieta. “This is distributed to the workforce, to the 5,000 people that do the contracting, to the 15,000 people that actually run the programs at [HHS].”
This simple micro-service is replicated on every node related to HHS’s internal workforce. If somebody wants to change the algorithm to fit their needs, they can do that in a distributed manner.
Arrieta hopes to use Accelerate to save researchers money at the point of purchase. The program uses blockchain to simplify the process of acquisition.
“How many of you work with the federal government?” Arrieta asked the audience. “Do you get sick of reentering the same information over and over again? Every single business opportunity you apply for, you have to resubmit your financial information. You constantly have to check for validation and verification, constantly have to resubmit capabilities.”
Wouldn’t it be better to have historical notes available for each transaction? said Arrieta. This would allow clinical researchers to be able to focus on “the things they’re really good at,” instead of red tape.
“If we had the top cancer researcher in the world, would you really want her spending her time learning about federal regulations as to how to spend money, or do you want her trying to solve cancer?” Arrieta said. “What we’re doing is providing that data to the individual in a distributed manner so they can read the information of historical purchases that support activity, and they can focus on the objectives and risks they see as it relates to their programming and their objectives.”
Blockchain also creates transparency among researchers, Arrieta said, which says creates an “uncomfortable reality” in the fact that they have to make a decision regarding data, fundamentally changing value exchange.
“The beauty of our business model is internal investment,” Arrieta said. For instance, the HHS could take all the sepsis data that exists in their system, put it into a distributed ledger, and share it with an external source.
“Maybe that could fuel partnership,” Arrieta said. “I can make data available to researchers in the field in real-time so they can actually test their hypothesis, test their intuition, and test their imagination as it relates to solving real-world problems.”
Blockchain-based genomic data hub platform Shivom recently reached its $35 million hard cap within 15 seconds of opening its main token sale. Shivom received funding from a number of crypto VC funds, including Collinstar, Lateral, and Ironside.
The goal is to create the world’s largest store of genomic data while offering an open web marketplace for patients, data donors, and providers — such as pharmaceutical companies, research organizations, governments, patient-support groups, and insurance companies.
“Disrupting the whole of the health care system as we know it has to be the most exciting use of such large DNA datasets,” Shivom CEO Henry Ines told me. “We’ll be able to stratify patients for better clinical trials, which will help to advance research in precision medicine. This means we will have the ability to make a specific drug for a specific patient based on their DNA markers. And what with the cost of DNA sequencing getting cheaper by the minute, we’ll also be able to sequence individuals sooner, so young children or even newborn babies could be sequenced from birth and treated right away.”
While there are many solutions examining DNA data to explain heritage, intellectual capabilities, health, and fitness, the potential of genomic data has largely yet to be unlocked. A few companies hold the monopoly on genomic data and make sizeable profits from selling it to third parties, usually without sharing the earnings with the data donor. Donors are also not informed if and when their information is shared, nor do they have any guarantee that their data is secure from hackers.
Shivom wants to change that by creating a decentralized platform that will break these monopolies, democratizing the processes of sharing and utilizing the data.
“Overall, large DNA datasets will have the potential to aid in the understanding, prevention, diagnosis, and treatment of every disease known to mankind, and could create a future where no diseases exist, or those that do can be cured very easily and quickly,” Ines said. “Imagine that, a world where people do not get sick or are already aware of what future diseases they could fall prey to and so can easily prevent them.”
Shivom’s use of blockchain technology and smart contracts ensures that all genomic data shared on the platform will remain anonymous and secure, while its OmiX token incentivizes users to share their data for monetary gain.
Blockchain will secure the DNA database for 50 million citizens in the eighth-largest state in India. The government of Andhra Pradesh signed a Memorandum of Understanding with a German genomics and precision medicine start-up, Shivom, which announced to start the pilot project soon. The move falls in line with a trend for governments turning to population genomics, and at the same time securing the sensitive data through blockchain.
Andhra Pradesh, DNA, and blockchain
Storing sensitive genetic information safely and securely is a big challenge. Shivom builds a genomic data-hub powered by blockchain technology. It aims to connect researchers with DNA data donors thus facilitating medical research and the healthcare industry.
With regards to Andhra Pradesh, the start-up will first launch a trial to determine the viability of their technology for moving from a proactive to a preventive approach in medicine, and towards precision health. “Our partnership with Shivom explores the possibilities of providing an efficient way of diagnostic services to patients of Andhra Pradesh by maintaining the privacy of the individual data through blockchain technologies,” said J A Chowdary, IT Advisor to Chief Minister, Government of Andhra Pradesh.
Other Articles in this Open Access Journal on Digital Health include:
Live Conference Coverage @Medcitynews Converge 2018 Philadelphia:Liquid Biopsy and Gene Testing vs Reimbursement Hurdles
Reporter: Stephen J. Williams, PhD
9:25- 10:15 Liquid Biopsy and Gene Testing vs. Reimbursement Hurdles
Genetic testing, whether broad-scale or single gene-testing, is being ordered by an increasing number of oncologists, but in many cases, patients are left to pay for these expensive tests themselves. How can this dynamic be shifted? What can be learned from the success stories?
Moderator:Shoshannah Roth, Assistant Director of Health Technology Assessment and Information Services , ECRI Institute @Ecri_Institute Speakers: Rob Dumanois, Manager – reimbursement strategy, Thermo Fisher Scientific Eugean Jiwanmall, Senior Research Analyst for Medical Policy & Technology Evaluation , Independence Blue Cross @IBX Michael Nall, President and Chief Executive Officer, Biocept
Michael: Wide range of liquid biopsy services out there. There are screening companies however they are young and need lots of data to develop pan diagnostic test. Most of liquid biopsy is more for predictive analysis… especially therapeutic monitoring. Sometimes solid biopsies are impossible , limited, or not always reliable due to metastasis or tough to biopsy tissues like lung.
Eugean: Circulating tumor cells and ctDNA is the only FDA approved liquid biopsies. However you choose then to evaluate the liquid biopsy, PCR NGS, FISH etc, helps determines what the reimbursement options are available.
Rob: Adoption of reimbursement for liquid biopsy is moving faster in Europe than the US. It is possible in US that there may be changes to the payment in one to two years though.
Michael: China is adopting liquid biopsy rapidly. Patients are demanding this in China.
Reimbursement
Eugean: For IBX to make better decisions we need more clinical trials to correlate with treatment outcome. Most of the major cancer networks, like NCCN, ASCO, CAP, just have recommendations and not approved guidelines at this point. From his perspective with lung cancer NCCN just makes a suggestion with EGFR mutations however only the companion diagnostic is approved by FDA.
Michael: Fine needle biopsies are usually needed by the pathologist anyway before they go to liquid biopsy as need to know the underlying mutations in the original tumor, it just is how it is done in most cancer centers.
Eugean: Whatever the established way of doing things, you have to outperform the clinical results of the old method for adoption of a newer method.
Reimbursement issues have driven a need for more research into clinical validity and utility of predictive and therapeutic markers with regard to liquid biopsies. However although many academic centers try to partner with Biocept Biocept has a limit of funds and must concentrate only on a few trials. The different payers use different evidence based methods to evaluate liquid biopsy markers. ECRI also has a database for LB markers using an evidence based criteria. IBX does sees consistency among payers as far as decision and policy.
NGS in liquid biopsy
Rob: There is a path to coverage, especially through the FDA. If you have a FDA cleared NGS test, it will be covered. These are long and difficult paths to reimbursement for NGS but it is feasible. Medicare line of IBX covers this testing, however on the commercial side they can’t cover this. @IBX: for colon only kras or nras has clinical utility and only a handful of other cancer related genes for other cancers. For a companion diagnostic built into that Dx do the other markers in the panel cost too much?
Please follow on Twitter using the following #hash tags and @pharma_BI
Live Conference Coverage @MedCity news Converge 2018 Philadelphia: Early Diagnosis Through Predictive Biomarkers, NonInvasive Testing, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)
5:00 – 5:45 PM Early Diagnosis Through Predictive Biomarkers, NonInvasive Testing
Reporter: Stephen J. Williams, Ph.D.
Diagnosing cancer early is often the difference between survival and death. Hear from experts regarding the new and emerging technologies that form the next generation of cancer diagnostics.
Bonnie Anderson and Veracyte produces genomic tests for thyroid and other cancer diagnosis. Kevin Hrusovksy and Precision Health uses peer reviewed evidence based medicine to affect precision medicine decision.
Bonnie: aim to get a truth of diagnosis. Getting tumor tissue is paramount as well as properly preserved tissue. They use deep RNA sequencing and machine learning in their clinically approved tests.
Kevin: Serial biospace entrepreneur. Two diseases, cancer and neurologic, have been diseases which have been hardest to get reproducible and validated biomarkers of early disease. He concentrates on protein biomarkers.
Heather: FDA has recently approved drugs for early disease intervention. However the use of biomarkers can go beyond patient stratification in clinical trials.
Kevin: 15 approved drugs for MS but the markers are scans looking for brain atrophy which is too late of an endpoint. So we need biomarkers of early disease progression. We can use those early biomarkers of disease progression so pharma can target those early biomarkers and or use those early biomarkers of disease progression for endpoint
Bonnie: exciting time in the early diagnostics field. She prefers transcriptomics to DNA based methods such as WES or WGS (whole exome or whole genome sequencing). It was critical to show data on the cost savings imparted by their transcriptomic based thryoid cancer diagnostic test for payers to consider this test eligible for reimbursement.
Kevin: There has been 20 million CAT scans for cancer but it is estimated 90% of these scans led to misdiagnosis. Biomarker development has revolutionized diagnostics in this disease area. They have developed a breakthrough panel of ten protein biomarkers in serum which he estimates may replace 5 million mammograms.
All panelists agreed on the importance of regulatory compliance and the focus of new research should be on early detection. In addition they believe that Dr. Gotlieb’s appointment to the FDA is a positive for the biomarker development field, as Dr. Gotlieb understands the potential and importance of early detection and prevention of disease. Kevin also felt Dr. Gotlieb understands the importance of incorporating biomarkers as endpoints in clinical trials. Over 750 phase 1,2, and 3 clinical trials use biomarker endpoints but the pharma companies still need to prove the biomarkers clinical relevance to the FDA.They also agreed it would be helpful to involve advocacy groups in putting more pressure on the healthcare providers and policy makers on this importance of diagnostics as a preventative measure.
In addition, the discovery and use of biomarkers as disease endpoints has led to a resurgence of Alzheimer’s disease drug development by companies which have previously given up on these type of neurodegenerative diseases.
Kevin feels proteomics offers great advantages over DNA-based diagnostics, especially in cancer such as ovarian cancer, where a high degree of specificity for a diagnostic test is required to ascertain if a woman should undergo prophylactic oophorectomy. He suggests that a new blood-based protein biomarker panel is being developed for early detection of some forms of ovarian cancer.
Please follow on Twitter using the following #hash tags and @pharma_BI
#MCConverge
#cancertreatment
#healthIT
#innovation
#precisionmedicine
#healthcaremodels
#personalizedmedicine
#healthcaredata
And at the following handles:
@pharma_BI
@medcitynews
Please see related articles on Live Coverage of Previous Meetings on this Open Access Journal
SNP-based Study on high BMI exposure confirms CVD and DM Risks – no associations with Stroke, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)
SNP-based Study on high BMI exposure confirms CVD and DM Risks – no associations with Stroke
Reporter: Aviva Lev-Ari, PhD, RN
Genes Affirm: High BMI Carries Weighty Heart, Diabetes Risk – Mendelian randomization study adds to ‘burgeoning evidence’
by Crystal Phend,Senior Associate Editor, MedPage Today, July 05, 2017
The “genetically instrumented” measure of high BMI exposure — calculated based on 93 single-nucleotide polymorphisms associated with BMI in prior genome-wide association studies — was associated with the following risks (odds ratios given per standard deviation higher BMI):
Hypertension (OR 1.64, 95% CI 1.48-1.83)
Coronary heart disease (CHD; OR 1.35, 95% CI 1.09-1.69)
Type 2 diabetes (OR 2.53, 95% CI 2.04-3.13)
Systolic blood pressure (β 1.65 mm Hg, 95% CI 0.78-2.52 mm Hg)
Diastolic blood pressure (β 1.37 mm Hg, 95% CI 0.88-1.85 mm Hg)
However, there were no associations with stroke, Donald Lyall, PhD, of the University of Glasgow, and colleagues reported online in JAMA Cardiology.
“The main advantage of an MR approach is that certain types of study bias can be minimized,” the team noted. “Because DNA is stable and randomly inherited, which helps to mitigate errors from reverse causality and confounding, genetic variation can be used as a proxy for lifetime BMI to overcome limitations such as reverse causality and confounding, a process that hampers observational analyses of obesity and its consequences.”
Detecting Multiple Types of Cancer With a Single Blood Test
Reporter and Curator: Irina Robu, PhD
Monitoring cancer patients and evaluating their response to treatment can sometimes involve invasive procedures, including surgery.
The liquid biopsies have become something of a Holy Grail in cancer treatment among physicians, researchers and companies gambling big on the technology. Liquid biopsies, unlike traditional biopsies involving invasive surgery — rely on an ordinary blood draw. Developments in sequencing the human genome, permitting researchers to detect genetic mutations of cancers, have made the tests conceivable. Some 38 companies in the US alone are working on liquid biopsies by trying to analyze blood for fragments of DNA shed by dying tumor cells.
Premature research on the liquid biopsy has concentrated profoundly on patients with later-stage cancers who have suffered treatments, including chemotherapy, radiation, surgery, immunotherapy or drugs that target molecules involved in the growth, progression and spread of cancer. For cancer patients undergoing treatment, liquid biopsies could spare them some of the painful, expensive and risky tissue tumor biopsies and reduce reliance on CT scans. The tests can rapidly evaluate the efficacy of surgery or other treatment, while old-style biopsies and CT scans can still remain inconclusive as a result of scar tissue near the tumor site.
As recently as a few years ago, the liquid biopsies were hardly used except in research. At the moment, thousands of the tests are being used in clinical practices in the United States and abroad, including at the M.D. Anderson Cancer Center in Houston; the University of California, San Diego; the University of California, San Francisco; the Duke Cancer Institute and several other cancer centers.
With patients for whom physicians cannot get a tissue biopsy, the liquid biopsy could prove a safe and effective alternative that could help determine whether treatment is helping eradicate the cancer. A startup, Miroculus developed a cheap, open source device that can test blood for several types of cancer at once. The platform, called Miriam finds cancer by extracting RNA from blood and spreading it across plates that look at specific type of mRNA. The technology is then hooked up at a smartphone which sends the information to an online database and compares the microRNA found in the patient’s blood to known patterns indicating different type of cancers in the early stage and can reduce unnecessary cancer screenings.
Nevertheless, experts warn that more studies are essential to regulate the accuracy of the test, exactly which cancers it can detect, at what stages and whether it improves care or survival rates.
Accurate detection of somatic mutations has proven to be challenging in cancer NGS analysis, due to tumor heterogeneity and cross-contamination between tumor and matched normal samples. Oftentimes, a somatic caller that performs well for one tumor may not for another.
In this webinar we will introduce SomaticSeq, a tool within the Bina Genomic Management Solution (Bina GMS) designed to boost the accuracy of somatic mutation detection with a machine learning approach. You will learn:
Benchmarking of leading somatic callers, namely MuTect, SomaticSniper, VarScan2, JointSNVMix2, and VarDict
Integration of such tools and how accuracy is achieved using a machine learning classifier that incorporates over 70 features with SomaticSeq
Accuracy validation including results from the ICGC-TCGA DREAM Somatic Mutation Calling Challenge, in which Bina placed 1st in indel calling and 2nd in SNV calling in stage 5
Creation of a new SomaticSeq classifier utilizing your own dataset
Review of the somatic workflow within the Bina Genomic Management Solution
Speakers:
Li Tai Fang
Sr. Bioinformatics Scientist
Bina Technologies, Part of
Roche Sequencing
Anoop Grewal
Product Marketing Manager
Bina Technologies, Part of
Roche Sequencing
An open mind and collaborative spirit have taken Hans Clevers on a journey from medicine to developmental biology, gastroenterology, cancer, and stem cells.
Ihave had to talk a lot about my science recently and it’s made me think about how science works,” says Hans Clevers. “Scientists are trained to think science is driven by hypotheses, but for [my lab], hypothesis-driven research has never worked. Instead, it has been about trying to be as open-minded as possible—which is not natural for our brains,” adds the Utrecht University molecular genetics professor. “The human mind is such that it tries to prove it’s right, so pursuing a hypothesis can result in disaster. My advice to my own team and others is to not preformulate an answer to a scientific question, but just observe and never be afraid of the unknown. What has worked well for us is to keep an open mind and do the experiments. And find a collaborator if it is outside our niche.”
“One thing I have learned is that hypothesis-driven research tends not to be productive when you are in an unknown territory.”
Clevers entered medical school at Utrecht University in The Netherlands in 1978 while simultaneously pursuing a master’s degree in biology. Drawn to working with people in the clinic, Clevers had a training position in pediatrics lined up after medical school, but then mentors persuaded him to spend an additional year converting the master’s degree to a PhD in immunology. “At the end of that year, looking back, I got more satisfaction from the research than from seeing patients.” Clevers also had an aptitude for benchwork, publishing four papers from his PhD year. “They were all projects I had made up myself. The department didn’t do the kind of research I was doing,” he says. “Now that I look back, it’s surprising that an inexperienced PhD student could come up with a project and publish independently.”
Clevers studied T- and B-cell signaling; he set up assays to visualize calcium ion flux and demonstrated that the ions act as messengers to activate human B cells, signaling through antibodies on the cell surface. “As soon as the experiment worked, I got T cells from the lab next door and did the same experiment. That was my strategy: as soon as something worked, I would apply it elsewhere and didn’t stop just because I was a B-cell biologist and not a T-cell biologist. What I learned then, that I have continued to benefit from, is that a lot of scientists tend to adhere to a niche. They cling to these niches and are not that flexible. You think scientists are, but really most are not.”
Here, Clevers talks about promoting a collaborative spirit in research, the art of doing a pilot experiment, and growing miniature organs in a dish.
Clevers Creates
Re-search? Clevers was born in Eindhoven, in the south of The Netherlands. The town was headquarters to Philips Electronics, where his father worked as a businessman, and his mother took care of Clevers and his three brothers. Clevers did well in school but his passion was sports, especially tennis and field hockey, “a big thing in Holland.” Then in 1975, at age 18, he moved to Utrecht University, where he entered an intensive, biology-focused program. “I knew I wanted to be a biology researcher since I was young. In Dutch, the word for research is ‘onderzoek’ and I knew the English word ‘research’ and had wondered why there was the ‘re’ in the word, because I wanted to search but I didn’t want to do re-search—to find what someone else had already found.”
Opportunity to travel. “I was very disappointed in my biology studies, which were old-fashioned and descriptive,” says Clevers. He thought medicine might be more interesting and enrolled in medical school while still pursuing a master’s degree in biology at Utrecht. For the master’s, Clevers had to do three rotations. He spent a year at the International Laboratory for Research on Animal Diseases (ILRAD) in Nairobi, Kenya, and six months in Bethesda, Maryland, at the National Institutes of Health. “Holland is really small, so everyone travels.” Clevers saw those two rotations more as travel explorations. In Nairobi, he went on safaris and explored the country in Land Rovers borrowed from the institute. While in Maryland in 1980, Clevers—with the consent of his advisor, who thought it was a good idea for him to get a feel for the U.S.—flew to Portland, Oregon, and drove back to Boston with a musician friend along the Canadian border. He met the fiancé of political activist and academic Angela Davis in New York City and even stayed in their empty apartment there.
Life and lab lessons. Back in Holland, Clevers joined Rudolf Eugène Ballieux’s lab at Utrecht University to pursue his PhD, for which he studied immune cell signaling. “I didn’t learn much science from him, but I learned that you always have to create trust and to trust people around you. This became a major theme in my own lab. We don’t distrust journals or reviewers or collaborators. We trust everyone and we share. There will be people who take advantage, but there have only been a few of those. So I learned from Ballieux to give everyone maximum trust and then change this strategy only if they fail that trust. We collaborate easily because we give out everything and we also easily get reagents and tools that we may need. It’s been valuable to me in my career. And it is fun!”
Clevers Concentrates
On a mission. “Once I decided to become a scientist, I knew I needed to train seriously. Up to that point, I was totally self-trained.” From an extensive reading of the immunology literature, Clevers became interested in how T cells recognize antigens, and headed off to spend a postdoc studying the problem in Cox Terhorst’s lab at Dana-Farber Cancer Institute in Boston. “Immunology was young, but it was very exciting and there was a lot to discover. I became a professional scientist there and experienced how tough science is.” In 1988, Clevers cloned and characterized the gene for a component of the T-cell receptor (TCR) called CD3-epsilon, which binds antigen and activates intracellular signaling pathways.
On the fast track in Holland. Clevers returned to Utrecht University in 1989 as a professor of immunology. Within one month of setting up his lab, he had two graduate students and a technician, and the lab had cloned the first T cell–specific transcription factor, which they called TCF-1, in human T cells. When his former thesis advisor retired, Clevers was asked, at age 33, to become head of the immunology department. While the appointment was high-risk for him and for the department, Clevers says, he was chosen because he was good at multitasking and because he got along well with everyone.
Problem-solving strategy. “My strategy in research has always been opportunistic. One thing I have learned is that hypothesis-driven research tends not to be productive when you are in an unknown territory. I think there is an art to doing pilot experiments. So we have always just set up systems in which something happens and then you try and try things until a pattern appears and maybe you formulate a small hypothesis. But as soon as it turns out not to be exactly right, you abandon it. It’s a very open-minded type of research where you question whether what you are seeing is a real phenomenon without spending a year on doing all of the proper controls.”
Trial and error. Clevers’s lab found that while TCF-1 bound to DNA, it did not alter gene expression, despite the researchers’ tinkering with promoter and enhancer assays. “For about five years this was a problem. My first PhD students were leaving and they thought the whole TCF project was a failure,” says Clevers. His lab meanwhile cloned TCF homologs from several model organisms and made many reagents including antibodies against these homologs. To try to figure out the function of TCF-1, the lab performed a two-hybrid screen and identified components of the Wnt signaling pathway as binding partners of TCF-1. “We started to read about Wnt and realized that you study Wnt not in T cells but in frogs and flies, so we rapidly transformed into a developmental biology lab. We showed that we held the key for a major issue in developmental biology, the final protein in the Wnt cascade: TCF-1 binds b-catenin when b-catenin becomes available and activates transcription.” In 1996, Clevers published the mechanism of how the TCF-1 homolog in Xenopus embryos, called XTcf-3, is integrated into the Wnt signaling pathway.
Clevers Catapults
COURTESY OF HANS CLEVERS AND JEROEN HUIJBEN, NYMUS
3DCrypt building and colon cancer.
Clevers next collaborated with Bert Vogelstein’s lab at Johns Hopkins, linking TCF to Wnt signaling in colon cancer. In colon cancer cell lines with mutated forms of the tumor suppressor gene APC, the APC protein can’t rein in b-catenin, which accumulates in the cytoplasm, forms a complex with TCF-4 (later renamed TCF7L2) in the nucleus, and caninitiate colon cancer by changing gene expression. Then, the lab showed that Wnt signaling is necessary for self-renewal of adult stem cells, as mice missing TCF-4 do not have intestinal crypts, the site in the gut where stem cells reside. “This was the first time Wnt was shown to play a role in adults, not just during development, and to be crucial for adult stem cell maintenance,” says Clevers. “Then, when I started thinking about studying the gut, I realized it was by far the best way to study stem cells. And I also realized that almost no one in the world was studying the healthy gut. Almost everyone who researched the gut was studying a disease.” The main advantages of the murine model are rapid cell turnover and the presence of millions of stereotypic crypts throughout the entire intestine.
Against the grain. In 2007, Nick Barker, a senior scientist in the Clevers lab, identified the Wnt target gene Lgr5 as a unique marker of adult stem cells in several epithelial organs, including the intestine, hair follicle, and stomach. In the intestine, the gene codes for a plasma membrane protein on crypt stem cells that enable the intestinal epithelium to self-renew, but can also give rise to adenomas of the gut. Upon making mice with adult stem cell populations tagged with a fluorescent Lgr5-binding marker, the lab helped to overturn assumptions that “stem cells are rare, impossible to find, quiescent, and divide asymmetrically.”
On to organoids. Once the lab could identify adult stem cells within the crypts of the gut, postdoc Toshiro Sato discovered that a single stem cell, in the presence of Matrigel and just three growth factors, could generate a miniature crypt structure—what is now called an organoid. “Toshi is very Japanese and doesn’t always talk much,” says Clevers. “One day I had asked him, while he was at the microscope, if the gut stem cells were growing, and he said, ‘Yes.’ Then I looked under the microscope and saw the beautiful structures and said, ‘Why didn’t you tell me?’ and he said, ‘You didn’t ask.’ For three months he had been growing them!” The lab has since also grown mini-pancreases, -livers, -stomachs, and many other mini-organs.
Tumor Organoids. Clevers showed that organoids can be grown from diseased patients’ samples, a technique that could be used in the future to screen drugs. The lab is also building biobanks of organoidsderived from tumor samples and adjacent normal tissue, which could be especially useful for monitoring responses to chemotherapies. “It’s a similar approach to getting a bacterium cultured to identify which antibiotic to take. The most basic goal is not to give a toxic chemotherapy to a patient who will not respond anyway,” says Clevers. “Tumor organoids grow slower than healthy organoids, which seems counterintuitive, but with cancer cells, often they try to divide and often things go wrong because they don’t have normal numbers of chromosomes and [have] lots of mutations. So, I am not yet convinced that this approach will work for every patient. Sometimes, the tumor organoids may just grow too slowly.”
Selective memory. “When I received the Breakthrough Prize in 2013, I invited everyone who has ever worked with me to Amsterdam, about 100 people, and the lab organized a symposium where many of the researchers gave an account of what they had done in the lab,” says Clevers. “In my experience, my lab has been a straight line from cloning TCF-1 to where we are now. But when you hear them talk it was ‘Hans told me to try this and stop this’ and ‘Half of our knockout mice were never published,’ and I realized that the lab is an endless list of failures,” Clevers recalls. “The one thing we did well is that we would start something and, as soon as it didn’t look very good, we would stop it and try something else. And the few times when we seemed to hit gold, I would regroup my entire lab. We just tried a lot of things, and the 10 percent of what worked, those are the things I remember.”
Greatest Hits
Cloned the first T cell–specific transcription factor, TCF-1, and identified homologous genes in model organisms including the fruit fly, frog, and worm
Found that transcriptional activation by the abundant β-catenin/TCF-4 [TCF7L2] complex drives cancer initiation in colon cells missing the tumor suppressor protein APC
First to extend the role of Wnt signaling from developmental biology to adult stem cells by showing that the two Wnt pathway transcription factors, TCF-1 and TCF-4, are necessary for maintaining the stem cell compartments in the thymus and in the crypt structures of the small intestine, respectively
Identified Lgr5 as an adult stem cell marker of many epithelial stem cells including those of the colon, small intestine, hair follicle, and stomach, and found that Lgr5-expressing crypt cells in the small intestine divide constantly and symmetrically, disproving the common belief that stem cell division is asymmetrical and uncommon
Established a three-dimensional, stable model, the “organoid,” grown from adult stem cells, to study diseased patients’ tissues from the gut, stomach, liver, and prostate
Regenerative Medicine Comes of Age
“Anti-Aging Medicine” Sounds Vaguely Disreputable, So Serious Scientists Prefer to Speak of “Regenerative Medicine”
Induced pluripotent stem cells (iPSCs) and genome-editing techniques have facilitated manipulation of living organisms in innumerable ways at the cellular and genetic levels, respectively, and will underpin many aspects of regenerative medicine as it continues to evolve.
An attitudinal change is also occurring. Experts in regenerative medicine have increasingly begun to embrace the view that comprehensively repairing the damage of aging is a practical and feasible goal.
A notable proponent of this view is Aubrey de Grey, Ph.D., a biomedical gerontologist who has pioneered an regenerative medicine approach called Strategies for Engineered Negligible Senescence (SENS). He works to “develop, promote, and ensure widespread access to regenerative medicine solutions to the disabilities and diseases of aging” as CSO and co-founder of the SENS Research Foundation. He is also the editor-in-chief of Rejuvenation Research, published by Mary Ann Liebert.
Dr. de Grey points out that stem cell treatments for age-related conditions such as Parkinson’s are already in clinical trials, and immune therapies to remove molecular waste products in the extracellular space, such as amyloid in Alzheimer’s, have succeeded in such trials. Recently, there has been progress in animal models in removing toxic cells that the body is failing to kill. The most encouraging work is in cancer immunotherapy, which is rapidly advancing after decades in the doldrums.
Many damage-repair strategies are at an early stage of research. Although these strategies look promising, they are handicapped by a lack of funding. If that does not change soon, the scientific community is at risk of failing to capitalize on the relevant technological advances.
Regenerative medicine has moved beyond boutique applications. In degenerative disease, cells lose their function or suffer elimination because they harbor genetic defects. iPSC therapies have the potential to be curative, replacing the defective cells and eliminating symptoms in their entirety. One of the biggest hurdles to commercialization of iPSC therapies is manufacturing.
Building Stem Cell Factories
Cellular Dynamics International (CDI) has been developing clinically compatible induced pluripotent stem cells (iPSCs) and iPSC-derived human retinal pigment epithelial (RPE) cells. CDI’s MyCell Retinal Pigment Epithelial Cells are part of a possible therapy for macular degeneration. They can be grown on bioengineered, nanofibrous scaffolds, and then the RPE cell–enriched scaffolds can be transplanted into patients’ eyes. In this pseudo-colored image, RPE cells are shown growing over the nanofibers. Each cell has thousands of “tongue” and “rod” protrusions that could naturally support rod and cone cells in the eye.
“Now that an infrastructure is being developed to make unlimited cells for the tools business, new opportunities are being created. These cells can be employed in a therapeutic context, and they can be used to understand the efficacy and safety of drugs,” asserts Chris Parker, executive vice president and CBO, Cellular Dynamics International (CDI). “CDI has the capability to make a lot of cells from a single iPSC line that represents one person (a capability termed scale-up) as well as the capability to do it in parallel for multiple individuals (a capability termed scale-out).”
Minimally manipulated adult stem cells have progressed relatively quickly to the clinic. In this scenario, cells are taken out of the body, expanded unchanged, then reintroduced. More preclinical rigor applies to potential iPSC therapy. In this case, hematopoietic blood cells are used to make stem cells, which are manufactured into the cell type of interest before reintroduction. Preclinical tests must demonstrate that iPSC-derived cells perform as intended, are safe, and possess little or no off-target activity.
For example, CDI developed a Parkinsonian model in which iPSC-derived dopaminergic neurons were introduced to primates. The model showed engraftment and enervation, and it appeared to be free of proliferative stem cells.
“You will see iPSCs first used in clinical trials as a surrogate to understand efficacy and safety,” notes Mr. Parker. “In an ongoing drug-repurposing trial with GlaxoSmithKline and Harvard University, iPSC-derived motor neurons will be produced from patients with amyotrophic lateral sclerosis and tested in parallel with the drug.” CDI has three cell-therapy programs in their commercialization pipeline focusing on macular degeneration, Parkinson’s disease, and postmyocardial infarction.
Keeping an Eye on Aging Eyes
The California Project to Cure Blindness is evaluating a stem cell–based treatment strategy for age-related macular degeneration. The strategy involves growing retinal pigment epithelium (RPE) cells on a biostable, synthetic scaffold, then implanting the RPE cell–enriched scaffold to replace RPE cells that are dying or dysfunctional. One of the project’s directors, Dennis Clegg, Ph.D., a researcher at the University of California, Santa Barbara, provided this image, which shows stem cell–derived RPE cells. Cell borders are green, and nuclei are red.
The eye has multiple advantages over other organ systems for regenerative medicine. Advanced surgical methods can access the back of the eye, noninvasive imaging methods can follow the transplanted cells, good outcome parameters exist, and relatively few cells are needed.
These advantages have attracted many groups to tackle ocular disease, in particular age-related macular degeneration, the leading cause of blindness in the elderly in the United States. Most cases of age-related macular degeneration are thought to be due to the death or dysfunction of cells in the retinal pigment epithelium (RPE). RPE cells are crucial support cells for the rods, cones, and photoreceptors. When RPE cells stop working or die, the photoreceptors die and a vision deficit results.
A regenerated and restored RPE might prevent the irreversible loss of photoreceptors, possibly via the the transplantation of functionally polarized RPE monolayers derived from human embryonic stem cells. This approach is being explored by the California Project to Cure Blindness, a collaborative effort involving the University of Southern California (USC), the University of California, Santa Barbara (UCSB), the California Institute of Technology, City of Hope, and Regenerative Patch Technologies.
The project, which is funded by the California Institute of Regenerative Medicine (CIRM), started in 2010, and an IND was filed early 2015. Clinical trial recruitment has begun.
One of the project’s leaders is Dennis Clegg, Ph.D., Wilcox Family Chair in BioMedicine, UCSB. His laboratory developed the protocol to turn undifferentiated H9 embryonic stem cells into a homogenous population of RPE cells.
“These are not easy experiments,” remarks Dr. Clegg. “Figuring out the biology and how to make the cell of interest is a challenge that everyone in regenerative medicine faces. About 100,000 RPE cells will be grown as a sheet on a 3 × 5 mm biostable, synthetic scaffold, and then implanted in the patients to replace the cells that are dying or dysfunctional. The idea is to preserve the photoreceptors and to halt disease progression.”
Moving therapies such as this RPE treatment from concept to clinic is a huge team effort and requires various kinds of expertise. Besides benefitting from Dr. Clegg’s contribution, the RPE project incorporates the work of Mark Humayun, M.D., Ph.D., co-director of the USC Eye Institute and director of the USC Institute for Biomedical Therapeutics and recipient of the National Medal of Technology and Innovation, and David Hinton, Ph.D., a researcher at USC who has studied how actvated RPE cells can alter the local retinal microenvironment.
(Left): Control rabbit brain, showing neuropil near the CA1 band in the hippocampus. (Right): Vitrified rabbit brain, same location. Synapses, vesicles, and microfilaments are clear. The myelinated axon shows excellent preservation. (credit: Robert L. McIntyre and Gregory M. Fahy/Cryobiology)
The Brain Preservation Foundation (BPF) has announced that a team at 21st Century Medicine led by Robert McIntyre, PhD., has won the Small Mammal Brain Preservation Prize, which carries an award of $26,735.
The Small Mammalian Brain Preservation Prize was awarded after the determination that the protocol developed by McIntyre, termed Aldehyde-Stabilized Cryopreservation, was able to preserve an entire rabbit brain with well-preserved ultrastructure, including cell membranes, synapses, and intracellular structures such as synaptic vesicles (full protocol here).
The judges for the prize were Kenneth Hayworth, PhD., Brain Preservation Foundation President and neuroscientist at the Howard Hughes Medical Institute; and Prof.Sebastian Seung, PhD., Princeton Neuroscience Institute and Computer Science Department.
First preservation of the connectome
“This is a milestone in the development of brain preservation techniques: it is the first time that the preservation of the connectome has been demonstrated in a whole brain (prior to this only small parts of brains have been preserved to this level of detail),” said the BPF announcement.
“Current models of the brain suggest that the connectome contains all the information necessary for personal identity (i.e., memory and personality). This technique is not the same as conventional cryonics (rapidly freezing the brain), which has never demonstrated preservation of the ultrastructure of the brain. Thus the winning of this prize represents a significant advance in the methods available for large scale studies of the connectome and could lead to procedures that preserve a complete human brain.
Kenneth Hayworth (KH) (President of the Brain Preservation Foundation (BPF)) and Michael Shermer (member of BPF advisory board) witnessed (on Sept. 25, 2015) the full Aldehyde Stabilized Cryopreservation surgical procedure performed on this rabbit at the laboratories of 21 Century Medicine under the direction of 21CM lead researcher Robert McIntyre. This included the live rabbit’s carotid arteries being perfused with glutaraldehyde and subsequent perfusion with cryoprotectant agent (CPA). KH witnessed this rabbit brain being put in -135 degrees C storage, removal from storage the following day (verifying that it had vitrified solid), and KH witnessed all subsequent tissue processing steps involved in the evaluation process. (credit: The Brain Preservation Foundation)
“The key breakthrough was the rapid perfusion of a deadly chemical fixative (glutaraldehyde) through the brain’s vascular system, instantly stopping metabolic decay and fixing all proteins in place by covalent crosslinks. This stabilized the tissue and vasculature so that cryoprotectant could be perfused at an optimal temperature and rate. The result was an intact rabbit brain filled with such a high concentration of cryoprotectants that it could be stored as a solid ‘vitrified’ block at a temperature of -135 degrees Celsius.”
Winning the award also required that the procedure be published in a peer reviewed scientific publication. McIntyre satisfied this requirement and published the protocol in an open-access paper in the Journal of Cryobiology.
3D microscope evaluation of the rabbit brain tissue preservation (credit: Brain Preservation Foundation)
The Brain Preservation Foundation plans to continue to promote the idea that brain preservation following legal death, by using scientifically validated techniques, is a reasonable choice for consenting individuals to make. Focus now shifts to the final Large Mammal phase of the contest, which requires an intact pig brain to be preserved with similar fidelity in a manner that could be directly adapted to terminal patients in a hospital setting.
The 21st Century Medicine team has recently submitted to the BPF such a preserved pig brain for official evaluation. Lead researcher Robert McIntyre has started Nectome to further develop this method.
“Of course, [the demonstrated brain preservation procedure] is only useful if you think all the relevant information is preserved in the fixation,” said Anders Sandberg, PhD., of the Future of Humanity Institute/Oxford Martin School. “Protein states and small molecule chemical information may be messed up.”
Proponents of cryonics have long sought a technique that could put terminal patients into longterm stasis, the goal being a form of medical time travel in which patients are stabilized against decay with the hope of being biologically revived and cured by future technologies. Despite decades of research, this goal of reversible cryopreservation remains far out of reach — too much damage occurs during the cryopreservation itself.
This has led a new generation of researchers to focus on a more achievable and demonstrable goal–preservation of brain structure only. Specifically preservation of the delicate pattern of synaptic connections (the “connectome”) which neuroscience contends encodes a person’s memory and identity. Instead of biological revival, these new researchers often envision a future “synthetic revival” comprising nanometer-scale scanning of the preserved brain to serve as the basis for mind uploading.
This shift in focus toward “synthetic” revival has completely transformed the cryonics debate, opening up new avenues of research and bringing it squarely within the purview of today’s scientific investigation. Hundreds of neuroscience papers have detailed how memory and personality are encoded structurally in synaptic connections, and recent advances in connectome imaging and brain simulation can be seen as a preview of the synthetic revival technologies to come.
Until now, the crucial unanswered questions were “How well does cryonics preserve the brain’s connectome?” and “Are there alternatives/modifications to cryonics that might preserve the connectome better and in a manner that could be demonstrated today?” The Brain Preservation Prize was put forward in 2010 to spur research that could definitively answer these questions. Now, five years later, these questions have been answered: Traditional cryonics procedures were not able to demonstrate (to the BPF’s satisfaction) preservation of the connectome, but the newly invented “Aldehyde-Stabilized Cryopreservation” technique was.
This result directly answers what has for decades been the main skeptical and scientific criticism against cryonics –that it does not provably preserve the delicate synaptic circuitry of the brain. As such, this research sets the stage for renewed interest within the scientific community, and offers a potential challenge to medical researchers to develop a human surgical procedure based on these successful animal experiments.
Abstract of Aldehyde-stabilized cryopreservation
We describe here a new cryobiological and neurobiological technique, aldehyde-stabilized cryopreservation (ASC), which demonstrates the relevance and utility of advanced cryopreservation science for the neurobiological research community. ASC is a new brain-banking technique designed to facilitate neuroanatomic research such as connectomics research, and has the unique ability to combine stable long term ice-free sample storage with excellent anatomical resolution. To demonstrate the feasibility of ASC, we perfuse-fixed rabbit and pig brains with a glutaraldehyde-based fixative, then slowly perfused increasing concentrations of ethylene glycol over several hours in a manner similar to techniques used for whole organ cryopreservation. Once 65% w/v ethylene glycol was reached, we vitrified brains at −135 °C for indefinite long-term storage. Vitrified brains were rewarmed and the cryoprotectant removed either by perfusion or gradual diffusion from brain slices. We evaluated ASC-processed brains by electron microscopy of multiple regions across the whole brain and by Focused Ion Beam Milling and Scanning Electron Microscopy (FIB-SEM) imaging of selected brain volumes. Preservation was uniformly excellent: processes were easily traceable and synapses were crisp in both species. Aldehyde-stabilized cryopreservation has many advantages over other brain-banking techniques: chemicals are delivered via perfusion, which enables easy scaling to brains of any size; vitrification ensures that the ultrastructure of the brain will not degrade even over very long storage times; and the cryoprotectant can be removed, yielding a perfusable aldehyde-preserved brain which is suitable for a wide variety of brain assays.
Totally weird – IOW those “covalent bonds” act like a preservation matrix. So this brain indeed has been “fixed” – just at a smaller scale and level.
A couple of other factors:
* Quite a lot of the brain that counts (memory) may be on a larger scale than this – and may be preserved. While it is not, per the Connetome idea, at the macro axon scale – it is a general idea that at the molecular scale, something “plays” through the consciousness mechanism (Search = Hameroff Memory.)
I personally suspect a DNA like encoding in an as yet unproven language software. Perhaps even multiple “scale” functionality that would be a combination of organelle specialization (perhaps time perception) and THEN the inter-connectedness.
* As for personality, I know that that is entirely reproducible – in spite of such extreme complexity – but that is a proof for another day.
Just for kicks, note how the “search” code above results in prefabricated libraries being sent to your mind.
Gorden Russell –
You had me until I got to this part: “…a deadly chemical fixative (glutaraldehyde) through the brain’s vascular system…”
So this process perfectly preserves your brain after killer it dead.
So in the future it can be scanned and printed out into a perfect copy — but the copy won’t be you, it’ll be somebody else who is just like you. You will still be dead.
I’d rather be a live brain in a jar atop a robot wired into the spinal column so that I could still have all of my senses while awaiting the time a human body can be regrown.
CT
We have to differentiate on how we define “me” or “you”. Do we mean our memories (data) or consciousness (process). Our memories, personality, knowledge… alone (e.g. while we sleep and are unconscious)… are like fixed data until the brain (or a computer) begins to run and consciousness comes into existence .
We could copy the data to a computer (through scanning), which in the next step (after the simulation is beginning to operate) would create consciousness as well (defining itself as “me” or “you”). It wouldn’t be the same consciousness (process) due to other environmental inputs (and over time other memory/data- background). But the same is true for a biological based consciousness. My consciousness right now is not the consciousness anymore I had last year. It’s always a unique set-up.
From my point of view, the sentiment that there is some kind of metaphysical soul over an entire lifetime is an illusion based on the fact that we have memories, knowledge and personality (which we would have after the scanning process of our brain as well), that were formed in the past, and we are able to (subjectively altered) recreate it (and remember it) in our current state of consciousness. As a result we conclude, that we are/ have the same state of consciousness as the past me, which is (as I see it) an illusion.
So if we would be able to make a perfect copy of our brain that is able to create consciousness (in any kind of computer substrate, digital, analog or quantum) it wouldn’t be more or less the me (the consciousness) at the present than my future me in 5 minutes or years would be (in its biological form). From my point of view, the status quo wouldn’t change.
It is a copy because maybe one day they can do it without killing the original. The only way out of this conundrum was explained to me on this web site a while back in comments: if they substituted every neuron in my brain one at a time over a certain timescale so that eventually my brain would be synthetic, ‘”I” probably wouldn’t even notice.
But you are dreaming during your sleep.
Glutaraldehyde will put an end to all of your dreams.
A printed copy of you may have similar dreams, but not your dreams.