Feeds:
Posts
Comments

Archive for the ‘ISO 10993 for Product Registration: FDA & CE Mark for Development of Medical Devices and Diagnostics’ Category

World’s first artificial pancreas

Reporter: Irina Robu, PhD

Diabetes is a life-long condition where your body does not produce enough insulin (Type 1) or your body cannot use the insulin it has effectively. Since there is no cure for diabetes, the artificial pancreas system comes as a relief for patients that are suffering with this disease.

The artificial pancreas, MiniMed 670G hybrid closed loop system designed by Medtronic is the first FDA-approved device that measures glucose levels and delivers the appropriate dose of basal insulin. The system comprises Medtronic’s MiniMed 670G insulin pump that is strapped to the body, an infusion patch that delivers insulin via catheter from the pump and a sensor which measures glucose levels under the skin and can be worn for 7 days at a time. While the device regulates basal, or background, insulin, patients must still manually request bolus insulin at mealtimes.

The device is intended for people age 14 or older with Type 1 diabetes and is intended to regulate insulin levels with “little to no input” from the patient. The artificial pancreas measures blood sugar levels using a constant glucose monitor (CGM) and communicates the information to an insulin pump which calculates and releases the required amount of insulin into the body, just as the pancreas does in people without diabetes.

The 2016 FDA approval was done in just three months which is a record for any medical device. The agency evaluated data from a clinical trial in which 123 patients with Type 1 diabetes used the system’s hybrid closed-loop feature as repeatedly during a three-month period. The trial presented the device to be safe for use in those 14 and older, showing no serious adverse events. The system is on sale since spring 2017.

While further clinical research is needed to ensure that the strength of the device in different settings is consistent, several researchers support the view that “artificial pancreas systems are a safe and effective treatment approach for people with type 1 diabetes. Medtronic counts this device as a step toward a fully automated, closed-loop system.

SOURCE

https://www.fiercebiotech.com/medical-devices/fda-approves-medtronic-s-artificial-pancreas-world-s-first

Read Full Post »

The Regulatory challenge in adopting AI

Author and Curator: Dror Nir, PhD

3.4.3

3.4.3   The Regulatory challenge in adopting AI, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 2: CRISPR for Gene Editing and DNA Repair

In the last couple of years we are witnessing a surge of AI applications in healthcare. It is clear now, that AI and its wide range of health-applications are about to revolutionize diseases’ pathways and the way the variety of stakeholders in this market interact.

Not surprisingly, the developing surge has waken the regulatory watchdogs who are now debating ways to manage the introduction of such applications to healthcare. Attributing measures to known regulatory checkboxes like safety, and efficacy is proving to be a complex exercise. How to align claims made by manufacturers, use cases, users’ expectations and public expectations is unclear. A recent demonstration of that is the so called “failure” of AI in social-network applications like FaceBook and Twitter in handling harmful materials.

‘Advancing AI in the NHS’ – is a report covering the challenges and opportunities of AI in the NHS. It is a modest contribution to the debate in such a timely and fast-moving field!  I bring here the report’s preface and executive summary hoping that whoever is interested in reading the whole 50 pages of it will follow this link: f53ce9_e4e9c4de7f3c446fb1a089615492ba8c

Screenshot 2019-04-07 at 17.18.18

Acknowledgements

We and Polygeia as a whole are grateful to Dr Dror Nir, Director, RadBee, whose insights

were valuable throughout the research, conceptualisation, and writing phases of this work; and to Dr Giorgio Quer, Senior Research Scientist, Scripps Research Institute; Dr Matt Willis, Oxford Internet Institute, University of Oxford; Professor Eric T. Meyer, Oxford Internet Institute, University of Oxford; Alexander Hitchcock, Senior Researcher, Reform; Windi Hari, Vice President Clinical, Quality & Regulatory, HeartFlow; Jon Holmes, co-founder and Chief Technology Officer, Vivosight; and Claudia Hartman, School of Anthropology & Museum Ethnography, University of Oxford for their advice and support.

Author affiliations

Lev Tankelevitch, University of Oxford

Alice Ahn, University of Oxford

Rachel Paterson, University of Oxford

Matthew Reid, University of Oxford

Emily Hilbourne, University of Oxford

Bryan Adriaanse, University of Oxford

Giorgio Quer, Scripps Research Institute

Dror Nir, RadBee

Parth Patel, University of Cambridge

All affiliations are at the time of writing.

Polygeia

Polygeia is an independent, non-party, and non-profit think-tank focusing on health and its intersection with technology, politics, and economics. Our aim is to produce high-quality research on global health issues and policies. With branches in Oxford, Cambridge, London and New York, our work has led to policy reports, peer-reviewed publications, and presentations at the House of Commons and the European Parliament. http://www.polygeia.com @Polygeia © Polygeia 2018. All rights reserved.

Foreword

Almost every day, as MP for Cambridge, I am told of new innovations and developments that show that we are on the cusp of a technological revolution across the sectors. This technology is capable of revolutionising the way we work; incredible innovations which could increase our accuracy, productivity and efficiency and improve our capacity for creativity and innovation.

But huge change, particularly through adoption of new technology, can be difficult to  communicate to the public, and if we do not make sure that we explain carefully the real benefits of such technologies we easily risk a backlash. Despite good intentions, the care.data programme failed to win public trust, with widespread worries that the appropriate safeguards weren’t in place, and a failure to properly explain potential benefits to patients. It is vital that the checks and balances we put in place are robust enough to sooth public anxiety, and prevent problems which could lead to steps back, rather than forwards.

Previous attempts to introduce digital innovation into the NHS also teach us that cross-disciplinary and cross-sector collaboration is essential. Realising this technological revolution in healthcare will require industry, academia and the NHS to work together and share their expertise to ensure that technical innovations are developed and adopted in ways that prioritise patient health, rather than innovation for its own sake. Alongside this, we must make sure that the NHS workforce whose practice will be altered by AI are on side. Consultation and education are key, and this report details well the skills that will be vital to NHS adoption of AI. Technology is only as good as those who use it, and for this, we must listen to the medical and healthcare professionals who will rightly know best the concerns both of patients and their colleagues. The new Centre for Data Ethics and Innovation, the ICO and the National Data Guardian will be key in working alongside the NHS to create both a regulatory framework and the communications which win society’s trust. With this, and with real leadership from the sector and from politicians, focused on the rights and concerns of individuals, AI can be advanced in the NHS to help keep us all healthy.

Daniel Zeichner

MP for Cambridge

Chair, All-Party Parliamentary Group on Data Analytics

Executive summary

Artificial intelligence (AI) has the potential to transform how the NHS delivers care. From enabling patients to self-care and manage long-term conditions, to advancing triage, diagnostics, treatment, research, and resource management, AI can improve patient outcomes and increase efficiency. Achieving this potential, however, requires addressing a number of ethical, social, legal, and technical challenges. This report describes these challenges within the context of healthcare and offers directions forward.

Data governance

AI-assisted healthcare will demand better collection and sharing of health data between NHS, industry and academic stakeholders. This requires a data governance system that ensures ethical management of health data and enables its use for the improvement of healthcare delivery. Data sharing must be supported by patients. The recently launched NHS data opt-out programme is an important starting point, and will require monitoring to ensure that it has the transparency and clarity to avoid exploiting the public’s lack of awareness and understanding. Data sharing must also be streamlined and mutually beneficial. Current NHS data sharing practices are disjointed and difficult to negotiate from both industry and NHS perspectives. This issue is complicated by the increasing integration of ’traditional’ health data with that from commercial apps and wearables. Finding approaches to valuate data, and considering how patients, the NHS and its partners can benefit from data sharing is key to developing a data sharing framework. Finally, data sharing should be underpinned by digital infrastructure that enables cybersecurity and accountability.

Digital infrastructure

Developing and deploying AI-assisted healthcare requires high quantity and quality digital data. This demands effective digitisation of the NHS, especially within secondary care, involving not only the transformation of paper-based records into digital data, but also improvement of quality assurance practices and increased data linkage. Beyond data digitisation, broader IT infrastructure also needs upgrading, including the use of innovations such as wearable technology and interoperability between NHS sectors and institutions. This would not only increase data availability for AI development, but also provide patients with seamless healthcare delivery, putting the NHS at the vanguard of healthcare innovation.

Standards

The recent advances in AI and the surrounding hype has meant that the development of AI-assisted healthcare remains haphazard across the industry, with quality being difficult to determine or varying widely. Without adequate product validation, including in

real-world settings, there is a risk of unexpected or unintended performance, such as sociodemographic biases or errors arising from inappropriate human-AI interaction. There is a need to develop standardised ways to probe training data, to agree upon clinically-relevant performance benchmarks, and to design approaches to enable and evaluate algorithm interpretability for productive human-AI interaction. In all of these areas, standardised does not necessarily mean one-size-fits-all. These issues require addressing the specifics of AI within a healthcare context, with consideration of users’ expertise, their environment, and products’ intended use. This calls for a fundamentally interdisciplinary approach, including experts in AI, medicine, ethics, cognitive science, usability design, and ethnography.

Regulations

Despite the recognition of AI-assisted healthcare products as medical devices, current regulatory efforts by the UK Medicines and Healthcare Products Regulatory Agency and the European Commission have yet to be accompanied by detailed guidelines which address questions concerning AI product classification, validation, and monitoring. This is compounded by the uncertainty surrounding Brexit and the UK’s future relationship with the European Medicines Agency. The absence of regulatory clarity risks compromising patient safety and stalling the development of AI-assisted healthcare. Close working partnerships involving regulators, industry members, healthcare institutions, and independent AI-related bodies (for example, as part of regulatory sandboxes) will be needed to enable innovation while ensuring patient safety.

The workforce

AI will be a tool for the healthcare workforce. Harnessing its utility to improve care requires an expanded workforce with the digital skills necessary for both developing AI capability and for working productively with the technology as it becomes commonplace.

Developing capability for AI will involve finding ways to increase the number of clinician-informaticians who can lead the development, procurement and adoption of AI technology while ensuring that innovation remains tied to the human aspect of healthcare delivery. More broadly, healthcare professionals will need to complement their socio-emotional and cognitive skills with training to appropriately interpret information provided by AI products and communicate it effectively to co-workers and patients.

Although much effort has gone into predicting how many jobs will be affected by AI-driven automation, understanding the impact on the healthcare workforce will require examining how jobs will change, not simply how many will change.

Legal liability

AI-assisted healthcare has implications for the legal liability framework: who should be held responsible in the case of a medical error involving AI? Addressing the question of liability will involve understanding how healthcare professionals’ duty of care will be impacted by use of the technology. This is tied to the lack of training standards for healthcare professionals to safely and effectively work with AI, and to the challenges of algorithm interpretability, with ”black-box” systems forcing healthcare professionals to blindly trust or distrust their output. More broadly, it will be important to examine the legal liability of healthcare professionals, NHS trusts and industry partners, raising questions

Recommendations

  1. The NHS, the Centre for Data Ethics and Innovation, and industry and academic partners should conduct a review to understand the obstacles that the NHS and external organisations face around data sharing. They should also develop health data valuation protocols which consider the perspectives of patients, the NHS, commercial organisations, and academia. This work should inform the development of a data sharing framework.
  2. The National Data Guardian and the Department of Health should monitor the NHS data opt-out programme and its approach to transparency and communication, evaluating how the public understands commercial and non-commercial data use and the handling of data at different levels of anonymisation.
  3. The NHS, patient advocacy groups, and commercial organisations should expand public engagement strategies around data governance, including discussions about the value of health data for improving healthcare; public and private sector interactions in the development of AI-assisted healthcare; and the NHS’s strategies around data anonymisation, accountability, and commercial partnerships. Findings from this work should inform the development of a data sharing framework.
  4. The NHS Digital Security Operations Centre should ensure that all NHS organisations comply with cybersecurity standards, including having up-to-date technology.
  5. NHS Digital, the Centre for Data Ethics and Innovation, and the Alan Turing Institute should develop technological approaches to data privacy, auditing, and accountability that could be implemented in the NHS. This should include learning from Global Digital Exemplar trusts in the UK and from international examples such as Estonia.
  6. The NHS should continue to increase the quantity, quality, and diversity of digital health data across trusts. It should consider targeted projects, in partnership with professional medical bodies, that quality-assure and curate datasets for more deployment-ready AI technology. It should also continue to develop its broader IT infrastructure, focusing on interoperability between sectors, institutions, and technologies, and including the end users as central stakeholders.
  7. The Alan Turing Institute, the Ada Lovelace Institute, and academic and industry partners in medicine and AI should develop ethical frameworks and technological approaches for the validation of training data in the healthcare sector, including methods to minimise performance biases and validate continuously-learning algorithms.
  8. The Alan Turing Institute, the Ada Lovelace Institute, and academic and industry partners in medicine and AI should develop standardised approaches for evaluating product performance in the healthcare sector, with consideration for existing human performance standards and products’ intended use.
  9. The Alan Turing Institute, the Ada Lovelace Institute, and academic and industry partners in medicine and AI should develop methods of enabling and evaluating algorithm interpretability in the healthcare sector. This work should involve experts in AI, medicine, ethics, usability design, cognitive science, and ethnography, among others.
  10. Developers of AI products and NHS Commissioners should ensure that usability design remains a top priority in their respective development and procurement of AI-assisted healthcare products.
  11. The Medicines and Healthcare Products Regulatory Agency should establish a digital health unit with expertise in AI and digital products that will work together with manufacturers, healthcare bodies, notified bodies, AI-related organisations, and international forums to advance clear regulatory approaches and guidelines around AI product classification, validation, and monitoring. This should address issues including training data and biases, performance evaluation, algorithm interpretability, and usability.
  12. The Medicines and Healthcare Products Regulatory Agency, the Centre for Data Ethics and Innovation, and industry partners should evaluate regulatory approaches, such as regulatory sandboxing, that can foster innovation in AI-assisted healthcare, ensure patient safety, and inform on-going regulatory development.
  13. The NHS should expand innovation acceleration programmes that bridge healthcare and industry partners, with a focus on increasing validation of AI products in real-world contexts and informing the development of a regulatory framework.
  14. The Medicines and Healthcare Products Regulatory Agency and other Government bodies should arrange a post-Brexit agreement ensuring that UK regulations of medical devices, including AI-assisted healthcare, are aligned as closely as possible to the European framework and that the UK can continue to help shape Europe-wide regulations around this technology.
  15. The General Medical Council, the Medical Royal Colleges, Health Education England, and AI-related bodies should partner with industry and academia on comprehensive examinations of the healthcare sector to assess which, when, and how jobs will be impacted by AI, including analyses of the current strengths, limitations, and workflows of healthcare professionals and broader NHS staff. They should also examine how AI-driven workforce changes will impact patient outcomes.
  16. The Federation of Informatics Professionals and the Faculty of Clinical Informatics should continue to lead and expand standards for health informatics competencies, integrating the relevant aspects of AI into their training, accreditation, and professional development programmes for clinician-informaticians and related professions.
  17. Health Education England should expand training programmes to advance digital and AI-related skills among healthcare professionals. Competency standards for working with AI should be identified for each role and established in accordance with professional registration bodies such as the General Medical Council. Training programmes should ensure that ”un-automatable” socio-emotional and cognitive skills remain an important focus.
  18. The NHS Digital Academy should expand recruitment and training efforts to increase the number of Chief Clinical Information Officers across the NHS, and ensure that the latest AI ethics, standards, and innovations are embedded in their training programme.
  19. Legal experts, ethicists, AI-related bodies, professional medical bodies, and industry should review the implications of AI-assisted healthcare for legal liability. This includes understanding how healthcare professionals’ duty of care will be affected, the role of workforce training and product validation standards, and the potential role of NHS Indemnity and no-fault compensation systems.
  20. AI-related bodies such as the Ada Lovelace Institute, patient advocacy groups and other healthcare stakeholders should lead a public engagement and dialogue strategy to understand the public’s views on liability for AI-assisted healthcare.

Read Full Post »

Artificial Pancreas – Medtronic Receives FDA Approval for World’s First Hybrid Closed Loop System for People with Type 1 Diabetes

Reporter: Aviva Lev-Ari, PhD, RN

 

Press Release

Read Full Post »

DNA-based nanomotor and chemomechanical crosstalk

Curators: Larry H. Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

LPBI

 

Nano-walkers take speedy leap forward with first rolling DNA-based motor

“Ours is the first rolling DNA motor, making it far faster and more robust,” says Khalid Salaita, the Emory chemist who led the research. (Photos by Bryan Meltz, Emory Photo/Video.)  https://pharmaceuticalintelligence.com/wp-content/uploads/2016/05/c4943-khalid_salaita.jpg

hysical chemists have devised a rolling DNA-based motor that’s 1,000 times faster than any other synthetic DNA motor, giving it potential for real-world applications, such as disease diagnostics. Nature Nanotechnology is publishing the finding.

“Unlike other synthetic DNA-based motors, which use legs to ‘walk’ like tiny robots, ours is the first rolling DNA motor, making it far faster and more robust,” says Khalid Salaita, the Emory University chemist who led the research. “It’s like the biological equivalent of the invention of the wheel for the field of DNA machines.”

The speed of the new DNA-based motor, which is powered by ribonuclease H, means a simple smart phone microscope can capture its motion through video. The researchers have filed an invention disclosure patent for the concept of using the particle motion of their rolling molecular motor as a sensor for everything from a single DNA mutation in a biological sample to heavy metals in water.

“Our method offers a way of doing low-cost, low-tech diagnostics in settings with limited resources,” Salaita says.

The field of synthetic DNA-based motors, also known as nano-walkers, is about 15 years old. Researchers are striving to duplicate the action of nature’s nano-walkers. Myosin, for example, are tiny biological mechanisms that “walk” on filaments to carry nutrients throughout the human body.

“It’s the ultimate in science fiction,” Salaita says of the quest to create tiny robots, or nano-bots, that could be programmed to do your bidding. “People have dreamed of sending in nano-bots to deliver drugs or to repair problems in the human body.”

So far, however, mankind’s efforts have fallen far short of nature’s myosin, which speeds effortlessly about its biological errands. “The ability of myosin to convert chemical energy into mechanical energy is astounding,” Salaita says. “They are the most efficient motors we know of today.”

Some synthetic nano-walkers move on two legs. They are essentially enzymes made of DNA, powered by the catalyst RNA. These nano-walkers tend to be extremely unstable, due to the high levels of Brownian motion at the nano-scale. Other versions with four, and even six, legs have proved more stable, but much slower. In fact, their pace is glacial: A four-legged DNA-based motor would need about 20 years to move one centimeter.

Kevin Yehl, a post-doctoral fellow in the Salaita lab, had the idea of constructing a DNA-based motor using a micron-sized glass sphere. Hundreds of DNA strands, or “legs,” are allowed to bind to the sphere. These DNA legs are placed on a glass slide coated with the reactant: RNA.

The DNA legs are drawn to the RNA, but as soon as they set foot on it they destroy it through the activity of an enzyme called RNase H. As the legs bind and then release from the substrate, they guide the sphere along, allowing more of the DNA legs to keep binding and pulling.

“It’s called a burnt-bridge mechanism,” Salaita explains. “Wherever the DNA legs step, they trample and destroy the reactant. They have to keep moving and step where they haven’t stepped in order to find more reactant.”

The combination of the rolling motion, and the speed of the RNase H enzyme on a substrate, gives the new DNA motor its stability and speed.

“Our DNA-based motor can travel one centimeter in seven days, instead of 20 years, making it 1,000 times faster than the older versions,” Salaita says. “In fact, nature’s myosin motors are only 10 times faster than ours, and it took them billions of years to evolve.”

http://2.bp.blogspot.com/-tB7Dtk9txtY/Vl3R_IInh3I/AAAAAAAAK7g/Kf3lSVSHzr8/s400/smart-phone_setup.jpg

Emory post-doctoral fellow Kevin Yehl sets up a smart-phone microscope to get a readout for the particle motion of the rolling DNA-based motor.

The researchers demonstrated that their rolling motors can be used to detect a single DNA mutation by measuring particle displacement. They simply glued lenses from two inexpensive laser pointers to the camera of a smart phone to turn the phone into a microscope and capture videos of the particle motion.

“Using a smart phone, we can get a readout for anything that’s interfering with the enzyme-substrate reaction, because that will change the speed of the particle,” Salaita says. “For instance, we can detect a single mutation in a DNA strand.”

This simple, low-tech method could come in handy for doing diagnostic sensing of biological samples in the field, or anywhere with limited resources.

The proof that the motors roll came by accident, Salaita adds. During their experiments, two of the glass spheres occasionally became stuck together, or dimerized. Instead of making a wandering trail, they left a pair of straight, parallel tracks across the substrate, like a lawn mower cutting grass. “It’s the first example of a synthetic molecular motor that goes in a straight line without a track or a magnetic field to guide it,” Salaita says.

In addition to Salaita and Yehl, the co-authors on the Nature Nanotechnology paper include Emory researchers Skanda Vivek, Yang Liu, Yun Zhang, Megzhen Fan, Eric Weeks and Andrew Mugler (who is now at Purdue University).

Related:
Chemists reveal the force within you
Molecular beacons shine light on how cells ‘crawl’

 

High-speed DNA-based rolling motors powered by RNase H

Kevin YehlAndrew MuglerSkanda VivekYang LiuYun ZhangMengzhen FanEric R. Weeks & Khalid Salaita
Nature Nanotechnology   11;184–190 (30 Nov 2016)    doi:10.1038/nnano.2015.259

DNA-based machines that walk by converting chemical energy into controlled motion could be of use in applications such as next-generation sensors, drug-delivery platforms and biological computing. Despite their exquisite programmability, DNA-based walkers are challenging to work with because of their low fidelity and slow rates (∼1 nm min–1). Here we report DNA-based machines that roll rather than walk, and consequently have a maximum speed and processivity that is three orders of magnitude greater than the maximum for conventional DNA motors. The motors are made from DNA-coated spherical particles that hybridize to a surface modified with complementary RNA; the motion is achieved through the addition of RNase H, which selectively hydrolyses the hybridized RNA. The spherical motors can move in a self-avoiding manner, and anisotropic particles, such as dimerized or rod-shaped particles, can travel linearly without a track or external force. We also show that the motors can be used to detect single nucleotide polymorphism by measuring particle displacement using a smartphone camera.

 

 

http://www.nature.com/nnano/journal/v11/n2/carousel/nnano.2015.259-f1.jpg

 

http://www.nature.com/nnano/journal/v11/n2/carousel/nnano.2015.259-f2.jpg

 

http://www.nature.com/nnano/journal/v11/n2/full/nnano.2015.259.html

 

T cells use ‘handshakes’ to sort friends from foes

http://esciencecommons.blogspot.ca/2016/05/t-cells-use-handshakes-to-sort-friends.html

 

A 3-D rendering of a fluorescence image mapping the piconewton forces applied by T cells. The height and color indicates the magnitude of the applied force. (Microscopy image by Yang Liu.)

By Carol Clark

 

T cells, the security guards of the immune system, use a kind of mechanical “handshake” to test whether a cell they encounter is a friend or foe, a new study finds.

The Proceedings of the National Academy of Sciences (PNAS) published the study, led by Khalid Salaita, a physical chemist at Emory University who specializes in the mechanical forces of cellular processes.

“We’ve provided the first direct evidence that a T cell gives precise mechanical tugs to other cells,” Salaita says. “And we’ve shown that these tugs are central to a T cell’s process of deciding whether to mount an immune response. A tug that releases easily, similar to a casual handshake, signals a friend. A stronger grip indicates a foe.”

Salaita, from Emory’s Department of Chemistry, collaborated on the research with Brian Evavold in the Emory School of Medicine’s Department of Microbiology and Immunology.

T cells continuously patrol through the body in search of foreign invaders. They have molecules known as T-cell receptors (TCR) that can recognize specific antigenic peptides on the surface of a pathogenic or cancerous cell. When a T cell detects an antigen-presenting cell (APC), its TCR connects to a ligand, or binding molecule, of the APC. If the T cell determines the ligand is foreign, it becomes activated and starts pumping calcium. The calcium is part of a signaling chain that recruits other cells to come and help mount an immune response.

Scientists have known about this process for decades, but they have not fully understood how the T cell distinguishes small modifications to the antigenic ligand and how it decides to respond to it. “If you view this T cell response purely as a chemical process, it does not fully explain the remarkable specifity of the binding,” Salaita says. “When you take the two components – the TCR and the ligand on the surface of cells – and just let them chemically bind in a solution, for example, you can’t predict what will trigger a strong or a weak immune response.”

The researchers hypothesized that mechanical strain might also play a role in a T cell response, since the T cell continues to move even as it locks into a bind with an antigenic ligand.

To test this idea, the Salaita lab developed DNA-based gold nanoparticle tension sensors that light up, or fluoresce, in response to a miniscule mechanical force of a piconewton – about one million-millionth the weight of an apple.

The researchers designed experiments using T cells from a mouse and allowed them to test ligands containing eight amino acid peptides that had slight mutations.

“We swapped out the fourth amino acid position to create really subtle chemical changes in the ligand that would be very difficult to distinguish without a mechanical component,” Salaita says.

Some of the mutated ligands were given a firmer anchor to give them a tighter “grip” to the moving TCR.

Through the experiments, captured on microscopy video, the researchers were able to see, record and measure the responses of the T cells as they moved across the ligands.

“As a T cell moves across a cell’s surface and encounters a ligand, it pulls on it,” Salaita explains. “It doesn’t pull very hard, it’s a very precise and tiny tug that is not sustained. The T cell pulls and stops, pulls and stops, all across the surface. It’s like the T cell is doing a mechanical test of the ligand.”

During the experiments, the T cells did not activate fully when they encountered ligands with weak anchors. In contrast, when a T cell encountered a ligand with a firm anchor, the T cell became activated, showing that it experienced a piconewton level of resistance.

The amount of force that was applied by the T cell was mapped by using tension probes of different stiffness. Probes that responded to 19 piconewtons did not fluoresce, while softer, 12-piconewton probes produced high signal.

Following the fluorescence of the probe, the T cells switched on their calcium pumps and increased the calcium concentration within the cell, indicating that the T cell is mounting an immune response.

“We were able to map out the order of the cascade of chemical and mechanical reactions,” Salaita says. “First, the T cell uses a very specific and finely tuned mechanical tug to distinguish friend from foe. And when it senses a precise, piconewton level of force in response to that tug, the T cell realizes that it has encountered a foreign body and gives the signal for attack.”

The discovery could help in the search for treatments of auto-immune diseases and the development of immune therapies for cancer.

“Cancer cells have an extra molecule that can make T cell security guards ‘drunk’ or ‘sleepy’ so that they are not able to function properly,” Salaita says. “Learning more about the mechanical forces involved in an effective immune response may help us develop ways to evade this defense system of cancer cells.”

Co-authors on the study include Yang Liu, Victor Pui-Yan Ma, Kornelia Galior and Zheng Liu (from the Salaita lab); and Lori Blanchfield and Rakieb Andargachew (from the Evavold lab).

Related:
Chemists reveal the force within you
Molecular beacons shine light on how cells ‘crawl’
Nano-walkers take speedy leap forward with first rolling DNA-based motor

 

DNA-based nanoparticle tension sensors reveal that T-cell receptors transmit defined pN forces to their antigens for enhanced fidelity

Yang LiuaLori BlanchfieldbVictor Pui-Yan MaaRakieb AndargachewbKornelia GalioraZheng LiuaBrian Evavoldb, and Khalid Salaitaa,c,1
P
NAS May 2016; http://dx.doi.org:/10.1073/pnas.1600163113     

Significance

T cells protect the body against pathogens and cancer by recognizing specific foreign peptides on the cell surface. Because antigen recognition occurs at the junction between a migrating T cell and an antigen-presenting cell (APC), it is likely that cellular forces are generated and transmitted through T-cell receptor (TCR)-ligand bonds. Here we develop a DNA-based nanoparticle tension sensor producing the first molecular maps of TCR-ligand forces during T cell activation. We find that TCR forces are orchestrated in space and time, requiring the participation of CD8 coreceptor and adhesion molecules. Loss or damping of TCR forces results in weakened antigen discrimination, showing that T cells harness mechanics to optimize the specificity of response to ligand.

T cells are triggered when the T-cell receptor (TCR) encounters its antigenic ligand, the peptide-major histocompatibility complex (pMHC), on the surface of antigen presenting cells (APCs). Because T cells are highly migratory and antigen recognition occurs at an intermembrane junction where the T cell physically contacts the APC, there are long-standing questions of whether T cells transmit defined forces to their TCR complex and whether chemomechanical coupling influences immune function. Here we develop DNA-based gold nanoparticle tension sensors to provide, to our knowledge, the first pN tension maps of individual TCR-pMHC complexes during T-cell activation. We show that naïve T cells harness cytoskeletal coupling to transmit 12–19 pN of force to their TCRs within seconds of ligand binding and preceding initial calcium signaling. CD8 coreceptor binding and lymphocyte-specific kinase signaling are required for antigen-mediated cell spreading and force generation. Lymphocyte function-associated antigen 1 (LFA-1) mediated adhesion modulates TCR-pMHC tension by intensifying its magnitude to values >19 pN and spatially reorganizes the location of TCR forces to the kinapse, the zone located at the trailing edge of migrating T cells, thus demonstrating chemomechanical crosstalk between TCR and LFA-1 receptor signaling. Finally, T cells display a dampened and poorly specific response to antigen agonists when TCR forces are chemically abolished or physically “filtered” to a level below ∼12 pN using mechanically labile DNA tethers. Therefore, we conclude that T cells tune TCR mechanics with pN resolution to create a checkpoint of agonist quality necessary for specific immune response.

  1. Crystal structure of a complete ternary complex of T-cell receptor, peptide-MHC, and CD4.
    Yiyuan Yin et al., Proc Natl Acad Sci U S A, 2012
  2. T-cell antagonism by short half-life pMHC ligands can be mediated by an efficient trapping of T-cell polarization toward the APC.
    Leandro J Carreño et al., Proc Natl Acad Sci U S A, 2009
  3. T cell receptor binding kinetics required for T cell activation depend on the density of cognate ligand on the antigen-presenting cell.
    Pablo A González et al., Proc Natl Acad Sci U S A, 2005
  4. T-cell triggering thresholds are modulated by the number of antigen within individual T-cell receptor clusters.
    Boryana N Manz et al., Proc Natl Acad Sci U S A, 2011
  5. Quantum dot/peptide-MHC biosensors reveal strong CD8-dependent cooperation between self and viral antigens that augment the T cell response.
    Nadia Anikeeva et al., Proc Natl Acad Sci U S A, 2006
  6. Rachel A Gottschalk et al., Proc Natl Acad Sci U S A, 2012 
  7. Stage-dependent reactivity of thymocytes to self-peptide–MHC complexes.
    Qibin Leng et al., Proc Natl Acad Sci U S A, 2007  
  8. Maxim N Artyomov et al., Proc Natl Acad Sci U S A, 2010  
  9. Scott R Burrows et al., Proc Natl Acad Sci U S A, 2010  
  10. Andrej Kosmrlj et al., Proc Natl Acad Sci U S A, 2008
Larry H. Bernstein, MD, FCAP, Curator
LPBI
The two articles above are connected in an interesting way by the fact that cellular forces are generated and transmitted through T-cell receptor (TCR)-ligand bonds. The T-cell receptor (TCR) encounters its antigenic ligand, the peptide-major histocompatibility complex (pMHC), on the surface of antigen presenting cells (APCs).  The movement detected by the fluorescent sensor may be based on only a single amino acid at the cell surface ligand. The result is chemomechanical crosstalk between TCR and LFA-1 receptor signaling

Read Full Post »

CRISPR/Cas9, Familial Amyloid Polyneuropathy ( FAP) and Neurodegenerative Disease

CRISPR/Cas9, Familial Amyloid Polyneuropathy (FAP) and Neurodegenerative Disease, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 2: CRISPR for Gene Editing and DNA Repair

CRISPR/Cas9, Familial Amyloid Polyneuropathy ( FAP) and Neurodegenerative Disease

Curator: Larry H. Bernstein, MD, FCAP

 

CRISPR/Cas9 and Targeted Genome Editing: A New Era in Molecular Biology

https://www.neb.com/tools-and-resources/feature-articles/crispr-cas9-and-targeted-genome-editing-a-new-era-in-molecular-biology

The development of efficient and reliable ways to make precise, targeted changes to the genome of living cells is a long-standing goal for biomedical researchers. Recently, a new tool based on a bacterial CRISPR-associated protein-9 nuclease (Cas9) from Streptococcus pyogenes has generated considerable excitement (1). This follows several attempts over the years to manipulate gene function, including homologous recombination (2) and RNA interference (RNAi) (3). RNAi, in particular, became a laboratory staple enabling inexpensive and high-throughput interrogation of gene function (4, 5), but it is hampered by providing only temporary inhibition of gene function and unpredictable off-target effects (6). Other recent approaches to targeted genome modification – zinc-finger nucleases [ZFNs, (7)] and transcription-activator like effector nucleases [TALENs (8)]– enable researchers to generate permanent mutations by introducing doublestranded breaks to activate repair pathways. These approaches are costly and time-consuming to engineer, limiting their widespread use, particularly for large scale, high-throughput studies.

The Biology of Cas9

The functions of CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) and CRISPR-associated (Cas) genes are essential in adaptive immunity in select bacteria and archaea, enabling the organisms to respond to and eliminate invading genetic material. These repeats were initially discovered in the 1980s in E. coli (9), but their function wasn’t confirmed until 2007 by Barrangou and colleagues, who demonstrated that S. thermophilus can acquire resistance against a bacteriophage by integrating a genome fragment of an infectious virus into its CRISPR locus (10).

Three types of CRISPR mechanisms have been identified, of which type II is the most studied. In this case, invading DNA from viruses or plasmids is cut into small fragments and incorporated into a CRISPR locus amidst a series of short repeats (around 20 bps). The loci are transcribed, and transcripts are then processed to generate small RNAs (crRNA – CRISPR RNA), which are used to guide effector endonucleases that target invading DNA based on sequence complementarity (Figure 1) (11).

Figure 1. Cas9 in vivo: Bacterial Adaptive Immunity

https://www.neb.com/~/media/NebUs/Files/Feature%20Articles/Images/FA_Cas9_Fig1_Cas9InVivo.png

In the acquisition phase, foreign DNA is incorporated into the bacterial genome at the CRISPR loci. CRISPR loci is then transcribed and processed into crRNA during crRNA biogenesis. During interference, Cas9 endonuclease complexed with a crRNA and separate tracrRNA cleaves foreign DNA containing a 20-nucleotide crRNA complementary sequence adjacent to the PAM sequence. (Figure not drawn to scale.)

https://www.neb.com/~/media/NebUs/Files/Feature%20Articles/Images/FA_Cas9_GenomeEditingGlossary.png

One Cas protein, Cas9 (also known as Csn1), has been shown, through knockdown and rescue experiments to be a key player in certain CRISPR mechanisms (specifically type II CRISPR systems). The type II CRISPR mechanism is unique compared to other CRISPR systems, as only one Cas protein (Cas9) is required for gene silencing (12). In type II systems, Cas9 participates in the processing of crRNAs (12), and is responsible for the destruction of the target DNA (11). Cas9’s function in both of these steps relies on the presence of two nuclease domains, a RuvC-like nuclease domain located at the amino terminus and a HNH-like nuclease domain that resides in the mid-region of the protein (13).

To achieve site-specific DNA recognition and cleavage, Cas9 must be complexed with both a crRNA and a separate trans-activating crRNA (tracrRNA or trRNA), that is partially complementary to the crRNA (11). The tracrRNA is required for crRNA maturation from a primary transcript encoding multiple pre-crRNAs. This occurs in the presence of RNase III and Cas9 (12).

During the destruction of target DNA, the HNH and RuvC-like nuclease domains cut both DNA strands, generating double-stranded breaks (DSBs) at sites defined by a 20-nucleotide target sequence within an associated crRNA transcript (11, 14). The HNH domain cleaves the complementary strand, while the RuvC domain cleaves the noncomplementary strand.

The double-stranded endonuclease activity of Cas9 also requires that a short conserved sequence, (2–5 nts) known as protospacer-associated motif (PAM), follows immediately 3´- of the crRNA complementary sequence (15). In fact, even fully complementary sequences are ignored by Cas9-RNA in the absence of a PAM sequence (16).

Cas9 and CRISPR as a New Tool in Molecular Biology

The simplicity of the type II CRISPR nuclease, with only three required components (Cas9 along with the crRNA and trRNA) makes this system amenable to adaptation for genome editing. This potential was realized in 2012 by the Doudna and Charpentier labs (11). Based on the type II CRISPR system described previously, the authors developed a simplified two-component system by combining trRNA and crRNA into a single synthetic single guide RNA (sgRNA). sgRNAprogrammed Cas9 was shown to be as effective as Cas9 programmed with separate trRNA and crRNA in guiding targeted gene alterations (Figure 2A).

To date, three different variants of the Cas9 nuclease have been adopted in genome-editing protocols. The first is wild-type Cas9, which can site-specifically cleave double-stranded DNA, resulting in the activation of the doublestrand break (DSB) repair machinery. DSBs can be repaired by the cellular Non-Homologous End Joining (NHEJ) pathway (17), resulting in insertions and/or deletions (indels) which disrupt the targeted locus. Alternatively, if a donor template with homology to the targeted locus is supplied, the DSB may be repaired by the homology-directed repair (HDR) pathway allowing for precise replacement mutations to be made (Figure 2A) (17, 18).

Cong and colleagues (1) took the Cas9 system a step further towards increased precision by developing a mutant form, known as Cas9D10A, with only nickase activity. This means it cleaves only one DNA strand, and does not activate NHEJ. Instead, when provided with a homologous repair template, DNA repairs are conducted via the high-fidelity HDR pathway only, resulting in reduced indel mutations (1, 11, 19). Cas9D10A is even more appealing in terms of target specificity when loci are targeted by paired Cas9 complexes designed to generate adjacent DNA nicks (20) (see further details about “paired nickases” in Figure 2B).

The third variant is a nuclease-deficient Cas9 (dCas9, Figure 2C) (21). Mutations H840A in the HNH domain and D10A in the RuvC domain inactivate cleavage activity, but do not prevent DNA binding (11, 22). Therefore, this variant can be used to sequence-specifically target any region of the genome without cleavage. Instead, by fusing with various effector domains, dCas9 can be used either as a gene silencing or activation tool (21, 23–26). Furthermore, it can be used as a visualization tool. For instance, Chen and colleagues used dCas9 fused to Enhanced Green Fluorescent Protein (EGFP) to visualize repetitive DNA sequences with a single sgRNA or nonrepetitive loci using multiple sgRNAs (27).

Figure 2. CRISPR/Cas9 System Applications

https://www.neb.com/~/media/NebUs/Files/Feature%20Articles/Images/FA_Cas9_Fig2_Cas9forGenomeEditing.png?device=modal

  1. Wild-type Cas9 nuclease site specifically cleaves double-stranded DNA activating double-strand break repair machinery. In the absence of a homologous repair template non-homologous end joining can result in indels disrupting the target sequence. Alternatively, precise mutations and knock-ins can be made by providing a homologous repair template and exploiting the homology directed repair pathway.
    B. Mutated Cas9 makes a site specific single-strand nick. Two sgRNA can be used to introduce a staggered double-stranded break which can then undergo homology directed repair.
    C. Nuclease-deficient Cas9 can be fused with various effector domains allowing specific localization. For example, transcriptional activators, repressors, and fluorescent proteins.

Targeting Efficiency and Off-target Mutations

Targeting efficiency, or the percentage of desired mutation achieved, is one of the most important parameters by which to assess a genome-editing tool. The targeting efficiency of Cas9 compares favorably with more established methods, such as TALENs or ZFNs (8). For example, in human cells, custom-designed ZFNs and TALENs could only achieve efficiencies ranging from 1% to 50% (29–31). In contrast, the Cas9 system has been reported to have efficiencies up to >70% in zebrafish (32) and plants (33), and ranging from 2–5% in induced pluripotent stem cells (34). In addition, Zhou and colleagues were able to improve genome targeting up to 78% in one-cell mouse embryos, and achieved effective germline transmission through the use of dual sgRNAs to simultaneously target an individual gene (35).

A widely used method to identify mutations is the T7 Endonuclease I mutation detection assay (36, 37) (Figure 3). This assay detects heteroduplex DNA that results from the annealing of a DNA strand, including desired mutations, with a wildtype DNA strand (37).

Figure 3. T7 Endonuclease I Targeting Efficiency Assay

https://www.neb.com/~/media/NebUs/Files/Feature%20Articles/Images/FA_Cas9_Fig3_T7Assay_TargetEfficiency.png

Genomic DNA is amplified with primers bracketing the modified locus. PCR products are then denatured and re-annealed yielding 3 possible structures. Duplexes containing a mismatch are digested by T7 Endonuclease I. The DNA is then electrophoretically separated and fragment analysis is used to calculate targeting efficiency.

Another important parameter is the incidence of off-target mutations. Such mutations are likely to appear in sites that have differences of only a few nucleotides compared to the original sequence, as long as they are adjacent to a PAM sequence. This occurs as Cas9 can tolerate up to 5 base mismatches within the protospacer region (36) or a single base difference in the PAM sequence (38). Off-target mutations are generally more difficult to detect, requiring whole-genome sequencing to rule them out completely.

Recent improvements to the CRISPR system for reducing off-target mutations have been made through the use of truncated gRNA (truncated within the crRNA-derived sequence) or by adding two extra guanine (G) nucleotides to the 5´ end (28, 37). Another way researchers have attempted to minimize off-target effects is with the use of “paired nickases” (20). This strategy uses D10A Cas9 and two sgRNAs complementary to the adjacent area on opposite strands of the target site (Figure 2B). While this induces DSBs in the target DNA, it is expected to create only single nicks in off-target locations and, therefore, result in minimal off-target mutations.

By leveraging computation to reduce off-target mutations, several groups have developed webbased tools to facilitate the identification of potential CRISPR target sites and assess their potential for off-target cleavage. Examples include the CRISPR Design Tool (38) and the ZiFiT Targeter, Version 4.2 (39, 40).

Applications as a Genome-editing and Genome Targeting Tool

Following its initial demonstration in 2012 (9), the CRISPR/Cas9 system has been widely adopted. This has already been successfully used to target important genes in many cell lines and organisms, including human (34), bacteria (41), zebrafish (32), C. elegans (42), plants (34), Xenopus tropicalis (43), yeast (44), Drosophila (45), monkeys (46), rabbits (47), pigs (42), rats (48) and mice (49). Several groups have now taken advantage of this method to introduce single point mutations (deletions or insertions) in a particular target gene, via a single gRNA (14, 21, 29). Using a pair of gRNA-directed Cas9 nucleases instead, it is also possible to induce large deletions or genomic rearrangements, such as inversions or translocations (50). A recent exciting development is the use of the dCas9 version of the CRISPR/Cas9 system to target protein domains for transcriptional regulation (26, 51, 52), epigenetic modification (25), and microscopic visualization of specific genome loci (27).

The CRISPR/Cas9 system requires only the redesign of the crRNA to change target specificity. This contrasts with other genome editing tools, including zinc finger and TALENs, where redesign of the protein-DNA interface is required. Furthermore, CRISPR/Cas9 enables rapid genome-wide interrogation of gene function by generating large gRNA libraries (51, 53) for genomic screening.

The Future of CRISPR/Cas9

The rapid progress in developing Cas9 into a set of tools for cell and molecular biology research has been remarkable, likely due to the simplicity, high efficiency and versatility of the system. Of the designer nuclease systems currently available for precision genome engineering, the CRISPR/Cas system is by far the most user friendly. It is now also clear that Cas9’s potential reaches beyond DNA cleavage, and its usefulness for genome locus-specific recruitment of proteins will likely only be limited by our imagination.

 

Scientists urge caution in using new CRISPR technology to treat human genetic disease

By Robert Sanders, Media relations | MARCH 19, 2015
http://news.berkeley.edu/2015/03/19/scientists-urge-caution-in-using-new-crispr-technology-to-treat-human-genetic-disease/

http://news.berkeley.edu/wp-content/uploads/2015/03/crispr350.jpg

The bacterial enzyme Cas9 is the engine of RNA-programmed genome engineering in human cells. (Graphic by Jennifer Doudna/UC Berkeley)

A group of 18 scientists and ethicists today warned that a revolutionary new tool to cut and splice DNA should be used cautiously when attempting to fix human genetic disease, and strongly discouraged any attempts at making changes to the human genome that could be passed on to offspring.

Among the authors of this warning is Jennifer Doudna, the co-inventor of the technology, called CRISPR-Cas9, which is driving a new interest in gene therapy, or “genome engineering.” She and colleagues co-authored a perspective piece that appears in the March 20 issue of Science, based on discussions at a meeting that took place in Napa on Jan. 24. The same issue of Science features a collection of recent research papers, commentary and news articles on CRISPR and its implications.    …..

A prudent path forward for genomic engineering and germline gene modification

David Baltimore1,  Paul Berg2, …., Jennifer A. Doudna4,10,*, et al.
http://science.sciencemag.org/content/early/2015/03/18/science.aab1028.full
Science  19 Mar 2015.  http://dx.doi.org:/10.1126/science.aab1028

 

Correcting genetic defects

Scientists today are changing DNA sequences to correct genetic defects in animals as well as cultured tissues generated from stem cells, strategies that could eventually be used to treat human disease. The technology can also be used to engineer animals with genetic diseases mimicking human disease, which could lead to new insights into previously enigmatic disorders.

The CRISPR-Cas9 tool is still being refined to ensure that genetic changes are precisely targeted, Doudna said. Nevertheless, the authors met “… to initiate an informed discussion of the uses of genome engineering technology, and to identify proactively those areas where current action is essential to prepare for future developments. We recommend taking immediate steps toward ensuring that the application of genome engineering technology is performed safely and ethically.”

 

Amyloid CRISPR Plasmids and si/shRNA Gene Silencers

http://www.scbt.com/crispr/table-amyloid.html

Santa Cruz Biotechnology, Inc. offers a broad range of gene silencers in the form of siRNAs, shRNA Plasmids and shRNA Lentiviral Particles as well as CRISPR/Cas9 Knockout and CRISPR Double Nickase plasmids. Amyloid gene silencers are available as Amyloid siRNA, Amyloid shRNA Plasmid, Amyloid shRNA Lentiviral Particles and Amyloid CRISPR/Cas9 Knockout plasmids. Amyloid CRISPR/dCas9 Activation Plasmids and CRISPR Lenti Activation Systems for gene activation are also available. Gene silencers and activators are useful for gene studies in combination with antibodies used for protein detection.    Amyloid CRISPR Knockout, HDR and Nickase Knockout Plasmids

 

CRISPR-Cas9-Based Knockout of the Prion Protein and Its Effect on the Proteome


Mehrabian M, Brethour D, MacIsaac S, Kim JK, Gunawardana C.G, Wang H, et al.
PLoS ONE 2014; 9(12): e114594. http://dx.doi.org/10.1371/journal.pone.0114594

The molecular function of the cellular prion protein (PrPC) and the mechanism by which it may contribute to neurotoxicity in prion diseases and Alzheimer’s disease are only partially understood. Mouse neuroblastoma Neuro2a cells and, more recently, C2C12 myocytes and myotubes have emerged as popular models for investigating the cellular biology of PrP. Mouse epithelial NMuMG cells might become attractive models for studying the possible involvement of PrP in a morphogenetic program underlying epithelial-to-mesenchymal transitions. Here we describe the generation of PrP knockout clones from these cell lines using CRISPR-Cas9 knockout technology. More specifically, knockout clones were generated with two separate guide RNAs targeting recognition sites on opposite strands within the first hundred nucleotides of the Prnp coding sequence. Several PrP knockout clones were isolated and genomic insertions and deletions near the CRISPR-target sites were characterized. Subsequently, deep quantitative global proteome analyses that recorded the relative abundance of>3000 proteins (data deposited to ProteomeXchange Consortium) were undertaken to begin to characterize the molecular consequences of PrP deficiency. The levels of ∼120 proteins were shown to reproducibly correlate with the presence or absence of PrP, with most of these proteins belonging to extracellular components, cell junctions or the cytoskeleton.

http://journals.plos.org/plosone/article/figure/image?size=inline&id=info:doi/10.1371/journal.pone.0114594.g001

http://journals.plos.org/plosone/article/figure/image?size=inline&id=info:doi/10.1371/journal.pone.0114594.g003

 

Development and Applications of CRISPR-Cas9 for Genome Engineering

Patrick D. Hsu,1,2,3 Eric S. Lander,1 and Feng Zhang1,2,*
Cell. 2014 Jun 5; 157(6): 1262–1278.   doi:  10.1016/j.cell.2014.05.010

Recent advances in genome engineering technologies based on the CRISPR-associated RNA-guided endonuclease Cas9 are enabling the systematic interrogation of mammalian genome function. Analogous to the search function in modern word processors, Cas9 can be guided to specific locations within complex genomes by a short RNA search string. Using this system, DNA sequences within the endogenous genome and their functional outputs are now easily edited or modulated in virtually any organism of choice. Cas9-mediated genetic perturbation is simple and scalable, empowering researchers to elucidate the functional organization of the genome at the systems level and establish causal linkages between genetic variations and biological phenotypes. In this Review, we describe the development and applications of Cas9 for a variety of research or translational applications while highlighting challenges as well as future directions. Derived from a remarkable microbial defense system, Cas9 is driving innovative applications from basic biology to biotechnology and medicine.

The development of recombinant DNA technology in the 1970s marked the beginning of a new era for biology. For the first time, molecular biologists gained the ability to manipulate DNA molecules, making it possible to study genes and harness them to develop novel medicine and biotechnology. Recent advances in genome engineering technologies are sparking a new revolution in biological research. Rather than studying DNA taken out of the context of the genome, researchers can now directly edit or modulate the function of DNA sequences in their endogenous context in virtually any organism of choice, enabling them to elucidate the functional organization of the genome at the systems level, as well as identify causal genetic variations.

Broadly speaking, genome engineering refers to the process of making targeted modifications to the genome, its contexts (e.g., epigenetic marks), or its outputs (e.g., transcripts). The ability to do so easily and efficiently in eukaryotic and especially mammalian cells holds immense promise to transform basic science, biotechnology, and medicine (Figure 1).

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343198/bin/nihms659174f1.jpg

For life sciences research, technologies that can delete, insert, and modify the DNA sequences of cells or organisms enable dissecting the function of specific genes and regulatory elements. Multiplexed editing could further allow the interrogation of gene or protein networks at a larger scale. Similarly, manipulating transcriptional regulation or chromatin states at particular loci can reveal how genetic material is organized and utilized within a cell, illuminating relationships between the architecture of the genome and its functions. In biotechnology, precise manipulation of genetic building blocks and regulatory machinery also facilitates the reverse engineering or reconstruction of useful biological systems, for example, by enhancing biofuel production pathways in industrially relevant organisms or by creating infection-resistant crops. Additionally, genome engineering is stimulating a new generation of drug development processes and medical therapeutics. Perturbation of multiple genes simultaneously could model the additive effects that underlie complex polygenic disorders, leading to new drug targets, while genome editing could directly correct harmful mutations in the context of human gene therapy (Tebas et al., 2014).

Eukaryotic genomes contain billions of DNA bases and are difficult to manipulate. One of the breakthroughs in genome manipulation has been the development of gene targeting by homologous recombination (HR), which integrates exogenous repair templates that contain sequence homology to the donor site (Figure 2A) (Capecchi, 1989). HR-mediated targeting has facilitated the generation of knockin and knockout animal models via manipulation of germline competent stem cells, dramatically advancing many areas of biological research. However, although HR-mediated gene targeting produces highly precise alterations, the desired recombination events occur extremely infrequently (1 in 106–109 cells) (Capecchi, 1989), presenting enormous challenges for large-scale applications of gene-targeting experiments.

Genome Editing Technologies Exploit Endogenous DNA Repair Machinery

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343198/bin/nihms659174f2.gif

To overcome these challenges, a series of programmable nuclease-based genome editing technologies have been developed in recent years, enabling targeted and efficient modification of a variety of eukaryotic and particularly mammalian species. Of the current generation of genome editing technologies, the most rapidly developing is the class of RNA-guided endonucleases known as Cas9 from the microbial adaptive immune system CRISPR (clustered regularly interspaced short palindromic repeats), which can be easily targeted to virtually any genomic location of choice by a short RNA guide. Here, we review the development and applications of the CRISPR-associated endonuclease Cas9 as a platform technology for achieving targeted perturbation of endogenous genomic elements and also discuss challenges and future avenues for innovation.   ……

Figure 4   Natural Mechanisms of Microbial CRISPR Systems in Adaptive Immunity

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4343198/bin/nihms659174f4.gif

……  A key turning point came in 2005, when systematic analysis of the spacer sequences separating the individual direct repeats suggested their extrachromosomal and phage-associated origins (Mojica et al., 2005Pourcel et al., 2005Bolotin et al., 2005). This insight was tremendously exciting, especially given previous studies showing that CRISPR loci are transcribed (Tang et al., 2002) and that viruses are unable to infect archaeal cells carrying spacers corresponding to their own genomes (Mojica et al., 2005). Together, these findings led to the speculation that CRISPR arrays serve as an immune memory and defense mechanism, and individual spacers facilitate defense against bacteriophage infection by exploiting Watson-Crick base-pairing between nucleic acids (Mojica et al., 2005Pourcel et al., 2005). Despite these compelling realizations that CRISPR loci might be involved in microbial immunity, the specific mechanism of how the spacers act to mediate viral defense remained a challenging puzzle. Several hypotheses were raised, including thoughts that CRISPR spacers act as small RNA guides to degrade viral transcripts in a RNAi-like mechanism (Makarova et al., 2006) or that CRISPR spacers direct Cas enzymes to cleave viral DNA at spacer-matching regions (Bolotin et al., 2005).   …..

As the pace of CRISPR research accelerated, researchers quickly unraveled many details of each type of CRISPR system (Figure 4). Building on an earlier speculation that protospacer adjacent motifs (PAMs) may direct the type II Cas9 nuclease to cleave DNA (Bolotin et al., 2005), Moineau and colleagues highlighted the importance of PAM sequences by demonstrating that PAM mutations in phage genomes circumvented CRISPR interference (Deveau et al., 2008). Additionally, for types I and II, the lack of PAM within the direct repeat sequence within the CRISPR array prevents self-targeting by the CRISPR system. In type III systems, however, mismatches between the 5′ end of the crRNA and the DNA target are required for plasmid interference (Marraffini and Sontheimer, 2010).  …..

In 2013, a pair of studies simultaneously showed how to successfully engineer type II CRISPR systems from Streptococcus thermophilus (Cong et al., 2013) andStreptococcus pyogenes (Cong et al., 2013Mali et al., 2013a) to accomplish genome editing in mammalian cells. Heterologous expression of mature crRNA-tracrRNA hybrids (Cong et al., 2013) as well as sgRNAs (Cong et al., 2013Mali et al., 2013a) directs Cas9 cleavage within the mammalian cellular genome to stimulate NHEJ or HDR-mediated genome editing. Multiple guide RNAs can also be used to target several genes at once. Since these initial studies, Cas9 has been used by thousands of laboratories for genome editing applications in a variety of experimental model systems (Sander and Joung, 2014). ……

The majority of CRISPR-based technology development has focused on the signature Cas9 nuclease from type II CRISPR systems. However, there remains a wide diversity of CRISPR types and functions. Cas RAMP module (Cmr) proteins identified in Pyrococcus furiosus and Sulfolobus solfataricus (Hale et al., 2012) constitute an RNA-targeting CRISPR immune system, forming a complex guided by small CRISPR RNAs that target and cleave complementary RNA instead of DNA. Cmr protein homologs can be found throughout bacteria and archaea, typically relying on a 5 site tag sequence on the target-matching crRNA for Cmr-directed cleavage.

Unlike RNAi, which is targeted largely by a 6 nt seed region and to a lesser extent 13 other bases, Cmr crRNAs contain 30–40 nt of target complementarity. Cmr-CRISPR technologies for RNA targeting are thus a promising target for orthogonal engineering and minimal off-target modification. Although the modularity of Cmr systems for RNA-targeting in mammalian cells remains to be investigated, Cmr complexes native to P. furiosus have already been engineered to target novel RNA substrates (Hale et al., 20092012).   ……

Although Cas9 has already been widely used as a research tool, a particularly exciting future direction is the development of Cas9 as a therapeutic technology for treating genetic disorders. For a monogenic recessive disorder due to loss-of-function mutations (such as cystic fibrosis, sickle-cell anemia, or Duchenne muscular dystrophy), Cas9 may be used to correct the causative mutation. This has many advantages over traditional methods of gene augmentation that deliver functional genetic copies via viral vector-mediated overexpression—particularly that the newly functional gene is expressed in its natural context. For dominant-negative disorders in which the affected gene is haplosufficient (such as transthyretin-related hereditary amyloidosis or dominant forms of retinitis pigmentosum), it may also be possible to use NHEJ to inactivate the mutated allele to achieve therapeutic benefit. For allele-specific targeting, one could design guide RNAs capable of distinguishing between single-nucleotide polymorphism (SNP) variations in the target gene, such as when the SNP falls within the PAM sequence.

 

 

CRISPR/Cas9: a powerful genetic engineering tool for establishing large animal models of neurodegenerative diseases

Zhuchi Tu, Weili Yang, Sen Yan, Xiangyu Guo and Xiao-Jiang Li

Molecular Neurodegeneration 2015; 10:35  http://dx.doi.org:/10.1186/s13024-015-0031-x

Animal models are extremely valuable to help us understand the pathogenesis of neurodegenerative disorders and to find treatments for them. Since large animals are more like humans than rodents, they make good models to identify the important pathological events that may be seen in humans but not in small animals; large animals are also very important for validating effective treatments or confirming therapeutic targets. Due to the lack of embryonic stem cell lines from large animals, it has been difficult to use traditional gene targeting technology to establish large animal models of neurodegenerative diseases. Recently, CRISPR/Cas9 was used successfully to genetically modify genomes in various species. Here we discuss the use of CRISPR/Cas9 technology to establish large animal models that can more faithfully mimic human neurodegenerative diseases.

Neurodegenerative diseases — Alzheimer’s disease(AD),Parkinson’s disease(PD), amyotrophic lateral sclerosis (ALS), Huntington’s disease (HD), and frontotemporal dementia (FTD) — are characterized by age-dependent and selective neurodegeneration. As the life expectancy of humans lengthens, there is a greater prevalence of these neurodegenerative diseases; however, the pathogenesis of most of these neurodegenerative diseases remain unclear, and we lack effective treatments for these important brain disorders.

CRISPR/Cas9,  Non-human primates,  Neurodegenerative diseases,  Animal model

There are a number of excellent reviews covering different types of neurodegenerative diseases and their genetic mouse models [812]. Investigations of different mouse models of neurodegenerative diseases have revealed a common pathology shared by these diseases. First, the development of neuropathology and neurological symptoms in genetic mouse models of neurodegenerative diseases is age dependent and progressive. Second, all the mouse models show an accumulation of misfolded or aggregated proteins resulting from the expression of mutant genes. Third, despite the widespread expression of mutant proteins throughout the body and brain, neuronal function appears to be selectively or preferentially affected. All these facts indicate that mouse models of neurodegenerative diseases recapitulate important pathologic features also seen in patients with neurodegenerative diseases.

However, it seems that mouse models can not recapitulate the full range of neuropathology seen in patients with neurodegenerative diseases. Overt neurodegeneration, which is the most important pathological feature in patient brains, is absent in genetic rodent models of AD, PD, and HD. Many rodent models that express transgenic mutant proteins under the control of different promoters do not replicate overt neurodegeneration, which is likely due to their short life spans and the different aging processes of small animals. Also important are the remarkable differences in brain development between rodents and primates. For example, the mouse brain takes 21 days to fully develop, whereas the formation of primate brains requires more than 150 days [13]. The rapid development of the brain in rodents may render neuronal cells resistant to misfolded protein-mediated neurodegeneration. Another difficulty in using rodent models is how to analyze cognitive and emotional abnormalities, which are the early symptoms of most neurodegenerative diseases in humans. Differences in neuronal circuitry, anatomy, and physiology between rodent and primate brains may also account for the behavioral differences between rodent and primate models.

 

Mitochondrial dynamics–fusion, fission, movement, and mitophagy–in neurodegenerative diseases

Hsiuchen Chen and David C. Chan
Human Molec Gen 2009; 18, Review Issue 2 R169–R176
http://dx.doi.org:/10.1093/hmg/ddp326

Neurons are metabolically active cells with high energy demands at locations distant from the cell body. As a result, these cells are particularly dependent on mitochondrial function, as reflected by the observation that diseases of mitochondrial dysfunction often have a neurodegenerative component. Recent discoveries have highlighted that neurons are reliant particularly on the dynamic properties of mitochondria. Mitochondria are dynamic organelles by several criteria. They engage in repeated cycles of fusion and fission, which serve to intermix the lipids and contents of a population of mitochondria. In addition, mitochondria are actively recruited to subcellular sites, such as the axonal and dendritic processes of neurons. Finally, the quality of a mitochondrial population is maintained through mitophagy, a form of autophagy in which defective mitochondria are selectively degraded. We review the general features of mitochondrial dynamics, incorporating recent findings on mitochondrial fusion, fission, transport and mitophagy. Defects in these key features are associated with neurodegenerative disease. Charcot-Marie-Tooth type 2A, a peripheral neuropathy, and dominant optic atrophy, an inherited optic neuropathy, result from a primary deficiency of mitochondrial fusion. Moreover, several major neurodegenerative diseases—including Parkinson’s, Alzheimer’s and Huntington’s disease—involve disruption of mitochondrial dynamics. Remarkably, in several disease models, the manipulation of mitochondrial fusion or fission can partially rescue disease phenotypes. We review how mitochondrial dynamics is altered in these neurodegenerative diseases and discuss the reciprocal interactions between mitochondrial fusion, fission, transport and mitophagy.

 

Applications of CRISPR–Cas systems in Neuroscience

Matthias Heidenreich  & Feng Zhang
Nature Rev Neurosci 2016; 17:36–44   http://dx.doi.org:/10.1038/nrn.2015.2

Genome-editing tools, and in particular those based on CRISPR–Cas (clustered regularly interspaced short palindromic repeat (CRISPR)–CRISPR-associated protein) systems, are accelerating the pace of biological research and enabling targeted genetic interrogation in almost any organism and cell type. These tools have opened the door to the development of new model systems for studying the complexity of the nervous system, including animal models and stem cell-derived in vitro models. Precise and efficient gene editing using CRISPR–Cas systems has the potential to advance both basic and translational neuroscience research.
Cellular neuroscience
, DNA recombination, Genetic engineering, Molecular neuroscience

Figure 3: In vitro applications of Cas9 in human iPSCs.close

http://www.nature.com/nrn/journal/v17/n1/carousel/nrn.2015.2-f3.jpg

a | Evaluation of disease candidate genes from large-population genome-wide association studies (GWASs). Human primary cells, such as neurons, are not easily available and are difficult to expand in culture. By contrast, induced pluripo…

  1. Genome-editing Technologies for Gene and Cell Therapy

Molecular Therapy 12 Jan 2016

  1. Systematic quantification of HDR and NHEJ reveals effects of locus, nuclease, and cell type on genome-editing

Scientific Reports 31 Mar 2016

  1. Controlled delivery of β-globin-targeting TALENs and CRISPR/Cas9 into mammalian cells for genome editing using microinjection

Scientific Reports 12 Nov 2015

 

Alzheimer’s Disease: Medicine’s Greatest Challenge in the 21st Century

https://www.physicsforums.com/insights/can-gene-editing-eliminate-alzheimers-disease/

The development of the CRISPR/Cas9 system has made gene editing a relatively simple task.  While CRISPR and other gene editing technologies stand to revolutionize biomedical research and offers many promising therapeutic avenues (such as in the treatment of HIV), a great deal of debate exists over whether CRISPR should be used to modify human embryos. As I discussed in my previous Insight article, we lack enough fundamental biological knowledge to enhance many traits like height or intelligence, so we are not near a future with genetically-enhanced super babies. However, scientists have identified a few rare genetic variants that protect against disease.  One such protective variant is a mutation in the APP gene that protects against Alzheimer’s disease and cognitive decline in old age. If we can perfect gene editing technologies, is this mutation one that we should be regularly introducing into embryos? In this article, I explore the potential for using gene editing as a way to prevent Alzheimer’s disease in future generations. Alzheimer’s Disease: Medicine’s Greatest Challenge in the 21st Century Can gene editing be the missing piece in the battle against Alzheimer’s? (Source: bostonbiotech.org) I chose to assess the benefit of germline gene editing in the context of Alzheimer’s disease because this disease is one of the biggest challenges medicine faces in the 21st century. Alzheimer’s disease is a chronic neurodegenerative disease responsible for the majority of the cases of dementia in the elderly. The disease symptoms begins with short term memory loss and causes more severe symptoms – problems with language, disorientation, mood swings, behavioral issues – as it progresses, eventually leading to the loss of bodily functions and death. Because of the dementia the disease causes, Alzheimer’s patients require a great deal of care, and the world spends ~1% of its total GDP on caring for those with Alzheimer’s and related disorders. Because the prevalence of the disease increases with age, the situation will worsen as life expectancies around the globe increase: worldwide cases of Alzheimer’s are expected to grow from 35 million today to over 115 million by 2050.

Despite much research, the exact causes of Alzheimer’s disease remains poorly understood. The disease seems to be related to the accumulation of plaques made of amyloid-β peptides that form on the outside of neurons, as well as the formation of tangles of the protein tau inside of neurons. Although many efforts have been made to target amyloid-β or the enzymes involved in its formation, we have so far been unsuccessful at finding any treatment that stops the disease or reverses its progress. Some researchers believe that most attempts at treating Alzheimer’s have failed because, by the time a patient shows symptoms, the disease has already progressed past the point of no return.

While research towards a cure continues, researchers have sought effective ways to prevent Alzheimer’s disease. Although some studies show that mental and physical exercise may lower ones risk of Alzheimer’s disease, approximately 60-80% of the risk for Alzheimer’s disease appears to be genetic. Thus, if we’re serious about prevention, we may have to act at the genetic level. And because the brain is difficult to access surgically for gene therapy in adults, this means using gene editing on embryos.

Reference https://www.physicsforums.com/insights/can-gene-editing-eliminate-alzheimers-disease/

 

Utilising CRISPR to Generate Predictive Disease Models: a Case Study in Neurodegenerative Disorders


Dr. Bhuvaneish.T. Selvaraj  – Scottish Centre for Regenerative Medicine

http://www.crisprsummit.com/utilising-crispr-to-generate-predictive-disease-models-a-case-study-in-neurodegenerative-disorders

  • Introducing the latest developments in predictive model generation
  • Discover how CRISPR is being used to develop disease models to study and treat neurodegenerative disorders
  • In depth Q&A session to answer your most pressing questions

 

Turning On Genes, Systematically, with CRISPR/Cas9

http://www.genengnews.com/gen-news-highlights/turning-on-genes-systematically-with-crispr-cas9/81250697/

 

Scientists based at MIT assert that they can reliably turn on any gene of their choosing in living cells. [Feng Zhang and Steve Dixon]  http://www.genengnews.com/media/images/GENHighlight/Dec12_2014_CRISPRCas9GeneActivationSystem7838101231.jpg

With the latest CRISPR/Cas9 advance, the exhortation “turn on, tune in, drop out” comes to mind. The CRISPR/Cas9 gene-editing system was already a well-known means of “tuning in” (inserting new genes) and “dropping out” (knocking out genes). But when it came to “turning on” genes, CRISPR/Cas9 had little potency. That is, it had demonstrated only limited success as a way to activate specific genes.

A new CRISPR/Cas9 approach, however, appears capable of activating genes more effectively than older approaches. The new approach may allow scientists to more easily determine the function of individual genes, according to Feng Zhang, Ph.D., a researcher at MIT and the Broad Institute. Dr. Zhang and colleagues report that the new approach permits multiplexed gene activation and rapid, large-scale studies of gene function.

The new technique was introduced in the December 10 online edition of Nature, in an article entitled, “Genome-scale transcriptional activation by an engineered CRISPR-Cas9 complex.” The article describes how Dr. Zhang, along with the University of Tokyo’s Osamu Nureki, Ph.D., and Hiroshi Nishimasu, Ph.D., overhauled the CRISPR/Cas9 system. The research team based their work on their analysis (published earlier this year) of the structure formed when Cas9 binds to the guide RNA and its target DNA. Specifically, the team used the structure’s 3D shape to rationally improve the system.

In previous efforts to revamp CRISPR/Cas9 for gene activation purposes, scientists had tried to attach the activation domains to either end of the Cas9 protein, with limited success. From their structural studies, the MIT team realized that two small loops of the RNA guide poke out from the Cas9 complex and could be better points of attachment because they allow the activation domains to have more flexibility in recruiting transcription machinery.

Using their revamped system, the researchers activated about a dozen genes that had proven difficult or impossible to turn on using the previous generation of Cas9 activators. Each gene showed at least a twofold boost in transcription, and for many genes, the researchers found multiple orders of magnitude increase in activation.

After investigating single-guide RNA targeting rules for effective transcriptional activation, demonstrating multiplexed activation of 10 genes simultaneously, and upregulating long intergenic noncoding RNA transcripts, the research team decided to undertake a large-scale screen. This screen was designed to identify genes that confer resistance to a melanoma drug called PLX-4720.

“We … synthesized a library consisting of 70,290 guides targeting all human RefSeq coding isoforms to screen for genes that, upon activation, confer resistance to a BRAF inhibitor,” wrote the authors of the Nature paper. “The top hits included genes previously shown to be able to confer resistance, and novel candidates were validated using individual [single-guide RNA] and complementary DNA overexpression.”

A gene signature based on the top screening hits, the authors added, correlated with a gene expression signature of BRAF inhibitor resistance in cell lines and patient-derived samples. It was also suggested that large-scale screens such as the one demonstrated in the current study could help researchers discover new cancer drugs that prevent tumors from becoming resistant.

More at –  http://www.genengnews.com/gen-news-highlights/turning-on-genes-systematically-with-crispr-cas9/81250697/

 

Susceptibility and modifier genes in Portuguese transthyretin V30M amyloid polyneuropathy: complexity in a single-gene disease
Miguel L. Soares1,2, Teresa Coelho3,6, Alda Sousa4,5, …, Maria Joa˜o Saraiva2,5 and Joel N. Buxbaum1
Human Molec Gen 2005; 14(4): 543–553   http://dx.doi.org:/10.1093/hmg/ddi051
https://www.researchgate.net/profile/Isabel_Conceicao/publication/8081351_Susceptibility_and_modifier_genes_in_Portuguese_transthyretin_V30M_amyloid_polyneuropathy_complexity_in_a_single-gene_disease/links/53e123d70cf2235f352733b3.pdf

Familial amyloid polyneuropathy type I is an autosomal dominant disorder caused by mutations in the transthyretin (TTR ) gene; however, carriers of the same mutation exhibit variability in penetrance and clinical expression. We analyzed alleles of candidate genes encoding non-fibrillar components of TTR amyloid deposits and a molecule metabolically interacting with TTR [retinol-binding protein (RBP)], for possible associations with age of disease onset and/or susceptibility in a Portuguese population sample with the TTR V30M mutation and unrelated controls. We show that the V30M carriers represent a distinct subset of the Portuguese population. Estimates of genetic distance indicated that the controls and the classical onset group were furthest apart, whereas the late-onset group appeared to differ from both. Importantly, the data also indicate that genetic interactions among the multiple loci evaluated, rather than single-locus effects, are more likely to determine differences in the age of disease onset. Multifactor dimensionality reduction indicated that the best genetic model for classical onset group versus controls involved the APCS gene, whereas for late-onset cases, one APCS variant (APCSv1) and two RBP variants (RBPv1 and RBPv2) are involved. Thus, although the TTR V30M mutation is required for the disease in Portuguese patients, different genetic factors may govern the age of onset, as well as the occurrence of anticipation.

Autosomal dominant disorders may vary in expression even within a given kindred. The basis of this variability is uncertain and can be attributed to epigenetic factors, environment or epistasis. We have studied familial amyloid polyneuropathy (FAP), an autosomal dominant disorder characterized by peripheral sensorimotor and autonomic neuropathy. It exhibits variation in cardiac, renal, gastrointestinal and ocular involvement, as well as age of onset. Over 80 missense mutations in the transthyretin gene (TTR ) result in autosomal dominant disease http://www.ibmc.up.pt/~mjsaraiv/ttrmut.html). The presence of deposits consisting entirely of wild-type TTR molecules in the hearts of 10– 25% of individuals over age 80 reveals its inherent in vivo amyloidogenic potential (1).

FAP was initially described in Portuguese (2) where, until recently, the TTR V30M has been the only pathogenic mutation associated with the disease (3,4). Later reports identified the same mutation in Swedish and Japanese families (5,6). The disorder has since been recognized in other European countries and in North American kindreds in association with V30M, as well as other mutations (7).

TTR V30M produces disease in only 5–10% of Swedish carriers of the allele (8), a much lower degree of penetrance than that seen in Portuguese (80%) (9) or in Japanese with the same mutation. The actual penetrance in Japanese carriers has not been formally established, but appears to resemble that seen in Portuguese. Portuguese and Japanese carriers show considerable variation in the age of clinical onset (10,11). In both populations, the first symptoms had originally been described as typically occurring before age 40 (so-called ‘classical’ or early-onset); however, in recent years, more individuals developing symptoms late in life have been identified (11,12). Hence, present data indicate that the distribution of the age of onset in Portuguese is continuous, but asymmetric with a mean around age 35 and a long tail into the older age group (Fig. 1) (9,13). Further, DNA testing in Portugal has identified asymptomatic carriers over age 70 belonging to a subset of very late-onset kindreds in whose descendants genetic anticipation is frequent. The molecular basis of anticipation in FAP, which is not mediated by trinucleotide repeat expansions in the TTR or any other gene (14), remains elusive.

Variation in penetrance, age of onset and clinical features are hallmarks of many autosomal dominant disorders including the human TTR amyloidoses (7). Some of these clearly reflect specific biological effects of a particular mutation or a class of mutants. However, when such phenotypic variability is seen with a single mutation in the gene encoding the same protein, it suggests an effect of modifying genetic loci and/or environmental factors contributing differentially to the course of disease. We have chosen to examine age of onset as an example of a discrete phenotypic variation in the presence of the particular autosomal dominant disease-associated mutation TTR V30M. Although the role of environmental factors cannot be excluded, the existence of modifier genes involved in TTR amyloidogenesis is an attractive hypothesis to explain the phenotypic variability in FAP. ….

ATTR (TTR amyloid), like all amyloid deposits, contains several molecular components, in addition to the quantitatively dominant fibril-forming amyloid protein, including heparan sulfate proteoglycan 2 (HSPG2 or perlecan), SAP, a plasma glycoprotein of the pentraxin family (encoded by the APCS gene) that undergoes specific calcium-dependent binding to all types of amyloid fibrils, and apolipoprotein E (ApoE), also found in all amyloid deposits (15). The ApoE4 isoform is associated with an increased frequency and earlier onset of Alzheimer’s disease (Ab), the most common form of brain amyloid, whereas the ApoE2 isoform appears to be protective (16). ApoE variants could exert a similar modulatory effect in the onset of FAP, although early studies on a limited number of patients suggested this was not the case (17).

In at least one instance of senile systemic amyloidosis, small amounts of AA-related material were found in TTR deposits (18). These could reflect either a passive co-aggregation or a contributory involvement of protein AA, encoded by the serum amyloid A (SAA ) genes and the main component of secondary (reactive) amyloid fibrils, in the formation of ATTR.

Retinol-binding protein (RBP), the serum carrier of vitamin A, circulates in plasma bound to TTR. Vitamin A-loaded RBP and L-thyroxine, the two natural ligands of TTR, can act alone or synergistically to inhibit the rate and extent of TTR fibrillogenesis in vitro, suggesting that RBP may influence the course of FAP pathology in vivo (19). We have analyzed coding and non-coding sequence polymorphisms in the RBP4 (serum RBP, 10q24), HSPG2 (1p36.1), APCS (1q22), APOE (19q13.2), SAA1 and SAA2 (11p15.1) genes with the goal of identifying chromosomes carrying common and functionally significant variants. At the time these studies were performed, the full human genome sequence was not completed and systematic singlenucleotide polymorphism (SNP) analyses were not available for any of the suspected candidate genes. We identified new SNPs in APCS and RBP4 and utilized polymorphisms in SAA, HSPG2 and APOE that had already been characterized and shown to have potential pathophysiologic significance in other disorders (16,20–22). The genotyping data were analyzed for association with the presence of the V30M amyloidogenic allele (FAP patients versus controls) and with the age of onset (classical- versus late-onset patients). Multilocus analyses were also performed to examine the effects of simultaneous contributions of the six loci for determining the onset of the first symptoms.  …..

The potential for different underlying models for classical and late onset is supported by the MDR analysis, which produces two distinct models when comparing each class with the controls. One could view the two onset classes as unique diseases. If this is the case, then the failure to detect a single predictive genetic model is consistent with two related, but different, diseases. This is exactly what would be expected in such a case of genetic heterogeneity (28). Using this approach, a major gene effect can be viewed as a necessary, but not sufficient, condition to explain the course of the disease. Analyzing the cases but omitting from the analysis of phenotype the necessary allele, in this case TTR V30M, can then reveal a variety of important modifiers that are distinct between the phenotypes.

The significant comparisons obtained in our study cohort indicate that the combined effects mainly result from two and three-locus interactions involving all loci except SAA1 and SAA2 for susceptibility to disease. A considerable number of four-site combinations modulate the age of onset with SAA1 appearing in a majority of significant combinations in late-onset disease, perhaps indicating a greater role of the SAA variants in the age of onset of FAP.

The correlation between genotype and phenotype in socalled simple Mendelian disorders is often incomplete, as only a subset of all mutations can reliably predict specific phenotypes (34). This is because non-allelic genetic variations and/or environmental influences underlie these disorders whose phenotypes behave as complex traits. A few examples include the identification of the role of homozygozity for the SAA1.1 allele in conferring the genetic susceptibility to renal amyloidosis in FMF (20) and the association of an insertion/deletion polymorphism in the ACE gene with disease severity in familial hypertrophic cardiomyopathy (35). In these disorders, the phenotypes arise from mutations in MEFV and b-MHC, but are modulated by independently inherited genetic variation. In this report, we show that interactions among multiple genes, whose products are confirmed or putative constituents of ATTR deposits, or metabolically interact with TTR, modulate the onset of the first symptoms and predispose individuals to disease in the presence of the V30M mutation in TTR. The exact nature of the effects identified here requires further study with potential application in the development of genetic screening with prognostic value pertaining to the onset of disease in the TTR V30M carriers.

If the effects of additional single or interacting genes dictate the heterogeneity of phenotype, as reflected in variability of onset and clinical expression (with the same TTR mutation), the products encoded by alleles at such loci could contribute to the process of wild-type TTR deposition in elderly individuals without a mutation (senile systemic amyloidosis), a phenomenon not readily recognized as having a genetic basis because of the insensitivity of family history in the elderly.

 

Safety and Efficacy of RNAi Therapy for Transthyretin Amyloidosis

Coelho T, Adams D, Silva A, et al.
N Engl J Med 2013;369:819-29.    http://dx.doi.org:/10.1056/NEJMoa1208760

Transthyretin amyloidosis is caused by the deposition of hepatocyte-derived transthyretin amyloid in peripheral nerves and the heart. A therapeutic approach mediated by RNA interference (RNAi) could reduce the production of transthyretin.

Methods We identified a potent antitransthyretin small interfering RNA, which was encapsulated in two distinct first- and second-generation formulations of lipid nanoparticles, generating ALN-TTR01 and ALN-TTR02, respectively. Each formulation was studied in a single-dose, placebo-controlled phase 1 trial to assess safety and effect on transthyretin levels. We first evaluated ALN-TTR01 (at doses of 0.01 to 1.0 mg per kilogram of body weight) in 32 patients with transthyretin amyloidosis and then evaluated ALN-TTR02 (at doses of 0.01 to 0.5 mg per kilogram) in 17 healthy volunteers.

Results Rapid, dose-dependent, and durable lowering of transthyretin levels was observed in the two trials. At a dose of 1.0 mg per kilogram, ALN-TTR01 suppressed transthyretin, with a mean reduction at day 7 of 38%, as compared with placebo (P=0.01); levels of mutant and nonmutant forms of transthyretin were lowered to a similar extent. For ALN-TTR02, the mean reductions in transthyretin levels at doses of 0.15 to 0.3 mg per kilogram ranged from 82.3 to 86.8%, with reductions of 56.6 to 67.1% at 28 days (P<0.001 for all comparisons). These reductions were shown to be RNAi mediated. Mild-to-moderate infusion-related reactions occurred in 20.8% and 7.7% of participants receiving ALN-TTR01 and ALN-TTR02, respectively.

ALN-TTR01 and ALN-TTR02 suppressed the production of both mutant and nonmutant forms of transthyretin, establishing proof of concept for RNAi therapy targeting messenger RNA transcribed from a disease-causing gene.

 

Alnylam May Seek Approval for TTR Amyloidosis Rx in 2017 as Other Programs Advance


https://www.genomeweb.com/rnai/alnylam-may-seek-approval-ttr-amyloidosis-rx-2017-other-programs-advance

Officials from Alnylam Pharmaceuticals last week provided updates on the two drug candidates from the company’s flagship transthyretin-mediated amyloidosis program, stating that the intravenously delivered agent patisiran is proceeding toward a possible market approval in three years, while a subcutaneously administered version called ALN-TTRsc is poised to enter Phase III testing before the end of the year.

Meanwhile, Alnylam is set to advance a handful of preclinical therapies into human studies in short order, including ones for complement-mediated diseases, hypercholesterolemia, and porphyria.

The officials made their comments during a conference call held to discuss Alnylam’s second-quarter financial results.

ATTR is caused by a mutation in the TTR gene, which normally produces a protein that acts as a carrier for retinol binding protein and is characterized by the accumulation of amyloid deposits in various tissues. Alnylam’s drugs are designed to silence both the mutant and wild-type forms of TTR.

Patisiran, which is delivered using lipid nanoparticles developed by Tekmira Pharmaceuticals, is currently in a Phase III study in patients with a form of ATTR called familial amyloid polyneuropathy (FAP) affecting the peripheral nervous system. Running at over 20 sites in nine countries, that study is set to enroll up to 200 patients and compare treatment to placebo based on improvements in neuropathy symptoms.

According to Alnylam Chief Medical Officer Akshay Vaishnaw, Alnylam expects to have final data from the study in two to three years, which would put patisiran on track for a new drug application filing in 2017.

Meanwhile, ALN-TTRsc, which is under development for a version of ATTR that affects cardiac tissue called familial amyloidotic cardiomyopathy (FAC) and uses Alnylam’s proprietary GalNAc conjugate delivery technology, is set to enter Phase III by year-end as Alnylam holds “active discussions” with US and European regulators on the design of that study, CEO John Maraganore noted during the call.

In the interim, Alnylam continues to enroll patients in a pilot Phase II study of ALN-TTRsc, which is designed to test the drug’s efficacy for FAC or senile systemic amyloidosis (SSA), a condition caused by the idiopathic accumulation of wild-type TTR protein in the heart.

Based on “encouraging” data thus far, Vaishnaw said that Alnylam has upped the expected enrollment in this study to 25 patients from 15. Available data from the trial is slated for release in November, he noted, stressing that “any clinical endpoint result needs to be considered exploratory given the small sample size and the very limited duration of treatment of only six weeks” in the trial.

Vaishnaw added that an open-label extension (OLE) study for patients in the ALN-TTRsc study will kick off in the coming weeks, allowing the company to gather long-term dosing tolerability and clinical activity data on the drug.

Enrollment in an OLE study of patisiran has been completed with 27 patients, he said, and, “as of today, with up to nine months of therapy … there have been no study drug discontinuations.” Clinical endpoint data from approximately 20 patients in this study will be presented at the American Neurological Association meeting in October.

As part of its ATTR efforts, Alnylam has also been conducting natural history of disease studies in both FAP and FAC patients. Data from the 283-patient FAP study was presented earlier this year and showed a rapid progression in neuropathy impairment scores and a high correlation of this measurement with disease severity.

During last week’s conference call, Vaishnaw said that clinical endpoint and biomarker data on about 400 patients with either FAC or SSA have already been collected in a nature history study on cardiac ATTR. Maraganore said that these findings would likely be released sometime next year.

Alnylam Presents New Phase II, Preclinical Data from TTR Amyloidosis Programs
https://www.genomeweb.com/rnai/alnylam-presents-new-phase-ii-preclinical-data-ttr-amyloidosis-programs

 

Amyloid disease drug approved

Nature Biotechnology 2012; (3http://dx.doi.org:/10.1038/nbt0212-121b

The first medication for a rare and often fatal protein misfolding disorder has been approved in Europe. On November 16, the E gave a green light to Pfizer’s Vyndaqel (tafamidis) for treating transthyretin amyloidosis in adult patients with stage 1 polyneuropathy symptoms. [Jeffery Kelly, La Jolla]

 

Safety and Efficacy of RNAi Therapy for Transthyretin …

http://www.nejm.org/…/NEJMoa1208760?&#8230;

The New England Journal of Medicine

Aug 29, 2013 – Transthyretin amyloidosis is caused by the deposition of hepatocyte-derived transthyretin amyloid in peripheral nerves and the heart.

 

Alnylam’s RNAi therapy targets amyloid disease

Ken Garber
Nature Biotechnology 2015; 33(577)    http://dx.doi.org:/10.1038/nbt0615-577a

RNA interference’s silencing of target genes could result in potent therapeutics.

http://www.nature.com/nbt/journal/v33/n6/images/nbt0615-577a-I1.jpg

The most clinically advanced RNA interference (RNAi) therapeutic achieved a milestone in April when Alnylam Pharmaceuticals in Cambridge, Massachusetts, reported positive results for patisiran, a small interfering RNA (siRNA) oligonucleotide targeting transthyretin for treating familial amyloidotic polyneuropathy (FAP).  …

  1. Analysis of 589,306 genomes identifies individuals resilient to severe Mendelian childhood diseases

Nature Biotechnology 11 April 2016

  1. CRISPR-Cas systems for editing, regulating and targeting genomes

Nature Biotechnology 02 March 2014

  1. Near-optimal probabilistic RNA-seq quantification

Nature Biotechnology 04 April 2016

 

Translational Neuroscience: Toward New Therapies

https://books.google.com/books?isbn=0262029863

Karoly Nikolich, ‎Steven E. Hyman – 2015 – ‎Medical

Tafamidis for Transthyretin Familial Amyloid Polyneuropathy: A Randomized, Controlled Trial. … Multiplex Genome Engineering Using CRISPR/Cas Systems.

 

Is CRISPR a Solution to Familial Amyloid Polyneuropathy?

Author and Curator: Larry H. Bernstein, MD, FCAP

Originally published as

https://pharmaceuticalintelligence.com/2016/04/13/is-crispr-a-solution-to-familial-amyloid-polyneuropathy/

 

http://scholar.aci.info/view/1492518a054469f0388/15411079e5a00014c3d

FAP is characterized by the systemic deposition of amyloidogenic variants of the transthyretin protein, especially in the peripheral nervous system, causing a progressive sensory and motor polyneuropathy.

FAP is caused by a mutation of the TTR gene, located on human chromosome 18q12.1-11.2.[5] A replacement of valine by methionine at position 30 (TTR V30M) is the mutation most commonly found in FAP.[1] The variant TTR is mostly produced by the liver.[citation needed] The transthyretin protein is a tetramer.    ….

 

 

Read Full Post »

Boston Scientific implant designed to occlude the heart’s left atrial appendage implicated with embolization – Device Sales in Europe halts

Reporter: Aviva Lev-Ari, PhD, RN

 

Boston Scientific halts EU sales of next-gen Watchman FLX anti-stroke device

Boston Scientific WatchmanBoston Scientific (NYSE:BSX) reportedly halted European sales of its the next generation of its anti-stroke device, the Watchman FLX, after receiving reports of device embolization.

Spokeswoman Trish Backes told TCTMD that there were 6 device embolizations in 207 (2.9%) European implantations of the Watchman FLX, an implant that designed to occlude the heart’s left atrial appendage. One of those patients died from complications related to an infection suffered after the device was retrieved.

The 1st-generation Watchman device showed a 30-day embolization rate of 0 to 0.7% in trials, and a post-approval registry called Ewolution showed a rate of 0.2%. The Watchman FLX device won CE Mark approval in the European Unionlast November; the original iteration won FDA approval in March 2015.

Watchman FLX will be taken off the shelves until Boston Scientific can determine what’s causing the unexpectedly high embolism rate, Backes told the website.

“With [the original] Watchman, we’re really confident. We’ve seen really low embolization rates,” she said. “With the robust clinical training program that we have in place for physicians before they start implanting the device, we feel really good about that. This doesn’t impact what we’re doing in the U.S. or what we’re doing with the current Watchman device. It’s not raising any concerns for us for the current device.”

Medical officers with the Marlborough, Mass.-based company, speaking at the annual conference of the American College of Cardiology, said they’ll look at whether physician training or implant technique are factors. The company said the sales halt for Watchman FLX will not affect its structural heart sales forecast of $175 million to $200 million this year.

Boston Scientific said earlier this week at ACC 2016 that a review of the 1st 1,000 Watchman patients found similar results as in pre-market trials.

Material from Reuters was used in this report.

SOURCE

http://www.massdevice.com/boston-scientific-halts-eu-sales-of-next-gen-watchman-flx-anti-stroke-device/?utm_source=newsletter-160405&utm_medium=email&utm_campaign=newsletter-160405&spMailingID=8750804&spUserID=MTI2MTQxNTczMjM5S0&spJobID=900546483&spReportId=OTAwNTQ2NDgzS0

Read Full Post »

Human Factor Engineering: New Regulations Impact Drug Delivery, Device Design And Human Interaction

Curator: Stephen J. Williams, Ph.D.

Institute of Medicine report brought medical errors to the forefront of healthcare and the American public (Kohn, Corrigan, & Donaldson, 1999) and  estimated that between

44,000 and 98,000 Americans die each year as a result of medical errors

An obstetric nurse connects a bag of pain medication intended for an epidural catheter to the mother’s intravenous (IV) line, resulting in a fatal cardiac arrest. Newborns in a neonatal intensive care unit are given full-dose heparin instead of low-dose flushes, leading to threedeaths from intracranial bleeding. An elderly man experiences cardiac arrest while hospitalized, but when the code blue team arrives, they are unable to administer a potentially life-saving shock because the defibrillator pads and the defibrillator itself cannot be physically connected.

Human factors engineering is the discipline that attempts to identify and address these issues. It is the discipline that takes into account human strengths and limitations in the design of interactive systems that involve people, tools and technology, and work environments to ensure safety, effectiveness, and ease of use.

 

FDA says drug delivery devices need human factors validation testing

Several drug delivery devices are on a draft list of med tech that will be subject to a final guidance calling for the application of human factors and usability engineering to medical devices. The guidance calls called for validation testing of devices, to be collected through interviews, observation, knowledge testing, and in some cases, usability testing of a device under actual conditions of use. The drug delivery devices on the list include anesthesia machines, autoinjectors, dialysis systems, infusion pumps (including implanted ones), hemodialysis systems, insulin pumps and negative pressure wound therapy devices intended for home use. Studieshave consistently shown that patients struggle to properly use drug delivery devices such as autoinjectors, which are becoming increasingly prevalent due to the rise of self-administered injectable biologics. The trend toward home healthcare is another driver of usability issues on the patient side, while professionals sometimes struggle with unclear interfaces or instructions for use.

 

Humanfactors engineering, also called ergonomics, or human engineering, science dealing with the application of information on physical and psychological characteristics to the design of devices and systems for human use. ( for more detail see source@ Britannica.com)

The term human-factors engineering is used to designate equally a body of knowledge, a process, and a profession. As a body of knowledge, human-factors engineering is a collection of data and principles about human characteristics, capabilities, and limitations in relation to machines, jobs, and environments. As a process, it refers to the design of machines, machine systems, work methods, and environments to take into account the safety, comfort, and productiveness of human users and operators. As a profession, human-factors engineering includes a range of scientists and engineers from several disciplines that are concerned with individuals and small groups at work.

The terms human-factors engineering and human engineering are used interchangeably on the North American continent. In Europe, Japan, and most of the rest of the world the prevalent term is ergonomics, a word made up of the Greek words, ergon, meaning “work,” and nomos, meaning “law.” Despite minor differences in emphasis, the terms human-factors engineering and ergonomics may be considered synonymous. Human factors and human engineering were used in the 1920s and ’30s to refer to problems of human relations in industry, an older connotation that has gradually dropped out of use. Some small specialized groups prefer such labels as bioastronautics, biodynamics, bioengineering, and manned-systems technology; these represent special emphases whose differences are much smaller than the similarities in their aims and goals.

The data and principles of human-factors engineering are concerned with human performance, behaviour, and training in man-machine systems; the design and development of man-machine systems; and systems-related biological or medical research. Because of its broad scope, human-factors engineering draws upon parts of such social or physiological sciences as anatomy, anthropometry, applied physiology, environmental medicine, psychology, sociology, and toxicology, as well as parts of engineering, industrial design, and operations research.

source@ Britannica.com

The human-factors approach to design

Two general premises characterize the approach of the human-factors engineer in practical design work. The first is that the engineer must solve the problems of integrating humans into machine systems by rigorous scientific methods and not rely on logic, intuition, or common sense. In the past the typical engineer tended either to ignore the complex and unpredictable nature of human behaviour or to deal with it summarily with educated guesses. Human-factors engineers have tried to show that with appropriate techniques it is possible to identify man-machine mismatches and that it is usually possible to find workable solutions to these mismatches through the use of methods developed in the behavioral sciences.

The second important premise of the human-factors approach is that, typically, design decisions cannot be made without a great deal of trial and error. There are only a few thousand human-factors engineers out of the thousands of thousands of engineers in the world who are designing novel machines, machine systems, and environments much faster than behavioral scientists can accumulate data on how humans will respond to them. More problems, therefore, are created than there are ready answers for them, and the human-factors specialist is almost invariably forced to resort to trying things out with various degrees of rigour to find solutions. Thus, while human-factors engineering aims at substituting scientific method for guesswork, its specific techniques are usually empirical rather than theoretical.

HFgeneralpic

 

 

 

 

 

 

 

 

 

 

 

The Man-Machine Model: Human-factors engineers regard humans as an element in systems

The simple man-machine model provides a convenient way for organizing some of the major concerns of human engineering: the selection and design of machine displays and controls; the layout and design of workplaces; design for maintainability; and the work environment.

Components of the Man-Machine Model

  1. human operator first has to sense what is referred to as a machine display, a signal that tells him something about the condition or the functioning of the machine
  2. Having sensed the display, the operator interprets it, perhaps performs some computation, and reaches a decision. In so doing, the worker may use a number of human abilities, Psychologists commonly refer to these activities as higher mental functions; human-factors engineers generally refer to them as information processing.
  3. Having reached a decision, the human operator normally takes some action. This action is usually exercised on some kind of a control—a pushbutton, lever, crank, pedal, switch, or handle.
  4. action upon one or more of these controls exerts an influence on the machine and on its output, which in turn changes the display, so that the cycle is continuously repeated

 

Driving an automobile is a familiar example of a simple man-machine system. In driving, the operator receives inputs from outside the vehicle (sounds and visual cues from traffic, obstructions, and signals) and from displays inside the vehicle (such as the speedometer, fuel indicator, and temperature gauge). The driver continually evaluates this information, decides on courses of action, and translates those decisions into actions upon the vehicle’s controls—principally the accelerator, steering wheel, and brake. Finally, the driver is influenced by such environmental factors as noise, fumes, and temperature.

 

hfactorconsideroutcomes

How BD Uses Human Factors to Design Drug-Delivery Systems

Posted in Design Services by Jamie Hartford on August 30, 2013

 Human factors testing has been vital to the success of the company’s BD Physioject Disposable Autoinjector.

Improving the administration and compliance of drug delivery is a common lifecycle strategy employed to enhance short- and long-term product adoption in the biotechnology and pharmaceutical industries. With increased competition in the industry and heightened regulatory requirements for end-user safety, significant advances in product improvements have been achieved in the injectable market, for both healthcare professionals and patients. Injection devices that facilitate preparation, ease administration, and ensure safety are increasingly prevalent in the marketplace.

Traditionally, human factors engineering addresses individualized aspects of development for each self-injection device, including the following:

  • Task analysis and design.
  • Device evaluation and usability.
  • Patient acceptance, compliance, and concurrence.
  • Anticipated training and education requirements.
  • System resilience and failure.

To achieve this, human factors scientists and engineers study the disease, patient, and desired outcome across multiple domains, including cognitive and organizational psychology, industrial and systems engineering, human performance, and economic theory—including formative usability testing that starts with the exploratory stage of the device and continues through all stages of conceptual design. Validation testing performed with real users is conducted as the final stage of the process.

To design the BD Physioject Disposable Autoinjector System , BD conducted multiple human factors studies and clinical studies to assess all aspects of performance safety, efficiency, patient acceptance, and ease of use, including pain perception compared with prefilled syringes.5 The studies provided essential insights regarding the overall user-product interface and highlighted that patients had a strong and positive response to both the product design and the user experience.

As a result of human factors testing, the BD Physioject Disposable Autoinjector System provides multiple features designed to aide in patient safety and ease of use, allowing the patient to control the start of the injection once the autoinjector is placed on the skin and the cap is removed. Specific design features included in the BD Physioject Disposable Autoinjector System include the following:

  • Ergonomic design that is easy to handle and use, especially in patients with limited dexterity.
  • A 360° view of the drug and injection process, allowing the patient to confirm full dose delivery.
  • A simple, one-touch injection button for activation.
  • A hidden needle before and during injection to reduce needle-stick anxiety.
  • A protected needle before and after injection to reduce the risk of needle stick injury.

 

YouTube VIDEO: Integrating Human Factors Engineering (HFE) into Drug Delivery

 

Notes:

 

 

The following is a slideshare presentation on Parental Drug Delivery Issues in the Future

 The Dangers of Medical Devices

The FDA receives on average 100,000 medical device incident reports per year, and more than a third involve user error.

In an FDA recall study, 44% of medical device recalls are due to design problems, and user error is often linked to the poor design of a product.

Drug developers need to take safe drug dosage into consideration, and this consideration requires the application of thorough processes for Risk Management and Human Factors Engineering (HFE).

Although unintended, medical devices can sometimes harm patients or the people administering the healthcare. The potential harm arises from two main sources:

  1. failure of the device and
  2. actions of the user or user-related errors. A number of factors can lead to these user-induced errors, including medical devices are often used under stressful conditions and users may think differently than the device designer.

Human Factors: Identifying the Root Causes of Use Errors

Instead of blaming test participants for use errors, look more carefully at your device’s design.

Great posting on reasons typical design flaws creep up in medical devices and where a company should integrate fixes in product design.
Posted in Design Services by Jamie Hartford on July 8, 2013

 

 

YouTube VIDEO: Integrating Human Factors Engineering into Medical Devices

 

 

Notes:

 

 Regulatory Considerations

  • Unlike other medication dosage forms, combination products require user interaction
  •  Combination products are unique in that their safety profile and product efficacy depends on user interaction
Human Factors Review: FDA Outlines Highest Priority Devices

Posted 02 February 2016By Zachary Brennan on http://www.raps.org/Regulatory-Focus/News/2016/02/02/24233/Human-Factors-Review-FDA-Outlines-Highest-Priority-Devices/ 

The US Food and Drug Administration (FDA) on Tuesday released new draft guidance to inform medical device manufacturers which device types should have human factors data included in premarket submissions, as well final guidance from 2011 on applying human factors and usability engineering to medical devices.

FDA said it believes these device types have “clear potential for serious harm resulting from use error and that review of human factors data in premarket submissions will help FDA evaluate the safety and effectiveness and substantial equivalence of these devices.”

Manufacturers should provide FDA with a report that summarizes the human factors or usability engineering processes they have followed, including any preliminary analyses and evaluations and human factors validation testing, results and conclusions, FDA says.

The list was based on knowledge obtained through Medical Device Reporting (MDRs) and recall data, and includes:

  • Ablation generators (associated with ablation systems, e.g., LPB, OAD, OAE, OCM, OCL)
  • Anesthesia machines (e.g., BSZ)
  • Artificial pancreas systems (e.g., OZO, OZP, OZQ)
  • Auto injectors (when CDRH is lead Center; e.g., KZE, KZH, NSC )
  • Automated external defibrillators
  • Duodenoscopes (on the reprocessing; e.g., FDT) with elevator channels
  • Gastroenterology-urology endoscopic ultrasound systems (on the reprocessing; e.g., ODG) with elevator channels
  • Hemodialysis and peritoneal dialysis systems (e.g., FKP, FKT, FKX, KDI, KPF ODX, ONW)
  • Implanted infusion pumps (e.g., LKK, MDY)
  • Infusion pumps (e.g., FRN, LZH, MEA, MRZ )
  • Insulin delivery systems (e.g., LZG, OPP)
  • Negative-pressure wound therapy (e.g., OKO, OMP) intended for home use
  • Robotic catheter manipulation systems (e.g., DXX)
  • Robotic surgery devices (e.g., NAY)
  • Ventilators (e.g., CBK, NOU, ONZ)
  • Ventricular assist devices (e.g., DSQ, PCK)

Final Guidance

In addition to the draft list, FDA finalized guidance from 2011 on applying human factors and usability engineering to medical devices.

The agency said it received over 600 comments on the draft guidance, which deals mostly with design and user interface, “which were generally supportive of the draft guidance document, but requested clarification in a number of areas. The most frequent types of comments requested revisions to the language or structure of the document, or clarification on risk mitigation and human factors testing methods, user populations for testing, training of test participants, determining the appropriate sample size in human factors testing, reporting of testing results in premarket submissions, and collecting human factors data as part of a clinical study.”

In response to these comments, FDA said it revised the guidance, which supersedes guidance from 2000 entitled “Medical Device Use-Safety: Incorporating Human Factors Engineering into Risk Management,” to clarify “the points identified and restructured the information for better readability and comprehension.”

Details

The goal of the guidance, according to FDA, is to ensure that the device user interface has been designed such that use errors that occur during use of the device that could cause harm or degrade medical treatment are either eliminated or reduced to the extent possible.

FDA said the most effective strategies to employ during device design to reduce or eliminate use-related hazards involve modifications to the device user interface, which should be logical and intuitive.

In its conclusion, FDA also outlined the ways that device manufacturers were able to save money through the use of human factors engineering (HFE) and usability engineering (UE).

– See more at: http://www.raps.org/Regulatory-Focus/News/2016/02/02/24233/Human-Factors-Review-FDA-Outlines-Highest-Priority-Devices/#sthash.cDTr9INl.dpuf

 

Please see an FDA PowerPoint on Human Factors Regulatory Issues for Combination Drug/Device Products here: MFStory_RAPS 2011 – HF of ComboProds_v4

 

 

 

 

Read Full Post »

Cancer Companion Diagnostics

Curator: Larry H. Bernstein, MD, FCAP

 

Companion Diagnostics for Cancer: Will NGS Play a Role?

Patricia Fitzpatrick Dimond, Ph.D.

http://www.genengnews.com/insight-and-intelligence/companion-diagnostics-for-cancer/77900554/

Companion diagnostics (CDx), in vitro diagnostic devices or imaging tools that provide information essential to the safe and effective use of a corresponding therapeutic product, have become indispensable tools for oncologists.  As a result, analysts expect the global CDx market to reach $8.73 billion by 2019, up from from $3.14 billion in 2014.

Use of CDx during a clinical trial to guide therapy can improve treatment responses and patient outcomes by identifying and predicting patient subpopulations most likely to respond to a given treatment.

These tests not only indicate the presence of a molecular target, but can also reveal the off-target effects of a therapeutic, predicting toxicities and adverse effects associated with a drug.

For pharma manufacturers, using CDx during drug development improves the success rate of drugs being tested in clinical trials. In a study estimating the risk of clinical trial failure during non-small cell lung cancer drug development in the period between 1998 and 2012 investigators analyzed trial data from 676 clinical trials with 199 unique drug compounds.

The data showed that Phase III trial failure proved the biggest obstacle to drug approval, with an overall success rate of only 28%. But in biomarker-guided trials, the success rate reached 62%. The investigators concluded from their data analysis that the use of a CDx assay during Phase III drug development substantially improves a drug’s chances of clinical success.

The Regulatory Perspective

According to Patricia Keegen, M.D., supervisory medical officer in the FDA’s Division of Oncology Products II, the agency requires a companion diagnostic test if a new drug works on a specific genetic or biological target that is present in some, but not all, patients with a certain cancer or disease. The test identifies individuals who would benefit from the treatment, and may identify patients who would not benefit but could also be harmed by use of a certain drug for treatment of their disease. The agency classifies companion diagnosis as Class III devices, a class of devices requiring the most stringent approval for medical devices by the FDA, a Premarket Approval Application (PMA).

On August 6, 2014, the FDA finalized its long-awaited “Guidance for Industry and FDA Staff: In Vitro Companion Diagnostic Devices,” originally issued in July 2011. The final guidance stipulates that FDA generally will not approve any therapeutic product that requires an IVD companion diagnostic device for its safe and effective use before the IVD companion diagnostic device is approved or cleared for that indication.

Close collaboration between drug developers and diagnostics companies has been a key driver in recent simultaneous pharmaceutical-CDx FDA approvals, and partnerships between in vitro diagnostics (IVD) companies have proliferated as a result.  Major test developers include Roche Diagnostics, Abbott Laboratories, Agilent Technologies, QIAGEN), Thermo Fisher Scientific, and Myriad Genetics.

But an NGS-based test has yet to make it to market as a CDx for cancer.  All approved tests include PCR–based tests, immunohistochemistry, and in situ hybridization technology.  And despite the very recent decision by the FDA to grant marketing authorization for Illumina’s MiSeqDx instrument platform for screening and diagnosis of cystic fibrosis, “There still seems to be a number of challenges that must be overcome before we see NGS for targeted cancer drugs,” commented Jan Trøst Jørgensen, a consultant to DAKO, commenting on presentations at the European Symposium of Biopathology in June 2013.

Illumina received premarket clearance from the FDA for its MiSeqDx system, two cystic fibrosis assays, and a library prep kit that enables laboratories to develop their own diagnostic test. The designation marked the first time a next-generation sequencing system received FDA premarket clearance. The FDA reviewed the Illumina MiSeqDx instrument platform through its de novo classification process, a regulatory pathway for some novel low-to-moderate risk medical devices that are not substantially equivalent to an already legally marketed device.

Dr. Jørgensen further noted that “We are slowly moving away from the ‘one biomarker: one drug’ scenario, which has characterized the first decades of targeted cancer drug development, toward a more integrated approach with multiple biomarkers and drugs. This ‘new paradigm’ will likely pave the way for the introduction of multiplexing strategies in the clinic using gene expression arrays and next-generation sequencing.”

The future of CDxs therefore may be heading in the same direction as cancer therapy, aimed at staying ahead of the tumor drug resistance curve, and acknowledging the reality of the shifting genomic landscape of individual tumors. In some cases, NGS will be applied to diseases for which a non-sequencing CDx has already been approved.

Illumina believes that NGS presents an ideal solution to transforming the tumor profiling paradigm from a series of single gene tests to a multi-analyte approach to delivering precision oncology. Mya Thomae, Illumina’s vice president, regulatory affairs, said in a statement that Illumina has formed partnerships with several drug companies to develop a universal next-generation sequencing-based oncology test system. The collaborations with AstraZeneca, Janssen, Sanofi, and Merck-Serono, announced in 2014 and 2015 respectively, seek to  “redefine companion diagnostics for oncology  focused on developing a system for use in targeted therapy clinical trials with a goal of developing and commercializing a multigene panel for therapeutic selection.”

On January 16, 2014 Illumina and Amgen announced that they would collaborate on the development of a next-generation sequencing-based companion diagnostic for colorectal cancer antibody Vectibix (panitumumab). Illumina will develop the companion test on its MiSeqDx instrument.

In 2012, the agency approved Qiagen’s Therascreen KRAS RGQ PCR Kit to identify best responders to Erbitux (cetuximab), another antibody drug in the same class as Vectibix. The label for Vectibix, an EGFR-inhibiting monoclonal antibody, restricts the use of the drug for those metastatic colorectal cancer patients who harbor KRAS mutations or whose KRAS status is unknown.

The U.S. FDA, Illumina said, hasn’t yet approved a companion diagnostic that gauges KRAS mutation status specifically in those considering treatment with Vectibix.  Illumina plans to gain regulatory approval in the U.S. and in Europe for an NGS-based companion test that can identify patients’ RAS mutation status. Illumina and Amgen will validate the test platform and Illumina will commercialize the test.

Treatment Options

Foundation Medicine says its approach to cancer genomic characterization will help physicians reveal the alterations driving the growth of a patient’s cancer and identify targeted treatment options that may not have been otherwise considered.

FoundationOne, the first clinical product from Foundation Medicine, interrogates the entire coding sequence of 315 cancer-related genes plus select introns from 28 genes often rearranged or altered in solid tumor cancers.  Based on current scientific and clinical literature, these genes are known to be somatically altered in solid cancers.

These genes, the company says, are sequenced at great depth to identify the relevant, actionable somatic alterations, including single base pair change, insertions, deletions, copy number alterations, and selected fusions. The resultant fully informative genomic profile complements traditional cancer treatment decision tools and often expands treatment options by matching each patient with targeted therapies and clinical trials relevant to the molecular changes in their tumors.

As Foundation Medicine’ s NGS analyses are increasingly applied, recent clinical reports describe instances in which comprehensive genomic profiling with the FoundationOne NGS-based assay result in diagnostic reclassification that can lead to targeted drug therapy with a resulting dramatic clinical response. In several reported instances, NGS found, among the spectrum of aberrations that occur in tumors, changes unlikely to have been discovered by other means, and clearly outside the range of a conventional CDx that matches one drug to a specific genetic change.

TRK Fusion Cancer

In July 2015, the University of Colorado Cancer Center and Loxo Oncology published a research brief in the online edition of Cancer Discovery describing the first patient with a tropomyosin receptor kinase (TRK) fusion cancer enrolled in a LOXO-101 Phase I trial. LOXO-101 is an orally administered inhibitor of the TRK kinase and is highly selective only for the TRK family of receptors.

While the authors say TRK fusions occur rarely, they occur in a diverse spectrum of tumor histologies. The research brief described a patient with advanced soft tissue sarcoma widely metastatic to the lungs. The patient’s physician submitted a tumor specimen to Foundation Medicine for comprehensive genomic profiling with FoundationOne Heme, where her cancer was demonstrated to harbor a TRK gene fusion.

Following multiple unsuccessful courses of treatment, the patient was enrolled in the Phase I trial of LOXO-101 in March 2015. After four months of treatment, CT scans demonstrated almost complete tumor disappearance of the largest tumors.

The FDA’s Elizabeth Mansfield, Ph.D., director, personalized medicine staff, Office of In Vitro Diagnostics and Radiological Health, said in a recent article,  “FDA Perspective on Companion Diagnostics: An Evolving Paradigm” that “even as it seems that many questions about co-development have been resolved, the rapid accumulation of new knowledge about tumor biology and the rapid evolution of diagnostic technology are challenging FDA to continually redefine its thinking on companion diagnostics.” It seems almost inevitable that a consolidation of diagnostic testing should take place, to enable a single test or a few tests to garner all the necessary information for therapeutic decision making.”

Whether this means CDx testing will begin to incorporate NGS sequencing remains to be seen.

Read Full Post »

Oracle In the Medical Devices Industry

Reporter: Aviva Lev-Ari, PhD, RN

Medical Devices

20 of the Top 20 Medical Device Companies Run Oracle Applications

Use Oracle’s powerful combination of technology and comprehensive, preintegrated business applications to be first-to-market and address the challenges of regulatory pressures and reimbursement caps.

 

Oracle In the Medical Devices Industry

  • Manage a single view of the product record throughout its lifecycle—from concept to design, source, build, sell, service, and disposal
  • Make large volumes of clinical data well organized, easily accessible, and thoroughly documented
  • Use powerful study layout and design features, full edit check facilities, complete tracking, analysis, and reporting capabilities, remote data collection, and site-based entry
  • Model any kind of clinical study and automatically store components for reuse
  • Employ prebuilt tools that enable electronic signatures and automate regulatory recordkeeping
  • Conduct safety and compliance monitoring by establishing flexible, global workflows enabling CAPA and product complaint resolution
  • Quickly usher a medical device from research and development to testing and product launch using tools that support segmentation, call execution and reporting, guided selling, territory and objectives management, and cross-functional business processes

SOURCE

http://www.oracle.com/us/industries/life-sciences/medical/overview/index.html

More about Oracle Life Sciences

Oracle delivers key functionality built specifically for pharmaceutical, biotechnology, and medical device enterprises, so you can maximize innovation and discovery, marketplace agility, and ROI.

Why Oracle Life Sciences solutions?

Oracle’s powerful combination of technology and comprehensive, preintegrated business applications gets you to market first while addressing the challenges of regulatory pressures and reimbursement caps.

 SOURCE

 

Read Full Post »

Antimalarial flow synthesis closer to commercialisation.

Read Full Post »

Older Posts »

%d bloggers like this: