Feeds:
Posts
Comments

Archive for the ‘Bio Instrumentation in Experimental Life Sciences Research’ Category

Lifelong Contraceptive Device for Men: Mechanical Switch to Control Fertility on Wish

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

There aren’t many options for long-term birth control for men. The most common kinds of male contraception include

  • condoms,
  • withdrawal / pulling out,
  • outercourse, and
  • vasectomy.

But, other than vasectomy none of the processes are fully secured, comfortable and user friendly. Another solution may be

  • RISUG (Reversible Inhibition of Sperm Under Guidance, or Vasalgel)

which is said to last for ten years and no birth control pill for men is available till date.

VIEW VIDEO

http://www.mdtmag.com/blog/2016/01/implanted-sperm-switch-turns-mens-fertility-and?et_cid=5050638&et_rid=461755519&type=cta

Recently a German inventor, Clemens Bimek, developed a novel, reversible, hormone free, uncomplicated and lifelong contraceptive device for controlling male fertility. His invention is named as Bimek SLV, which is basically a valve that stops the flow of sperm through the vas deferens with the literal flip of a mechanical switch inside the scortum, rendering its user temporarily sterile. Toggled through the skin of the scrotum, the device stays closed for three months to prevent accidental switching. Moreover, the switch can’t open on its own. The tiny valves are less than an inch long and weigh is less than a tenth of an ounce. They are surgically implanted on the vas deferens, the ducts which carry sperm from the testicles, through a simple half-hour operation.

The valves are made of PEEK OPTIMA, a medical-grade polymer that has long been employed as a material for implants. The device is patented back in 2000 and is scheduled to undergo clinical trials at the beginning of this year. The inventor claims that Bimek SLV’s efficacy is similar to that of vasectomy, it does not impact the ability to gain and maintain an erection and ejaculation will be normal devoid of the sperm cells. The valve’s design enables sperm to exit the side of the vas deferens when it’s closed without any semen blockage. Leaked sperm cells will be broken down by the immune system. The switch to stop sperm flow can be kept working for three months or 30 ejaculations. After switching on the sperm flow the inventor suggested consulting urologist to ensure that all the blocked sperms are cleared off the device. The recovery time after switching on the sperm flow is only one day, according to Bimek SLV. However, men are encouraged to wait one week before resuming sexual activities.

Before the patented technology can be brought to market, it must undergo a rigorous series of clinical trials. Bimek and his business partners are currently looking for men interested in testing the device. If the clinical trials are successful then this will be the first invention of its kind that gives men the ability to control their fertility and obviously this method will be preferred over vasectomy.

 

References:

 

https://www.bimek.com/this-is-how-the-bimek-slv-works/

 

http://www.mdtmag.com/blog/2016/01/implanted-sperm-switch-turns-mens-fertility-and?et_cid=5050638&et_rid=461755519&type=cta

 

http://www.telegraph.co.uk/news/worldnews/europe/germany/12083673/German-carpenter-invents-on-off-contraception-switch-for-sperm.html

 

http://www.discovery.com/dscovrd/tech/you-can-now-turn-off-your-sperm-flow-with-the-flip-of-a-switch/

 

Read Full Post »

Periodic table of protein complexes, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Periodic table of protein complexes

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Periodic Table of Protein Complexes

http://www.technologynetworks.com/Proteomics/news.aspx?ID=186248

New tool helps to visualise, understand and predict how proteins combine to drive biological processes

A new ‘periodic table’ of protein complexes has been developed that provides a unified way to classify and visualise protein complexes, providing a valuable tool for biotechnology and the engineering of novel complexes.

This study also provides insights into evolutionary distribution of different types of existing protein complexes.

The Periodic Table of Protein Complexes offers a new way of looking at the enormous variety of structures that proteins can build in nature, which ones might be discovered next, and predicting how entirely novel structures could be engineered. Created by an interdisciplinary team led by researchers at the Wellcome Genome Campus and the University of Cambridge, the Table provides a valuable tool for research into evolution and protein engineering.

Almost every biological process depends on proteins interacting and assembling into complexes in a specific way, and many diseases are associated with problems in complex assembly. The principles underpinning this organisation are not yet fully understood, but by defining the fundamental steps in the evolution of protein complexes, the new ‘periodic table’ presents a systematic, ordered view on protein assembly, providing a visual tool for understanding biological function.

“Evolution has given rise to a huge variety of protein complexes, and it can seem a bit chaotic. But if you break down the steps proteins take to become complexes, there are some basic rules that can explain almost all of the assemblies people have observed so far.”

 

Dr Joe Marsh, formerly of the Wellcome Genome Campus and now of the MRC Human Genetics Unit at the University of Edinburgh.

Different ballroom dances can be seen as an endless combination of a small number of basic steps. Similarly, the ‘dance’ of protein complex assembly can be seen as endless variations on dimerization (one doubles, and becomes two), cyclisation (one forms a ring of three or more) and subunit addition (two different proteins bind to each other). Because these happen in a fairly predictable way, it’s not as hard as you might think to predict how a novel protein would form.

“We’re bringing a lot of order into the messy world of protein complexes. Proteins can keep go through several iterations of these simple steps, adding more and more levels of complexity and resulting in a huge variety of structures. What we’ve made is a classification based on these underlying principles that helps people get a handle on the complexity.”

Dr Sebastian Ahnert of the Cavendish Laboratory at the University of Cambridge

The exceptions to the rule are interesting in their own right, as are the subject of on-going studies.

“By analysing the tens of thousands of protein complexes for which three-dimensional structures have already been experimentally determined, we could see repeating patterns in the assembly transitions that occur – and with new data from mass spectrometry we could start to see the bigger picture.”

Dr Joe Marsh

“The core work for this study is in theoretical physics and computational biology, but it couldn’t have been done without the mass spectrometry work by our colleagues at Oxford University. This is yet another excellent example of how extremely valuable interdisciplinary research can be.”

Dr Sarah Teichmann, Research Group Leader at the Wellcome Trust Sanger Institute and the European Bioinformatics Institute (EMBL-EBI)

Read Full Post »

Reporter: Danut Dragoi, PhD

As we know only magnetic materials can be magnetized in a magnetic field, the culture cells don’t. However, n3D company found a way to do so. On their site, the company states that bio-medical research is gravitating towards 3D cell culture models and tissue printing. There are four major development of n3D company:

  • magnetization of cells
  • NaanoShuttle, a bio-compatible nano-particle assembly
  • spheroids bio-printing to print
  • 3Dcell culture to design and implement assays development and lab protocols

They sell kits and services for 3D cell culture. The core technology is the magnetization of cells, which can then be directed using magnetic forces. In this manner, the cells can be levitated or bio-printed. These cultures are faster to assemble than other systems that require a substrate and easier to handle with magnets without losing samples.

One Trade Secret made by n3D is the NanoShuttle, a bio compatible nano-particle assembly of gold, iron oxide, and poly-L-lysine. A description of poly-lysine is given here . The iron oxide, Fe3O4 (Magnetite), is the only component that magnetizes and the nano-particle assembly of gold, iron oxide, and poly-L-lysine attach to the cell membranes. The physical attachment of the nano-assembly to the membrane of the cell is electrostatic in nature and is non-specific to the type of cells. No effect has been seen on proliferation, viability, metabolism, inflammatory or oxidation stress. By 7-8 days, the NanoShuttle will release off the cell and into the extracellular space of the culture.

One of the limiting factors in 3D cell culture can be the throughput and the efficiency. n3D make kits for magnetic 3D bio-printing to print spheroids in 384-well plates rapidly (15 min – 6 h) and reproducible with as few as 50 cells. It saves time and cost compared to other spheroid systems. This system is also ideal for high-content screening, with the iPod-based BiO Assay for toxicity, and fluorescence and other cell-based assays for function. Some results with HepG2 hepatocites are shown in the same site.

The design and the implementation of the assays development and lab protocols is clearly an advantage on 3D cell culture. If we are worried about the time and cost of implementing, the n3D offers services in assay development, where they use a rich expertise in 3D cell culture to design and implement assays and protocols in a Biolab. They can work with any cell type to design assays to exact specifications and throughput, and compare these assays to the existing 2D assays. The goal is to help to transition from 2D to 3D with ease.

VIDEO

Hubert Tseng of Nano3D company explains how the concept of magnetization of cells works in here.

One important tool in a BioLab is the n3D’s MagPenTM that is used to transfer, collect, organize, or layer magnetic 3D cultures without disrupting the tissue architecture. The MagPenTM uses a simple “pick-up-and-drop” method to reliably move cultures between microscope slides, petri dishes, and well plates. The instruction manual for MagPen can be found here . More details how to use MagPen is given here 

Source
http://www.n3dbio.com/ , downloaded 01/02/2016

Read Full Post »

Roche is developing a high-throughput low cost sequencer for NGS, How NGS Will Revolutionize Reproductive Diagnostics: November Meeting, Boston MA, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Roche is developing a high-throughput low cost sequencer for NGS

Reporter: Stephen J. Williams, PhD

 

Reported from Diagnostic World News

Long-Read Sequencing in the Age of Genomic Medicine

 

 

By Aaron Krol

December 16, 2015 | This September, Pacific Biosciences announced the creation of the Sequel, a DNA sequencer half the cost and seven times as powerful as its previous RS II instrument. PacBio, with its unique long-read sequencing technology, had already secured a place in high-end research labs, producing finished, highly accurate genomes and helping to explore the genetic “dark matter” that other next-generation sequencing (NGS) instruments miss. Now, in partnership with Roche Diagnostics, PacBio is repositioning itself as a company that can serve hospitals as well.

“Pseudogenes, large structural variants, validation, repeat disorders, polymorphic regions of the genome―all those are categories where you practically need PacBio,” says Bobby Sebra, Director of Technology Development at the Icahn School of Medicine at Mount Sinai. “Those are gaps in the system right now for short-read NGS.”

Mount Sinai’s genetic testing lab owns three RS II sequencers, running almost around the clock, and was the first lab to announce it had bought a Sequel just weeks after the new instruments were launched. (It arrived earlier this month and has been successfully tested.) Sebra’s group uses these sequencers to read parts of the genome that, thanks to their structural complexity, can only be assembled from long, continuous DNA reads.

There are a surprising number of these blind spots in the human genome. “HLA is a huge one,” Sebra says, referring to a highly variable region of the genome involved in the immune system. “It impacts everything from immune response, to pharmacogenomics, to transplant medicine. It’s a pretty important and really hard-to-genotype locus.”

Nonetheless, few clinical organizations are studying PacBio or other long-read technologies. PacBio’s instruments, even the Sequel, come with a relatively high price tag, and research on their value in treating patients is still tentative. Mount Sinai’s confidence in the technology is surely at least partly due to the influence of Sebra―an employee of PacBio for five years before coming to New York―and Genetics Department Chair Eric Schadt, at one time PacBio’s Chief Scientific Officer.

Even here, the sequencers typically can’t be used to help treat patients, as the instruments are sold for research use only. Mount Sinai is still working on a limited number of tests to submit as diagnostics to New York State regulators.

Physician Use

Roche Diagnostics, which invested $75 million in the development of the Sequel, wants to change that. The company is planning to release its own, modified version of the instrument in the second half of 2016, specifically for diagnostic use. Roche will initially promote the device for clinical studies, and eventually seek FDA clearance to sell it for routine diagnosis of patients.

In an email to Diagnostics World, Paul Schaffer, Lifecycle Leader for Roche’s sequencing platforms division, wrote that the new device will feature an integrated software pipeline to interpret test results, in support of assays that Roche will design and validate for clinical indications. The instrument will also have at least minor hardware modifications, like near field communication designed to track Roche-branded reagents used during sequencing.

This new version of the Sequel will probably not be the first instrument clinical labs turn to when they decide to start running NGS. Short-read sequencers are sure to outcompete the Roche machine on price, and can offer a pretty useful range of assays, from co-diagnostics in cancer to carrier testing for rare genetic diseases. But Roche can clear away some of the biggest barriers to entry for hospitals that want to pursue long-read sequencing.

Today, institutions like Mount Sinai that use PacBio typically have to write a lot of their own software to interpret the data that comes off the machines. Off-the-shelf analysis, with readable diagnostic reports for doctors, will make it easier for hospitals with less research focus to get on board. To this end, Roche acquired Bina, an NGS analysis company that handles structural variants and other PacBio specialties, in late 2014.

The next question will be whether Roche can design a suite of tests that clinical labs will want to run. Long-read sequencing is beloved by researchers because it can capture nearly complete genomes, finding the correct order and orientation of DNA reads. “The long-read technologies like PacBio’s are going to be, in the future, the showcase that ties it all together,” Sebra says. “You need those long reads as scaffolds to bring it together.”

But that envisions a future in which doctors will want to sequence their patients’ entire genomes. When it comes to specific medical tests, targeting just a small part of the genome connected to disease, Roche will have to content itself with some niche applications where PacBio stands out.

Early Applications

“At this time we are not releasing details regarding the specific assays under development,” Schaffer told Diagnostics World in his email. “However, virology and genetics are a key focus, as they align with other high-priority Roche Diagnostics products.”

Genetic disease is the obvious place to go with any sequencing technology. Rare hereditary disorders are much easier to understand on a genetic level than conditions like diabetes or heart disease; typically, the pathology can be traced back to a single mutation, making it easy to interpret test results.

Some of these mutations are simply intractable for short-read sequencers. A whole class of diseases, the PolyQ disorders and other repeat disorders, develop when a patient has too many copies of a single, repetitive sequence in a gene region. The gene Huntingtin, for example, contains a long stretch of the DNA code CAG; people born with 40 or more CAG repeats in a row will develop Huntington’s disease as they reach early adulthood.

These disorders would be a prime target for Roche’s sequencer. The Sequel’s long reads, spanning thousands of DNA letters at a stretch, can capture the entire repeat region of Huntingtin at a stretch, unlike short-read sequencers that would tend to produce a garbled mess of CAG reads impossible to count or put in order.

Nonetheless, the length of reads is not the only obstacle to understanding these very obstinate diseases. “The entire category of PolyQ disorders, and Fragile X and Huntington’s, is really important,” says Sebra. “But to be frank, they’re the most challenging even with PacBio.” He suggests that, even without venturing into the darkest realms of the genome, a long-read sequencer might actually be useful for diagnosing many of the same genetic diseases routinely covered by other instruments.

That’s because, even when the gene region involved in a disease is well known, there’s rarely only one way for it to go awry. “An example of that is Gaucher’s disease, in a gene called GBA,” Sebra says. “In that gene, there are hundreds of known mutations, some of which you can absolutely genotype using short reads. But others, you would need to phase the entire block to really understand.” Long-read sequencing, which is better at distinguishing maternal from paternal DNA and highlighting complex rearrangements within a gene, can offer a more thorough look at diseases with many genetic permutations, especially when tracking inheritance through a family.

“You can think of long-read sequencing as a really nice way to supplement some of the inherited panels or carrier screening panels,” Sebra says. “You can also use PacBio to verify variants that are called with short-read sequencing.”

Virology is, perhaps, a more surprising focus for Roche. Diagnosing a viral (or bacterial, or fungal) infection with NGS only requires finding a DNA read unique to a particular species or strain, something short-read sequencers are perfectly capable of.

But Mount Sinai, which has used PacBio in pathogen surveillance projects, has seen advantages to getting the full, completely assembled genomes of the organisms it’s tracking. With bacteria, for instance, key genes that confer resistance to antibiotics might be found either in the native genome, or inside plasmids, small packets of DNA that different species of bacteria freely pass between each other. If your sequencer can assemble these plasmids in one piece, it’s easier to tell when there’s a risk of antibiotic resistance spreading through the hospital, jumping from one infectious species to another.

Viruses don’t share their genetic material so freely, but a similar logic can still apply to viral infections, even in a single person. “A virus is really a mixture of different quasi-species,” says Sebra, so a patient with HIV or influenza likely has a whole constellation of subtly different viruses circulating in their body. A test that assembles whole viral genomes—which, given their tiny size, PacBio can often do in a single read—could give physicians a more comprehensive view of what they’re dealing with, and highlight any quasi-species that affect the course of treatment or how the virus is likely to spread.

The Broader View

These applications are well suited to the diagnostic instrument Roche is building. A test panel for rare genetic diseases can offer clear-cut answers, pointing physicians to any specific variants linked to a disorder, and offering follow-up information on the evidence that backs up that call.

That kind of report fits well into the workflows of smaller hospital labs, and is relatively painless to submit to the FDA for approval. It doesn’t require geneticists to puzzle over ambiguous results. As Schaffer says of his company’s overall NGS efforts, “In the past two years, Roche has been actively engaged in more than 25 partnerships, collaborations and acquisitions with the goal of enabling us to achieve our vision of sample in to results out.”

But some of the biggest ways medicine could benefit from long-read sequencing will continue to require the personal touch of labs like Mount Sinai’s.

Take cancer, for example, a field in which complex gene fusions and genetic rearrangements have been studied for decades. Tumors contain multitudes of cells with unique patchworks of mutations, and while long-read sequencing can pick up structural variants that may play a role in prognosis and treatment, many of these variants are rarely seen, little documented, and hard to boil down into a physician-friendly answer.

An ideal way to unravel a unique cancer case would be to sequence the RNA molecules produced in the tumor, creating an atlas of the “transcriptome” that shows which genes are hyperactive, which are being silenced, and which have been fused together. “When you run something like IsoSeq on PacBio and you can see truly the whole transcriptome, you’re going to figure out all possible fusions, all possible splicing events, and the true atlas of reads,” says Sebra. “Cancer is so diverse that it’s important to do that on an individual level.”

Occasionally, looking at the whole transcriptome, and seeing how a mutation in one gene affects an entire network of related genes, can reveal an unexpected treatment option―repurposing a drug usually reserved for other cancer types. But that takes a level of attention and expertise that is hard to condense into a mass-market assay.

And, Sebra suggests, there’s another reason for medical centers not to lean too heavily on off-the-shelf tests from vendors like Roche.

Devoted as he is to his onetime employer, Sebra is also a fan of other technologies now emerging to capture some of the same long-range, structural information on the genome. “You’ve now got 10X Genomics, BioNano, and Oxford Nanopore,” he says. “Often, any two or even three of those technologies, when you merge them together, can get you a much more comprehensive story, sometimes faster and sometimes cheaper.” At Mount Sinai, for example, combining BioNano and PacBio data has produced a whole human genome much more comprehensive than either platform can achieve on its own.

The same is almost certainly true of complex cases like cancer. Yet, while companies like Roche might succeed in bringing NGS diagnostics to a much larger number of patients, they have few incentives to make their assays work with competing technologies the way a research-heavy institute like Mount Sinai does.

“It actually drives the commercialization of software packages against the ability to integrate the data,” Sebra says.

Still, he’s hopeful that the Sequel can lead the industry to pay more attention to long-read sequencing in the clinic. “The RS II does a great job of long-read sequencing, but the throughput for the Sequel is so much higher that you can start to achieve large genomes faster,” he says. “It makes it more accessible for people who don’t own the RS II to get going.” And while the need for highly specialized genetics labs won’t be falling off anytime soon, most patients don’t have the luxury of being treated in a hospital with the resources of Mount Sinai. NGS companies increasingly see physicians as some of their most important customers, and as our doctors start checking into the health of our genomes, it would be a shame if ubiquitous short-read sequencing left them with blind spots.

Source: http://diagnosticsworldnews.com/2015/12/16/long-read-sequencing-age-genomic-medicine.aspx

 

 

Read Full Post »

Microtubule-Associated Protein Assembled on Polymerized Microtubules

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Magic-Angle-Spinning NMR Enables First-Ever Determination of Atomic-Resolution Structure of a Microtubule-Associated Protein Assembled on Polymerized Microtubules

http://www.bioquicknews.com/node/3201

 

https://upload.wikimedia.org/wikipedia/commons/0/06/Microtubule_structure.png

 

A latticework of tiny tubes called microtubules gives your cells their shape and also acts like a railroad track that essential proteins travel on. But if there is a glitch in the connection between train and track, diseases can occur. In the November 24, 2015 issue of PNAS, Tatyana Polenova, Ph.D., Professor of Chemistry and Biochemistry, and her team at the University of Delaware (UD), together with John C. Williams, Ph.D., Associate Professor at the Beckman Research Institute of City of Hope in Duarte, California, reveal for the first time — atom by atom — the structure of a protein bound to a microtubule. The protein of focus, CAP-Gly, short for “cytoskeleton-associated protein-glycine-rich domains,” is a component of dynactin, which binds with the motor protein dynein to move cargoes of essential proteins along the microtubule tracks. Mutations in CAP-Gly have been linked to such neurological diseases and disorders as Perry syndrome and distal spinal bulbar muscular dystrophy. The research team used magic-angle-spinning nuclear magnetic resonance spectrometry (NMR) in the Department of Chemistry and Biochemistry at UD to unveil the structure of the CAP-Gly protein assembled on polymerized microtubules. The CAP-Gly protein has 1,329 atoms, and each tubulin dimer, which is a building block for microtubules, has nearly 14,000 atoms. “This is the first time anyone has been able to get an atomic-resolution structure of any microtubule-associated protein assembled on polymerized microtubules,” Dr. Polenova says. “With magic-angle-spinning NMR, we can look into the structure of this and other assemblies of microtubules and their associated proteins and gain critical insights into their function and dynamics, as well as begin to gather clues as to how mutations cause disease.” In magic-angle-spinning NMR, a sample is placed in the NMR’s small, tube-like rotor, which is then spun inside the NMR magnet at an angle of 54.74 degrees — called the “magic angle” because it suppresses the atoms from interacting magnetically. The result is a high-resolution protein fingerprint, a graph of hundreds of peaks representing the frequencies of two or more interacting atoms. These data are then used to calculate the 3-D structures.

The 3-D structures of CAP-Gly, which show the spatial arrangement of atoms in the protein molecule, are different between the free state of the protein and its bound state to the microtubule. These structures reveal how the protein interacts with microtubules, predominantly through its loop regions, which adopt specific conformations upon binding.

However, static structures of CAP-Gly do not tell the whole story about the protein.

“Just as we are always moving our arms and legs about, proteins are very dynamic. They do not stand still,” Dr. Polenova says.

“These motions are essential to their biological function, and NMR spectroscopy is the only technique that can record such movements, with atomic resolution, on a variety of time scales, from picoseconds to arbitrarily long time scales — seconds, days, weeks — to help us understand the protein’s function.”

“We know from our prior studies that CAP-Gly is dynamic on timescales from nano- to milliseconds, and this mobility is essential for the protein’s ability to interact with microtubules and with multiple other binding partners.”

The research, which has been ongoing since 2008 when the first data sets were collected, required the development of new protocols for preparing the samples, new NMR experiments to gather various information on structure and dynamics, and new protocols for data analysis.

In the future, Dr. Polenova and her team envision using NMR in combination with cryo-electron microscopy, in which samples are studied at extremely low temperatures, typically below -200 degrees Fahrenheit, to look at even more complex systems in a highly preserved form.

Dr. Polenova’s research team at UD included Dr. Si Yan, who received her doctorate from the University in 2014, current doctoral student Changmiao Guo, NMR spectroscopist Guangjin Hou, and postdoctoral researchers Dr. Huilan Zhang and Dr. Xingyu Lu. Dr. Williams, at Beckman Research Institute, was also a co-author of the study.

 

Atomic-resolution structure of the CAP-Gly domain of dynactin on polymeric microtubules determined by magic angle spinning NMR spectroscopy

 

Fig. 1.

http://www.pnas.org/content/112/47/14611/F1.small.gif

 

Significance

Microtubules and their associated proteins are central to most cellular functions. They have been extensively studied at multiple levels of resolution; however, significant knowledge gaps remain. Structures of microtubule-associated proteins bound to microtubules are not known at atomic resolution. We used magic angle spinning NMR to solve a structure of dynactin’s cytoskeleton-associated protein glycine-rich (CAP-Gly) domain bound to microtubules and to determine the intermolecular interface, the first example, to our knowledge, of the atomic-resolution structure of a microtubule-associated protein on polymeric microtubules. The results reveal remarkable structural plasticity of CAP-Gly, which enables CAP-Gly’s binding to microtubules and other binding partners. This approach offers atomic-resolution information of microtubule-binding proteins on microtubules and opens up the possibility to study critical parameters such as protonation states, strain, and dynamics on multiple time scales.

 

Microtubules and their associated proteins perform a broad array of essential physiological functions, including mitosis, polarization and differentiation, cell migration, and vesicle and organelle transport. As such, they have been extensively studied at multiple levels of resolution (e.g., from structural biology to cell biology). Despite these efforts, there remain significant gaps in our knowledge concerning how microtubule-binding proteins bind to microtubules, how dynamics connect different conformational states, and how these interactions and dynamics affect cellular processes. Structures of microtubule-associated proteins assembled on polymeric microtubules are not known at atomic resolution. Here, we report a structure of the cytoskeleton-associated protein glycine-rich (CAP-Gly) domain of dynactin motor on polymeric microtubules, solved by magic angle spinning NMR spectroscopy. We present the intermolecular interface of CAP-Gly with microtubules, derived by recording direct dipolar contacts between CAP-Gly and tubulin using double rotational echo double resonance (dREDOR)-filtered experiments. Our results indicate that the structure adopted by CAP-Gly varies, particularly around its loop regions, permitting its interaction with multiple binding partners and with the microtubules. To our knowledge, this study reports the first atomic-resolution structure of a microtubule-associated protein on polymeric microtubules. Our approach lays the foundation for atomic-resolution structural analysis of other microtubule-associated motors.

Read Full Post »

Laboratory Automation Today, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Laboratory Automation Today

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

Clinical Laboratories Considering Total Laboratory Automation Are Increasingly Using Lean Methods to Develop Proposals, Select a Vendor, and Implement the Solution

November 23, 2015
by Robert Michel  – Dark Daily

Top-performing medical laboratories are using Lean to help craft RFPs, evaluate TLA options, then implement the automated systems to achieve optimal quality and productivity

In recent years, there’s been a big change in how clinical laboratories purchase total laboratory automation (TLA) solutions, and then integrate this automation into their lab operations. Using a strategy that is somewhat off the radar, top-performing medical laboratories will purchase and install TLA only after applying the principles of Lean to the physical layout and overall workflow within their labs.

This development demonstrates the growing acceptance of Lean, Six Sigma, andcontinuous process improvement methods at hospital-based laboratories and independent clinical laboratories.

As lab budgets get squeezed down each year and specimen volume increases, pathologists and clinical lab managers face the twin challenges of reducing costs while increasing the quality of their lab testing services.

Why Top-Performing Medical Laboratories Use Both Lean and TLA

For these reasons, Lean methods are now integral to the use of laboratory automation among top-performing clinical laboratories. These labs use a new cycle for procurement and implementation of lab automation. Steps in this cycle involve Lean methods in the creation of the RFP (request for proposal), in the pre-purchase assessment of proposals, and in implementation, followed by the use of continuous improvement designed to extract maximum quality and optimal productivity from the laboratory automation installation.

When labs are incorporating Lean methods and a culture of quality management in their operations, they change the traditional steps they followed when preparing to replace an aging, outmoded system with a new one capable of revising almost all aspects of lab operations.

Forward-looking pathologists and clinical lab directors view the installation of a new automation system as an opportunity to revise not only the automated processes but also as many other lab operations as possible.

In fact, they view the challenge of implementing a new system as a chance to address almost all of the most challenging problems in laboratory management today. These problems range from:

  • Reimbursement levels that have been declining for years and are expected to continue to decline;
  • A shrinking lab workforce;
  • An aging population with more chronic diseases;
  • To the most pressing need of testing more specimens while delivering greater value simultaneously.

Manual Processing in Clinical Laboratories Is Subject to High Error Rates

“Among the benefits of both Lean and TLA is that labs can maximize efficiency and reduce errors simultaneously,” stated Joe Ross, Senior Marketing Manager, North America Automation and Clinical Informatics, for Beckman Coulter in Brea, Calif. “Lab managers recognized that any area of the clinical laboratory that involves manual processing is subject to high error rates. This is particularly true of the pre-analytical and post-analytical phases of the lab testing workflow that humans handle. It is these areas that generate the most benefits when a lab blends Lean with new lab automation.”

Click here to see image

“In an environment saturated with total lab automation solutions to help improve quality, labs constrained by size and budget need to look for solutions to help reduce variation in manual processes—including both physical and decision-making processes. Utilizing Lean processing initiatives, which are designed to eliminate waste and improve efficiency in various processes and industries, is the first step toward identifying key areas for improvement,” stated Joe Ross (above), Senior Marketing Manager, Automation and Informatics, Beckman Coulter. (Photo and caption copyright: Clinical Lab Products Magazine.)

Other areas that benefit from both Lean and TLA are turnaround time. In general, Lean experts say, Lean labs can and should have an average TAT of less than 30 minutes. In addition, the combination of Lean and automation systems can help labs reduce variation in TAT as well.

One of the biggest challenges medical laboratories face is the ability to handle stat testing smoothly and efficiently. Lean management and the best in class automation systems can process stat tests in less than 30 minutes—meaning from the time the lab receives the sample to the time the results are sent to the ordering physician. The best in class systems beat this 30-minute TAT goal even during times of peak processing workload. Most important, rather than having a mean of 30-minutes, the top performers have driven variation below 10 minutes.

Using Lean and Lab Automation in Microbiology at DynaLIFEDx

One lab that recently combined Lean and total laboratory automation wasDynaLIFEDx Diagnostic Laboratory Services in Edmonton, Alberta, Canada. One of the largest labs in North America, DynaLIFEDx processes 900,000 microbiology specimens annually for more than 120 hospitals and health systems in Alberta, Saskatchewan, and the Northwest Territories.

When its microbiology lab combined Lean with TLA in September 2013, the lab recorded:

• Improved turnaround time;
• Reduced errors;
• Standardized specimen handling and processing;
• Streamlined operations; and,
• Enhanced antibiotic stewardship.

In addition, combining Lean and TLA helped the DynaLIFEDx microbiology lab lift productivity so much that it could handle a 15% increase in specimen volume over 18 months while also reducing staff by six full-time equivalent positions.

Lean and Automation Cut Microbiology Test TAT to Just 1 to 1.5 Days

“After the lab implemented its TLA system, the microbiology staff saw the time to report results drop from 1 to 5 days to 1.5 to 2 days,” stated Norma Page, the lab’s Vice President of Clinical Operations during a presentation at the Executive War College in New Orleans last May. “When the staff compared the number of labeling errors in one month (March 2012 versus March 2014), the number dropped from 106 out of 70,523 specimens processed (for a rate of 0.150%) to 13 errors among 77,951 specimens processed for an error rate of 0.017%.”

Click here to see image

Norma Page (above) is a medical laboratory and finance professional with hands-on experience in all aspects of laboratory medicine, including testing services, patient care, systems and support infrastructure, quality management, laboratory integration, business acquisitions, and strategic, business and financial analysis. She is a registered medical laboratory technologist and holds a Master of Business Administration degree from Heriot-Watt University in Edinburgh, Scotland. (Photo and caption copyright: Dark Intelligence Group.)

Our sister publication, The Dark Report published a seminal study that confirmed the performance advantages that Lean labs using lab automation have over non-Lean labs using automation. The study was done by Thomas Joseph, CEO of Visiun, Inc., of Ann Arbor, Mich.

Working from a database that included 100 labs, 14 of which were incorporating Lean methods, Joseph determined that Lean labs consistently outperformed non-Lean labs in the important measures of average test TAT, staff productivity, and reduction of outlier test reports, despite the fact that all labs were generally using comparable automated systems for chemistry, immunoassay, and hematology.

Click here to see image

Thomas Joseph (above), CEO of Visiun, Inc., is a seasoned consulting professional with 20 years of consulting experience and areas of expertise that include financial management, operations assessment, and improvements using Lean strategies, and research and development. Joseph’s research in the area of performance metrics has led to the development of the most comprehensive database of performance metrics in the laboratory industry.

Lean Labs with Automation Sustain TAT Even with Large Test Volume

“With Lean labs, we saw that the relationship between test volume and TAT is almost flat, meaning Lean labs are managing results regardless of volume, because larger volumes have almost no effect on TAT,” observed Joseph. “What’s more, the larger Lean labs are doing just as well. The largest lab we studied did about 1.6 million annual tests and could do a routine CBC in about nine minutes. The smaller Lean labs, with annual volume of about 400,000 tests, did a routine CBC in about eight minutes. Work processes in Lean labs allow them to handle increased workload without suffering declines in TAT the way conventional labs do.” (See The Dark Report, Volume XV, No. 1, January 21, 2008.)

For all these reasons, in vitro diagnostic manufacturers have recognized the power of combining Lean with lab automation and the latest best-in-class total automation systems are designed to accommodate Lean labs. “For these systems, manufacturers incorporate Six Sigma principles into the actual automation workflow by working to eliminate bottlenecks on the automation line,” noted Ross.

Automation Solutions Should Be Designed to Eliminate Bottlenecks in Clinical Labs

“If you look at lab automation systems that don’t have a philosophy to eliminate bottlenecks, then specimens will get held up at the centrifuge or at various analyzers,” he explained. “Then, lab test turnaround times have wide variation, which physicians don’t like because it creates unpredictability in when the lab reports results to them.

“Conversely, the modules in best-in-class automation systems are designed to move at the same speed,” added Ross. “That includes the centrifuges, analyzers, decappers, recappers, and all essential components. When you do that, you start to get very consistent turnaround times because—if you have 100 samples and each sample gets loaded every three seconds—you will get results every three seconds. Therefore, your variation in turnaround is very, very minimal. When your lab does that, physicians ordering tests will see the consistency and thus the lab will see an overall improvement in its relationships with physicians,” stated Ross.

White Paper on Combining Lean and Total Laboratory Automation

To help clinical laboratories understand all the issues when implementing Lean and a total laboratory automation system, The Dark Report and Dark Daily have produced a white paper on this topic. Titled, “Buyer’s Guide for Clinical Laboratory Automation Achieving High Production Lab Automation and Lean Workflow: What Lab Managers Should Know Before Issuing the RFP,” the report can be downloaded here.

In the report, readers will find a thorough discussion of the issues related to combining Lean and TLA, along with an examination of the questions lab directors and pathologists will want to answer before they make a decision to purchase and implement a new TLA system.

—Joseph Burns
Related Information:

Buyer’s Guide for Clinical Laboratory Automation Achieving High Production Lab Automation and Lean Workflow: What Lab Managers Should Know Before Issuing the RFP

More Clinical Pathology Laboratories Are Buying Total Laboratory Automation

Innovative Labs Combining Lean and Automation in Clever Ways

The Economic Realities of Lab Automation

Implementing a Laboratory Automation System: Experience of a Large Clinical Laboratory

Related Products:

FREE SPECIAL EDITION WHITE PAPER:
Buyer’s Guide for Clinical Laboratory Automation
Download FREE Report Now!

FREE SPECIAL EDITION WHITE PAPER:
A Lab Leader’s Guide To Pharmacogenomic Testing (PGx)
Cloud-Based Software Reporting of PGx
For Hospitals, Health Systems and Clinical Laboratories

Download FREE Report Now!

FREE Clinical Lab White Paper on Healthcare Networks
Integrating Clinical Laboratories into Healthcare Networks: Using clinical exchange and elegant workflow design to enhance service value and reduce costs
Download Free Report Now!

Read Full Post »

microscopy and spatially resolved chemical analysis

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

Combining the Power of Mass Spectrometry, Microscopy

Published: Monday, Nov 9, 2015    http://www.technologynetworks.com/news.aspx?ID=184983

A tool that provides world-class microscopy and spatially resolved chemical analysis shows considerable promise for advancing a number of areas of study, including chemical science, pharmaceutical development and disease progression

The hybrid optical microscope/mass spectrometry-based imaging system developed at the Department of Energy’s Oak Ridge National Laboratory operates under ambient conditions and requires no pretreatment of samples to analyze chemical compounds with sub-micron resolution. One micron is equal to about 1/100th the width of a human hair. Results of the work have recently been published by postdoctoral associate Jack Cahill and Gary Van Berkel and Vilmos Kertesz of ORNL’s Chemical Sciences Division.

“Knowing the chemical basis of material interactions that take place at interfaces is vital for designing and advancing new functional materials that are important for DOE missions such as organic photovoltaics for solar energy,” Van Berkel said. “In addition, the new tool can be used to better understand the chemical basis of important biological processes such as drug transport, disease progression and response for treatment.”

InTheNewsArticle_1.jpg

The hybrid instrument transfers tiny amounts of a material such as human tissue or an organic polymer from a sample by a laser ablation process in which material is captured and transported via liquid stream to the ionization source of the mass spectrometer. In just seconds, a computer screen displays the results.

Researchers noted that the resolution of less than one micron is essential to accurately differentiate and distinguish between polymers and sub-components of similar-sized cells.

“Today’s mass spectrometry imaging techniques are not yet up to the task of reliably acquiring molecular information on a wide range of compound types,” Cahill said. “Examples include synthetic polymers used in various functional materials like light harvesting and emitting devices or biopolymers like cellulose in plants or proteins in animal tissue.”

This technology, however, provides the long-sought detailed chemical analysis through a simple interface between a hybrid optical microscope and an electrospray ionization system for mass spectrometry.

 

This technology, however, provides the long-sought detailed chemical analysis through a simple interface between a hybrid optical microscope and an electrospray ionization system for mass spectrometry

 

Read Full Post »

Signaling Stiffness Changes

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Sound Waves Levitate Cells to Detect Disease-Signaling Stiffness Changes

Wed, 11/04/2015  Acoustical Society of America (ASA)

http://www.mdtmag.com/news/2015/11/sound-waves-levitate-cells-detect-disease-signaling-stiffness-changes

 

Utah Valley University physicists are literally applying rocket science to the field of medical diagnostics. With a few key changes, the researchers used a noninvasive ultrasonic technique originally developed to detect microscopic flaws in solid fuel rockets, such as space shuttle boosters, to successfully detect cell stiffness changes associated with certain cancers and other diseases.

Brian Patchett, a research assistant and instructor within the Department of Physics at Utah Valley University, will describe the group’s method, which uses sound waves to manipulate and probe cells, during the Acoustical Society of America’s Fall 2015 Meeting, held Nov. 2-6, in Jacksonville, Fla.

The method combines a low-frequency ultrasonic wave to levitate the cells and confine them to a single layer within a fluid and a high-frequency ultrasonic wave to measure the cell’s stiffness.

 

http://www.mdtmag.com/sites/mdtmag.com/files/styles/content_body_image/public/embedded_image/2015/11/levitating-cells-sound.jpg

University Basic setup of the group’s acoustic levitation device during normal use with an ultrahigh-frequency pulser taking readings of the monolayer. (Credit: Brian Patchett/Utah Valley)

 

“An acoustic wave is a pressure wave so it travels as a wave of high and low pressure. By trapping a sound wave between a transducer — such as a speaker — and a reflective surface, we can create a ‘standing wave’ in the space between,” explained Patchett. “This standing wave has stationary layers of high and low pressure, a.k.a. ‘anti-nodes,’ and areas, ‘the nodes’ where the pressure remains the same.”

This standing wave allowed the group to acoustically levitate the cells and isolate them in manner similar to their natural state — as they would be within human tissue or the bloodstream. Previous work in this realm relied on “growing the cell cultures in a Petri dish, which tends to deform the structure, as well as create all sorts of interference,” Patchett said.

The significance of the group’s work is that it focuses on an unexplored method of measuring the properties of cells and how they change during the process of cancer and disease development. “The stiffness of the cell is the primary change detected with our high-frequency ultrasound; it reveals detailed information about the internal structure of the cell and how it changes in certain diseases,” Patchett said.

The group’s method can also help distinguish between different types of cancer — such as aggressive breast cancer vs. less aggressive forms. “By isolating the cells in a monolayer of fluid via acoustic levitation, we’re providing a better method for the detection of cell stiffness,” Patchett said. “This method can be used to explore the aspect of cells that changes during Alzheimer’s disease, the metastasis of cancer, or during the onset of autoimmune responses to better understand these conditions and provide insight into possible treatment methods.”

 

http://www.mdtmag.com/sites/mdtmag.com/files/styles/content_body_image/public/featured_image/2015/11/levitating-cells-sound2.jpg

UHFSine photo of the layers created by Sine waves. (Credit: Brian Patchett/Utah Valley University)

 

One of the group’s key findings is that “by manipulating the shape of the wave that we use for levitation in specific ways, we’re able to create more precise, sharply defined layers,” Patchett said. “And, borrowing from previous cell culture work done by our group, our high-frequency ultrasound method detects changes within the stiffness of cells with high accuracy. By isolating the cells in a levitated monolayer, we hope to see these changes more clearly so that we can gain a better understanding of what’s happening within the cell and why.”

What kinds of applications might this method find? “It’s a really fantastic research method for exploring autoimmune disorders,” pointed out Timothy Doyle, lead scientist on the project and an assistant professor of physics at Utah Valley University.

As far as other applications, the group’s method may find use in clinics, hospitals, and surgical centers as a way to immediately detect and characterize cancer or other diseases.

“Our method identifies aggressive types of breast cancer, for example, while in the operating room,” Patchett noted. “Faster than current pathology methods, it will enable doctors to ensure speedier assessments and more effective treatment plans for patients — personalized to their specific needs, which, in turn, will end up being more cost effective in the long term.”

In the near future, the group plans to apply their method to a wide range of biological materials, including white blood cells undergoing activation, which is part of the immune response to an illness.

“We’re collaborating with the Huntsman Cancer Institute — part of the University of Utah healthcare system — to explore various types of breast tissues under levitation to refine our pathology detection methods,” Patchett said. “Our goal is to provide potentially life-saving, personalized medical treatments based on our ability to quickly and effectively detect cancers and diseases in patients.”

 

 

 

Read Full Post »

Laser Technology

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Laser Focus World   www.laserfocusworld.com

Ultrafast lasers simplify fabrication of 3D hydrogel tissue scaffolds

Multimode holographic waveguides tackle in vivo biological imaging

Mid-infrared Lasers CMOS silicon-on-sapphire process produces broad mid-IR supercontinuum

Looking Back/Looking Forward: Positioning equipment—the challenge of building a solid foundation for optics
Stability and precision have been crucial for optics since the 19th century.
Jeff Hecht

Monolithic DFB QCL array aims at handheld IR spectral analysis
Many QCLs combined on a single chip demonstrate fully electronic wavelength tuning for stand-off IR spectroscopy of explosives and other materials.
Mark F. Witinski, Romain Blanchard, Christian Pfluegl, Laurent Diehl, Biao Li, Benjamin Pancy, Daryoosh Vakhshoori, and Federico Capasso

Quantum dots and silicon photonics combine in broadband tunable laser
A new wavelength-tunable laser diode combines quantum-dot technology and silicon photonics with large optical gains around the 1310 nm telecom window.
Tomohiro Kita and Naokatsu Yamamoto

Computer modeling boosts laser device development
A full quantitative understanding of laser devices is boosted by computer modeling, which is not only essential for efficient development processes, but also for identifying the causes of unexpected behavior.
Rüdiger Paschotta

 

 

Monolithic DFB QCL array aims at handheld IR spectral analysis
MARK F. WITINSKI, ROMAIN BLANCHARD, CHRISTIAN PFLUEGL, LAURENT DIEHL, BIAO LI, BENJAMIN PANCY, DARYOOSH VAKHSHOORI, and FEDERICO CAPASSO

Advances in infrared (IR) laser sources, optics, and detectors promise major new advances in areas of chemical analysis such as trace-gas monitoring, IR microscopy, industrial safety, and security.

One key type of photonic device that has yet to reach its full potential is a truly portable noncontact (standoff), chemically versatile analyzer for fast Fourier-transform infrared (FTIR)quality spectral examination of nearly any condensed-phase material. The unique challenges of standoff IR spectroscopy actually extend beyond advances in IR hardware, requiring the proper combination of several areas of expertise: cutting-edge optical design and laser fabrication, integrated laser electronics, thermally efficient hermetic packaging, statistical signal processing methods, and deep chemical knowledge.

At the core of the approach we have taken at Pendar Technologies is the monolithic distributed feedback (DFB) quantum-cascade laser (QCL) array. Invented in Federico Capasso’s group at Harvard University (Cambridge, MA) and licensed exclusively to Pendar, the continuously wavelength-tunable QCL array source is a highly stable broadband source that can be used for illumination in reflectance spectroscopy. Each element of the array is individually addressable and emits at a different wavelength by design.

The advantages of these QCL arrays over external-cavity (EC) QCLs stem from (1) the monolithic structure of QCL arrays and (2) their fully electronic wavelength tuning— that is, no moving gratings, allowing for much-higher-speed acquisition through improved amplitude and wavelength stability. When integrated into a system, the result is robust, stable, and field-deployable.

One of the key advances that has enabled this technology to be fielded is the high-yield fabrication of each laser ridge in the QCL array from a single wafer such that every channel simultaneously meets the specified wavelength, power, and single-mode suppression ratio. Each of these parameters is critical to both efficient beam combining and to obtaining high-quality molecular spectroscopy once integrated.

With these hurdles largely overcome, the payoff in terms of spectrometer performance lies largely in a demonstrated shot-to-shot amplitude stability in pulsed mode of <0.1%—a factor of 50 more stable than is typical for EC QCLs, even when used in the lab. Most importantly, the DFB QCL noise is random, and averages toward an Allan variance limit quickly such that detector-noise-limited, high-quality spectra can be obtained for trace levels (for example, 1–50 µg/cm2) of typical powders in just 100 ms.

More DFB array advantages While the stability advantage of DFBs vs. EC configurations has been well established, there are a few less-obvious aspects to DFB arrays that make them more suitable to real-world spectroscopy tools and, in particular, portable spectroscopy tools. For one, the laser array as a whole can maintain a 100% duty cycle while each laser in the array requires operation only over a 100/n (%) duty cycle, where n is the number of lasers in the array. Put another way, a laser array consisting of only pulsed QCLs can operate as a truly continuous-wave (CW) system, allowing for high-measurement duty cycle while possibly reducing the cost of fabrication.

In a related way, generating light for an array that has a 100% aggregate duty cycle (by using, for instance, 32 lasers at 3% duty cycle), the thermal heat-sinking requirements of the source are dramatically reduced. Indeed, our packaged prototypes do not even require active cooling to keep the system cool enough to run. A thermoelectric cooler is built into the package only to stabilize the temperature, which therefore stabilizes the 32 wavelengths (see Fig. 1).
FIGURE 1. A 200 cm-1 prototype QCL array with 32 QCLs is shown prior to beam combining and packaging (a), and experimental spectra from 32 adjacent QCLs are seen (b). (Courtesy of Pendar Technologies)
Finally, the arbitrary programmability of the QCL array opens up many new possibilities for experimental optimization. Certain lasers can be skipped, multiple lasers can fire at once, repetition rates and pulse durations can be set for each element, and so on. These advantages are only truly realized when the QCL array is instrumented into a full system.

Looking holistically at how best to integrate this new capability into a full system, it is critical to draft the link equations that govern the use of electrons to produce photons, the collection of photons scattered back, and finally the conversion from raw spectral information to chemical identification. In the case of mid-IR material identification, it becomes clear that three aspects are particularly consequential: (1) How broad a wavelength range is needed for the tool to be of maximum specificity without producing redundant or useless chemical information (that is, how many laser channels should be used, how should they be spaced with respect to one another, and over what total wavelength regime should they be spaced); (2) the mechanical and electro-optical design of the instrument; and (3) how to get the highest performance regressions against reference spectra while maintaining the high-speed identification that the QCL array actually enables.

With regard to the wavelength regions of interest (see Fig. 2), most of the spectral richness of an IR spectrum is centered in two bands, generally referred to as the functional group region (about 3.3–5.5 µm) and the fingerprint region (about 7–11 µm). The first is typically dominated by the stretch modes of certain common bond groups, while the latter includes bending modes of some functional groups as well as lower frequency modes that are characteristic of the macromolecule “backbone”—for instance, the torsional modes of a toluene ring found in many highly energetic materials. With support from the Department of Homeland Security (DHS)’s Widely Tunable Infrared Source (WTIRS) program and from the Army Research Lab, Pendar is developing a compact array module that fully covers 7–11 µm (900–1430 cm-1).

FIGURE 2. An assemblage of IR spectra of many common explosives shows that each has at least one unique absorption feature in the wavelength ranges selected. The blue shaded box indicates strong water interference in the troposphere. The figure intentionally spans beyond 1800 cm-1 so as to illustrate that no new information is gained for this chemical class by shifting the longwave-IR (LWIR) source further to the blue until the midwave-IR (MWIR) is reached.

 

System architecture drivers To maximize signal-to-noise (SNR) while minimizing the required acquisition time, the system architecture is driven by the following first-order considerations: 1. Increasing the laser power enabled by relaxed thermal constraints as the heat load is distributed over several modules (arrays) and laser waveguides. 2. Maximization of the measurement duty cycle enabled by the fast purely electronic control of the array, allowing close to zero-delay switching between lasers— that is, a laser is on at any time. This is also enabled by the distributed heat load among the laser units. 3. Improved source stability, wavelength accuracy, pulse-to-pulse amplitude, and frequency repeatability—all of which are needed to ensure that the source noise is not the limiting form of noise (compared to detector or speckle noise). Other researchers have studied the source-noise problem of commercial EC QCLs as well and concluded that the order-of-magnitude advantage in minimum detectable absorbance (MDA) offered by a DFB QCL carries through the full experiment.

Finally, once the spectra are digitized, the system must use complex chemometrics algorithms to ensure confident identification of threats in the presence of chemical clutter, deliberate interferents, and unknown backgrounds, without the intervention of an expert user. Our approach to real-time chemometrics is centered on the fact that for chemically cluttered situations, spectral libraries alone—no matter how large—cannot constitute the sole basis for chemometric analysis. Microphysics modeling and experimentation are also required, particularly in regard to crystal size distribution, clutter interactions, and chemical photolysis/reactions.

The key advance lies in the incorporation of chemical and physical understanding of the targets and their co-indicators. We are currently developing a four-tiered approach to the spectroscopic algorithms challenge:

1. Physics-based models. Reliable chemical detection from standoff measurements will involve transformation of the chemical signatures in the reference spectral library to reflect the physical and environmental conditions of the experiment. A physics-based model will thus be included in the detection algorithm to help us model the variability in a reference spectrum as a function of effects such as vapor pressure, deliquescence, photochemical lifetime, reactive lifetime, decomposition products, and so on to facilitate better comparison with the measured spectrum.

2. Situational effects. Effects of different substrates and their properties on the chemical signatures and the angular dependence of spectra that are not clearly linked to equations of physics and chemistry will be experimentally evaluated and included in the detection algorithm. In particular, experimentally measuring such variability will help us algorithmically model the variability of chemical signatures from some “gold standard” reference signature, which—in
addition to the physical model—will enable better detection strategies.

3. Feature-based classification. Extraction of relevant feature vectors from the reference library spectra and the knowledge of the chemistry to form a hierarchical decision tree that will help us provide different levels of classification based on the customer requirements. For instance, if a customer is only interested in finding out whether a given chemical is an explosive, then we might save on computational cost by avoiding searching through the leaves of the decision tree to find out the exact chemical.

4. Real-time atmospheric measurements. Once validated, the model will be suitable for field implementation by the inclusion of an integrated sensor suite that simultaneously records atmospheric pressure, temperature, relative humidity, solar flux, wind magnitude, and water-vapor mixing ratio. With these design drivers considered, Pendar recently completed the build of a handheld demonstration system.

Figure 3 shows the experimentally obtained spectra for two nonhazardous chemical targets as a function of stand-off distance. The yellow line in each panel shows the library FTIR (“true”) spectrum for each. Agreements of r2 > 0.9 were typical. With the prototype system as an extrapolation point, continued, focused advances in the technology are now underway to open myriad frontiers in molecular spectroscopy.

 

FIGURE 3. Standoff spectra of of acetaminophen and ibuprofen for three target distances. The black line shows the FTIR of the same using a diffuse reflectance accessory. The only data processing shown is the normalization of the curve areas to a common value.

 

ACKNOWLEDGEMENT Pendar Technologies was formed in August 2015 through a merger between Pendar Medical (Cambridge, MA), a portable spectroscopy company founded by Daryoosh Vakhshoori (who was previously at Ahura Scientific and CoreTek), and QCL sensing startup Eos Photonics (Cambridge, MA), a Harvard spinoff founded by professor Federico Capasso and his postdocs.

 

Quantum dots and silicon photonics combine in broadband tunable laser
TOMOHIRO KITA and NAOKATSU YAMAMOTO

A new wavelength-tunable laser diode combines quantum-dot (QD) technology and silicon photonics with large optical gains around the 1310 nm telecom window and is amenable to integration of other passive and active components towards a truly integrated photonic platform.

A new heterogeneous wavelength-tunable laser diode, configured using quantum dot (QD) and silicon photonics technology, leverages large optical gains in the 1000–1300 nm wavelength region using a scalable platform for highly integrated photonics devices. A cooperative research effort between Tohoku University (Sendai, Japan) and the National Institution of Information and Communication Technology (NICT; Tokyo, Japan) has resulted in the demonstration of broadband tuning of 44 nm around a 1230 nm center wavelength with an ultrasmall device footprint, with many more configurations with various performance metrics possible.

Recently developed high-capacity optical transmission systems use wavelength-division multiplexing (WDM) systems with dense frequency channels. Because the frequency channels in the conventional band (C-band) at 1530–1565 nm are overcrowded, the frequency utilization efficiency of such WDM systems becomes saturated. However, extensive and unexploited frequency resources are buried in the near-infrared (NIR) wavelength regions such as the thousand (T) and original (O) bands between 1000 and 1260 nm and 1260 and 1350 nm, respectively. Quantum dot-based optical gain media have various attractive characteristics, including ultrabroad optical gain bandwidths, high-temperature device stability, and small line width enhancement factors, as well as silicon photonic wire waveguides based on silicon-on-insulator (SOI) structures that are easily amenable to constructing highly integrated photonics devices.1-4

Quantum dot-based optical gain media have various attractive characteristics, including ultrabroad optical gain bandwidths, high-temperature device stability, and small linewidth enhancement factors, as well as silicon photonic wire waveguides based on silicon-on-insulator (SOI) structures that are easily amenable to constructing highly integrated photonics devices.1-4

The photonic devices used for shortrange data transmission are required to have a small footprint and low power consumption. Therefore, compact, low-power wavelength-tunable laser diodes are key devices for use in higher-capacity data transmission systems that have been designed to use these undeveloped frequency bands, and our heterogeneous tunable wavelength laser diode consisting of a QD optical gain medium and a silicon photonics external cavity is a promising candidate.5

Quantum dot optical amplifier Ultrabroadband optical gain media spanning the T- and O-band are effectively fabricated by using QD growth techniques on large-diameter gallium-arsenide (GaAs) substrates. Our sandwiched sub-nano-separator (SSNS) growth technique is a simple and efficient method for obtaining high-quality QDs (see Fig. 1).

 

FIGURE 1. A cross-section (a) shows a quantum dot (QD) device grown using the SSNS technique, resulting in a high-density, highquality QD structure (b) that is used to create a typical SOA (c) using QD optical gain.

 

In the SSNS method, three monolayers (each around 0.85 nm thick) of GaAs thin film are grown in an indium GaAs (InGaAs) quantum well (QW) under the QDs. We had previously observed many large, coalescent dots that could induce crystal defects in QD devices using a conventional growth technique without SSNS. Now, we can obtain high-density (8.2 × 1010 cm-2), high-quality QD structures since the SSNS technique successfully suppresses the formation of coalescent dots.

For single-mode transmission, a ridgetype semiconductor waveguide was fabricated for single-mode transmission. The cross-section of the semiconductor optical amplifier (SOA) has an anti-reflection (AR) coating facet to connect a silicon photonics chip with low reflection and a cleaved facet used as a reflecting mirror in the laser cavity.

To fabricate the SOA, the SSNS growth technique was combined with molecular beam epitaxy. Quantum dots comprised of indium arsenide (InAs) with 20–30 nm diameters were grown within an InGaAs QW. Seven of these QD layers are stacked to achieve broadband optical gain. Subsequently, this QD-SOA is used as an optical gain medium for the heterogeneous laser, which can be complemented by other communication technology devices such as a high-speed modulator, a two-mode laser, and a photoreceiver.6, 7

Silicon photonics ring resonator filter With the QD-SOA fabricated, a wavelength filter is fabricated next using silicon photonics techniques. It includes a spot-size converter that has a silicon oxide (SiOx) core and a tapered Si waveguide that connects the QD-SOA to the Si photonic wire waveguide while minimizing optical reflections and coupling losses (see Fig. 2).

 

FIGURE 2. A microscope image (a) shows a silicon-photonicsbased wavelength-tunable filter. In a transmittance analysis (b), the red and blue dotted lines indicate the transmittance of a small ring resonator with free spectral range FSR1 and a large ring resonator with FSR2, respectively, and the solid line indicates the product of each transmittance. The tuning wavelength range is determined from the FSR difference of the two rings. A smaller difference in the FSR provides a wider wavelength tuning range, even when the transmittance difference between the main and side peaks is small.

 

The wavelength-tunable filter consists of two ring resonators of different size. The Vernier effect of these two ring resonators allows only light of a specific wavelength to reflect to the QD-SOA. Furthermore, Tantalum micro-heaters formed above the resonators provide a means whereby the laser wavelength can be tuned through application of the thermooptic effect.

Essentially, the wavelength tuning operation of the double ring resonator wavelength filter is achieved through Vernier effects wherein a ring resonator acts as a wavelength filter with constant wavelength interval called the free spectral range (FSR), which is inversely proportional to the circumference of the ring. The tuning wavelength range is determined from the FSR difference of the two rings with FSR1 and FSR2.

A smaller difference in the FSR provides a wider wavelength tuning range, even when the transmittance difference between the main and side peaks is small. On the other hand, a sufficiently large transmittance difference is required to achieve stable single-mode lasing and is obtained using large FSR ring resonators.

Silicon photonics allows us to fabricate an ultrasmall ring resonator with large FSR because of the strong light confinement in the waveguide. The ring resonator consists of four circle quadrants and four straight lines and the radius of the circle was chosen to be 10 µm to avoid bending losses. The FSRs of the ring resonators and the coupling efficiency between the bus-waveguide and the ring resonator are optimized to obtain wide wavelength tuning range and sufficient transmittance difference.

The FSRs and the coupling efficiencies of the double ring resonators are designed to obtain a 50 nm wavelength tuning range and 1 dB transmittance difference. We have since fabricated various wavelength-tunable laser diodes, including a broadband tunable laser diode, a narrow spectral-linewidth tunable laser diode, and a high-power integrated tunable laser diode by using a silicon photonics wavelength filter and a commercially available C-band SOA.8, 9

The tunable laser diode Using stepper motor controllers, the QD-SOA—kept at approximately 25°C using a thermoelectric cooler—and the silicon photonics wavelength filter are butt-jointed (see Fig. 3). The lasing wavelength is controlled by the temperature of a micro-heater placed on the ring resonators. With physical footprints of 600 µm × 1 mm and 1 × 2 mm for the wavelength filter and the QD-SOA, respectively, the total device size of the tunable laser diode is just 1 × 3 mm.

 

FIGURE 3. A schematic shows how the heterogeneous wavelengthtunable laser diode is constructed.

 

Measured using a lensed fiber, the laser output from the cleaved facet of the QD-SOA shows single-mode lasing characteristics with a laser oscillation threshold current of 230 mA. Maximum fiber-coupled output power is 0.4 mW when the QD-SOA injection current is 500 mA. As the ring resonator temperature is increased by a heater with 2.1 mW/nm power consumption, the superimposed lasing spectra show a 44 nm wavelength tuning range with more than a 37 dB side-mode-suppression ratio between the ring resonator’s modes. The 44 nm wavelength tuning range of our heterogeneous QD/Si photonics wavelength-tunable laser is, to our knowledge, the broadest achieved to date. The 44 nm tuning range around 1230 nm corresponds to 8.8 THz in the frequency domain, which is far larger than the 4.4 THz frequency that is available within the C-band.

Our heterogeneous laser is suitable for use as a light source on a silicon photonics platform that includes other optical components such as high-speed modulators and germanium (Ge)-based detectors. In addition to application as a single-chip broadband optical transceiver for telecommunications, the laser could also be applied to biomedical imaging applications such as optical coherence tomography (OCT), considering the low absorption of NIR light at 1310 nm in the presence of water.

ACKNOWLEDGEMENTS This research was partially supported by the Strategic Information and Communications R&D Promotion Program (SCOPE), of Japan’s Ministry of Internal Affairs and Communications and a Grant-in-Aid for Scientific Research of the Japan Society for the Promotion of Science.

REFERENCES

1. Y. Arakawam and H. Sakaki, Appl. Phys. Lett., 40, 11, 939–941 (1982).

2. D. L. Huffaker et al., Appl. Phys. Lett., 73, 18, 2564–2566 (1998).

3. R. A. Soref, Proc. IEEE, 81, 12, 1687–1706 (1993).

4. B. Jalai and S. Fathpour, J. Lightwave Technol., 24, 12, 4600–4615 (2006).

5. T. Kita et al., Appl. Phys. Express, 8, 6, 062701 (2015).

6. N. Yamamoto et al., Jpn. J. Appl. Phys., 51, 2S, 02BG08 (2012).

7. N. Yamamoto et al., Proc. OFC, Los Angeles, CA, paper W2A.24 (Mar. 2015).

8. T. Kita et al., Appl. Phys. Lett., 106, 11, 111104 (2015).

9. N. Kobayashi et al., J. Lightwave Technol., 33, 6, 1241–1246 (2015).

 

Computer modeling boosts laser device development
RÜDIGER PASCHOTTA

A full quantitative understanding of laser devices is boosted by computer modeling, which is not only essential for efficient development processes, but also for identifying the causes of unexpected behavior.

Computer modeling can give valuable insight into the function of laser devices. It can even reveal internal details that could not be observed in experiments, and thus allows one to develop a comprehensive understanding from which laser development can enormously profit. For example, the performance potentials of certain technologies can be fully exploited and time-consuming and expensive iterations in the development process can be avoided. Some typical examples clarify the benefits of computer modeling for improved laser device development.

Example 1: Q-switched lasers

FIGURE 1. Evolution of the transverse beam profile (shown with a color scale) and the optical power (black circles, in arbitrary units) in an actively Q-switched laser is simulated with RP Fiber Power software using numerical beam propagation. The color scale is normalized for each round trip according to the timedependent optical power so that the variation of the beam diameter can be seen.

Example 2: Mode-locked lasers

Example 3: Ultrashortpulse fiber amplifiers

FIGURE 2. The evolution of pulse energy and forward ASE powers in a four-stage fiber amplifier system with various types of ASE suppression between the stages, calculated with a comprehensive computer model

FIGURE 3. Form-based software can be used to model laser devices such as a fiber amplifier. It is essential that such forms be made or modified by the user or by technical support, so that they can be tailored to specific applications.

…. more

Documentation and support For any modeling task, documentation of methods and results is essential. The documentation must not only explain details of the user interface, but must inform the user what kind of physical model was used, what simplifying assumptions were made, and what limitations need to be considered. Unfortunately, software documentation is often neglected. In case of doubt, competent technical support should be available—not only for helping with the handling of the software, but also offering detailed technical and scientific advice. For example, a beginner may find it difficult to decide which kind of model should be implemented for a certain purpose and which possibly disturbing effects need to be considered. Such support should come from a competent expert in the field rather than just a programmer.
Rüdiger Paschotta is founder and executive of RP Photonics Consulting, Bad Dürrheim, Germany; e-mail: paschotta@rp-photonics.com; www.rp-photonics.com

 

 

 

Read Full Post »

Capsule Robot

Curator: Larry H. Berntein, MD, FCAP

 

 

Motivation

Medical capsule robots are cm-size mechatronic devices designed to perform medical tasks by entering the human body from natural orifices. Wireless capsule embedding a miniature camera are available since 2000 for diagnosis of small intestine diseases. Due to the complexity of operating inside the human body, capsule robots to date have been designed in an ad hoc fashion, relying on profound expertise acquired through many years of experience

Our Research

We are trying to systematize miniaturized wireless medical device design by creating a cyber-physical design environment that will lower the barriers to design space exploration, thus accelerating progress to prototyping.

A systematic approach to design of pill-size medical devices is possible by outlining the crosscutting constraint that these systems must address. The main ones are (1) size – ideally, a capsule device should be small enough to swallow or to enter natural orifices without requiring a dedicated incision; (2) power consumption — given the limited space available onboard, energy is limited; (3) communication bandwidth – wireless signals must be transmitted through the human body with a sufficient data rate; (4) fail safe operation — since the device is deep inside the human body, the user has no access to it; and (5) effective interaction with the target site, according to the specific functions the device is required to fulfill. Given these common constraints, it is possible to identify a general system architecture for a pill-size medical consisting of the following general modules: (1) a central processing unit (CPU), that can be programmed by the user to accomplish a specific task; (2) a communication submodule, that links the device with user intent; (3) a source of energy that powers the system; and (4) sensors and (5) actuators, both of which interact with the surrounding environment to accomplish one or more specific tasks. It is also desirable for the designers to have a model of the environment, in order to predict the effectiveness of the specific design in accomplishing the desired task.

Starting from this systematic approach, we are creating a web-based cyber-physical design framework that will offer different options for the basic submodules of the capsule robot that can be integrated to obtain a simulation of the expected performance. The main goal of the design environment is to lower the barriers to design space exploration, accelerating progress to prototyping, and increasing the probability of success for each prototype.

 

 

 

You May Be Swallowing a Capsule Robot in the Near Future

Through a website and a paper revealed at a pair of Institute of Electrical and Electronics Engineers (IEEE) conferences, Assistant Professor of Mechanical Engineering Pietro Valdastri, Associate Professor of Computer Engineering Akos Ledeczi and their team made the capsule hardware and software open-source.

The paper, titled “Systematic Design of Medical Capsule Robots,” ran in a special issue of IEEE Design & Test magazine dedicated to cyber-physical systems for medical applications. Within years, Vanderbilt’s capsule robots, made small enough to be swallowed, could be used for preventative screenings and to diagnose and treat a number of internal diseases.

“We’ve done custom capsule design – one for the colon, one for the stomach, another one with a surgical clip to stop bleeding – but we saw we were basically reusing the same components,” said Valdastri, director of Vanderbilt’s Science and Technology of Robotics in Medicine (STORM) Lab. “Like it is with Lego bricks, you can reassemble them for different functions. We wanted to provide the people working in this field with their own Lego bricks for their own capsules.”

Now research groups with hypotheses about how to use the capsules won’t have to redesign boards and interfaces from scratch, which means they can get to the prototyping stage faster.

Medical capsule robots differ from the PillCam, put on the market in 2001, because they can be manipulated to perform internal tasks rather than just passing through the body and recording video.

The paper explains the hardware modules available, which handle computation, wireless communication, power, sensing and actuation. Each is designed to interface easily with new modules contributed from other research groups.

On the software side, Vanderbilt engineers used TinyOS — a free, open-source, flexible operating system — to develop reusable components.

Ledeczi, senior research scientist at Vanderbilt’s Institute for Software Integrated Systems, said a medical capsule robot is the ideal example of a cyber-physical system.

It must work inside the challenging physical environment of the human body, sense its environment and move through it effectively, and then complete tasks such as release a drug, take a tissue sample or deploy a clamp. Finally, it must constantly communicate with a base station through the entire process.

“Our focus is the design environment, not the software per se, with the goal of easing the learning curve for new researchers and engineers who start in this field,” Ledeczi said. “Designing a capsule from scratch requires deep hardware, software and domain expertise.”

By providing a hardware and software component library and the tools to make their composition easy, Vanderbilt opens up the field of medical capsule robots to engineers and scientists who have great ideas but aren’t hardware or software experts, plus makes development costs far more affordable, he said.

Already, the Royal Infirmary of Edinburg in Scotland, Chinese University of Hong Kong, Chonnam National University in South Korea and a number of other institutions have demonstrated an interest in using the technology.

The team presented its work at the IEEE/RSJ International Conference on Intelligent Robots and Systems in Hamburg, Germany, in September and the IEEE International Conference on Robotics and Automation in Seattle, Washington, in May.

 

 

Read Full Post »

« Newer Posts - Older Posts »