Posts Tagged ‘nanotechnology’

Dialysis alternative

Curators: Larry Bernstein, MD and Jennifer Schwartz, Ursulin College and Cleveland Clinic



VU Inside: Dr. William Fissell’s Artificial Kidney

by | Feb. 12, 2016 

Vanderbilt is using a microchip to build a first-ever artificial kidney

Vanderbilt University Medical Center nephrologist and Associate Professor of Medicine Dr. William H. Fissell IV, is making major progress on a first-of-its kind device to free kidney patients from dialysis. He is building an implantable artificial kidney with microchip filters and living kidney cells that will be powered by a patient’s own heart. The bio-hybrid device will mimic a kidney to remove enough waste products, salt and water to keep a patient off dialysis,” said Fissell.

Fissell says the goal is to make it small enough, roughly the size of a soda can, to be implanted inside a patient’s body.


The key to the device is a microchip.

“It’s called silicon nanotechnology. It uses the same processes that were developed by the microelectronics industry for computers,” said Fissell.

The chips are affordable, precise and make ideal filters. Fissell and his team are designing each pore in the filter one by one based on what they want that pore to do. Each device will hold roughly fifteen microchips layered on top of each other.

But the microchips have another essential role beyond filtering.

“They’re also the scaffold in which living kidney cells will rest,” said Fissell.


close-up of microchip held by tweezers×299.jpg

An example of the microchip filter being used inside Fissell’s artificial kidney. (Vanderbilt University)


Living kidney cells

Fissell and his team use live kidney cells that will grow on and around the microchip filters. The goal is for these cells to mimic the natural actions of the kidney.

“We can leverage Mother Nature’s 60 million years of research and development and use kidney cells that fortunately for us grow well in the lab dish, and grow them into a bioreactor of living cells that will be the only ‘Santa Claus’ membrane in the world: the only membrane that will know which chemicals have been naughty and which have been nice. Then they can reabsorb the nutrients your body needs and discard the wastes your body desperately wants to get rid of,” said Fissell.

Avoiding organ rejection

Because this bio-hybrid device sits out of reach from the body’s immune response, it is protected from rejection.

“The issue is not one of immune compliance, of matching, like it is with an organ transplant,” said Fissell.

How it works

The device operates naturally with a patient’s blood flow.

The device operates naturally with a patient’s blood flow.

“Our challenge is to take blood in a blood vessel and push it through the device. We must transform that unsteady pulsating blood flow in the arteries and move it through an artificial device without clotting or damage.”

Fluid dynamics

And that’s where Vanderbilt biomedical engineer Amanda Buck comes in. Buck is using fluid dynamics to see if there are certain regions in the device that might cause clotting.

“It’s fun to go in and work in a field that I love, fluid mechanics, and get to see it help somebody,” said Buck.

She uses computer models to refine the shape of the channels for the smoothest blood flow. Then they rapidly prototype the new design using 3-D printing and test it to make the blood flow as smoothly as possible.


Amanda sitting at her desk looking at fluid dynamics models on a computer screen×299.jpg

Vanderbilt biomedical engineer Amanda Buck is using fluid dynamics to see if there are certain regions in the device that might cause clotting.


Future human trials

Fissell says he has a long list of dialysis patients eager to join a future human trial. Pilot studies of the silicon filters could start in patients by the end of 2017.

“My patients are absolutely my heroes,” said Fissell. “They come back again and again and they accept a crushing burden of illness because they want to live. And they’re willing to put all of that at risk for the sake of another patient.”

Federal investment

The National Institutes of Health awarded a four-year, $6 million grant to Fissell and his research partner Shuvo Roy from the University of California at San Francisco. The two investigators are longtime collaborators on this research. In 2003, the kidney project attracted its first NIH funding, and in 2012 the Food and Drug Administration selected the project for a fast-track approval program. The work is supported by NIH grant 1U01EB021214-01.

The National Kidney Foundation reports that in 2012, Federal Medicare dollars paid more than $87 billion caring for kidney disease patients (not including prescription medications).

Desperate need

Transplant of a human kidney is the best treatment for kidney failure, but donor kidneys are in short supply. According to the U.S. Organ Procurement and Transplantation Network, although more than 100,000 patients in the United States are on the waiting list for a kidney transplant, last year only 17,108 received one.

In all, the National Kidney Foundation says more than 460,000 Americans have end-stage renal disease and every day, 13 people die waiting for a kidney.

Read more about the NIH grant here.

By Amy Wolf

Media Inquiries: Craig Boerner, (615) 322-4747

Media Inquiries:
Amy Wolf, (615) 322-NEWS


Read Full Post »

Advanced Nanospectroscopy

Larry H. Bernstein, MD, FCAP, Curator



Graphene Enables Nanoelectromechanical Systems Integration

BARCELONA, Spain, Jan. 21, 2016 — Combining nanoelectromechanical (NEMS) systems with on-chip optics holds promise as a method to actively control light at the nanoscale, and now a hybrid system has overcome the challenges of integrating such nanoscale devices with optical fields thanks to the material graphene.

Researchers from the Institute of Photonic Sciences (ICFO) have demonstrated an on-chip graphene NEMS suspended a few tens of nanometers above nitrogen-vacancy centres (NVCs), which are stable single-photon emitters embedded in nanodiamonds. The work confirms that graphene is an ideal platform for both nanophotonics and nanomechanics, the researchers said.

Due to its electromechanical properties, graphene NEMS can be actuated and deflected electrostatically over a few tens of nanometers with modest voltages applied to a gate electrode, the researchers found. The graphene motion can thus be used to modulate the light emission by the NVC, while the emitted field can be used as a universal probe of the graphene position. The optomechanical coupling between the graphene displacement and the NVC emission is based on near-field, dipole-dipole interaction.


False color scanning electronic micrograph of a hybrid graphene-nitrogen-vacancy nearfield nano-optomechanical system. Courtesy of ICFO.

The researchers observed that the coupling strength increased strongly for shorter distances and was enhanced because of graphene’s 2D character and linear dispersion. These achievements hold promise for selective control of emitter arrays on-chip, optical spectroscopy of individual nano-objects, and integrated optomechanical information processing. The ICFO team also said the hybrid device could advance quantum optomechanics.

The research was published in Nature Communications (doi: 10.1038/ncomms10218).

Electromechanical control of nitrogen-vacancy defect emission using graphene NEMS

Antoine Reserbat-PlanteyKevin G. SchädlerLouis GaudreauGabriele NavickaiteJohannes Güttinger, et al.

Nature Communications 2016; 7(10218)

Despite recent progress in nano-optomechanics, active control of optical fields at the nanoscale has not been achieved with an on-chip nano-electromechanical system (NEMS) thus far. Here we present a new type of hybrid system, consisting of an on-chip graphene NEMS suspended a few tens of nanometres above nitrogen-vacancy centres (NVCs), which are stable single-photon emitters embedded in nanodiamonds. Electromechanical control of the photons emitted by the NVC is provided by electrostatic tuning of the graphene NEMS position, which is transduced to a modulation of NVC emission intensity. The optomechanical coupling between the graphene displacement and the NVC emission is based on near-field dipole–dipole interaction. This class of optomechanical coupling increases strongly for smaller distances, making it suitable for nanoscale devices. These achievements hold promise for selective control of emitter arrays on-chip, optical spectroscopy of individual nano-objects, integrated optomechanical information processing and open new avenues towards quantum optomechanics.


Graphene is ideal substrate for brain electrodes, researchers find

February 1, 2016

This illustration portrays neurons interfaced with a sheet of graphene molecules in the background (credit: Graphene Flagship)

An international study headed by the European Graphene Flagship research consortium has found that graphene is a promising material for use in electrodes that interface with neurons, based on its excellent conductivity, flexibility for molding into complex shapes, biocompatibility, and stability within the body.

The graphene-based substrates they studied* promise to overcome problems with “glial scar” tissue formation (caused by electrode-based brain trauma and long-term inflammation). To avoid that, current electrodes based on tungsten or silicon use a protective coating on electrodes, which reduces charge transfer. Current electrodes are also rigid (resulting in tissue detachment and preventing neurons from moving) and generate electrical noise, with partial or complete loss of signal over time, the researchers note in a paper published recently in the journal ACS Nano.

Electrodes are used as neural biosensors and for prosthetic applications — such as deep-brain intracranial electrodes used to control motor disorders (mainly epilepsy or Parkinson’s) and for brain-computer interfaces (BCIs), used to recover sensory functions or control robotic arms for paralyzed patients. These applications require an interface with long-term, minimal interference.

Interfacing graphene to neurons directly

Scanning electron microscope image of rat hippocampal neurons grown in the lab on a graphene-based substrate, showing normal morphology characterized by well-defined round neural soma, extended neurite arborization (branching), and cell density similar to control substrates (credit: A. Fabbro et al./ACS Nano)

“For the first time, we interfaced graphene to neurons directly, without any peptide-coating,” explained lead neuroscientist Prof. Laura Ballerini of the International School for Advanced Studies (SISSA/ISAS) and the University of Trieste.

Using electron microscopy and immunofluorescence, the researchers found that the neurons remained healthy, transmitting normal electric impulses and, importantly, no adverse glial reaction, which leads to damaging scar tissue, was seen.

As a next step, Ballerini says the team plans to investigate how different forms of graphene, from multiple layers to monolayers, are able to affect neurons,  and “whether tuning the graphene material properties might alter the synapses and neuronal excitability in new and unique ways.”

Prof. Andrea C. Ferrari, Director of the Cambridge Graphene Centre and Chair of the Graphene Flagship Executive Board, said the Flagship will “support biomedical research and development based on graphene technology with a new work package and a significant cash investment from 2016.”

The interdisciplinary collaboration also included the University Castilla-La Mancha and the Cambridge Graphene Centre.

* The study used two methods of creating graphene-based substrates (GBSs).  Liquid phase exfoliation (LPE) — peeling off graphene from graphite — can be performed without the potentially hazardous chemical treatments involved in graphene oxide production, is scalable, and operates at room temperature, with high yield. LPE dispersions can also be easily deposited on target substrates by drop-casting, filtration, or printing. Ball milling (BM), with the help of melamine (which forms large hydrogen-bond domains, unlike LPE), can be performed in a solid environment. “Our data indicate that both GBSs are promising for next-generation bioelectronic systems, to be used as brain interfaces,” the paper concludes.




Read Full Post »

Nanoscale photodynamic therapy

Larry H. Bernstein, MD, FCAP, Curator


Researchers from the Centre for Nanoscale BioPhotonics (CNBP), an Australian Research Centre of Excellence, have shown that nanoparticles used in combination with X-rays, are a viable method for killing cancer cells deep within the living body.

The research, published in the journal Scientific Reports is based on the successful quantification of singlet oxygen produced during photodynamic therapy for cancer. Singlet oxygen molecules (a highly reactive form of oxygen) are able to kill or inhibit growth of cancer cells in the body due to their toxicity.

Co-lead author on the paper, Ewa Goldys, Deputy Director of the CNBP and Professor at Macquarie University explained, “Photodynamic therapy is where light sensitive compounds are placed near diseased cells, then activated by light, producing short lived molecular by-products that can destroy or damage the cells being targeted.”

“In this case, X-rays (a form of light) were used to stimulate cerium fluoride (CeF3) nanoparticles which had been placed near a group of cells. Singlet oxygen was produced as a by-product of the X-ray and CeF3 interaction, which was then successfully measured.”

Goldys believes the research is significant, as this is the first time that anyone has been able to quantify accurately, the number of singlet oxygen molecules produced in this type of procedure.

“Singlet oxygen molecules are a far more reactive form of oxygen but they can only kill cancer cells if generated in sufficient quantity”, said Goldys.

“In our testing we established that therapeutic radiation dose X-rays, produce enough singlet oxygen molecules to be effective in photodynamic therapy.”

According to Goldys, photodynamic therapy has traditionally utilised near-infrared or visible light which has been unable to penetrate far into the body, limiting its use to cancer treatment, on or near the surface of the skin.

“We’re looking to target cancer cells deeper in the body hence the use of X-rays, which can really penetrate into deeper levels of tissue, and are already used in medical diagnostic and therapy.’

Concluded Goldys, “What we’ve shown through our measurements is the applicability of the photodynamic therapy approach to effectively treat tumours within.”

“The beauty of this type of treatment is that it uses different biological pathways to kill cells as compared to chemotherapy, radiotherapy and other current cancer practices.

“Deep tissue photodynamic therapy will potentially provide new treatment options for the cancer patients of the future.”

Next steps with this research will see differing nanoparticles tested and measured, for effectiveness in singlet oxygen production.

ernstein, MD, FCAP, Curator



Counting Cancer-busting Oxygen Molecules

Macquarie University



Read Full Post »

Confluence of Chemistry, Physics, and Biology

Curator: Larry H. Bernstein, MD, FCAP


  1. How Nanotechnology Works by Kevin Bonsor and Jonathan Strickland


Image Source:

There’s an unprecedented multidisciplinary convergence of scientists dedicated to the study of a world so small, we can’t see it — even with a light microscope. That world is the field of nanotechnology, the realm ofatoms and nanostructures.Nanotechnology i­s so new, no one is really sure what will come of it. Even so, predictions range from the ability to reproduce things like diamonds and food to the world being devoured by self-replicating nanorobots.In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale. A nanometer (nm) is one-billionth of a meter, smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair

As small as a nanometer is, it’s still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom’s nucleus is much smaller — about 0.00001 nm. Atoms are the building blocks for all matter in our universe. You and everything around you are made of atoms. Nature has perfected the science of manufacturing matter molecularly. For instance, our bodies are assembled in a specific manner from millions of living cells. Cells are nature’s nanomachines. At the atomic scale, elements are at their most basic level. On the nanoscale, we can potentially put these atoms together to make almost anything.

In a lecture called “Small Wonders:The World of Nanoscience,” Nobel Prize winner Dr. Horst Störmer said that the nanoscale is more interesting than the atomic scale because the nanoscale is the first point where we can assemble something — it’s not until we start putting atoms together that we can make anything useful.

In this article, we’ll learn about what nanotechnology means today and what the future of nanotechnology may hold. We’ll also look at the potential risks that come with working at the nanoscale.

In the next section, we’ll learn more about our world on the nanoscale.

The World of Nanotechnology

Experts sometimes disagree about what constitutes the nanoscale, but in general, you can think ofnanotechnology dealing with anything measuring between 1 and 100 nm. Larger than that is the microscale, and smaller than that is the atomic scale.

Nanotechnology is rapidly becoming an interdisciplinary field. Biologists, chemists, physicists and engineers are all involved in the study of substances at the nanoscale. Dr. Störmer hopes that the different disciplines develop a common language and communicate with one another

Only then, he says, can we effectively teach nanoscience since you can't understand the world of nanotechnology without a solid background in multiple sciences.

One of the exciting and challenging aspects of the nanoscale is the role that quantum mechanics plays in it. The rules of quantum mechanics are very different from classical physics, ­which means that the behavior of substances at the nanoscale can sometimes contradict common sense by behaving erratically. You can’t walk up to a wall and immediately teleport to the other side of it, but at the nanoscale an electron can — it’s called electron tunneling. Substances that are insulators, meaning they can’t carry an electric charge, in bulk form might become semiconductors when reduced to the nanoscale. Melting points can change due to an increase in surface area. Much of nanoscience requires that you forget what you know and start learning all over again.

So what does this all mean? Right now, it means that scientists are experimenting with substances at the nanoscale to learn about their properties and how we might be able to take advantage of them in various applications. Engineers are trying to use nano-size wires to create smaller, more powerful microprocessors. Doctors are searching for ways to use nanoparticles in medical applications. Still, we’ve got a long way to go before nanotechnology dominates the technology and medical markets.

In the next section, we’ll look at two important nanotechnology structures: nanowires and carbon nanotubes.


At the nanoscale, objects are so small that we can’t see them — even with a light microscope. Nanoscientists have to use tools like scanning tunneling microscopes or atomic force microscopes to observe anything at the nanoscale. Scanning tunneling microscopes use a weak electric current to probe the scanned material. Atomic force microscopes scan surfaces with an incredibly fine tip. Both microscopes send data to a computer, which can assemble the information and project it graphically onto a monitor


Nanowires and Carbon Nanotubes

Currently, scientists find two nano-size structures of particular interest: nanowires and carbon nanotubes. Nanowires are wires with a very small diameter, sometimes as small as 1 nanometer. Scientists hope to use them to build tiny transistors for computer chips and other electronic devices. In the last couple of years, carbon nanotubes have overshadowed nanowires. We’re still learning about these structures, but what we’ve learned so far is very exciting.

A carbon nanotube is a nano-size cylinder of carbon atoms. Imagine a sheet of carbon atoms, which would look like a sheet of hexagons. If you roll that sheet into a tube, you’d have a carbon nanotube. Carbon nanotube properties depend on how you roll the sheet. In other words, even though all carbon nanotubes are made of carbon, they can be very different from one another based on how you align the individual atoms.

With the right arrangement of atoms, you can create a carbon nanotube that’s hundreds of times stronger than steel, but six times lighter

Engineers plan to make building material out of carbon nanotubes, particularly for things like cars and airplanes. Lighter vehicles would mean better fuel efficiency, and the added strength translates to increased passenger safety.

Carbon nanotubes can also be effective semiconductors with the right arrangement of atoms. Scientists are still working on finding ways to make carbon nanotubes a realistic option for transistors in microprocessors and other electronics.

In the next section, we’ll look at products that are taking advantage of nanotechnology.


What’s the difference between graphite and diamonds? Both materials are made of carbon, but both have vastly different properties. Graphite is soft; diamonds are hard. Graphite conducts electricity, but diamonds are insulators and can’t conduct electricity. Graphite is opaque; diamonds are usually transparent. Graphite and diamonds have these properties because of the way the carbon atoms bond together at the nanoscale.

Products with Nanotechnology

You might be surprised to find out how many products on the market are already benefiting from nanotechnology.

Bridgestone engineers developed this Quick Response Liquid Powder Display, a flexible digital screen, using nanotechnology.

Yoshikazu Tsuno/AFP/Getty Images

  • Sunscreen – Many sunscreens contain nanoparticles of zinc oxide or titanium oxide. Older sunscreen formulas use larger particles, which is what gives most sunscreens their whitish color. Smaller particles are less visible, meaning that when you rub the sunscreen into your skin, it doesn’t give you a whitish tinge.
  • Self-cleaning glass – A company called Pilkington offers a product they call Activ Glass, which uses nanoparticles to make the glassphotocatalytic and hydrophilic. The photocatalytic effect means that when UV radiation from light hits the glass, nanoparticles become energized and begin to break down and loosen organic molecules on the glass (in other words, dirt). Hydrophilic means that when water makes contact with the glass, it spreads across the glass evenly, which helps wash the glass clean.
  • Clothing – Scientists are using nanoparticles to enhance your clothing. By coating fabrics with a thin layer of zinc oxide nanoparticles, manufacturers can create clothes that give better protection from UV radiation. Some clothes have nanoparticles in the form of little hairs or whiskers that help repel water and other materials, making the clothing stain-resistant.
  • Scratch-resistant coatings – Engineers discovered that adding aluminum silicate nanoparticles to scratch-resistant polymer coatings made the coatings more effective, increasing resistance to chipping and scratching. Scratch-resistant coatings are common on everything from cars to eyeglass lenses.
  • Antimicrobial bandages – Scientist Robert Burrell created a process to manufacture antibacterial bandages using nanoparticles of silver. Silver ions block microbes’ cellular respiration

    . In other words, silver smothers harmful cells, killing them.

New products incorporating nanotechnology are coming out every day. Wrinkle-resistant fabrics, deep-penetrating cosmetics, liquid crystal displays (LCD) and other conveniences using nanotechnology are on the market. Before long, we’ll see dozens of other products that take advantage of nanotechnology ranging from Intel microprocessors to bio-nanobatteriescapacitors only a few nanometers thick. While this is exciting, it’s only the tip of the iceberg as far as how nanotechnology may impact us in the future.

In the next section, we’ll look at some of the incredible things that nanotechnology may hold for us.­


Nanotechnology is making a big impact on the tennis world. In 2002, the tennis racket company Babolat introduced the VS Nanotube Power racket. They made the racket out of carbon nanotube-infused graphite, meaning the racket was very light, yet many times stronger than steel. Meanwhile, tennis ball manufacturer Wilson introduced the Double Core tennis ball. These balls have a coating of clay nanoparticles on the inner core. The clay acts as a sealant, making it very difficult for air to escape the ball.

Accelerate Your Time to Print Using ANSYS™ SpaceClaim 2015

Switching between multiple tools to prepare 3D models for printing is not only time consuming, but also inefficient and costly to maintain. In the 2015 release, ANSYS SpaceClaim has honed its 3D printing capabilities while adding a multitude of new features to streamline model preparation, providing you with the best 3D printing model prep solution.

ANSYS™ SpaceClaim 2015 provides new features to the STL Prep module, including:

  1. A one-click tool for adding a desired thickness to a part for printing
  2. Automatic facet smoothing for building precision into 3D parts
  3. A minimum thickness detection feature to check for areas falling below a tolerance limit
  4. An unsupported material warning with an overhangs button to add support material where it is needed

The Future of Nanotechnology

In the world of “Star Trek,” machines called replicators can produce practically any physical object, from weapons to a steaming cup of Earl Grey tea. Long considered to be exclusively the product of science fiction, today some people believe replicators are a very real possibility. They call it molecular manufacturing, and if it ever does become a reality, it could drastically change the world.

Atoms and molecules stick together because they have complementary shapes that lock together, or charges that attract. Just like with magnets, a positively charged atom will stick to a negatively charged atom. As millions of these atoms are pieced together by nanomachines, a specific product will begin to take shape. The goal of molecular manufacturing is to manipulate atoms individually and place them in a pattern to produce a desired structure.

The first step would be to develop nanoscopic machines, called assemblers, that scientists can program to manipulate atoms and molecules at will. Rice University Professor Richard Smalley points out that it would take a single nanoscopic machine millions of years to assemble a meaningful amount of material. In order for molecular manufacturing to be practical, you would need trillions of assemblers working together simultaneously. Eric Drexler believes that assemblers could first replicate themselves, building other assemblers. Each generation would build another, resulting in exponential growth until there are enough assemblers to produce objects

Assemblers might have moving parts like the nanogears in this concept drawing.

Trillions of assemblers and replicators could fill an area smaller than a cubic millimeter, and could still be too small for us to see with the naked eye. Assemblers and replicators could work together to automatically construct products, and could eventually replace all traditional labor methods. This could vastly decrease manufacturing costs, thereby making consumer goods plentiful, cheaper and stronger. Eventually, we could be able to replicate anything, including diamonds, water and food. Famine could be eradicated by machines that fabricate foods to feed the hungry.

Nanotechnology may have its biggest impact on the medical industry. Patients will drink fluids containing nanorobots programmed to attack and reconstruct the molecular structure of cancer cells and viruses. There’s even speculation that nanorobots could slow or reverse the aging process, and life expectancy could increase significantly. Nanorobots could also be programmed to perform delicate surgeries — suchnanosurgeons could work at a level a thousand times more precise than the sharpest scalpel

By working on such a small scale, a nanorobot could operate without leaving the scars that conventional surgery does. Additionally, nanorobots could change your physical appearance. They could be programmed to perform cosmetic surgery, rearranging your atoms to change your ears, nose, eye color or any other physical feature you wish to alter.

Nanotechnology has the potential to have a positive effect on the environment. For instance, scientists could program airborne nanorobots to rebuild the thinning ozone layer. Nanorobots could remove contaminants from water sources and clean up oil spills. Manufacturing materials using the bottom-upmethod of nanotechnology also creates less pollution than conventional manufacturing processes. Our dependence on non-renewable resources would diminish with nanotechnology. Cutting down trees, mining coal or drilling for oil may no longer be necessary — nanomachines could produce those resources.

Many nanotechnology experts feel that these applications are well outside the realm of possibility, at least for the foreseeable future. They caution that the more exotic applications are only theoretical. Some worry that nanotechnology will end up like virtual reality — in other words, the hype surrounding nanotechnology will continue to build until the limitations of the field become public knowledge, and then interest (and funding) will quickly dissipate.

In the next section, we’ll look at some of the challenges and risks of nanotechnology.


In 1959, physicist and future Nobel prize winner Richard Feynman gave a lecture to the American Physical Society called “There’s Plenty of Room at the Bottom.” The focus of his speech was about the field of miniaturization and how he believed man would create increasingly smaller, powerful devices.

In 1986, K. Eric Drexler wrote “Engines of Creation” and introduced the term nanotechnology. Scientific research really expanded over the last decade. Inventors and corporations aren’t far behind — today, more than 13,000 patents registered with the U.S. Patent Office have the word “nano” in them

Nanotechnology Challenges, Risks and Ethics

The most immediate challenge in nanotechnology is that we need to learn more about materials and their properties at the nanoscale. Universities and corporations across the world are rigorously studying how atoms fit together to form larger structures. We’re still learning about how quantum mechanics impact substances at the nanoscale.

Because elements at the nanoscale behave differently than they do in their bulk form, there’s a concern that some nanoparticles could be toxic. Some doctors worry that the nanoparticles are so small, that they could easily cross the blood-brain barrier, a membrane that protects the brain from harmful chemicals in the bloodstream. If we plan on using nanoparticles to coat everything from our clothing to our highways, we need to be sure that they won’t poison us.

Closely related to the knowledge barrier is the technical barrier. In order for the incredible predictions regarding nanotechnology to come true, we have to find ways to mass produce nano-size products like transistors and nanowires. While we can use nanoparticles to build things like tennis rackets and make wrinkle-free fabrics, we can’t make really complex microprocessor chips with nanowires yet.

There are some hefty social concerns about nanotechnology too. Nanotechnology may also allow us to create more powerful weapons, both lethal and non-lethal. Some organizations are concerned that we’ll only get around to examining the ethical implications of nanotechnology in weaponry after these devices are built. They urge scientists and politicians to examine carefully all the possibilities of nanotechnology before designing increasingly powerful weapons.

If nanotechnology in medicine makes it possible for us to enhance ourselves physically, is that ethical? In theory, medical nanotechnology could make us smarter, stronger and give us other abilities ranging from rapid healing to night vision. Should we pursue such goals? Could we continue to call ourselves human, or would we become transhuman — the next step on man’s evolutionary path? Since almost every technology starts off as very expensive, would this mean we’d create two races of people — a wealthy race of modified humans and a poorer population of unaltered people? We don’t have answers to these questions, but several organizations are urging nanoscientists to consider these implications now, before it becomes too late.

Not all questions involve altering the human body — some deal with the world of finance and economics. If molecular manufacturing becomes a reality, how will that impact the world’s economy? Assuming we can build anything we need with the click of a button, what happens to all the manufacturing jobs? If you can create anything using a replicator, what happens to currency? Would we move to a completely electronic economy? Would we even need money?

Whether we’ll actually need to answer all of these questions is a matter of debate. Many experts think that concerns like grey goo and transhumans are at best premature, and probably unnecessary. Even so, nanotechnology will definitely continue to impact us as we learn more about the enormous potential of the nanoscale.


Eric Drexler, the man who introduced the word nanotechnology, presented a frightening apocalyptic vision — self-replicating nanorobots malfunctioning, duplicating themselves a trillion times over, rapidly consuming the entire world as they pull carbon from the environment to build more of themselves. It’s called the “grey goo” scenario, where a synthetic nano-size device replaces all organic material. Another scenario involves nanodevices made of organic material wiping out the Earth — the “green goo” scenario.

The Technion’s Russell Berrie Nanotechnology Institute is a world-leader in nanotechnology research having made seminal discoveries in the field.

Breakthroughs in Nanotechnology

  • Prof. Ester Segal and a team of Israeli and American researchers find that silicon nanomaterials used for the localized delivery of chemotherapy drugs behave differently in cancerous tumors than they do in healthy tissues. The findings could help scientists better design such materials to facilitate the controlled and targeted release of the chemotherapy drugs to tumors.
  • Associate Professor Alex Leshansky of the Faculty of Chemical Engineering is part of an international team that has created a tiny screw-shaped propeller that can move in a gel-like fluid, mimicking the environment in a living organism. The breakthrough brings closer the day robots that are only nanometers – billionths of a meter – in length, can maneuver and perform medicine inside the human body and possibly inside human cells.
  • Prof. Amit Miller and a team of researchers at the Technion and Boston University have discovered a simple way to control the passage of DNA molecules through nanopore sensors. The breakthrough could lead to low-cost, ultra-fast DNA sequencing that would revolutionize healthcare and biomedical research, and spark major advances in drug development, preventative medicine and personalized medicine.

– Israeli Prime Minister Benjamin Netanyahu presents U.S. President Barack Obama with nano-sized inscribed replicas of the Declarations of Independence of the United States and the State of Israel. The replicas were created by scientists at the Technion’s Russell Berrie Nanotechnology Institute (RBNI). (03/13)

– Prof. Nir Tessler has found a way to generate an electrical field inside solar cells that use inorganic nanocrystals or “quantum dots,” making them more suitable for building an energy-efficient nanocrystal solar cell. (11/11)

– Researchers led by Prof. Wayne Kaplan discover the nature of nanometer-thick layers between different materials and find that they have both solid and liquid properties. The results could enable scientists to improve the resilience of the bond between ceramic materials and metals, two types of materials that “do not like” to come into contact. Applications include cutting tools for metal-working; composites for brake pads; the joins between metal conducting wires and chips in computers; and the application of protective ceramic coatings on jet engine blades. (05/11)

– Israeli President Shimon Peres presents Pope Benedict XVI with a “Nano-Bible” smaller than a pinhead. Created by researchers at the Technion-Israel Institute of Technology, the complete punctuated and vowelized version of the Old Testament takes up just 0.5 square millimeters. The idea to write the Bible on such a tiny surface was conceived by Professor Uri Sivan, the first head of the university’s Russell Berrie Nanotechnology Institute (RBNI). (05/09)

Nanotechnology and medicine

Expert Opinion on Biological Therapy  2003; Volume 3Issue 4, 655-663
Dwaine F Emerich & Christopher G Thanos

Nanotechnology, or systems/device manufacture at the molecular level, is a multidisciplinary scientific field undergoing explosive development. The genesis of nanotechnology can be traced to the promise of revolutionary advances across medicine, communications, genomics and robotics. On the surface, miniaturisation provides cost effective and more rapidly functioning mechanical, chemical and biological components. Less obvious though is the fact that nanometre sized objects also possess remarkable self-ordering and assembly behaviours under the control of forces quite different from macro objects. These unique behaviours are what make nanotechnology possible, and by increasing our understanding of these processes, new approaches to enhancing the quality of human life will surely be developed. A complete list of the potential applications of nanotechnology is too vast and diverse to discuss in detail, but without doubt one of the greatest values of nanotechnology will be in the development of new and effective medical treatments (i.e., nanomedicine). This review focuses on the potential of nanotechnology in medicine, including the development of nanoparticles for diagnostic and screening purposes, artificial receptors, DNA sequencing using nanopores, manufacture of unique drug delivery systems, gene therapy applications and the enablement of tissue engineering.

Nanotechnology in Medicine – Nanomedicine

The use of nanotechnology in medicine offers some exciting possibilities. Some techniques are only imagined, while others are at various stages of testing, or actually being used today.

Nanotechnology in medicine involves applications of nanoparticles currently under development, as well as longer range research that involves the use of manufactured nano-robots to make repairs at the cellular level (sometimes referred to as nanomedicine).

Whatever you call it, the use of nanotechnology in the field of medicine could revolutionize the way we detect and treat damage to the human body and disease in the future, and many techniques only imagined a few years ago are making remarkable progress towards becoming realities.

Nanotechnology in Medicine Application: Drug Delivery

One application of nanotechnology in medicine currently being developed involves employing nanoparticles to deliver drugs, heat, light or other substances to specific types of cells (such as cancer cells). Particles are engineered so that they are attracted to diseased cells, which allows direct treatment of those cells. This technique reduces damage to healthy cells in the body and allows for earlier detection of disease.

For example, nanoparticles that deliver chemotherapy drugs directly to cancer cells are under development. Tests are in progress for targeted delivery of chemotherapy drugs and their final approval for their use with cancer patients is pending. One company, CytImmune has published the results of a Phase 1 Clinical Trial of their first targeted chemotherapy drug and another company, BIND Biosciences, has published preliminary results of a Phase 1 Clinical Trial for their first targeted chemotherapy drug and is proceeding with a Phase 2 Clinical Trial.

Researchers at the University of Illinois have demonstated that gelatin nanoparticles can be used to deliver drugs to damaged brain tissue.

Researchers at MIT using nanoparticles to deliver vaccine. The nanoparticles protect the vaccine, allowing the vaccine time to trigger a stronger immune response.

Reserchers are developing a method to release insulin that uses a sponge-like matrix that contains insulin as well as nanocapsules containing an enzyme. When the glucose level rises the nanocapsules release hydrogen ions, which bind to the fibers making up the matrix. The hydrogen ions make the fibers positively charged, repelling each other and creating openings in the matrix through which insulin is released.

Researchers are developing a nanoparticle that can be taken orally and pass through the lining of the intestines into the bloodsteam. This should allow drugs that must now be delivered with a shot to be taken in pill form.

Researchers are also developing a nanoparticle to defeat viruses. The nanoparticle does not actually destroy viruses molecules, but delivers an enzyme that prevents the reproduction of viruses molecules in the patients bloodstream.

Read more about nanomedicine in drug delivery

Nanotechnology in Medicine Application: Therapy Techniques

Researchers have developed “nanosponges” that absorb toxins and remove them from the bloodstream. The nanosponges are polymer nanoparticles coated with a red blood cell membrane. The red blood cell membrane allows the nanosponges to travel freely in the bloodstream and attract the toxins.

Researchers have demonstrated a method to generate sound waves that are powerful, but also tightly focused, that may eventually be used for noninvasive surgery. They use a lens coated with carbon nanotubes to convert light from a laser to focused sound waves. The intent is to develop a method that could blast tumors or other diseased areas without damaging healthy tissue.

Researchers are investigating the use of bismuth nanoparticles to concentrate radiation used in radiation therapy to treat cancer tumors. Initial results indicate that the bismuth nanoparticles would increase the radiation dose to the tumor by 90 percent.

Nanoparticles composed of polyethylene glycol-hydrophilic carbon clusters (PEG-HCC) have been shown to absorb free radicals at a much higher rate than the proteins out body uses for this function. This ability to absorb free radicals may reduce the harm that is caused by the release of free radicals after a brain injury.

Targeted heat therapy is being developed to destroy breast cancer tumors. In this method antibodies that are strongly attracted to proteins produced in one type of breast cancer cell are attached to nanotubes, causing the nanotubes to accumulate at the tumor. Infrared light from a laser is absorbed by the nanotubes and produces heat that incinerates the tumor.

Read more about nanomedicine therapy techniques

Nanotechnology in Medicine Application: Diagnostic Techniques

Reseachers at MIT have developed a sensor using carbon nanotubes embedded in a gel; that can be injected under the skin to monitor the level of nitric oxide in the bloodstream. The level of nitric oxide is important because it indicates inflamation, allowing easy monitoring of imflammatory diseases. In tests with laboratory mice the sensor remained functional for over a year.

Researchers at the University of Michigan are developing a sensor that can detect a very low level of cancer cells, as low as 3 to 5 cancer cells in a one milliliter in a blood sample. They grow sheets of graphene oxide, on which they attach molecules containing an antibody that attaches to the cancer cells. They then tag the cancer cells with fluorescent molecules to make the cancer cells stand out in a microscope.

Researchers have demonstrated a way to use nanoparticles for early diagnosis of infectious disease. The nanoparticles attach to molecules in the blood stream indicating the start of an infection. When the sample is scanned for Raman scattering the nanoparticles enhance the Raman signal, allowing detection of the molecules indicating an infectious disease at a very early stage.

A test for early detection of kidney damage is being developed. The method uses gold nanorodsfunctionalized to attach to the type of protein generated by damaged kidneys. When protein accumulates on the nanorod the color of the nanorod shifts. The test is designed to be done quickly and inexpensively for early detection of a problem.

Read more about nanomedicine diagnostic techniques

Nanotechnology in Medicine Application: Anti-Microbial Techniques

One of the earliest nanomedicine applications was the use of nanocrystalline silver which is  as an antimicrobial agent for the treatment of wounds, as discussed on the Nucryst Pharmaceuticals Corporation website.

A nanoparticle cream has been shown to fight staph infections. The nanoparticles contain nitric oxide gas, which is known to kill bacteria. Studies on mice have shown that using the nanoparticle cream to release nitric oxide gas at the site of staph abscesses significantly reduced the infection.

Burn dressing that is coated with nanocapsules containing antibotics. If a infection starts the harmful bacteria in the wound causes the nanocapsules to break open, releasing the antibotics. This allows much quicker treatment of an infection and reduces the number of times a dressing has to be changed.

A welcome idea in the early study stages is the elimination of bacterial infections in a patient within minutes, instead of delivering treatment with antibiotics over a period of weeks. You can read about design analysis for the antimicrobial nanorobot used in such treatments in the following article: Microbivores: Artifical Mechanical Phagocytes using Digest and Discharge Protocol.

Nanotechnology in Medicine Application: Cell Repair

Nanorobots could actually be programmed to repair specific diseased cells, functioning in a similar way to antibodies in our natural healing processes.  Read about design analysis for one such cell repair nanorobot in this article: The Ideal Gene Delivery Vector: Chromallocytes, Cell Repair Nanorobots for Chromosome Repair Therapy

Nanotechnology in Medicine: Company Directory

Company Product
CytImmune Gold nanoparticles for targeted delivery of drugs to tumors
NanoBio Nanoemulsions for nasal delivery to fight viruses (such as the flu and colds) or through the skin to fight bacteria

More nanomedicine companies

Nanotechnology in Medicine: Resources

National Cancer Institute Alliance for Nanotechnology in Cancer; This alliance includes aNanotechnology Characterization Lab as well as eight Centers of  Cancer Nanotechnology Excellence.

Alliance for NanoHealth; This alliance includes eight research institutions performing collaborative research.

European Nanomedicine platform

The National Institute of Health (NIH) is funding research at eight Nanomedicine Development Centers.

Page 2: Nanomedicine based upon nano-robots

Compiled by Earl Boysen of Hawk’s Perch Technical Writing, LLC and

Future impact of nanotechnology on medicine and dentistry

Mallanagouda Patil,1 Dhoom Singh Mehta,2 and Sowjanya Guvva3

J Indian Soc Periodontol. 2008 May-Aug; 12(2): 34–40.

doi:  10.4103/0972-124X.44088  PMCID: PMC2813556

The human characteristics of curiosity, wonder, and ingenuity are as old as mankind. People around the world have been harnessing their curiosity into inquiry and the process of scientific methodology. Recent years have witnessed an unprecedented growth in research in the area of nanoscience. There is increasing optimism that nanotechnology applied to medicine and dentistry will bring significant advances in the diagnosis, treatment, and prevention of disease. Growing interest in the future medical applications of nanotechnology is leading to the emergence of a new field called nanomedicine. Nanomedicine needs to overcome the challenges for its application, to improve the understanding of pathophysiologic basis of disease, bring more sophisticated diagnostic opportunities, and yield more effective therapies and preventive properties. When doctors gain access to medical robots, they will be able to quickly cure most known diseases that hobble and kill people today, to rapidly repair most physical injuries our bodies can suffer, and to vastly extend the human health span. Molecular technology is destined to become the core technology underlying all of 21st century medicine and dentistry. In this article, we have made an attempt to have an early glimpse on future impact of nanotechnology in medicine and dentistry.

Keywords: Nanodentistry, nanomedicine, nanoscience, nanotechnology


The world began without man, and it will complete itself without him. …Cloude Levi Strauss. Winfred Phillips, DSc, said, “You have to be able to fabricate things, you have to be able to analyze things, you have to be able to handle things smaller than ever imagined in ways not done before”.[1] Many researchers believed that in future, scientific devices that are dwarfed by dust mites may one day be capable of grand biomedical miracles.

The vision of nanotechnology introduced in 1959 by late Nobel Physicist Richard P Faynman in dinner talk said, “There is plenty of room at the bottom,”[2] proposed employing machine tools to make smaller machine tools, these are to be used in turn to make still smaller machine tools, and so on all the way down to the atomic level, noting that this is “a development which I think cannot be avoided”. He suggested nanomachines, nanorobots, and nanodevices ultimately could be used to develop a wide range of automically precise microscopic instrumentation and manufacturing tools, could be applied to produce a vast quantities of ultrasmall computers and various nanoscale microscale robots.

Feynman’s idea remained largely undiscussed until the mid-1980s, when the MIT educated engineer K Eric Drexler published “Engines of Creation”, a book to popularize the potential of molecular nanotechnology.[3]

Nano comes from the Greek word for dwarf, usually nanotechnology is defined as the research and development of materials, devices, and systems exhibiting physical, chemical, and biological properties that are different from those found on a larger scale (matter smaller than scale of things like molecules and viruses).[4]

Old rules don’t apply, small things behave differently. Researchers in nanoland are also making really, really small things with astonishing properties like the carbon nanotube. Chris Papadopoulos, a nanotechnology researcher says, “The carbon nanotube is the poster boy for nanotechnology”. It’s is a very thin sheet of graphite that’s formed into a tube, its strength can be harnessed by embedding them in constructive materials, among other applications, nanotubes may be part of future improvements for high-performance air craft.

In nanoland, tiny differences in size can add up to huge differences in function. Ted Sergent, author of The dance of Molecules, says matter is tunable at nanoscale. For example, change the length of a guitar string and you change the sound it makes; change the size of semiconductors called quantum dots, and you change their rainbow of colors from a single material. Sergent made a three-nanometric dot that ‘glows’ blue, and four nanometer dot that glows red and a five nanometer dot that emits infrared rays or heat.

Nanotechnology will affect everything, says William Atkinson, author of Nanoscom. Nanotechnology and the big changes coming from the inconceivably small. It’ll be like a blizzard; snowflakes whose weight you can’t detect can bring a city to a standstill. Nanotechnology is going to be like that.

The unique quantum phenomena that happen at the nanoscale, draw researchers from many different disciplines to the field, including medicine, chemistry, physics, engineering, and others (dentistry).

The scientists in the field of regenerative medicine and tissue engineering are continually looking for new ways to apply the principles of cell transplantation, material science, and bioengineering to construct biological substitutes that will restore and maintain normal function in diseased and injured tissue. Development of more refined means of delivering medications at therapeutic levels to specific sites is an important clinical issue, for applications of such technology in medicine, and dentistry.[5]


The field of “Nanomedicine” is the science and technology of diagnosing, treating, and preventing disease and traumatic injury, of relieving pain, and of preserving and improving human health, using nanoscale structured materials, biotechnology, and genetic engineering, and eventually complex machine systems and nonorobots.[5] It was perceived as embracing five main subdisciplines that in many ways are overlapping by common technical issues [Figure 1].

Figure 1

Dimensions in Nanomedicine


It is the use of nanodevices for the early disease identification or predisposition at cellular and molecular level. In in-vitro diagnostics, nanomedicine could increase the efficiency and reliability of the diagnostics using human fluids or tissues samples by using selective nanodevices, to make multiple analyses at subcellular scale, etc. In in vivo diagnostics, nanomedicine could develop devices able to work inside the human body in order to identify the early presence of a disease, to identify and quantify toxic molecules, tumor cells.

Regenerative medicine

It is an emerging multidisciplinary field to look for the reparation, improvement, and maintenance of cells, tissues, and organs by applying cell therapy and tissue engineering methods. With the help of nanotechnology it is possible to interact with cell components, to manipulate the cell proliferation and differentiation, and the production and organization of extracellular matrices.

Present day nanomedicine exploits carefully structured nanoparticles such as dendrimers, carbon fullerenes (buckyballs), and nanoshells to target specific tissues and organs. These nanoparticles may serve as diagnostic and therapeutic antiviral, antitumor, or anticancer agents. Years ahead, complex nanodevices and even nanorobots will be fabricated, first of biological materials but later using more durable materials such as diamond to achieve the most powerful results.[6]

The human body is comprised of molecules, hence the availablity of molecular nanotechnology will permit dramatic progress to address medical problems and will use molecular knowledge to maintain and improve human health at the molecular scale.

Applications in medicine

Within 10–20 years it should become possible to construct machines on the micrometer scale made up of parts on the nanometer scale. Subassemblies of such devices may include such as useful robotic components as 100 nm manipulater arms, 10 nm sorting rotors for molecule by molecule reagent purification, and smooth super hard surfaces made of automically flawless diamond.

Nanocomputers would assume the important task of activating, controlling, and deactivating such nanomechanical devices. Nanocomputers would store and execute mission plans, receive and process external signals and stimuli, communicate with other nanocomputers or external control and monitoring devices, and possess contextual knowledge to ensure safe functioning of the nanomechanical devices. Such technology has enormous medical and dental implications.

Programmable nanorobotic devices would allow physicians to perform precise interventions at the cellular and molecular level. Medical nanorobots have been proposed for genotological[7] applicatons in pharmaceuticals research,[8] clinical diagnosis, and in dentistry,[9] and also mechanically reversing atherosclerosis, improving respiratory capacity, enabling near-instantaneous homeostasis, supplementing immune system, rewriting or replacing DNA sequences in cells, repairing brain damage, and resolving gross cellular insults whether caused by irreversible process or by cryogenic storage of biological tissues.

Feynman offered the first known proposal for a nanorobotic surgical procedure to cure heart disease,[2] “A friend of mine (Albert R. Hibbs) suggests a very interesting possibility for relatively small machines. He says that, although it is a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and looks around. It finds out which valve is the faulty and takes a little knife and slices it out, that we can manufacture an object that maneuvers at that level, other small machines might be permanently incorporated in the body to assist some inadequately functioning organs”.[2]

Many disease causing culprits such as bacteria and viruses are nanosize. So, it only makes sense that nanotechnology would offer us ways of fighting back. The ancient greeks used silver to promote healing and prevent infection, but the treatment took backseat when antibiotics came on the scene. Nycryst pharmaceuticals (Canada) revived and improved an old cure by coating a burn and wound bandage with nanosize silver particles that are more reactive than the bulk form of metal. They penetrate into skin and work steadily. As a result, burn victims can have their dressings changed just once a week.

Genomics and protomics research is already rapidly elucidating the molecular basis of many diseases. This has brought new opportunities to develop powerful diagnostic tools able to identify genetic predisposition to diseases. In the future, point of care diagnosis will be routinely used to identify those patients requiring preventive medication to select the most appropriate medication for individual patients, and to monitor response to treatment. Nanotechnology has a vital role to play in realizing cost-effective diagnostic tools.

Chris Backous developing Lab–on-Chip to give doctor immediate results from medical tests for cancer and viruses, it gets its information by analyzing the genetic material in individual cells. Advances in gene sequencing mean this can now be done quickly and sequencing with tiny samples of body fluids or tissues such as blood, bone marrow, or tumors. The device can also detect the BK virus, a sign of trouble in patients who have had kidney transplants. Ultimately (Pilarski thinks,) chip technology will be able to detect what kind of flu a person has, or, even if they have SARS or HIV.

Nanotechnology has the potential to offer invaluable advances such as use of nanocoatings to slow the release of asthma medication in the lungs, allowing people with asthma to experience longer periods of relief from symptoms after using inhalants. Thus, what nanotechnology tries to do is essentially make drug particles in such a way, that they don’t dissolve that fast, done this with.

Nanosensors developed for military use in recognizing airborne rogue agents and chemical weapons to detect drugs and other substances in exhaled breath.[1] Basically, you can detect many drugs in breath, but the amount you detect in breath is going to be related to the amount that you take and also to whether it partitions well between the blood and the breath. Drug abuse like marijuna (and things like), concentration of alcohol, testing of athletes for banned substances, and individual’s drug treatment programs are two areas long overdue for breath detection technologies. We see this in future totally replacing urine testing.

Currently, most legal and illegal drug overdoses have no specific way to be effectively neutralized, using nanoparticles as absorbents of toxic drugs, is another area of medical nanoscience that is rapidly gaining momentum. Goal is design nanostructures that effectively bind molecular entities, which currently don’t have effective treatments. We are putting nanosponges into the blood stream and they are soaking up toxic drug molecules to reduce the free amount in the blood, in turn, causes a resolution of the toxicity that was there before you put the nanosponges into the blood.

French and Italian researchers have come up with a completely new approach to render anticancer and antiviral nucleoside analoges significantly more potent. By linking the nucleoside analoges to sequalene, a biochemical precursor to the whole family of steroids, the researchers observed the self-organization of amphiphilic molecules in water. These nanoassemblies exhibited superior anticancer activity in vitro in human cancer cells.

Laurie B Gower, PhD, has been researching bone formation and structure at the nanoscale level. She is examining biomimetic methods of constructing a synthetic bone graft substitute with a nanostructured architecture that matches natural bone so that it would be accepted by the body and guide the cells toward the mending of damaged bones. Biomineralization refers to minerals that are formed biologically, which have very different properties than geological minerals or lab-formed crystals. The crystal properties found in bone are manipulated at nanoscale and are imbedded within collagen fibers to create an interpenetrating organic–inorganic composite with unique mechanical properties. She foresees numerous implications of the material in the future of osteology.

Hichan Fenniri, a chemistry professor, tried to make artificial joints act more like natural ones. Fenniri has made a nanotube coating for titanium hip or knee, is very good mimic of collagen, as a result of coating attracts and attaches more bone cells, osteoblasts, which help in bone growth quickly than uncoated hip or knee.

There is ongoing attempts to build ‘medical microrobots’ for in vivo medical use.[10] In 2002, Ishiyama et al,[11] at Tohku University developed tiny magnetically driven spinning screws intended to swim along veins and carry drugs to infected tissues or even to burrow into tumors and kill them with heat. In 2005, Brad Nelson’s[12] team reported the fabrication of a microscopic robot, small enough (approximately 200 µm) to be injected into the body through a syringe. They hope that this device or its descendants might someday be used to deliver drugs or perform minimally invasive eye surgery. Gorden’s[9,13] group at the University of Manitoba has also proposed magnetically controlled ‘cytobots’ and ‘karyobots’ for performing wireless intracellular and intranuclear surgery.

‘Respirocytes’, the first theoreotical design study of a complete medical nanorobot ever published in peer-reviewed journal described a hypothetical artificial mechanical red blood cell or ‘respirocyte’ made of 18 billion precisely arranged structural atoms.[10,14] The respirocyte is a bloodborne spherical 1 µm diamondedoid 1000 atmosphere pressure vessel with reversible molecule selective surface pumps powered by endogenous serum glucose. This nanorobot would deliver 236 times more oxygen to body tissues per unit volume than natural red cells and would manage carbonic acidity, controlled by gas concentration sensors and an onboard nanocomputer.

Nanorobotic microbivores

Artificial phagocytes called microbivores could patrol the bloodstream, seeking out and digesting unwanted pathogens including bacteria, viruses, or fungi.[10,15] Microbivores would achieve complete clearance of even the most severe septicemic infections in hours or less. The nanorobots do not increase the risk of sepsis or septic shock because the pathogens are completely digested into harmless sugars, amino acids, and the like, which are the only effluents from the nanorobot.

Surgical nanorobotics

A surgical nanorobot, programmed or guided by a human surgeon, could act as a semiautonomous on site surgeon inside the human body, when introduced into the body through vascular system or cavities. Such a device could perform various functions such as searching for pathology and then diagnosing and correcting lesions by nanomanipulation, coordinated by an onboard computer while maintaining contact with the supervising surgeon via coded ultrasound signals.[10]

The earliest forms of cellular nanosurgery are already being explored today. For example, rapidly vibrating (100 Hz) micropipette with a <1 µm tip diameter has been used to completely cut dentrites from single neurons without damaging cell viability.[16] Axotomy of roundworm neurons was performed by femtosecond laser surgery, after which the axons functionally regenerated.[17] Femtolaser acts like a pair of nanoscissors by vaporizing tissue locally while leaving adjacent tissue unharmed. Femtolaser surgery has performed the individual chromosomes.[18]


They could make new class of self-powered implantable medical devices, sensors, and portable electronics, by converting mechanical energy from body movement, muscle stretching, or water flow into electricity.

Nanogenerators produce electric current by bending and then releasing zinc oxide nanowires, which are both piezoelectric and semiconducting. Nanowires can be grown on polymer-based films, use of flexible polymer substrates could one day allow portable devices to be powered by movement of their users.

“Our bodies are good at converting chemical energy from glucose into the mechanical energy of our muscles,” Wang (faculty at Peking University and National Center for Nanoscience and Technology of China) explained “these nanogenerators can take mechanical energy and convert it to electrical energy for powering devices inside the body. This could open up tremendous possibilities for self-powered implantable medical devices.”


Nanodentistry will make possible the maintenance of comprehensive oral health by employing nanomaterials, biotechnology, including tissue engineering, and ultimately, dental nanorobotics. New potential treatment opportunities in dentistry may include, local anesthesia, dentition renaturalization, permanent hypersensitivity cure, complete orthodontic realignments during a single office visit, covalently bonded diamondised enamel, and continuous oral health maintenance using mechanical dentifrobots.

When the first micro-size dental nanorobots can be constructed, dental nanorobots might use specific motility mechanisms to crawl or swim through human tissue with navigational precision, acquire energy, sense, and manipulate their surroundings, achieve safe cytopenetration and use any of the multitude techniques to monitor, interrupt, or alter nerve impulse traffic in individual nerve cells in real time.

These nanorobot functions may be controlled by an onboard nanocomputer that executes preprogrammed instructions in response to local sensor stimuli. Alternatively, the dentist may issue strategic instructions by transmitting orders directly to in vivo nanorobots via acoustic signals or other means.

Inducing anesthesia

One of the most common procedure in dental practice, to make oral anesthesia, dental professionals will instill a colloidal suspension containing millions of active analgesic micron-sized dental nanorobot ‘particles’ on the patient’s gingivae. After contacting the surface of the crown or mucosa, the ambulating nanorobots reach the dentin by migrating into the gingival sulcus and passing painlessly through the lamina propria or the 1–3-micron thick layer of loose tissue at the cementodentinal junction. On reaching dentin, the nanorobots enter dentinal tubules holes that are 1–4 microns in diameter and proceed toward the pulp, guided by a combination of chemical gradients, temperature differentials, and even positional navigation, all under the control of the onboard nanocomputer as directed by the dentist.[9]

There are many pathways to choose from, near to CEJ, midway between junction and pulp, and near to pulp. Tubules diameter increases as it nears the pulp, which may facilitate nanorobot movement, although circumpulpal tubule openings vary in numbers and size (tubules number density 22,000 mm DEJ, 37,000 mm square midway, ans 48000 mm square near to pulp). Tubules branching patterns, between primary and irregular secondary dentin, regular secondary dentin in young and old teeth (sclerosing) may present a significant challenge to navigation.

The presence of natural cells that are constantly in motion around and inside the teeth including human gingival and pulpal fibroblasts, cementoblasts of the CDJ, bacteria inside dentinal tubules, odontoblasts near the pulp dentin border, and lymphocytes within the pulp or lamina propria suggested that such journey should be feasible by cell-sized nanorobots of similar mobility.

Once installed in the pulp and having established control over nerve impulse traffic, the analgesic dental nanorobots may be commanded by the dentist to shut down all sensitivity in any particular tooth that requires treatment. When on the hand-held controller display, the selected tooth immediately becomes numb. After the oral procedures completed, the dentist orders the nanorobots to restore all sensation, to relinguish control of nerve traffic and to engress, followed by aspiration. Nanorobotic analgesics offer greater patient comfort and reduced anxiety, no needles, greater selectivity, and controllability of the analgesic effect, fast and completely reversible switchable action and avoidance of most side effects and complications.

Tooth repair

Nanorobotic manufacture and installation of a biologically autologous whole replacement tooth that includes both mineral and cellular components, that is, ‘complete dentition replacement therapy’ should become feasible within the time and economic constraints of a typical office visit through the use of an affordable desktop manufacturing facility, which would fabricate the new tooth in the dentist’s office.

Chen et al[19] took advantage of these latest developments in the area of nanotechnology to simulate the natural biomineralization process to create the hardest tissue in the human body, dental enamel, by using highly organized microarchitectural units of nanorod-like calcium hydroxyapatite crystals arranged roughly parallel to each other.

Dentin hypersensitivity

Natural hypersensitive teeth have eight times higher surface density of dentinal tubules and diameter with twice as large than nonsensitive teeth. Reconstructive dental nanorobots, using native biological materials, could selectively and precisely occlude specific tubules within minutes, offering patients a quick and permanent cure.[9]

Tooth repositioning

Orthodontic nanorobots could directly manipulate the periodontal tissues, allowing rapid and painless tooth straightening, rotating and vertical repositioning within minutes to hours.

Tooth renaturalization

This procedure may become popular, providing perfect treatment methods for esthetic dentistry. This trend may begin with patients who desire to have their (1) old dental amalgams excavated and their teeth remanufactured with native biological materials, and (2) full coronal renaturalization procedures in which all fillings, crowns, and other 20th century modifications to the visible dentition are removed with the affected teeth remanufactured to become indistinguishable from original teeth.

Dental durability and cosmetics

Durability and appearance of tooth may be improved by replacing upper enamel layers with covalently bonded artificial materials such as sapphire or diamond,[20] which have 20–100 times the hardness and failure strength of natural enamel or contemporary ceramic veneers and good biocompatibility. Pure sapphire and diamond are brittle and prone to fracture, can be made more fracture resistant as part of a nanostructured composite material that possibly includes embedded carbon nanotubes.

Nanorobotic dentifrice (dentifrobots) delivered by mouthwash or toothpaste could patrol all supragingival and subgingival surfaces at least once a day metabolizing trapped organic mater into harmless and odorless vapors and performing continous calculus debridement.

Properly configured dentifrobots could identify and destroy pathogenic bacteria residing in the plaque and elsewhere, while allowing the 500 species of harmless oral microflora to flourish in a healthy ecosystem. Dentifrobots also would provide a continous barriers to halitosis, since bacterial putrification is the central metabolic process involved in oral malodor. With this kind of daily dental care available from an early age, conventional tooth decay and gingival deseases will disappear into the annals of medical history.

Potential benefits of nanotechnology are its ability to exploit the atomic or molecular properties of materials and the development of newer materials with better properties. Nanoproducts can be made by: building-up particles by combining atomic elements and using equipments to create mechanical nanoscale objects.

Nanotechnology has improved the properties of various kinds of fibers.[21] Polymer nanofibers with diameters in the nanometer range, possess a larger surface area per unit mass and permit an easier addition of surface functionalities compared to polymer microfibers.[21,22] Polymer nanofiber materials have been studied as drug delivery systems, scaffolds for tissue engineering and filters. Carbon fibers with nanometer diamensions showed a selective increase in osteoblast adhesion necessary for successful orthopedic/dental implant applications due to a high degree of nanometer surface roughness.[23]

Nonagglomerated discrete nanoparticles are homogenously manufactured in resins or coatings to produce nanocomposites. The nanofiller used include an aluminosilicate powder having a mean particles size of about 80 nm and 1:4 M ratio of alumina to silica. Advantages – superior hardness, flexible strength, modulus of elasticity, translucency and esthetic appeal, excellent color density, high polish, and polish retention, and excellent handling properties.[24] (Filtek O supreme Univrasl Restorative Pure Nano O).

Heliometer, microfilled composite resin, a close examination of this composite suggests that a form of nanotechnology was in use years ago, yet never recognized.

Nanosolutions produce unique and dispersible nanoparticles that can be added to various solvents, paints, and polymers in which they are dispersed homogenously. Nanotechnology in bonding agents ensures homogeneity and so the operator can now be totally confident that the adhesive is perfectly mixed every time.

Nanofillers are integrated in the vinylsiloxanes, producing a unique addition siloxane impression material. Better flow, improved hydrophilic properties, hence fewer voids at margin and better model pouring, enhanced detail precision.[25]


Nanotechnology is part of a predicted future in which dentistry and periodontal practice may become more high-tech and more effective looking to manage individual dental health on a microscopic level by enabling us to battle decay where it begins with bacteria. Construction of a comprehensive research facility is crucial to meet the rigorous requirements for the development of nanotechnologies.

Researchers are looking at ways to use microscopic entities to perform tasks that are now done by hand or with equipment. This concept is known as nanotechnology. Tiny machines, known as nanoassemblers, could be controlled by computer to perform specialized jobs. The nanoassemblers could be smaller than a cell nucleus so that they could fit into places that are hard to reach by hand or with other technology. Used to destroy bacteria in the mouth that cause dental caries or even repair spots on the teeth where decay has set in, by use of computer to direct these tiny workers in their tasks.

Nanotechnology has tremendous potential, but social issues of public acceptance, ethics, regulation, and human safety must be addressed before molecular nanotechnology can be seen as the possibility of providing high quality dental care to the 80% of the world’s population that currently receives no significant dental care.

Role of periodontitis will continue to evolve along the lines of currently visible trends. For example, simple self-care neglect will become fewer, while cases involving cosmetic procedures, acute trauma, or rare disease conditions will become relatively more commonplace.

Trends in oral health and disease also may change the focus on specific diagnostic and treatment modalities. Increasingly preventive approaches will reduce the need for cure prevention a viable approach for the most of them.

Diagnosis and treatment will be customized to match the preferences and genetics of each patient. Treatment options will become more numerous and exciting. All this will demand, even more so than today, the best technical abilities, professional skills that are the hallmark of the contemporary dentist and periodontist. Developments are expected to accelerate significantly.

Nanometers and nanotubes, technologies could be used to administer drugs more precisely. Technology should be able to target specific cells in a patient suffering from cancer or other life-threatening conditions. Toxic drugs used to fight these illnessess would become much more direct and consequently less harmful to the body.


The visions described in this article may sound unlikely, implausible, or even heretic. Yet, the theoretical and applied research to turn them into reality is progressing rapidly. Nanotechnology will change dentistry, healthcare, and human life more profoundly than many developments of the past. As with all technologies, nanotechnology carries a significant potential for misuse and abuse on a scale and scope never seen before. However, they also have potential to bring about significant benefits, such as improved health, better use of natural resources, and reduced environmental pollution. These truly are the days of miracle and wonder.

Current work is focused on the recent developments, particularly of nanoparticles and nanotubes for periodontal management, the materials developed from such as the hollow nanospheres, core shell structures, nanocomposites, nanoporous materials, and nanomembranes will play a growing role in materials development for the dental industry.

Once nanomechanics are available, the ultimate dream of every healer, medicine man and physician throughout recorded history will, at last become a reality. Programmable and controllable microscale robots comprised of nanoscale parts fabricated to nanometer precision will allow medical doctors to execute curative and reconstructive procedures in the human body at the cellular and molecular levels. Nanomedical physicians of the 21st century will still make good use of the body’s natural healing powers and homeostatic mechanisms, because all else equal, those interventions are best that intervene least.


Source of Support: Nil

Conflict of Interest: None declared.


  1. Rocco Castoro. U F expects big things from the science of small, nanotechnology. Think Small. The POST 02-2005.
  2. Feynman RP. There’s plenty of room at the bottom. Eng Sci. 1960;23:22–36.
  3. Drexler KE. New era of nanotechnology. New York: Anchor Press; 1986. Engines of creation: The coming era of nanotechnology; pp. 99–129.
  4. Freitas RA., Jr . Basic capabilities. Vol 1. Texas: Landes Bioscience; 1999. Nanomedicine. Available from: http//[last accessed on 2000 Sep 26] Georgetown.
  5. European Science Foundation. Nanomedicine. Forward look on Nanomedicine. 2005
  6. Frietas RA., Jr Current status of nanomedicine and medical nanorobotics. J Comut Ther Nanosci.2005;2:1–25.
  7. Fahy GM. Short-term and long term possibilities for interventive gerontology. Mt Sinai J Med.1991;58:328–40. [PubMed]
  8. Fahy GM. Molecular nanotechnology and its possible pharmaceutical implications. In: Bezold C, Halperin JA, Eng JL, editors. 2020 visions: Health care information standards and technologies. Rockville, MD: U.S Pharmacopenial Convention; 1993. pp. 152–9.
  9. Freitas RA., Jr Nanodentistry. J Am Dent Assoc. 2000;131:1559–66. [PubMed]
  10. Freitas R., Jr Nanotechnology, nanomedicine and nanosurgery. Int J Surg. 2005;3:243–6. [PubMed]
  11. Ishiyama K, Sendoh M, Arai KI. Magnetic micromachines for medical applications. J Magn Mater.2002;242:1163–5.
  12. Nelson B, Rajamani R. Biomedical micro-robotic system. In: Eighth international conference on medical image computing and computer assised intervention. MICCAI; 2005; Available from:http://www.miccai2005.orgPalm Springs CA: 26-29 Otober 2005.
  13. Chrusch DD, Podaima BW, Gordon R. Cytobots: Intracellular robotic micromanipulators. In: Kinsner W, Sebak A, editors. Conference proceedings, 2002 IEEE Canadian conference on electrical and computer engineering; 2002 May 12-15; Winnipeg, Canada. Winnipeg: IEEE; 2002.
  14. Freitas RA., Jr Exploratory design in medical nanotechnology: A mechanical artificial red cell. Artif Cells Blood Substit Immobil Biotechnol. 1998;26:411–30. [PubMed]
  15. Freitas RA., Jr Microbivores: Artificial mechanical phagocytes using digest and discharge protocol. J Evol Technol. 2005;14:1–52.
  16. Kirson ED, Yaari Y. A novel technique for micro-dissection of neuronal processes. J Neurosci Methods.2000;98:119–22. [PubMed]
  17. Yanik MF, Cinar H, Cinar HN, Chisholm AD, Jin Y, Ben-Yakar A. Neurosurgery: functional regeneration after laser axotomy. Nature. 2004;432:822. [PubMed]
  18. Konig K, Riemann I, Fischer P, Halbhuber KJ. Intracellular nanosurgery with near infrared femtosecond laser pulses. Cell Mol Biol. 1999;45:195–201. [PubMed]
  19. Chen HF, Clarkson BH, Sunk, Mansfield JF. Self assembly of synthetic hydroxyaptite nanorods into enamel prism like structure. J Colloid Interf Sci. 2005;188:97–103. [PubMed]
  20. Yunshin S, Park HN, Kim KH. Biologic evaluation of Chitosan Nanofiber Membrane for guided bone regeneration. J Periodontol. 2005;76:1778–84. [PubMed]
  21. Reifman EM. Diamond teeth. In: Crandall BC, editor. Nanotechnology: Molecular speculations on global abundance. Cambridge, Mass: MIT Press; 1996. pp. 81–6.
  22. Jayraman K, Kotaki M, Zhang Y, Mox, Ramakrishna S. Recent advances in Polymer nanofibers. J Nanosci Nanotechnol. 2004;4:52–65. [PubMed]
  23. Katti DS, Robinson KW, Ko FK, Laurenci CT. Bioresorbable nanofiber based systems for wound healing and drug delivery: Optimisation of fabrication parameters. J Biomed Mater Res. 2004;70:282–96.[PubMed]
  24. Price RL, Ellison K, Haberstroh KM, Webster TJ. Nano-meter surface roughness increases select osteoblasts adhesion on carbon nanofiber compacts. J Biomed Mater Res. 2004;70:129–38. [PubMed]
  25. Nano A. The A to Z of nanotechnology And nanomaterials. The Institute of nanotechnology, Azom Co Ltd; 2003.

The MIT-Harvard Center for Cancer Nanotechnology Excellence is a collaborative effort among MIT, Harvard University, Harvard Medical School, Massachusetts General Hospital, and Brigham and Women’s Hospital. It is one of eight Centers of Cancer Nanotechnology Excellence awarded by The National Cancer Institute (NCI), part of the National Institutes of Health (NIH). It focuses on developing a diversified portfolio of nanoscale devices for targeted delivery of cancer therapies, diagnostics, non-invasive imaging, and molecular sensing. In addition to general oncology applications, the Consortium focuses on prostate, brain, lung, ovarian, and colon cancer.

Examples of projects that the Consortium is undertaking include the development of:

  • Targeted nanoparticles for treating prostate cancer
  • Polymer nanoparticles and quantum dots for siRNA delivery
  • Next-generation magnetic nanoparticles for multimodal, non-invasive tumor imaging
  • Implantable, biodegradable microelectromechanical systems (MEMS), also known as lab-on-a-chip devices, for in vivo molecular sensing of tumor-associated biomolecules
  • Low-toxicity nanocrystal quantum dots for biomedical sensing

In addition to drawing on the scientific and technological expertise of its investigators, the Consortium uses available facilities for toxicology testing and the extensive mouse models of cancer collection at the collaborating institutions.

  1.  Nanotechnology and CancerNanotechnology is one of the most popular areas of scientific research, especially with regard to medical applications. We’ve already discussed some of the new detection methods that should bring about cheaper, faster and less invasive cancer diagnoses. But once the diagnosis occurs, there’s still the prospect of surgery, chemotherapy or radiation treatment to destroy the cancer. Unfortunately, these treatments can carry serious side effects. Chemotherapy can cause a variety of ailments, including hair loss, digestive problems, nausea, lack of energy and mouth ulcers.But nanotechnologists think they have an answer for treatment as well, and it comes in the form o ftargeted drug therapies. If scientists can load their cancer-detecting gold nanoparticles with anticancer drugs, they could attack the cancer exactly where it lives. Such a treatment means fewer side effects and less medication used. Nanoparticles also carry the potential for targeted and time-release drugs. A potent dose of drugs could be delivered to a specific area but engineered to release over a planned period to ensure maximum effectiveness and the patient’s safety.These treatments aim to take advantage of the power of nanotechnology and the voracious tendencies of cancer cells, which feast on everything in sight, including drug-laden nanoparticles. One experiment of this type used modified bacteria cells that were 20 percent the size of normal cells. These cells were equipped with antibodies that latched onto cancer cells before releasing the anticancer drugs they contained.Another used nanoparticles as a companion to other treatments. These particles were sucked up by cancer cells and the cells were then heated with a magnetic field to weaken them. The weakened cancer cells were then much more susceptible to chemotherapy.It may sound odd, but the dye in your blue jeans or your ballpoint pen has also been paired with gold nanoparticles to fight cancer. This dye, known as phthalocyanine, reacts with light. The nanoparticles take the dye directly to cancer cells while normal cells reject the dye. Once the particles are inside, scientists “activate” them with light to destroy the cancer. Similar therapies have existed to treat skin cancers with light-activated dye, but scientists are now working to use nanoparticles and dye to treat tumors deep in the body.From manufacturing to medicine to many types of scientific research, nanoparticles are now rather common, but some scientists have voiced concerns about their negative health effects. Nanoparticles’ small size allows them to infiltrate almost anywhere. That’s great for cancer treatment but potentially harmful to healthy cells and DNA. There are also questions about how to dispose of nanoparticles used in manufacturing or other processes. Special disposal techniques are needed to prevent harmful particles from ending up in the water supply or in the general environment, where they’d be impossible to track.Gold nanoparticles are a popular choice for medical research, diagnostic testing and cancer treatment, but there are numerous types of nanoparticles in use and in development. Bill Hammack, a professor of chemical engineering at the University of Illinois, warned that nanoparticles are “technologically sweet” [Source: Marketplace]. In other words, scientists are so wrapped up in what they can do, they’re not asking if they should do it. The Food and Drug Administration has a task force on nanotechnology, but as of yet, the government has exerted little oversight or regulation.
  2. The U.S. Food and Drug Administration (FDA)regulates a wide range of products, including foods, cosmetics, drugs, devices, veterinary products, and tobacco products some of which may utilize nanotechnology or contain nanomaterials. Nanotechnology allows scientists to create, explore, and manipulate materials measured in nanometers (billionths of a meter).  Such materials can have chemical, physical, and biological properties that differ from those of their larger counterparts.Guidance documents issued
    • On June 24, 2014, FDA issued three final guidance documentsrelated to the use of nanotechnology in regulated products,incuding cosmetics and food substances.
    • On August 5, 2015, FDA issued one final guidance documentrelated to the use of nanotechnology in food for animals.
      • FDA Guidance on Nanotechnology
        1. Nanotechnology Fact Sheet
        2. FDA issues three final guidances related to nanotechnology applications in regulated products, including cosmetics and food substances (June 2014)
        3. FDA issues final guidance on the use of nanotechnology in food for animals (August 2015)
        4. Nanotechnology in TherapeuticsA Focus on Nanoparticles as a Drug Delivery SystemSuwussa Bamrungsap; Zilong Zhao; Tao Chen; Lin Wang; Chunmei Li; Ting Fu; Weihong TanDisclosuresNanomedicine. 2012;7(8):1253-1271.AbstractContinuing improvement in the pharmacological and therapeutic properties of drugs is driving the revolution in novel drug delivery systems. In fact, a wide spectrum of therapeutic nanocarriers has been extensively investigated to address this emerging need. Accordingly, this article will review recent developments in the use of nanoparticles as drug delivery systems to treat a wide variety of diseases. Finally, we will introduce challenges and future nanotechnology strategies to overcome limitations in this field.IntroductionNanotechnology involves the engineering of functional systems at the molecular scale. Such systems are characterized by unique physical, optical and electronic features that are attractive for disciplines ranging from materials science to biomedicine. One of the most active research areas of nanotechnology is nanomedicine, which applies nanotechnology to highly specific medical interventions for the prevention, diagnosis and treatment of diseases.[1,2,401] The surge in nanomedicine research during the past few decades is now translating into considerable commercialization efforts around the globe, with many products on the market and a growing number in the pipeline. Currently, nanomedicine is dominated by drug delivery systems, accounting for more than 75% of total sales.[3]

          Nanomaterials fall into a size range similar to proteins and other macromolecular structures found inside living cells. As such, nanomaterials are poised to take advantage of existing cellular machinery to facilitate the delivery of drugs. Nanoparticles (NPs) containing encapsulated, dispersed, absorbed or conjugated drugs have unique characteristics that can lead to enhanced performance in a variety of dosage forms. When formulated correctly, drug particles are resistant to settling and can have higher saturation solubility, rapid dissolution and enhanced adhesion to biological surfaces, thereby providing rapid onset of therapeutic action and improved bioavailability. In addition, the vast majority of molecules in a nanostructure reside at the particle surface,[4] which maximizes the loading and delivery of cargos, such as therapeutic drugs, proteins and polynucleotides, to targeted cells and tissues. Highly efficient drug delivery, based on nanomaterials, could potentially reduce the drug dose needed to achieve therapeutic benefit, which, in turn, would lower the cost and/or reduce the side effects associated with particular drugs. Furthermore, NP size and surface characteristics can be easily manipulated to achieve both passive and active drug targeting. Site-specific targeting can be achieved by attaching targeting ligands, such as antibodies or aptamers, to the surface of particles, or by using guidance in the form of magnetic NPs. NPs can also control and sustain release of a drug during transport to, or at, the site of localization, altering drug distribution and subsequent clearance of the drug in order to improve therapeutic efficacy and reduce side effects.

          Nanotechnology could be strategically implemented in new developing drug delivery systems that can expand drug markets. Such a plan would be applied to drugs selected for full-scale development based on their safety and efficacy data, but which fail to reach clinical development because of poor biopharmacological properties, for example, poor solubility or poor permeability across the intestinal epithelium, situations that translate into poor bioavailability and undesirable pharmacokinetic properties.[5] The new drug delivery methods are expected to enable pharmaceutical companies to reformulate existing drugs on the market, thereby extending the lifetime of products and enhancing the performance of drugs by increasing effectiveness, safety and patient adherence, and ultimately reducing healthcare costs.[6–8]

          Commercialization of nanotechnology in pharmaceutical and medical science has made great progress. Taking the USA alone as an example, at least 15 new pharmaceuticals approved since 1990 have utilized nanotechnology in their design and drug delivery systems. In each case, both product development and safety data reviews were conducted on a case-by-case basis, using the best available methods and procedures, with an understanding that postmarketing vigilance for safety issues would be ongoing. Some representative examples of therapeutic nanocarriers on the market are briefly described in Table 1.

          In this review, we focus mainly on the application of nanotechnology to drug delivery and highlight several areas of opportunity where current and emerging nanotechnologies could enable novel classes of therapeutics. We look at challenges and general trends in pharmaceutical nanotechnology, and we also explore nanotechnology strategies to overcome limitations in drug delivery. However, this article can only serve to provide a glimpse into this rapidly evolving field, both now and what may be expected in the future.

          Nanocarriers & Their Applications

          Various nanoforms have been attempted as drug delivery systems, varying from biological substances, such as albumin, gelatin and phospholipids for liposomes, to chemical substances, such as various polymers and solid metal-containing NPs (Figure 1). Polymer–drug conjugates, which have high size variation, are normally not considered as NPs. However, since their size can still be controlled within 100 nm, they are also included in these nanodelivery systems. These nanodelivery systems can be designed to have drugs absorbed or conjugated onto the particle surface, encapsulated inside the polymer/lipid or dissolved within the particle matrix. As a consequence, drugs can be protected from a critical environment or their unfavorable biopharmaceutical properties can be masked and replaced with the properties of nanomaterials. In addition, nanocarriers can be accumulated preferentially at tumor, inflammatory and infectious sites by virtue of the enhanced permeability and retention (EPR) effect. The EPR effect involves site-specific characteristics, not associated with normal tissues or organs, thus resulting in increased selective targeting. Based on those properties, nanodrug delivery systems offer many advantages,[9–11] including:

          (Enlarge Image)

          Figure 1.

          Some nanotechnology-based drug delivery platforms, including a nanocrystal, liposome, polymeric micelle, protein-based nanoparticle, dendrimer, carbon nanotube and polymer–drug conjugate.
          NP: Nanoparticle.

          • Improving the stability of hydrophobic drugs, rendering them suitable for administration;
          • Improving biodistribution and pharmacokinetics, resulting in improved efficacy;
          • Reducing adverse effects as a consequence of favored accumulation at target sites;
          • Decreasing toxicity by using biocompatible nanomaterials.

          By adopting nanotechnology, fundamental changes in drug production and delivery are expected to affect approximately half of the worldwide drug production in the next decade, totaling approximately US$380 billion in revenue.[12] Next, several main nanocarriers are briefly discussed.


          One of the most obvious and important nanotechnology tools for product development is the opportunity to convert existing drugs with poor water solubility and dissolution rate into readily water-soluble dispersions by converting them into nanosized drugs.[13,14] In other words, the drug itself may be formulated at a nanoscale such that it can function as its own ‘carrier’.[15] Many approaches have been studied, but the most practical strategy involves reducing the drug particle size to nanometer range and stabilizing the drug NP surface with a layer of nonionic surfactants or polymeric macromolecules.[16] By reducing the particle size of the active pharmaceutical ingredient, the drug’s surface area is increased considerably, thereby improving its solubility and dissolution and consequently increasing both the maximum plasma concentration and area under the curve. Once the drug is nanosized, it can be formulated into various dosage forms, such as oral, nasal and injectable. These nanocrystal drugs may have advantages over association colloids (micelle solutions) because the level of surfactant per amount of drug can be greatly minimized, using only the amount that is necessary to stabilize the solid–fluid interface.[15]

          Furthermore, recent studies have shown that external agents, such as surfactants, for nanocrystal drug delivery can be eliminated. For example, a method was recently developed for the delivery of a hydrophobic photosensitizing anticancer drug in its pure form using nanocrystals.[17] Synthesized by the reprecipitation method, the resulting drug nanocrystals were stable in aqueous dispersion, without the necessity of any additional stabilizer. These nanocrystals are uniform in size distribution with an average diameter of 110 nm. Such nanocrystals were efficiently taken up by tumor cells in vitro, and irradiation of such cells with visible light (665 nm) resulted in significant cell death. An in vivo study of the nanocrystal drug also showed significant efficacy compared with the conventional surfactant-based delivery system. These results illustrate the potential of pure drug nanocrystals for photodynamic therapy. As shown in Table 1 , a number of well-known drugs have already been commercialized using the nanocrystal approach.

          Organic Nanoplatforms

          Liposomes Liposomes are self-assembled artificial vesicles developed from amphiphilic phospholipids. These vesicles consist of a spherical bilayer structure surrounding an aqueous core domain, and their size can vary from 50 nm to several micrometers. Liposomes have attractive biological properties, including general biocompatibility, biodegradability, isolation of drugs from the surrounding environment and the ability to entrap both hydrophilic and hydrophobic drugs. Through the addition of agents to the lipid membrane, or the alteration of the surface chemistry, liposome properties, such as size, surface charge and functionality, can be easily tuned.

          Liposomes are the most clinically established nanosystems for drug delivery. Their efficacy has been demonstrated in reducing systemic effects and toxicity, as well as in attenuating drug clearance.[18,19]Modified liposomes at the nanoscale have been shown to have excellent pharmacokinetic profiles for the delivery of DNA, antisense oligonucleotide, siRNA, proteins and chemotherapeutic agents.[20]Examples of marketed liposomal drugs with higher efficacy and lower toxicity than their nonliposomal analogues are listed in Table 1 . Doxorubicin is an anticancer drug that is widely used for the treatment of various types of tumors. It is a highly toxic compound affecting not only tumor tissue, but also heart and kidney, a fact that limits its therapeutic applications. However, the development of doxorubicin enclosed in liposomes culminated in an approved nanomedical drug delivery system.[21,22] This novel liposomal formulation has resulted in reduced delivery of doxorubicin to the heart and renal system, while elevating the accumulation in tumor tissue[23,24] by the EPR effect. Furthermore, a number of liposomal drugs are currently being investigated, including anticancer agents, such as camptothecin[25]and paclitaxel (PTX),[26] as well as antibiotics, such as vancomycin[27] and amikacin.[28]

          Liposomes are also subject to some limitations, including low encapsulation efficiency, fast burst release of drugs, poor storage stability and lack of tunable triggers for drug release.[29] Furthermore, since liposomes cannot usually permeate cells, drugs are released into the extracellular fluid.[30] As such, many efforts have focused on improving their stability and increasing circulation half-life for effective targeting or sustained drug action.[19,31] Surface modification is one method of conferring stability and structural integrity against a harsh bioenvironment after oral or parenteral administration.[32] Surface modification can be achieved by attaching polyethylene glycol (PEG) units, which form a protective layer over the liposome surface (known as stealth liposomes) to slow down liposome recognition, or by attaching other polymers, such as poly(methacrylic acid-co-cholesteryl methacrylate)[33] and poly(actylic acid),[34] to improve the circulation time of liposomes in blood. To overcome the fast burst release of the chemotherapeutic drugs from liposomes, drugs such as doxorubicin may be encapsulated in the liposomal aqueous phase by an ammonium sulphate gradient.[35] This strategy enables stable drug entrapment with negligible drug leakage during circulation, even after prolonged residence in the blood stream.[36] Further efforts to improve control over the rate of release and drug bioavailability have been made by designing liposomes whose release is environmentally triggered. Accordingly, the drug release from liposome-responsive polymers, or hydrogel, is triggered by a change in pH, temperature, radiofrequency or magnetic field.[37] Liposomes have also been conjugated with active-targeting ligands, such as antibodies[38–40] or folate, for target-specific drug delivery.[41]

          Polymeric NPs Polymeric NPs are colloidal particles with a size range of 10–1000 nm, and they can be spherical, branched or core–shell structures. They have been fabricated using biodegradable synthetic polymers, such as polylactide–polyglycolide copolymers, polyacrylates and polycaprolactones, or natural polymers, such as albumin, gelatin, alginate, collagen and chitosan.[42] Various methods, such as solvent evaporation, spontaneous emulsification, solvent diffusion, salting out/emulsification-diffusion, use of supercritical CO2 and polymerization, have been used to prepare the NPs.[43]Advances in polymer science and engineering have resulted in the development of smart polymer (stimuli-sensitive polymer), which can change its physicochemical properties in response to environmental signals. Physical (temperature, ultrasound, light, electricity and mechanical stress), chemical (pH and ionic strength) and biological signals (enzymes and biomolecules) have been used as triggering stimuli. Various monomers having sensitivity to specific stimuli can be tailored to a homopolymer in response to a certain signal or copolymers answering multiple stimuli. The versatility of polymer sources and their easy combination make it possible to tune up polymer sensitivity in response to a given stimulus within a narrow range, leading to more accurate and programmable drug delivery.

          Polymeric nanocarriers can be categorized based on three drug-incorporation mechanisms. The first includes polymeric carriers that use covalent chemistry for direct drug conjugation (e.g., linear polymers). The second group includes hydrophobic interactions between drugs and nanocarriers (e.g., polymeric micelles from amphiphilic block copolymers). Polymeric nanocarriers in the third group include hydrogels, which offer a water-filled depot for hydrophilic drug encapsulation.

          Polymer–Drug Conjugates (Prodrugs) Many polymer–drug conjugates have been developed since the first combination reported in the 1970s.[44,45] Conjugation of macromolecular polymers to drugs can significantly enhance the blood circulation time of the drugs. Especially, protein or peptide drugs, which can be readily digested inside the human body, can maintain their activity by conjugation of the water-soluble polymer PEG (PEGylation). For example, it was reported that PEGylated L-asparaginase increased its plasma half-life by up to 357 h.[46] Without PEG, the half-life of natural L-asparaginase is only 20 h. In addition to PEGylation of proteins, small molecular anticancer drugs can also be PEGylated to improve their pharmacokinetics for cancer therapy. For instance, PEG-camptothecin (PROTHECAN®) has entered clinical trials for cancer therapy.[47]

          Increasing the otherwise poor solubility of some drugs is another important function of polymer–drug conjugation. Specifically, conjugating water-soluble polymers to functional groups that already exist in the drug structure can significantly enhance the water solubility of the drug. Recently, a new category of polymer–drug conjugates called brush polymer–drug conjugates were prepared by ring-opening metathesis copolymerization.[48] In this report, as PEG was employed as the brush polymer side chains, the conjugates exhibited significant water solubility. However, polymer–drug conjugates require chemical modification of the existing drugs; as a consequence, their production could cost more, and additional purification steps are needed. Moreover, polymers that are chemically conjugated with drugs are often considered new chemical entities owing to a pharmacokinetic profile distinct from that of the parent drugs. As such, additional US FDA approval is required, even though the parent drug has already been approved. Despite the variety of novel drug targets and sophisticated chemistries available, only four drugs (doxorubicin, camptothecin, PTX and platinate) and four polymers (N-[2-hydroxylpropyl]methacrylamide [HPMA] copolymer, poly-L-glutamic acid [PGA], PEG and dextran) have been used to develop polymer–drug conjugates.[49–54] In addition to the commercially available polymer drugs listed in Table 1 , PGA-PTX (Xyotax™, CT-2103; Cell Therapeutics Inc./Chugai Pharmaceutical Co. Ltd.),[55] PGA-camptothecin (CT-2106; Cell Therapeutics Inc.)[56] and HPMA–doxorubicin (PK1/FCE-28068; Pfizer Inc./Cancer Research Campaign)[57] are now in clinical trials. As an example, PK1 has been evaluated in clinical trials as an anticancer agent, and a Phase I evaluation has been completed in patients with several types of tumors resistant to prior therapy, such as chemotherapy or radiation. However, although the clinical results for HPMA–doxorubicin conjugates look promising, PEG-based conjugation remains the gold-standard in the field of polymeric drug delivery. In addition, polymer–drug conjugates are still limited by their nonbiodegradability and the fate of polymers after in vivoadministration.[58]

          Polymeric Micelles Polymeric micelles are formed when amphiphilic surfactants or polymeric molecules spontaneously associate in aqueous medium to form core–shell structures. The inner core of a micelle, which is hydrophobic, is surrounded by a shell of hydrophilic polymers, such as PEG.[59] Their hydrophobic core serves as a reservoir for poorly water-soluble and amphiphilic drugs; at the same time, their hydrophilic shell stabilizes the core, prolongs circulation time in blood and increases accumulation in tumor tissues.[41] So far, a large variety of drug molecules have been incorporated into polymeric micelles, either by physical encapsulation[60,61] or covalent attachment.[62] Genexol-PM® (Samyang, Korea), PEG-poly(D,L-lactide)-PTX, employs cremophor-free polymeric micelles loaded with PTX drugs. It was found to have a three-times higher maximum tolerated dose in nude mice and two- to threefold higher levels of biodistribution, compared with those of pristine PTX, in various tissues, including tumors. A Phase I clinical trial has been evaluated in patients, and the results showed that Genexol-PM is superior to conventional PTX for the delivery of higher doses without additional toxicity.[63] Recently, a series of novel dual targeting micellar delivery systems were developed based on the self-assembled hyaluronic acid-octadecyl (HA-C18) copolymer and folic acid-conjugated HA-C18 (FA-HA-C18). PTX was successfully encapsulated by HA-C18 and FA-HA-C18 polymeric micelles, with a high encapsulation efficiency of 97.3%. Since these copolymers are biodegradable, biocompatible and cell-specifically targetable, they become promising nanostructure carriers for hydrophobic anticancer drugs.[64] In addition, stimuli-responsive drug-loaded micelles[65–69] and multifunctional polymeric micelles containing imaging as well as therapeutic agents[70–72] are now under active investigation with the potential to be the mainstream of the polymeric drug development in the near future. Furthermore, using computer simulation, the experimental preparation of drug-loaded polymeric micelles could be more efficiently guided, by providing insight into the mechanism of mesoscopic structures and serving as a complement to experiments.[73]

          Hydrogel NPs In recent years, hydrogel NPs have gained considerable attention as one of the most promising nanoparticulate drug delivery systems owing to their unique properties. Hydrogels are cross-linked networks of hydrophilic polymers that can absorb and retain more than 20% of their weight in water, while at the same time, maintaining the distinct 3D structure of the polymer network. Swelling properties, network structure, permeability or mechanical stability of hydrogels can be controlled by external stimuli or physiological parameters.[74–78] Hydrogels have been extensively studied for controlled release of therapeutics, stimuli-responsive release and applications in biological implants.[75,79–81] However, the hydration response to changes in stimuli in most hydrogel systems is too slow for therapeutic applications. To overcome this limitation, further development of hydrogel structures at the micro- and nano-scale is needed.[82] Recent reports showed some progress in micro- and nanogels of poly-N-isopropylacrylamide with ultrafast responses and attractive rheological properties.[83,84] Ding et al. demonstrated that cisplatin-loaded polyacrylic acid hydrogel NPs could be implanted and plastered on tumor tissue.[85] This hydrogel system exhibited superior efficacy in impeding tumor growth and prolonging lifespan in mice. The in vivo biodistribution assay also demonstrated that the hydrogel implant results in high concentration and retention of the drug. A multifunctional hybrid hydrogel was developed by combining the magnetic properties of NPs and the typical characteristics of the hydrogel. These hybrid hydrogels could be used to load a large number of drugs and transport them to the target site by the application of an external magnetic field.[86] To improve the specificity of the hydrogel drug delivery systems, core–shell nanogels were developed, which utilize aptamers as the recognition element and near-infrared light as a triggering stimulus for drug delivery. In this system, gold (Au)–silver nanorods, which possess intense absorption bands in the near-infrared range, were coated with DNA cross-linked polymeric shells, so that drugs can be rapidly and controllably released upon the near-infrared irradiation.[87] As the fate of hydrogel NPs after in vivo administration may be a concern for clinical applications, biodegradable hydrogel NPs with diameters of approximately 200 nm have been synthesized via inverse miniemulsion reversible addition–fragmentation chain-transfer polymerization of 2-(dimethylamino)ethyl methacrylate. A disulfide cross-linker was used to cross-link the NPs, so that the polymer network could be degraded to its constituent primary chains by exposure to a reductive environment. It is indicated that these biodegradable hydrogel NPs are currently being investigated for encapsulation and controlled release of siRNA.[88] Although hydrogel NPs-based drugs are not commercially available, they have high possibility to be further developed for drug delivery systems in the future, owing to their highly biocompatible and effective drug-loading properties.

          Protein-based NPs Hydrophobic drugs, such as taxanes, are highly active and widely used in a variety of solid tumor therapies. Both PTX and docetaxel, which are the commercially available taxanes for clinical treatments, are hydrophobic. Because of their solubility problems, they have been formulated as suspensions with nonionic surfactants, such as Cremophor EL® (BASF Corp.) for PTX and Tween-80 (ICI Americas, Inc.) for docetaxel. However, these surfactants are associated with hypersensitivity reaction and toxic side effects to tissues. To decrease toxicity, albumin conjugated with PTX has been formulated, yielding NPs approximately 130 nm in size and approved by the FDA for breast cancer treatment.[89–91] In addition to reduced toxicity, albumin–PTX has been found to bind with the albumin receptor (gp60) on endothelial cells, with further extravascular transport,[92–94] resulting in an increase in drug concentration at tumor sites without hypersensitivity reactions. The albumin–PTX complex is approved in 38 countries for the treatment of metastatic breast cancer. Furthermore, Abraxane® is currently in various stages of investigation for the treatment of other cancers, such as metastatic breast cancer, non-small-cell lung cancer, malignant melanoma, pancreatic and gastric cancer.

          Dendrimers Dendrimers are synthetic, branched macromolecules that form a tree-like structure. Unlike most linear polymers, the chemical composition and molecular weight of dendrimers can be precisely controlled; hence, it is relatively easy to predict their biocompatibility and pharmacokinetics.[95]Dendrimers are very uniform with extremely low polydispersities, and they are commonly created with dimensions incrementally grown in approximate nanometer steps from 1 to over 10 nm. Their globular structures and the presence of internal cavities enable drugs to be encapsulated within the macromolecule interior and are used to provide controlled release from the inner core.[96] Although the small size (up to 10 nm) of dendrimers limits extensive drug incorporation, their dendritic nature and branching allows drug loading onto the outside surface of the structure[97] via covalent binding or electrostatic interactions. Dendrimers can be synthesized by either divergent or convergent approaches. In the divergent approach, dendrimers are synthesized from the core and further built to other layers called generations. However, this method provides a low yield because the reactions that occur must be conducted on a single molecule processing a large number of equivalent reaction sites.[98] In addition, a large amount of reagents is required for the latter stages of synthesis, resulting in complication of purification. For the convergent method, synthesis begins at the periphery of the dendrimer molecules and stops at the core. In this approach, each synthesized generation can be subsequently purified.[98]

          Drug molecules associated with dendrimers can be utilized for cancer treatment,[99] the enhancement of drug solubility and permeability (dendrimer–drug conjugates)[100] and intracellular delivery.[101] Some drugs can be physically encapsulated inside the dendrimer network or form linkages (either covalently or noncovalently) on the dendrimer surface.[102] Furthermore, functionalization of the dendrimer surface with specific ligands can enhance potential targeting. For example, Myc et al. reported a polyamidoamine dendrimer conjugate containing FA as the targeting agent and methotroxate as the therapeutic agent.[103] Cytotoxicity and specificity were tested with both FA receptor-expressing and nonexpressing cells. Both in vitro and in vivo results showed that the dendrimer conjugate was preferentially cytotoxic to the target cells. The polyamido amine dendrimer conjugated with an anti-prostate specific membrane antigen antibody was also demonstrated.[104] The antibody–dendrimer conjugate specifically bound to anti-prostate specific membrane antigen-positive, but not negative, cell lines. However, dendrimer toxicity and immunogenicity are the main concerns when they are applied for drug delivery. Since the clinical experience with dendrimers has so far been limited, it is hard to tell whether the dendrimers are intrinsically ‘safe’ or ‘toxic’.

          Inorganic Platforms

          Au NPs Noble metal NPs, such as Au NPs, have emerged as a promising scaffold for drug and gene delivery in that they provide a useful complement to more traditional delivery vehicles. The combination of inertness and low toxicity,[105] easy synthesis, very large surface area, well-established surface functionalization (generally through thiol linkages) and tunable stability provide Au NPs with unique attributes to enable new delivery strategies. Moreover, excess loading of pharmaceuticals on NPs allows ‘drug reservoirs’ to accumulate for controlled and sustained release, thereby maintaining the drug level within the therapeutic window. An Au NP with 2-nm core diameter could, in principle, be conjugated with 100 molecules to available ligands (n = 108) in the monolayer.[106] Zubarev et al. have recently succeeded in coupling 70 PTX molecules, a chemotherapeutic drug, to an Au NP with a 2-nm core diameter.[107] Efficient release of these therapeutic agents could be triggered by internal (e.g., glutathione[108] or pH[109]) or external (e.g., light[110,111]) stimuli. In addition to serving as the carrier for drug delivery, Au NPs can also be imaged using contrast imaging techniques. Once the Au NPs are targeted to the diseased site, such as a tumor, hyperthermia treatment can be used for tumor destruction. For example, a recent study demonstrated that PEGylated Au NPs were employed for highly efficient drug delivery and in vivo photodynamic therapy of cancer.[112] Compared with conventional photodynamic therapy drug delivery in vivo, PEGylated Au NPs accelerated the silicon phthalocyanine 4 administration by approximately two orders of magnitude without side effects in treated mice. The key issue that needs to be addressed with Au NPs is the engineering of the particle surface for optimized properties, such as bioavailability and nonimmunogenicity.

          Superparamagnetic NPs Magnetic NPs have been proposed as drug carriers with a push towards clinical trials.[113] The superparamagnetic properties of iron (II) oxide particles can be used to guide microcapsules in place for delivery by external magnetic fields. Another advantage of using magnetic NPs is the ability to heat the particles after internalization, which is known as the hyperthermia effect. For example, Brazel et al. developed a grafted thermosensitive polymeric system by embedding FePt NPs in poly(N-isopropylacrylamide)-based hydrogels, which can be triggered to release the loaded drug by inducing an increase in temperature based on a magnetic thermal heating event.[114] The grafted hydrogel system is also shown to exhibit a desirable positive thermal response with an increased drug diffusion coefficient for temperatures higher than physiological temperature.[115]

          Besides being utilized for targeting and raising temperature, magnetic NPs can also affect the permeability of microcapsules by applying external oscillating magnetic fields and releasing encapsulated materials.[116] For example, ferromagnetic Au-coated cobalt NPs (3 nm in diameter) were incorporated into the polymer walls of microcapsules. Subsequently, application of external alternating magnetic fields of 100–300 Hz and 1200 Oe strength disturbed the capsule wall structures and dramatically increased their permeability to macromolecules. This work supports the hypothesis that magnetic NPs embedded in polyelectrolyte capsules can be used for the controlled release of substances by applying an external magnetic field.

          The main benefits of superparamagnetic NPs over classical cancer therapies are minimal invasiveness, accessibility of hidden tumors and minimal side effects. Conventional heating of a tissue by, for example, microwaves or laser light results in the destruction of healthy tissue surrounding the tumor. However, targeted paramagnetic particles provide a powerful strategy for localized heating of cancerous cells.

          Ceramic NPs Ceramic NPs are particles fabricated from inorganic compounds with porous characteristics, such as silica, alumina and titania.[117–119] Among these, silica NPs have attracted much research attention as a result of their biocompatibility and ease of synthesis, as well as surface modification.[120–122,301] Furthermore, the well-established silane chemistry facilitates the cross-linking of drugs to silica particles.[123,124] For example, recent breakthroughs in mesoporous silica NPs (MSNs) have brought new possibilities to this burgeoning area of research. MSNs contain hundreds of empty channels (mesopores) arranged in a 2D network of a honeycomb-like porous structure. In contrast to the low biocompatibility of other amorphous silica materials, recent studies have shown that MSNs exhibit superior biocompatibility at concentrations adequate for pharmacological applications.[125,126]Once the vehicle is localized in the cytoplasm, it is desirable to have effective control over the release of drug molecules in order to reach pharmacologically effective levels. The ability to selectively functionalize the external particle and/or the interior nanochannel surface of MSNs is advantageous in achieving this goal.[127,128] Different functional groups can be added by using this methodology, including, for example, functionalization with stimuli-responsive tethers that could be further attached to NPs (Au and iron [II] oxide). These NPs could work as gatekeepers and be removed by either intracellular or external triggers, such as changes in pH, reducing environment, enzymatic activity, light, electromagnetic field or ultrasound.[128] The surface of MSNs can be engineered with cell-specific moieties, such as organic molecules, peptides, aptamers and antibodies, to achieve cell type or tissue specificity. Moreover, optical and magnetic contrast agents can be introduced to develop multipurpose drug delivery systems.

          These strategies demonstrated that the application of target-specific MSN vehicles in vitro is promising; however, the application in vivo has not yet been reported. These particles are not biodegradable; consequently, there is a concern that they may accumulate in the human body and cause harmful effects.[117] For further in vivo applications, the biocompatibility, biodistribution, retention, degradation and clearance of MSNs must be systematically investigated.

          Carbon-based Nanomaterials Carbon-based nanomaterials have attracted particular interest because they can be surface functionalized for the grafting of nucleic acids, peptides and proteins. Carbon nanotubes (CNTs), fullerene, and nanodiamonds[129] have been extensively studied for drug delivery applications.[130] The size, geometry and surface characteristics of single-wall nanotubes (SWNTs), multiwall nanotubes and C60 fullerenes make them appealing for drug carrier usage. For example, PTX-conjugated SWNTs have shown promise for in vivo cancer treatment. SWNT delivery of PTX affords markedly improved treatment efficacy over clinical Taxol (Bristol-Myers Squibb Co.), as evidenced by its ability to slow down tumor growth at a low PTX dose.[131]

          However, the primary drawback of carbon-based nanomaterials appears to be their toxicity. Experiments have shown that CNTs can lead to cell proliferation inhibition and apoptosis. Although they are less toxic than carbon fibers and NPs, the toxicity of CNTs increases significantly when carbonyl, carboxyl and/or hydroxyl functional groups are present on their surface.[132] Because of the reported toxicity of CNTs,[133–137] studies involving their application for drug delivery are still being conducted.[138–140] In order to promote the application of CNTs for drug delivery, researchers have functionalized their surface, rendering them benign.[136] Unfortunately, concerns that functionalized CNTs may revert back to a toxic state if the functional group detaches has limited the pursuit of using these modified CNTs for biomedical applications.

          The toxicity of other forms of nanocarbons has also been reported.[132,140,141] One study of human lung tumor cells showed that carbon NPs are even more toxic than multiwall nanotubes and carbon nanofibers.[132] Given the mounting evidence demonstrating the toxicity of carbon NPs, the enthusiasm to develop carbon NPs for drug delivery has decreased significantly in recent years.

          Integrated Nanocomposite Particles

          A variety of nanoplatforms have been developed for a wide spectrum of applications, and each of these applications has unique advantages and limitations. By combining the specific function of each material, new hybrid nanocomposite materials can be fabricated. For instance, liposomes and polymeric NPs are the two most widely studied drug delivery platforms, and attempts have been made to combine the advantages of both systems. A recent study reported the use of nanocells consisting of nuclear poly(lactic-co-glycolic acid) NPs within an extranuclear PEGylated phospholipid envelope for temporal targeting of tumor cells and neovasculature.[142] Moreover, liposomes are routinely coated with a hydrophilic polymer, such as PEG or poly(ethylene oxide), to improve the circulation time in vivo, which is another example of a liposome–polymer composite.[143] Similarly, liposomal locked-in dendrimers, the combination of liposomes and dendrimers in one formulation, has resulted in higher drug loading and slower drug release from the composite, as compared with pure liposomes.[144] Another LipoMag formulation, which consists of an oleic acid-coated magnetic nanocrystal core and a cationic lipid shell, was magnetically guided to deliver and silence genes in cells and tumors in mice.[145]

          Targeting Strategies

          Two basic requirements should be realized in the design of nanocarriers to achieve effective drug delivery (Figure 2). First, drugs should be able to reach the desired tumor sites after administration with minimal loss to their volume and activity in blood circulation. Second, drugs should only kill tumor cells without harmful effects to healthy tissue.[146] These requirements may be enabled using two strategies: passive and active targeting of drugs.[147]

          (Enlarge Image)

          Figure 2.

          Passive and active targeting.
          By the enhanced permeability and retention effect, nanoparticles (NPs) can be passively extravasated through leaky vascularization, allowing their accumulation at the tumor region (A). In this case, drugs may be released in the extracellular matrix and then diffuse through the tissue. Active targeting (B) can enhance the therapeutic efficacy of drugs by the increased accumulation and cellular uptake of NPs through receptor-mediated endocytosis. NPs can be engineered to incorporate ligands that bind to endothelial cell surface receptors. In this case, the enhanced permeability and retention effect does not pertain, and the presence of leaky vasculature is not required.

          Passive Targeting

          Passive targeting takes advantage of the unique pathophysiological characteristics of tumor vessels, enabling nanodrugs to accumulate in tumor tissues. Typically, tumor vessels are highly disorganized and dilated with a high number of pores, resulting in enlarged gap junctions between endothelial cells and compromised lymphatic drainage. The ‘leaky’ vascularization, which refers to the EPR effect, allows migration of macromolecules up to 400 nm in diameter into the surrounding tumor region.[147–149] One of the earliest nanoscale technologies for passive targeting of drugs was based on the use of liposomes. More advanced liposomes are coated with a synthetic polymer that protects the agents from immune destruction.[150]

          Moreover, the EPR effect, the microenvironment surrounding tumor tissue, is different from that of healthy cells, a physiological phenomenon that also supports passive targeting. Based on the high metabolic rate of fast-growing tumor cells, they require more oxygen and nutrients. Consequently, glycolysis is stimulated to obtain extra energy, resulting in an acidic environment.[151] Taking advantage of this, pH-sensitive liposomes have been designed to be stable at physiological pH 7.4, but degraded to release drug molecules at the acidic pH.[152]

          Although passive targeting approaches form the basis of clinical therapy, they suffer from several limitations. Ubiquitously targeting cells within a tumor is not always feasible because some drugs cannot diffuse efficiently, and the random nature of the approach makes it difficult to control the process. The passive strategy is further limited because certain tumors do not exhibit an EPR effect, and the permeability of vessels may not be the same throughout a single tumor.[153]

          Active Targeting

          One way to overcome the limitations of passive targeting is to attach affinity ligands (antibodies,[154]peptides,[155] aptamers[156] or small molecules[157] that only bind to specific receptors on the cell surface) to the surface of the nanocarriers by a variety of conjugation chemistries. Nanocarriers will recognize and bind to target cells through ligand–receptor interactions by the expression of receptors or epitopes on the cell surface. In order to achieve high specificity, those receptors should be highly expressed on tumor cells, but not on normal cells. Furthermore, the receptors should homogeneously express and should not be shed into the blood circulation. Internalization of targeting conjugates can also occur by receptor-mediated endocytosis after binding to target cells, facilitating drug release inside the cells. Based on the receptor-mediated endocytosis mechanism, targeting conjugates bind with their receptors first, followed by plasma membrane enclosure around the ligand–receptor complex to form an endosome. The newly formed endosome is transferred to specific organelles, and drugs could be released by acidic pH or enzymes. Although the active targeting strategy looks intriguing, nanodrugs currently approved for clinical use are relatively simple and generally lack active targeting or triggered drug release components. Moreover, nanodrugs currently under clinical development lack specific targeting. To fully explore the application of targeted drug delivery, we need to investigate whether the specific diseases are the correct application for targeting, whether the properties of the therapeutic drugs, as well as their site and mode of action, are suited for targeting and whether the delivery vehicles are optimal for product development.[158]

          Key Factors Impacting Drug Delivery

          In order to achieve effective drug delivery, nanocarriers must have suitable circulation time to prevent the elimination of drugs before reaching their target. Based on previous investigations, size, shape and surface characteristics are key factors that impact the efficiency of drug delivery systems.


          Nanotechnology is an emerging field with the potential to revolutionize drug delivery. Advances in this area have allowed some nanomedicines in the market to achieve desirable pharmacokinetic properties, reduce toxicity and improve patient compliance, as well as clinical outcomes. Integration of nanoparticulate drug delivery technologies in preformulation work not only accelerates the development of new therapeutic moieties, but also helps in the reduction of attrition of new molecular entities caused by undesirable biopharmaceutical and pharmacokinetic properties.

          Optimizing the integration of nanomaterials into drug delivery systems will require standardized metrics for their classification, as well as protocols for their handling. This will, in turn, result in a better understanding of the interactions of nanomaterials with biological systems, which will facilitate better engineering of their properties specific to biomedical applications. The development of such drug carriers will require a greater understanding of both the surface chemistry of nanomaterials and the interaction chemistry of these nanomaterials with biological systems. This can only be achieved through collaborative efforts among scientists in different disciplines. Those who work in this emerging field should have up-to-date information on related toxicology issues, potential health and safety risks and the regulatory environment that will impact patient use. Understanding both the benefits and the risks of these new nanotechnology applications will be essential to good decision-making for drug developers, regulators and ultimately the consumers and patients who will be the beneficiaries of new drug delivery technologies.

        5. Nanoparticles wrapped inside human platelet membranes serve as new vehicles for targeted drug delivery.

Nanoparticles disguised as human platelets could greatly enhance the healing power of drug treatments for cardiovascular disease and systemic bacterial infections. These platelet-mimicking nanoparticles, developed by engineers at the University of California, San Diego, are capable of delivering drugs to targeted sites in the body — particularly injured blood vessels, as well as organs infected by harmful bacteria. Engineers demonstrated that by delivering the drugs just to the areas where the drugs were needed, these platelet copycats greatly increased the therapeutic effects of drugs that were administered to diseased rats and mice.

“This work addresses a major challenge in the field of nanomedicine: targeted drug delivery with nanoparticles,” said Liangfang Zhang, a nanoengineering professor at UC San Diego and the senior author of the study. “Because of their targeting ability, platelet-mimicking nanoparticles can directly provide a much higher dose of medication specifically to diseased areas without saturating the entire body with drugs.”


The study is an excellent example of using engineering principles and technology to achieve “precision medicine,” said Shu Chien, a professor of bioengineering and medicine, director of the Institute of Engineering in Medicine at UC San Diego, and a corresponding author on the study. “While this proof of principle study demonstrates specific delivery of therapeutic agents to treat cardiovascular disease and bacterial infections, it also has broad implications for targeted therapy for other diseases such as cancer and neurological disorders,” said Chien.

The ins and outs of the platelet copycats

On the outside, platelet-mimicking nanoparticles are cloaked with human platelet membranes, which enable the nanoparticles to circulate throughout the bloodstream without being attacked by the immune system. The platelet membrane coating has another beneficial feature: it preferentially binds to damaged blood vessels and certain pathogens such as MRSA bacteria, allowing the nanoparticles to deliver and release their drug payloads specifically to these sites in the body.

Enclosed within the platelet membranes are nanoparticle cores made of a biodegradable polymer that can be safely metabolized by the body. The nanoparticles can be packed with many small drug molecules that diffuse out of the polymer core and through the platelet membrane onto their targets.

To make the platelet-membrane-coated nanoparticles, engineers first separated platelets from whole blood samples using a centrifuge. The platelets were then processed to isolate the platelet membranes from the platelet cells. Next, the platelet membranes were broken up into much smaller pieces and fused to the surface of nanoparticle cores. The resulting platelet-membrane-coated nanoparticles are approximately 100 nanometers in diameter, which is one thousand times thinner than an average sheet of paper.

This cloaking technology is based on the strategy that Zhang’s research group had developed to cloak nanoparticles in red blood cell membranes. The researchers previously demonstrated that nanoparticles disguised as red blood cells are capable of removing dangerous pore-forming toxins produced by MRSA, poisonous snake bites and bee stings from the bloodstream.

By using the body’s own platelet membranes, the researchers were able to produce platelet mimics that contain the complete set of surface receptors, antigens and proteins naturally present on platelet membranes. This is unlike other efforts, which synthesize platelet mimics that replicate one or two surface proteins of the platelet membrane.

“Our technique takes advantage of the unique natural properties of human platelet membranes, which have a natural preference to bind to certain tissues and organisms in the body,” said Zhang. This targeting ability, which red blood cell membranes do not have, makes platelet membranes extremely useful for targeted drug delivery, researchers said.

Platelet copycats at work

In one part of this study, researchers packed platelet-mimicking nanoparticles with docetaxel, a drug used to prevent scar tissue formation in the lining of damaged blood vessels, and administered them to rats afflicted with injured arteries. Researchers observed that the docetaxel-containing nanoparticles selectively collected onto the damaged sites of arteries and healed them.

When packed with a small dose of antibiotics, platelet-mimicking nanoparticles can also greatly minimize bacterial infections that have entered the bloodstream and spread to various organs in the body. Researchers injected nanoparticles containing just one-sixth the clinical dose of the antibiotic vancomycin into one of group of mice systemically infected with MRSA bacteria. The organs of these mice ended up with bacterial counts up to one thousand times lower than mice treated with the clinical dose of vancomycin alone.

“Our platelet-mimicking nanoparticles can increase the therapeutic efficacy of antibiotics because they can focus treatment on the bacteria locally without spreading drugs to healthy tissues and organs throughout the rest of the body,” said Zhang. “We hope to develop platelet-mimicking nanoparticles into new treatments for systemic bacterial infections and cardiovascular disease.”

6.  Sponge-like nanoporous gold could be key to new devices to detect disease-causing agents in humans and plants, according to UC Davis researchers.

A group from the UC Davis Department of Electrical and Computer Engineering have demonstrated that they could detect nucleic acids  using nanoporous gold, a novel sensor coating material, in mixtures of other biomolecules that would gum up most detectors. This method enables sensitive detection of DNA in complex biological samples, such as serum from whole blood.

“Nanoporous gold can be imagined as a porous metal sponge with pore sizes that are a thousand times smaller than the diameter of a human hair,” said Erkin Şeker, assistant professor of electrical and computer engineering at UC Davis and the senior author on the papers. “What happens is the debris in biological samples, such as proteins, is too large to go through those pores, but the fiber-like nucleic acids that we want to detect can actually fit through them. It’s almost like a natural sieve.”


Rapid and sensitive detection of nucleic acids plays a crucial role in early identification of pathogenic microbes and disease biomarkers. Current sensor approaches usually require nucleic acid purification that relies on multiple steps and specialized laboratory equipment, which limit the sensors’ use in the field. The researchers’ method reduces the need for purification.

“So now we hope to have largely eliminated the need for extensive sample clean-up, which makes the process conducive to use in the field,” Şeker said.

The result is a faster and more efficient process that can be applied in many settings.

The researchers hope the technology can be translated into the development of miniature point-of-care diagnostic platforms for agricultural and clinical applications.

“The applications of the sensor are quite broad ranging from detection of plant pathogens to disease biomarkers,” said Şeker.

For example, in agriculture, scientists could detect whether a certain pathogen exists on a plant without seeing any symptoms. And in sepsis cases in humans, doctors might determine bacterial contamination much more quickly than at present, preventing any unnecessary treatments.

7.  Pushing the limits of lensless imaging

The Optical Society

To take a picture with this method, scientists fire an X-ray or extreme ultraviolet laser at a target. The light scatters off, and some of those photons interfere with one another and find their way onto a detector, creating a diffraction pattern. By analyzing that pattern, a computer then reconstructs the path those photons must have taken, which generates an image of the target material—all without the lens that’s required in conventional microscopy.

WASHINGTON — Using ultrafast beams of extreme ultraviolet light streaming at a 100,000 times a second, researchers from the Friedrich Schiller University Jena, Germany, have pushed the boundaries of a well-established imaging technique. Not only did they make the highest resolution images ever achieved with this method at a given wavelength, they also created images fast enough to be used in real time. Their new approach could be used to study everything from semiconductor chips to cancer cells.

The team will present their work at the Frontiers in Optics, The Optical Society’s annual meeting and conference in San Jose, California, USA, on October 22, 2015.

The researchers’ wanted to improve on a lensless imaging technique called coherent diffraction imaging, which has been around since the 1980s. To take a picture with this method, scientists fire an X-ray or extreme ultraviolet laser at a target. The light scatters off, and some of those photons interfere with one another and find their way onto a detector, creating a diffraction pattern. By analyzing that pattern, a computer then reconstructs the path those photons must have taken, which generates an image of the target material—all without the lens that’s required in conventional microscopy.

“The computer does the imaging part—forget about the lens,” explained Michael Zürch, Friedrich Schiller University Jena, Germany and lead researcher. “The computer emulates the lens.”

Without a lens, the quality of the images primarily depends on the radiation source. Traditionally, researchers use big, powerful X-ray beams like the one at the SLAC National Accelerator Laboratory in Menlo Park, CA, USA. Over the last 10 years, researchers have developed smaller, cheaper machines that pump out coherent, laser-like beams in the laboratory setting. While those machines are convenient from the cost perspective, they have drawbacks when reporting results.

The table-top machines are unable to produce as many photons as the big expensive ones which limits their resolution. To achieve higher resolutions, the detector must be placed close to the target material—similar to placing a specimen close to a microscope to boost the magnification. Given the geometry of such short distances, hardly any photons will bounce off the target at large enough angles to reach the detector. Without enough photons, the image quality is reduced.

Zürch and a team of researchers from Jena University used a special, custom-built ultrafast laser that fires extreme ultraviolet photons a hundred times faster than conventional table-top machines. With more photons, at a wavelength of 33 nanometers, the researchers were able to make an image with a resolution of 26 nanometers — almost the theoretical limit. “Nobody has achieved such a high resolution with respect to the wavelength in the extreme ultraviolet before,” Zürch said.

The ultrafast laser also overcame another drawback of conventional table-top light sources: long exposure times. If researchers have to wait for images, they can’t get real-time feedback on the systems they study. Thanks to the new high-speed light source, Zürch and his colleagues have reduced the exposure time to only about a second — fast enough for real-time imaging. When taking snapshots every second, the researchers reached a resolution below 80 nanometers.

The prospect of high-resolution and real-time imaging using such a relatively small setup could lead to all kinds of applications, Zürch said. Engineers can use this to hunt for tiny defects in semiconductor chips. Biologists can zoom in on the organelles that make up a cell. Eventually, he said, the researchers might be able to cut down on the exposure times even more and reach even higher resolution levels.

About FiO/LS

Frontiers in Optics (FiO) 2015 is The Optical Society’s (OSA) 99th Annual Meeting and is being held together with Laser Science, the 31th annual meeting of the American Physical Society (APS) Division of Laser Science (DLS). The two meetings unite the OSA and APS communities for five days of quality, cutting-edge presentations, in-demand invited speakers and a variety of special events spanning a broad range of topics in optics and photonics—the science of light—across the disciplines of physics, biology and chemistry. The exhibit floor will feature leading optics companies, technology products and programs.

About The Optical Society

Founded in 1916, The Optical Society (OSA) is a leading professional organization for scientists, engineers, students and entrepreneurs who fuel discoveries, shape real-life applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership initiatives, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of optics and photonics experts. OSA is a founding partner of the National Photonics Initiative and the 2015 International Year of Light.

SOURCE: The Optical Society

8.  Physicists determine three-dimensional positions of individual atoms for the first time

Katherine Kornei, UCLA
The scientists were able to plot the exact coordinates of nine layers of atoms with a precision of 19 trillionths of a meter. Courtesy of Mary Scott and Jianwei (John) Miao/UCLAAtoms are the building blocks of all matter on Earth, and the patterns in which they are arranged dictate how strong, conductive or flexible a material will be. Now, scientists at UCLA have used a powerful microscope to image the three-dimensional positions of individual atoms to a precision of 19 trillionths of a meter, which is several times smaller than a hydrogen atom.

Their observations make it possible, for the first time, to infer the macroscopic properties of materials based on their structural arrangements of atoms, which will guide how scientists and engineers build aircraft components, for example. The research, led by Jianwei (John) Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, is published September 21 in the online edition of the journal Nature Materials.

For more than 100 years, researchers have inferred how atoms are arranged in three-dimensional space using a technique called X-ray crystallography, which involves measuring how light waves scatter off of a crystal. However, X-ray crystallography only yields information about the average positions of many billions of atoms in the crystal, and not about individual atoms’ precise coordinates.

“It’s like taking an average of people on Earth,” Miao said. “Most people have a head, two eyes, a nose and two ears. But an image of the average person will still look different from you and me.”

Because X-ray crystallography doesn’t reveal the structure of a material on a per-atom basis, the technique can’t identify tiny imperfections in materials, such as the absence of a single atom. These imperfections, known as point defects, can weaken materials, which can be dangerous when the materials are components of machines like jet engines.

“Point defects are very important to modern science and technology,” Miao said.

Miao and his team used a technique known as scanning transmission electron microscopy, in which a beam of electrons smaller than the size of a hydrogen atom is scanned over a sample and measures how many electrons interact with the atoms at each scan position. The method reveals the atomic structure of materials because different arrangements of atoms cause electrons to interact in different ways.

However, scanning transmission electron microscopes only produce two-dimensional images. So, creating a 3-D picture requires scientists to scan the sample once, tilt it by a few degrees and re-scan it—repeating the process until the desired spatial resolution is achieved—before combining the data from each scan using a computer algorithm. The downside of this technique is that the repeated electron beam radiation can progressively damage the sample.

Using a scanning transmission electron microscope at the Lawrence Berkeley National Laboratory’s Molecular Foundry, Miao and his colleagues analyzed a small piece of tungsten, an element used in incandescent light bulbs. As the sample was tilted 62 times, the researchers were able to slowly assemble a 3-D model of 3,769 atoms in the tip of the tungsten sample.

The experiment was time consuming because the researchers had to wait several minutes after each tilt for the setup to stabilize.

“Our measurements are so precise, and any vibrations—like a person walking by—can affect what we measure,” said Peter Ercius, a staff scientist at Lawrence Berkeley National Laboratory and an author of the paper.

The researchers compared the images from the first and last scans to verify that the tungsten had not been damaged by the radiation, thanks to the electron beam energy being kept below the radiation damage threshold of tungsten.

Miao and his team showed that the atoms in the tip of the tungsten sample were arranged in nine layers, the sixth of which contained a point defect. The researchers believe the defect was either a hole in an otherwise filled layer of atoms or one or more interloping atoms of a lighter element such as carbon.

Regardless of the nature of the point defect, the researchers’ ability to detect its presence is significant, demonstrating for the first time that the coordinates of individual atoms and point defects can be recorded in three dimensions.

“We made a big breakthrough,” Miao said.

Miao and his team plan to build on their results by studying how atoms are arranged in materials that possess magnetism or energy storage functions, which will help inform our understanding of the properties of these important materials at the most fundamental scale.

“I think this work will create a paradigm shift in how materials are characterized in the 21st century,” he said. “Point defects strongly influence a material’s properties and are discussed in many physics and materials science textbooks. Our results are the first experimental determination of a point defect inside a material in three dimensions.”

The study’s co-authors include Rui Xu, Chien-Chun Chen, Li Wu, Mary Scott, Matthias Bartels, Yongsoo Yang and Michael Sawaya, all of UCLA; as well as Colin Ophus of Lawrence Berkeley National Laboratory; Wolfgang Theis of the University of Birmingham; Hadi Ramezani-Dakhel and Hendrik Heinz of the University of Akron; and Laurence Marks of Northwestern University.

This work was primarily supported by the U.S. Department of Energy’s Office of Basic Energy Sciences (grant DE-FG02-13ER46943 and contract DE-AC02—05CH11231).

9.  An SDSU chemist has developed a technique to identify potential cancer drugs that are less likely to produce side effects.

A class of therapeutic drugs known as protein kinase inhibitors has in the past decade become a powerful weapon in the fight against various life-threatening diseases, including certain types of leukemia, lung cancer, kidney cancer and squamous cell cancer of the head and neck. One problem with these drugs, however, is that they often inhibit many different targets, which can lead to side effects and complications in therapeutic use. A recent study by San Diego State University chemist Jeffrey Gustafson has identified a new technique for improving the selectivity of these drugs and possibly decreasing unwanted side effects in the future.

Why are protein kinase–inhibiting drugs so unpredictable? The answer lies in their molecular makeup.

Many of these drug candidates possess examples of a phenomenon known as atropisomerism. To understand what this is, it’s helpful to understand a bit of the chemistry at work. Molecules can come in different forms that have exactly the same chemical formula and even the same bonds, just arranged differently. The different arrangements are mirror images of each other, with a left-handed and a right-handed arrangement. The molecules’ “handedness” is referred to as chirality. Atropisomerism is a form of chirality that arises when the spatial arrangement has a rotatable bond called an axis of chirality. Picture two non-identical paper snowflakes tethered together by a rigid stick.

Some axes of chirality are rigid, while others can freely spin about their axis. In the latter case, this means that at any given time, you could have one of two different “versions” of the same molecule.

Watershed treatment

As the name suggests, kinase inhibitors interrupt the function of kinases—a particular type of enzyme—and effectively shut down the activity of proteins that contribute to cancer.

“Kinase inhibition has been a watershed for cancer treatment,” said Gustafson, who attended SDSU as an undergraduate before earning his Ph.D. in organic chemistry from Yale University, then working there as a National Institutes of Health poctdoctoral fellow in chemical biology.

“However, it’s really hard to inhibit a single kinase,” he explained. “The majority of compounds identified inhibit not just one but many kinases, and that can lead to a number of side effects.”

Many kinase inhibitors possess axes of chirality that are freely spinning. The problem is that because you can’t control which “arrangement” of the molecule is present at a given time, the unwanted version could have unintended consequences.

In practice, this means that when medicinal chemists discover a promising kinase inhibitor that exists as two interchanging arrangements, they actually have two different inhibitors. Each one can have quite different biological effects, and it’s difficult to know which version of the molecule actually targets the right protein.

“I think this has really been under-recognized in the field,” Gustafson said. “The field needs strategies to weed out these side effects.”

Applying the brakes

So that’s what Gustafson did in a recently published study. He and his colleagues synthesized atropisomeric compounds known to target a particular family of kinases known as tyrosine kinases. To some of these compounds, the researchers added a single chlorine atom which effectively served as a brake to keep the atropisomer from spinning around, locking the molecule into either a right-handed or a left-handed version.

When the researchers screened both the modified and unmodified versions against their target kinases, they found major differences in which kinases the different versions inhibited. The unmodified compound was like a shotgun blast, inhibiting a broad range of kinases. But the locked-in right-handed and left-handed versions were choosier.

“Just by locking them into one or another atropisomeric configuration, not only were they more selective, but they inhibited different kinases,” Gustafson explained.

If drug makers incorporated this technique into their early drug discovery process, he said, it would help identify which version of an atropisomeric compound actually targets the kinase they want to target, cutting the potential for side effects and helping to usher drugs past strict regulatory hurdles and into the hands of waiting patients.

11.  ‘Nanocubes’ Make PSA Test Over 100 Times More Sensitive

A new catalyst that improves the sensitivity of the standard PSA test more than 100-fold, pictured above, is made of palladium nanocubes coated with iridium. (Credit: Xiaohu Xia, Michigan Technological University)

Say you’ve been diagnosed with prostate cancer, the second-leading cause of cancer death in men. You opt for surgery to remove your prostate. Three months later, a prostate surface antigen (PSA) test shows no prostate cells in your body. Everyone rejoices.

Until 18 months later, when another PSA test reveals that now prostate cells have reappeared. What happened?

The first PSA test yielded what’s known as a false negative result. It did not detect the handful of cells that remained after surgery and later multiplied. Now a chemist at Michigan Technological University has made a discovery that could, among other things, slash the numbers of false negatives in PSA tests.

Xiaohu Xia and his team, including researchers from Louisiana State University and the University of Texas at Dallas, have developed a new catalyst that could make lab tests like the PSA much more sensitive. And it may even speed up reactions that neutralize toxic industrial chemicals before they enter lakes and streams.

A paper on the research, “Pd-Ir Core-Shell Nanocubes: A Type of Highly Efficient and Versatile Peroxidase Mimic,” was published online Sept. 3 in ACS Nano. In addition to Xia, the coauthors are graduate students Jingtuo Zhang, Jiabin Liu and Haihang Ye and undergraduate Erin McKenzie of Michigan Tech; Moon J. Kim and Ning Lu of the University of Texas at Dallas; and Ye Xu and Kushal Ghale of Louisiana State University. The LSU team conducted theoretical calculations, and the UT Dallas team contributed high-resolution electron microscopy images.

Their new catalyst mimics the action of similar biochemicals found in nature, called peroxidases. “In animals and plants, these peroxidases are important– for example, they get rid of hydrogen peroxide, which is harmful to the organism,” said Xia, an assistant professor of chemistry at Michigan Tech. In medicine, peroxidases have become powerful tools for accelerating chemical reactions in diagnostic tests; a peroxidase found in the horseradish root is commonly used in the standard PSA test.

However, these natural peroxidases have drawbacks. They can be difficult to extract and purify. “And, they are made of protein, which isn’t very stable,” Xia explained. “At high temperatures, they cook, like meat.”

“Moreover, their efficiency is just fair,” he added. “We wanted to develop a mimic peroxidase that was substantially more efficient than the natural peroxidase, which would lead to a more-sensitive PSA test.”

Their new catalyst, made from nanoscale cubes of palladium coated with a few layers of iridium atoms, does just that. PSA tests Xia’s team conducted using the palladium-iridium catalyst were 110 times more sensitive than tests completed with the conventional peroxidase.

“After surgery, it’s vital to detect a tiny amount of prostate antigen, because otherwise you can get a false negative and perhaps delay treatment for cancer,” said Xia. “Our ultimate goal is to further refine our system for use in clinical diagnostic laboratories.”

Xia hopes that his mimic peroxidase will someday save lives through earlier detection of cancer and other maladies. He also plans to explore other applications, including how it compares with horseradish peroxidase in other catalytic reactions: breaking down toxic industrial-waste products like phenols into harmless substances.

Finally, the team wants to better understand why its palladium-iridium catalyst works so well. “We know the iridium coating is the key,” Xia said. “We think it makes the surface sticky, so the chemical reagents bind to it better.”

12.  Using Proteomics To Understand How Genetic Mutations Rewire Cancer Cells

SAN JOSE, Calif.–(BUSINESS WIRE)–Thermo Fisher Scientific and the Biotech Research and Innovation Center (BRIC) at the University of Copenhagen (UCPH) have shared results from two important scientific papers that advance understanding of how gene mutations drive cancer progression. The two landmark studies, published this week in the journal ; CELL, are some of the early results of the strategic collaboration between Thermo Fisher Scientific and the Linding Lab at BRIC, UCPH.

Using advanced Thermo Scientific Orbitrap Fusion mass spectrometry and next-generation sequencing technologies, researchers from the Universities of Copenhagen, Yale, Zurich, Rome and Tottori describe how specific cancer mutations target and damage the protein signaling networks within human cells on a global scale.

By developing advanced algorithms to integrate data from quantitative mass-spectrometry and next generation sequencing of tumor samples, the researchers have been able to uncover cancer-related changes to phosphorylation signaling networks. This new breakthrough allows researchers to identify the effects of mutations on the function of protein pathways in cancer for individual patients, even if those mutations are very rare.

Lead BRIC researcher Dr. Rune Linding said: “The identification of distinct changes within our tissues that could have the potential to help predict and treat cancer is a major step forward and we are confident that it can aid in the development of novel therapies and screening techniques.”

Since the human genome was decoded more than a decade ago, large scale cancer genome studies have successfully identified gene mutations in individual patients and tumors. However to develop improved cancer therapies, researchers need to explain and relate this genomic data to proteins, the targets of most pharmaceutical drugs. Creating this linkage provides powerful new insights into cancer biology and potential therapeutic approaches.

“The studies highlight the importance of integrating proteomics with genomics in future cancer studies and underscores the value of the broad technological expertise within Thermo Fisher,” said Ken Miller, vice president of research product marketing, life sciences mass spectrometry at Thermo Fisher. “It is becoming increasingly apparent that the genetic basis for each patient’s cancer is subtly, but importantly, different. This realization will inevitably lead to a need for tools to acquire and assess patient-specific information to develop highly personalized therapies with the potential for much greater efficacy. It is hoped that the novel approaches described in these studies, together with best-in-class enabling technologies such as the Orbitrap and Ion Torrent systems, will continue to improve our knowledge of cancer biology.”

The Biotech Research & Innovation Centre (BRIC) was established in 2003 by the Danish Ministry of Science, Technology and Innovation to form an elite centre in biomedical research.

The two studies will be available in advance online and printed in the 24th September issue of CELL, a premier journal in life and biological sciences. More information about the studies and links to media content can be found on and The work was supported by the European Research Council (ERC), the Lundbeck Foundation and Human Frontier Science Program.

13.  Multi-Ancestry GWAS Uncovers a Dozen New Loci Linked to Blood Pressure

Sep 21, 2015

NEW YORK (GenomeWeb) – In Nature Genetics, an international team described a dozen new loci influencing blood pressure patterns across individuals from multiple populations — a set that overlaps with variants implicated in epigenetic features of blood and other tissues.

Through a multi-stage genome-wide association study that relied on genotyping information for as many as 320,251 individuals of East Asian, South Asian, and European descent, the researchers focused in on SNPs at 12 blood pressure-associated sites in the genome, including loci previously linked to cardiac or metabolic functions.

In particular, the team saw blood pressure-linked variants in and around genes contributing to vascular smooth muscle and renal function. And a large proportion of the associated SNPs — or variants in linkage disequilibrium with them — turned up at sites already implicated in control of DNA methylation.

“We note an effect of genome-wide-associated sentinel SNPs on DNA methylation for traits in addition to blood pressure, suggesting that DNA methylation might have a wider role in linking common genetic variation to multiple phenotypes,” the study’s authors wrote.

More than a billion people around the world are affected by high blood pressure, the team explained, a condition that elevates the risk of heart disease, heart attack, stroke, and chronic kidney disease.

Because it occurs at especially high rates in East Asian and South Asian populations, the investigators reasoned that it might be possible to find both ancestry-specific and trans-ancestral genetic associations with high blood pressure.

The team started by analyzing imputed and directly genotyped SNPs in 31,516 individuals of East Asian ancestry, 35,352 individuals with European ancestry, and 33,126 individuals of South Asian descent, searching for variants associated with systolic blood pressure, diastolic blood pressure, pulse pressure, mean arterial pressure, and hypertension.

Through analyses on each population individually and in a meta-analysis of individuals from all three populations, the researchers initially identified 630 loci with suspected ties to at least one of the five blood pressure traits considered.

They then compared the top SNP at each site against data on as many as 87,205 individuals tested for various blood pressure traits for the International Consortium on Blood Pressure GWAS, narrowing in on 19 loci with potential ties to blood pressure that were not described in the past.

The team confirmed blood pressure associations for SNPs at 12 of the new loci through testing on another 48,268 East Asians, 68,456 Europeans, and 16,328 South Asians.

The analysis also verified almost two-dozen loci linked to blood pressure in the past and pointed to 17 sites in the genome with weaker ties to the traits of interest.

Variants at the 12 new loci seemed to have similar effects on the five traits in question, regardless of the population considered, while variants that first appeared to show population-specific effects in East Asians and Europeans did not pan out in replication testing.

By folding in linkage disequilibrium patterns for SNPs at the new blood pressure-associated sites, the researchers got a look at genes that fall near these linked SNPs — a collection that includes genes such as PDE3A, KCNK3, and PRDM6.

They also used these linkage patterns to look for overlap with DNA methylation-related SNPs, demonstrating that 28 of 35 SNPs at these loci seem to be linked to altered DNA methylation levels and related expression shifts in samples from thousands of Europeans or East Asians.

And the team saw similar effects in hundreds of cord blood samples subjected to methylation profiling, suggesting the effect is not simply a consequence of high blood pressure itself.

“The presence of these associations at an early stage of life, before substantial environmental exposure, lends support to the view that the sequence variants have a direct effect on DNA methylation and argues against reverse causation,” the study authors wrote.

14.  Elabela, A New Human Embryonic Stem Cell Growth Factor

September 20, 2015 by mburatov

When embryonic stem cell lines are made, they are traditionally grown on a layer of “feeder cells” that secrete growth factors that keep the embryonic stem cells (ESCs) from differentiating and drive them to grow. These feeder cells are usually irradiated mouse fibroblasts that coat the culture dish, but do not divide. Mouse ESCs can be grown without feeder cells if the growth factor LIF is provided in the medium. LIF, however, is not the growth factor required by human ESCs, and therefore, designing culture media for human ESCs to help them grow without feeder cells has proven more difficult.

Having said that, several laboratories have designed media that can be used to derive human embryonic stem cells without feeder cells. Such a procedure is very important if such cells are to be used for therapeutic purposes, since animal cells can harbor difficult to detect viruses and unusual sugars on their cell surfaces that can also be transferred to human ESCs in culture. These unusual sugars can elicit a strong immune response against them, and for this reason, ESCs must be cultivated or derived under cell-free conditions. However, to design good cell-free culture media, we must know more about the growth factors required by ESCs.

To that end, Bruno Reversade from The Institute of Molecular and Cell Biology in Singapore and others have identified a new growth factor that human ESCs secrete themselves. This protein, ELABELA (ELA), was first identified as a signal for heart development. However, Reversade’s laboratory has discovered that ELA is also abundantly secreted by human ESCs and is required for human ESCs to maintain their ability to self-renew.

Reversade and others deleted the ELA gene with the CRISPR/Cas9 system, and they also knocked the expression of this gene down in other cells with small interfering RNAs. Alternatively, they also incubated human ESCs with antibodies against ELA, which neutralized ELA and prevented it from binding to the cell surface. However Ela was inhibited, the results were the same; reduced ESC growth, increased amounts of cell death, and loss of pluripotency.

How does ELA signal to cells to grow? Global signaling studies of growing human ESCs showed that ELA activates the PI3K/AKT/mTORC1 signaling pathway, which has been show in other work to be required for cell survival. By activating this pathway, ELA drives human ESCs through the cell-cycle progression, activates protein synthesis, and inhibits stress-induced apoptosis.

Interestingly, INSULIN and ELA have partially overlapping functions in human ESC culture medium, but only ELA seems to prime human ESCs toward the endoderm lineage. In the heart, ELA binds to the Apelin receptor APLNR. This receptor, however, is not expressed in human ESCs, which suggests that another receptor, whose identity remains unknown at the moment, binds ELA in human ESCs.

Thus ELA seems to act through an alternate cell-surface receptor, is an endogenous secreted growth factor in human

This paper was published in the journal Cell Stem Cell.

15.  Multiwavelength TIRF Microscopy Enables Insight into Actin Filaments

Researchers at the University of California, San Francisco (UCSF) are combining multiple laser excitation wavelengths in total internal reflection fluorescence (TIRF) microscopy to investigate the binding dynamics of individual actin filaments.


TIRF microscopy provides a unique method of imaging isolated molecules and complexes in vitro. Additionally, the use of sensitive, low-noise cameras enables researchers to study this behavior in real time. A new plug-and-play method of combining several fiber-delivered, digitally modulated lasers into a single instrument, such as a TIRF microscope, now enables multiple labeled proteins to be imaged pseudosimultaneously at high frame rates. This article explores how multiwavelength excitation is being combined with TIRF microscopy in the laboratory of Dr. Dyche Mullins, a professor at UCSF, and how it’s being used to gain new insights into complex biochemical interactions that control the stability and function of actin filaments.

TIRF microscopy in single-filament studies

The Mullins Lab, located at UCSF’s Mission Bay campus, is widely recognized as a leading authority on the study of actin filaments. The protein filaments are fundamental to many processes in virtually every eukaryotic cell — they act as structural elements that enable movement of internal cargoes, amoeboid cell migration, cell division, etc. With these filaments playing so many different roles, it is not surprising that their combination of growth, branching, aggregation and movement involves many subtle control options, which are mediated by a range of different proteins. Sam Lord, the Mullins Lab’s microscope specialist, said, “One area of our research is studying how various proteins bind to actin filaments to enable aggregation, branching and other actions, and more specifically, how yet another set of proteins modulates these binding processes. Obviously, we do bulk studies in a cuvette that reveal overall kinetic data about these binding processes but we also want to image these processes in real time to study the structural biochemistry.” In order to do so, the lab uses TIRF microscopy to observe single actin filaments.

This process involves excitation light that is introduced into the sample region through either a glass slide or a cover slip. The microscope’s optics are configured so that the light hits the glass/sample interface beyond the critical angle, meaning that all of the light will undergo total internal reflection (TIR). However, even with TIR, some of the light’s electric field, called the evanescent wave, penetrates into the sample by an incredibly short distance — typically around 100 nm — beyond the interface. This means that TIRF microscopy can be used to selectively excite fluorescence in molecules and complexes that are adhered to the interface. However, because the light does not penetrate into the bulk (i.e., background) sample region, this methodology will not excite fluorescence from the huge backdrop of molecules freely floating within this medium.

TIRF microscopy is thus a 3D-resolved imaging technique. Its X-Y resolution is limited only by diffraction and/or the camera resolution, but the Z-axis sampling depth is much smaller than the diffraction limit. If there is sufficient signal for fast frame acquisition speeds, the important fourth dimension — time — enables dynamic processes, such as actin filament-protein binding, to be observed on a single filament or on a network of filaments, in real time.

In principle, both laser and nonlaser light sources may be used for fluorescence excitation in such TIRF-based applications. However, for experiments with naturally low signal levels, such as single-molecule monitoring, a laser beam’s extreme brightness is a critical advantage. In particular, a laser’s unique spatial brightness means that it is relatively simple to collimate and subsequently focus the beam into the sample with a narrow range of incidence angles, avoiding excitation of the bulk sample.

Through-objective TIRF microscopy

All TIRF microscope setups are based on one of two basic approaches: through-objective lens geometry or the prism-based method. In the former approach, light is directed in an off-axis geometry through an oil-immersion microscope objective so that the angle of incidence at the coverslip/sample interface is greater than the critical angle, as is shown schematically in Figure 1.

Figure 1.
 In TIRF microscopy, excitation light beyond the critical light is completely reflected. The evanescence of the light field at the refractive interface penetrates into the sample by about 100 nm, causing selective excitation of molecules and complexes adhered to this interface. TIRF microscopes are available with a choice of either through-objective excitation or prism excitation options.
In the prism-based method, the orientation of the sample is reversed with respect to the imaging objective. A light beam is introduced to the sample through a prism attached to the cover slip; the geometry of the prism ensures that the incidence angle at the sample is greater than the critical angle.

Depending on the type of experiment being performed, there are both advantages and disadvantages to each of the above methods. For example, the prism method limits physical access to the sample. As Lord explained, the Mullins Lab uses a Nikon microscope in the through-objective configuration with a very high numerical aperture (NA = 1.49) for several reasons. “For single-molecule studies, fluorescence signal strength is always a major challenge, particularly since we are following processes that need fast frame rates. So, we need a high-NA objective with a small working distance to maximize light collection efficiency. These objectives require a coverglass of precise thickness and the sample near the top of the coverslip to minimize aberrations.” Lord also stated that caution must be taken so the team does not introduce scattering and other losses due to viewing fluorescence through the bulk of the sample.

Multiple, simultaneous laser wavelengths

As has been noted, the mechanisms controlling the binding of regulatory proteins to actin filaments are quite complex. To better understand these processes, the Mullins Lab increasingly has been using sophisticated, multiwavelength TIRF-based experiments. In order to image multiple fluorophores, Lord explained, “We can use either multiple sequenced lasers or a scope equipped with multiple cameras — we have setups for both arrangements.” He continued, “Multiple excitation wavelengths that sequence at high rates enable us to selectively image multiple, differently labeled targets using a microscope equipped with a single high-sensitivity camera, and ensures near-perfect image registration.”

When using multiple lasers, the two technical challenges are to perfectly coalign the lasers into the microscope objective and then to be able to switch between different wavelengths. In order to follow fast binding processes in real time, researchers typically must switch wavelengths between alternate camera frames to build up pseudosimultaneous (i.e., interleaved) videos at two or sometimes three laser wavelengths. This switching must be performed with no undesirable dead time (i.e., shifts in the beam path) and without using mechanical shutters or a complex and costly approach, such as an acousto-optic tunable filter.

Lord notes, “As recently as five years ago, we simply didn’t have low-cost options to conduct single molecule studies using multiple laser wavelengths pseudosimultaneously at the requisite frame rates (30 fps) in order to follow critical binding processes.” He added, “Digitally controllable diode or solid-state lasers, hardware sequencing electronics and quad-band optical filters make it possible to achieve nearly simultaneous multicolor imaging with a single camera.”

In 2014, the lab acquired several digitally modulatable smart lasers to enable multiwavelength TIRF microscopy. These lasers included Coherent’s fiber-pigtailed OBIS FP modules that operate at 488, 561 and 640 nm. The lab also acquired the OBIS Galaxy, which enables simple plug-and-play combining of up to eight fiber-coupled lasers into one, single-mode output fiber. As was detailed by Coherent’s Dan Callen and Matthias Schulze in BioPhotonics’ November 2014 issue (“Laser Combiner Enables Scanning Fluorescence Endoscopy,”, this passive module enables lasers to be added or subtracted (i.e., hot-swapped) to any fiber-coupled instrument or setup in a few minutes or less via standard fiber connectors, such as FC/UFC and FC/APC connectors.

Figure 2. The OBIS Galaxy (shown with the top cover removed) allows plug-and-play combining of up to eight separate fiber coupled lasers into a single output fiber. Courtesy of Sam Lord.
The timing hardware setup at the Mullins Lab is very simple in design due to the fact that these smart lasers support direct digital modulation. In each experiment, the frame rate is set by the microscope’s high-sensitivity camera, which is an Andor DU897. The camera’s TTL output trigger pulses are processed in either a programmable Arduino board or an ESio controller, which then directs TTL pulses to fire one of the three lasers without any hardware or software delays. Alternating wavelengths typically are used in most experiments, although any sequence of wavelength frames easily can be programed using the Arduino and Micro-Manager software.1

According to Lord, the flexibility of this arrangement supports future experimental setups that have even greater levels of complexity. In particular, he added, “We may well add a 405-nm laser option in the near future. If/when this arrives, we can simply plug it in and we are ready to go.”

Investigating modulation of actin binding processes

In the team’s work on the binding of actin filaments, this flexible TIRF setup enables the Mullins Lab to conduct experiments with several different approaches. For example, in typical two-wavelength experiments, the actin filament is labeled with one fluorophore, and the protein of interest is labeled with another fluorophore. The protein fluorophore only appears in the TIRF-produced images if/when it binds to the actin sitting on the cover slip. One use of the third wavelength is to image a second protein, which is labeled with a different fluorophore. The image sequences then may reveal, for example, whether the proteins are interspersed at different sites on the filament, or whether the second protein promotes filament growth or branching from a new site. Or, it may reveal that the second protein competitively displaces the first.

In a recently published study,2 Mullins Lab researchers used their multilaser TIRF setup to investigate the details of control mechanisms associated with the binding of tropomyosins to actin filaments. Tropomyosins are coiled-coil proteins whose known functions are to bind actin filaments and thereby regulate multiple cytoskeletal functions — including actin network dynamics near the leading edge of motile cells.

Mullins explained, “The binding of tropomyosins to actin filaments is known to be fundamentally important in actin dynamics. But, we do not yet fully understand how this binding is regulated, especially near the leading edge of migrating cells. Why, for example, are filaments in the lamellum coated with tropomyosin while filaments in the adjacent lamellipod are not?” (Lamellum and lamellipod are distinct, actin-based substructures involved in cell migration.) He went on to state that, prior to his team’s latest studies, previous research demonstrated that tropomyosins inhibit actin nucleation by the Arp2/3 protein complex and that this, in turn, prevented filament severing by the protein cofilin.3,4 “So, we have recently used TIRF and other methods to investigate if and how the Arp2/3 complex and cofilin in turn modulate the binding of tropomyosins to actin filaments,” he said.

Figure 3.
 TIRF images showing Tm1A binding preferentially to the pointed end of single actin filaments. The red signal is from Cy5 labeled Tm1A fluorescence excited at 640 nm, and the green signal is due to Alexa 488 labeled actin excited at 488 nm. Courtesy of J.Y. Hsiao, L.M. Goins, N.A. Petek, R.D. Mullins.
The team members studied these interactions in the specific case of nonmuscle Drosophila tropomyosin protein, Tm1A. They also compared some of these interactions in Tm1A to the same interactions in rabbit skeletal muscle tropomyosin, as other researchers previously have found that mammalian skeletal muscle tropomyosin is the least-effective Arp2/3 inhibitor.2

Data from dual-wavelength excitation produced by TIRF microscopy methodology when applied to single filaments is shown in Figure 3. This information shows that Tm1A preferentially binds near the pointed end of actin filaments. By comparing similar data that resulted from different experimental conditions, the researchers showed that pointed-end binding is dependent on the nucleotide state of the actin and the Tm1A concentration.

Although a complete evaluation of all of the research’s results, conclusions and wider implications falls outside the scope of this article, Mullins does summarize some of the key points. “Binding of cyto-skeletal tropomyosin to actin filaments turns out to be more complicated than previously appreciated. Both nucleation and spreading of tropomyosin are strongly influenced by the conformation of the actin filament and the presence of other regulatory proteins.” Mullins added that, based on TIRF-produced images and other collected data, “We have been able to propose a model where the cooperation of the severing activity of cofilin and tropomyosin binding helps establish the border between the lamellipod and lamellum.” The role of cofilin in the model referenced by Mullin is shown in Figure 4.

Figure 4.
 These images summarize the role of cofilin in the model proposed by Hsaio et al. [ref]. The branched actin network on the left shows the situation in the absence of cofilin, where tropomyosin binding is blocked by Arp2/3 branches. The branched actin network on the right illustrates that in the presence of cofilin, new pointed ends are created, which allows tropomyosin to bind. Once tropomyosin is bound, it protects the actin filaments from further cofilin severing, possibly resulting in the transition from the lamellipod to the lamellum. Courtesy of J.Y. Hsiao, L.M. Goins, N.A. Petek, R.D. Mullins.
In summation, TIRF microscopy is a well-established technique for imaging single molecular structures and protein complexes. This method also enables their respective dynamics to be observed in real time. By providing a method to rapidly switch between two or more excitation wavelengths, the latest lasers and laser-combining technologies are now enabling researchers to perform TIRF microscopy experiments with a greater number of separate labels. This capability is delivering unique insights into important and multifaceted processes in the study of cell biology.

Meet the author

Dan Callen is a product manager at Coherent Inc. in Santa Clara, Calif.; email:


1. A.D. Edelstein et al. (2014). Advanced methods of microscope control using μManager software. J Biol Methods, Vol. 1, No. 2, e10.

2. J.Y. Hsiao et al. (2015). Arp2/3 complex and cofilin modulate binding of tropomyosin to branched actin networks. Curr Biol, pp. 1-10.

3. L. Blanchoin et al. (2001). Inhibition of the Arp2/3 complex-nucleated actin polymerization and branch formation by tropomyosin. Curr Biol, Vol. 11, No. 16, pp. 1300-1304.

4. J.H. Iwasa and R.D. Mullins (2007). Spatial and temporal relationships between actin-filament nucleation, capping and disassembly. Curr Biol, Vol. 17, No. 5, pp. 395-406.

18.  Using QCLs for MIR-Based Spectral Imaging — Applications in Tissue Pathology

A quantum cascade laser (QCL) microscope allows for fast data acquisition, real-time chemical imaging and the ability to collect only spectral frequencies of interest. Due to their high-quality, highly tunable illumination characteristics and excellent signal-to-noise performance, QCLs are paving the way for the next generation of mid-infrared (MIR) imaging methodologies.


H. Sreedhar*1, V. Varma*2, A. Graham3, Z. Richards1, F. Gambacorata4, A. Bhatt1,
P. Nguyen1, K. Meinke1, L. Nonn1, G. Guzman1, E. Fotheringham5, M. Weida5,
D. Arnone5, B. Mohar5, J. Rowlette5
Real-time, MIR chemical imaging microscopes could soon become powerful frontline screening tools for practicing pathologists. The ability to see differences in the biochemical makeup across a tissue sample greatly enhances a practioner’s ability to detect early stages of disease or disease variants. Today, this is accomplished much as it was 100 years ago — through the use of specially formulated stains and dyes in combination with white light microscopy. A new MIR, QCL-based microscope from Daylight Solutions enables real-time, nondestructive biochemical imaging of tissues without the need to perturb the sample with chemical or heat treatments, thus preserving the sample for follow-on fluorescence tagging, histochemical staining or other “omics” testing within the workflow.
MIR chemical imaging is a well-established absorbance spectroscopy technique; it senses the relative amount of light that molecules absorb due to their unique vibrational resonances falling within the MIR portion of the electromagnetic spectrum (i.e., wavelengths from approximately 2 to 15 µm). This absorption can be detected with a variety of MIR detector types and can provide detailed information about the sample’s chemical composition.

The most common instrument for this type of measurement is known as a Fourier transform infrared (FTIR) spectrometer. FTIR systems use a broadband MIR light source, known as a globar, to illuminate a sample; the absorption spectrum is generated by the use of interferometry. Throughout the past decade, FTIR systems have incorporated linear arrays and 2D focal plane arrays (FPAs) in a microscope configuration to enable a technique known as chemical imaging.

19.  Inner Ear Undertakers

Support cells in the inner ear respond differently to two drugs that kill hair cells.

By Kerry Grens | September 1, 2015

The paper
E.L. Monzack et al., “Live imaging the phagocytic activity of inner ear supporting cells in response to hair cell death,” Cell Death Differ, doi:10.1038/cdd.2015.48, 2015.

Killer drugs
A number of commonly used medications can cause hearing loss by killing off cochlear hair cells, which translate sound waves into neural activity. To understand how they die, Lisa Cunningham and Elyssa Monzack of the National Institute on Deafness and Other Communication Disorders and colleagues turned to the utricle, a vestibular inner-ear structure involved with balance whose hair cells are very similar to those in the cochlea, which are notoriously resistant to culturing when mature.

Body bags
The team developed a method to watch hair cells of whole mouse utricles die in real time after exposure to the chemotherapy drug cisplatin or the antibiotic neomycin. In response to the latter, supporting cells, glia-like neighbors of hair cells, appeared to form a phagosome around the corpses and engulf them. “You can see two, three, sometimes four supporting cells advancing simultaneously on that hair cell corpse,” says Cunningham—which suggests that the dying cell is giving off a specific and local signal.

Spilled guts
In contrast, cisplatin-induced hair cell death provoked hardly any phagocytic reaction from supporting cells, about half of which themselves succumbed. Cunningham says this could have clinical implications if dead hair cells then spill their cytoplasmic contents into the tissue, which can result in an immune response that can cause even further damage.

Distress call
Mark Warchol of Washington University in St. Louis says it will be important to identify the signal supporting cells are responding to after neomycin treatment. “There’s some molecular signal by which the hair cell causes [supporting cells] to execute this process. And with cisplatin, they’re just not capable of doing it.”


utriclesensory biologyphagocytosishearinghair cellearcell death and cell & molecular biology

20.  Inner Ear Cartography

Scientists map the position of cells within the organ of Corti.

By Ruth Williams | September 1, 2015

Age-related hearing loss caused by damage to the sensory hair cells within the cochlea is extremely common, but studying the inner ear is tough. “It’s in the densest bone in the body, so you don’t have access,” says John Brigande of Oregon Health and Science University in Portland. Even if you can extract cells, he says, “there are so darn few of them.”

Despite these technical difficulties, researchers have gleaned gene-expression information about different cell types within the organ of Corti—home to the sensory cells within the cochlea. But “it’s not only important to know what a cell expresses,” says Robert Durruthy-Durruthy, a postdoc in the Stanford University lab of Stefan Heller. “It’s also important to know where it can be found within a tissue.”

To this end, Durruthy-Durruthy, Heller, and postdoc Jörg Waldhaus have derived a 2-D map of organ of Corti cells from neonatal mice. First, the team sorted all cell types across the medial-to-lateral axis (or width) of the organ based on marker gene expression. The approximately 900 sorted cells, representing nine cell types, were then each quantitatively analyzed for the expression of 192 selected genes. Computational analysis of these expression data then enabled reconstruction of the cells’ positions along the organ’s apical-to-basal (length) and medial-to-lateral axes. In principle, the technique, which harnesses gene-expression information to determine cells’ spatial organization, could be applied to generate 2-D maps of any complex tissue, says Durruthy-Durruthy.

Within the mammalian cochlea, apical cells retain regenerative capacity for a few weeks after birth, but basal cells do not. “Spatial mapping allows us to get at the differences [between these cells],” says Brigande, and that could ultimately highlight possible ways to reinstate regeneration in the adult ear. (Cell Reports, 11:1385-99, 2015)

  FROM ORGAN TO SINGLE CELLS: To build a map of cells within the organ of Corti—where sound is translated to neural activity—scientists divide the cochlea in two. Each half of the organ of Corti is then broken up into its constituent cells, which comprise nine cell types (represented by the nine colors) spanning the organ’s edial-to-lateral axis.

21.  Resveratrol Stabilizes Amyloid in Alzheimer’s

Pauline Anderson

September 17, 2015

High doses of purified resveratrol, a polyphenol found in some foods, appear to stabilize levels of amyloid beta (Aβ) in cerebrovascular fluid (CSF) and in plasma in patients with mild to moderate Alzheimer disease (AD) and are safe and well tolerated, a new phase 2 study has shown.

Although it is too soon to start recommending resveratrol supplements to patients, the research indicates that this compound is safe and is promising, lead author R. Scott Turner, MD, PhD, professor, Neurology, and director of the Memory Disorders Program, Georgetown University Medical Center, Washington, DC, one of 21 medical centers across the United States participating in the study.

“It seems to have some interesting effects, enough to justify further research into this strategy,” he told Medscape Medical News.

The study was published online September 11 as an Open Access article in Neurology.

Natural Compound

Resveratrol is a naturally occurring compound found in red grapes, red wine, dark chocolate, and some other foods, and is widely available as a supplement.

It is believed that resveratrol promotes resilience to stress, as levels increase in plants exposed to severe cold or to fungus, said Dr Turner. Animal research suggests that resveratrol may affect sirtuins, which are proteins that are activated with calorie restriction, which is a form of mild stress.

The study included 119 patients randomly assigned to either high doses of pure synthetic pharmaceutical grade resveratrol that is not available commercially (n = 64) or placebo (n = 55). The resveratrol used in the study was introduced at a dose of 500 mg a day and was increased every 3 months, so that by the end of the 1-year study, subjects were taking 2000 mg a day.

Results showed that at 1 year, the treated group’s levels of Aβ40 in CSF declined from 6574 to 6513 ng/mL, but in the placebo group, these levels went from 6560 to 5622 ng/mL, for a statistical difference at week 52 (P = .002).

This difference was also found in secondary analyses of study completers, in the mild dementia subgroup, and in APOE4 carriers and noncarriers.

The treated group’s Aβ40 levels in plasma declined from 163 to 153 ng/mL, and in the placebo group, these levels went from 165 to 132 ng/mL (for a statistical difference; P = .024).

“We can’t prove efficacy from this trial, but we’re looking for some movement in biomarkers, and we actually found that,” which is promising, said Dr Turner. “The major movement we found was in amyloid proteins in blood and CSF that were stabilized by resveratrol treatment compared to placebo, where it trended downhill, which is what happens with Alzheimer’s disease.”

This downhill trend could signal more amyloid being deposited into the brain. In contrast, that resveratrol seemed to stabilize Aβ40 in the CSF and plasma suggests the drug was able to penetrate the blood–brain barrier.

Unfortunately, said Dr Turner, the study could not fund amyloid positron emission tomography scans, which might have shed more light on the Aβ status of subjects.

Although there were no significant effects of the treatment on other amyloid biomarkers, including CSF and plasma Aβ42, trends were similar to the findings with Aβ40.

There was no difference in CSF tau. There was a trend toward an increase in CSF phospho-tau 181 with treatment (P = .08) and in secondary analysis of mild dementia (P = .047).

As for brain volume determined through magnetic resonance imaging (MRI), results showed that volumes declined more in the treatment group (going from 866 to 839 mL) than in the placebo group (going from 850 to 840 mL). This result was “mysterious” and “unexpected,” said Dr Turner.

However, he noted that the same effect has been reported in other AD trials, including those investigating immunotherapy. “The working hypothesis is that by treating AD, we are also decreasing the amount of inflammation and swelling in the brain.”

The study showed no significant effects on the mini mental state exam or on other clinical scales, but the researchers note that the phase 2 trial was not powered to detect differences in clinical outcomes.

However, they did find that the activities of daily living scale declined less in the resveratrol group than in the placebo group. “That’s also promising, because even with this phase 2, we are seeing what we think might be a clinical benefit,” said Dr Turner

A total of 657 adverse events were reported (355 in the treatment and 302 in the placebo groups), most of which were mild. The most common adverse events were gastrointestinal-related and included nausea and diarrhea.

Weight Loss

The placebo group gained about 1 pound of body weight, whereas the treated group lost almost 2 pounds. At the end of the study, the mean body mass index in the placebo group was 26.1, and in the treated group, it was 25.4.

“The weight loss is concerning, because Alzheimer disease itself causes weight loss, and we don’t want people to continue to lose weight,” said Dr Turner.

It is not clear whether the weight loss was a result of the adverse effects of diarrhea, nausea, and so on, or because of some metabolic effect.

Interestingly, six of the seven new neoplasms seen in study participants occurred in those taking placebo. This is of great interest to cancer researchers, said Dr Turner, adding that resveratrol and similar compounds are being tested in many age-related disorders, including diabetes and neurodegenerative disorders, as well as AD and cancer.

None of the 36 serious adverse events (19 on the drug and 17 on placebo), including three deaths, were deemed to be related to treatment.

Commenting on this study for Medscape Medical News, James Hendrix, PhD, director, Global Science Initiatives, Alzheimer’s Association, said that although the finding that resveratrol might stabilize Aβ40 is encouraging, the study needs to be followed up with a larger and longer phase 3 trial.

“The main focus of this study, and the main question it addressed, was whether a dose at such a high level is safe, and with the exception of some [gastrointestinal] discomfort for some people, it appears to be mostly safe.”

Dr Hendrix noted that the high dose used in the study is equivalent to 1000 bottles of red wine.

He pointed out that the study was relatively small, with 56 subjects completing the study in the treatment group, and only 48 in the placebo group.

The research was supported by a grant from the National Institute on Aging. Dr Turner reports no personal financial interests related to the study. Dr Hendrix is an employee of the Alzheimer’s Association, which has funded resveratrol grants in the past, but did not fund this study.

Neurology. Published online September 11, 2015. Full text

A randomized, double-blind, placebo-controlled trial of resveratrol for Alzheimer disease ABSTRACT Objective: A randomized, placebo-controlled, double-blind, multicenter 52-week phase 2 trial of resveratrol in individuals with mild to moderate Alzheimer disease (AD) examined its safety and tolerability and effects on biomarker (plasma Ab40 and Ab42, CSF Ab40, Ab42, tau, and phospho-tau 181) and volumetric MRI outcomes (primary outcomes) and clinical outcomes (secondary outcomes). Methods: Participants (n 5 119) were randomized to placebo or resveratrol 500 mg orally once daily (with dose escalation by 500-mg increments every 13 weeks, ending with 1,000 mg twice daily). Brain MRI and CSF collection were performed at baseline and after completion of treatment. Detailed pharmacokinetics were performed on a subset (n 5 15) at baseline and weeks 13, 26, 39, and 52. Results: Resveratrol and its major metabolites were measurable in plasma and CSF. The most common adverse events were nausea, diarrhea, and weight loss. CSF Ab40 and plasma Ab40 levels declined more in the placebo group than the resveratrol-treated group, resulting in a significant difference at week 52. Brain volume loss was increased by resveratrol treatment compared to placebo. Conclusions: Resveratrol was safe and well-tolerated. Resveratrol and its major metabolites penetrated the blood–brain barrier to have CNS effects. Further studies are required to interpret the biomarker changes associated with resveratrol treatment. Classification of evidence: This study provides Class II evidence that for patients with AD resveratrol is safe, well-tolerated, and alters some AD biomarker trajectories. The study is rated Class II because more than 2 primary outcomes were designated. Neurology® 2015;85:1–9

Caloric restriction prevents aging-dependent phenotypes1 and activates sirtuins (including SIRT1), a highly conserved family of deacetylases that are regulated by NAD1/NADH and thus link energy metabolism to gene expression.2 SIRT1 substrates include FOXO and PGC- 1a. 3 A screen of SIRT1 activators identified resveratrol (trans-3,49,5-trihydroxystilbene) as a potent compound.4 Similar to caloric restriction,5,6 resveratrol decreases aging-dependent cognitive decline and pathology in Alzheimer disease (AD) animal models.7,8

Resveratrol is under investigation to prevent age-related disorders including cancer, diabetes mellitus, and neurodegeneration.4,9–12 Due to its low bioavailability but high bioactivity,13,14 we increased the dose to the maximal amount considered safe and well-tolerated for this study.15 We conducted a randomized, placebocontrolled, double-blind, multicenter 52- week phase 2 trial of resveratrol in individuals with mild to moderate AD. The primary objectives were to (1) assess the safety and tolerability of resveratrol; (2) assess effect on plasma and CSF Ab42 and Ab40, CSF tau and phospho-tau 181, and volumetric MRI; and (3) examine pharmacokinetics. The secondary objectives were to (1) explore the effects of resveratrol on cognitive, functional, and behavioral outcomes; (2) examine the influence of APOE genotype; and (3) determine whether resveratrol affects insulin and glucose metabolism. We hypothesized that resveratrol would alter AD biomarker trajectories.

RESULTS A total of 179 participants were screened, of whom 60 were not randomized (50 screen-failed and 10 withdrew consent). Participants (119) were randomized as shown (figure 1). A total of 104 completed the study (12.6% dropout), and 77 completed 2 CSF collections (34% dropout). Eighteen participants discontinued treatment early and 15 discontinued the study. The population was English-speaking, 57% female, and 91% Caucasian.

Safety and tolerability. No differences between the resveratrol and placebo-treated groups were found on vital signs, physical examinations, or neurologic examinations. Routine laboratory tests were normal. A total of 657 AEs (490 mild, 139 moderate, 28 severe) were reported (355 on drug, 302 on placebo) (table 2). A total of 113 out of 119 (95%) participants reported at least 1 AE. The most common AEs were nausea and diarrhea (in 42% of individuals with drug vs 33% with placebo, p 5 0.35). Few participants reported nausea and diarrhea—the most likely drug-related AE—that led to treatment discontinuation, a treatment plateau at a lower dosage, or study discontinuation (figure 1). The placebo group gained 0.54 6 3.2 kg body weight, while the treated group lost 0.92 6 4.9 kg (mean 6 SD, p 5 0.038) resulting in a difference in body mass index (BMI). The treated group’s BMI was 25.4 6 4.0 vs the placebo group’s 26.1 6 4.1 at week 52 (mean 6 SD, p 5 0.047). Thirty-six serious AEs (SAEs) were reported (19 on drug, 17 on placebo) including 27 hospitalizations (14 on drug, 13 on placebo) and 3 deaths (1 on drug, 2 on placebo)—none study drug-related. There were no differences in participants who experienced at least one SAE (20.3% on drug, 18.2% on placebo), at least one hospitalization (18.8% drug, 16.4% placebo), or died (1.6% drug, 3.6% placebo). Seven new neoplasms were reported (1 on drug, 6 on placebo, p , 0.048) (table 2). Retrospective review of the brain MRIs of a placebo-enrolled participant with malignant glioma, which resulted in death, revealed that the tumor was present at screening. Two participant deaths were due to lung melanoma (placebo group) and drowning (drug group).

AD duration (from year of symptom onset), y, mean (SD)      Resv 3.9 (2.3)        Placebo 5.5 (2.6)     <0.001

Outcomes. At week 52, the treated group’s CSF Ab40 declined from 6,574 6 2,346 to 6,513 6 2,279 ng/mL and from 6,560 6 2,190 to 5,622 6 1,736 ng/mL with placebo, resulting in a difference at week 52 (mean 6 SD, p 5 0.002) (figure 2A). This difference was also found in secondary analyses of study completers (p 5 0.002), in the mild dementia subgroup (p 5 0.01), and in APOE4 carriers (p 5 0.05) and noncarriers (p 5 0.01) (table e-2). During the study, the treated group’s plasma Ab40 (figure 2B) declined from 163 6 58 to 153 6 54 ng/mL and from 165 6 55 to 132 6 54 ng/mL with placebo (mean 6 SD, p 5 0.024). Secondary analyses by APOE4 genotype revealed an effect of treatment on plasma Ab40 in APOE4 carriers (p 5 0.04) but not noncarriers (table e-2). There were no effects on CSF Ab42 or plasma Ab42 (figure 2, C and D), although trends were similar to Ab40. There was no difference in CSF tau and a trend toward an increase in CSF phospho-tau 181 with treatment (p 5 0.08), and in a secondary analysis of mild dementia (p 5 0.047) (data not shown). Volumetric MRIs revealed that brain volume (excluding CSF, brainstem, and cerebellum) declined more in the treatment group (p 5 0.025) with an increase in ventricular volume (p 5 0.05) at week 52 (figure 3, A and B). In the treatment group, brain volume decreased from 866 6 84 to 839 6 85 mL and ventricular volume increased from 55 6 24 to 81 6 24 mL (mean 6 SD). With placebo, brain volume decreased from 850 6 99 to 840 6 93 mL and ventricular volume increased from 56 6 19 to 76 6 25 mL (mean 6 SD). Secondary analyses revealed that brain volume declined with treatment in APOE4 carriers (p 5 0.02) but not noncarriers (table e-2). Similar results were found with ventricular volume, which increased with treatment in APOE4 carriers (p 5 0.05) but not noncarriers. This phase 2 trial (underpowered to detect differences in clinical outcomes) found no significant effects on CDR-SOB, ADAS-cog, MMSE, or NPI. The drugtreated group’s ADCS-ADL declined from 63.7 6 10.8 to 57.4 6 12.3 and from 60.5 6 10.7 to 51.3 6 14.5 in the placebo group (mean 6 SD, p 5 0.03), indicating less decline with treatment. No drug effects were found with plasma glucose or insulin metabolism (data not shown). We also analyzed (post hoc) the subset of individuals with CSF Ab42 ,600 ng/mL at baseline as a proxy of AD amyloid pathology. At week 52, differences between treatment groups persisted for CSF Ab40 (p 5 0.001, total n 5 70) and plasma Ab40 (p 5 0.02, n 5 83). In this analysis, we also found a treatment effect on CSF Ab42 (p 5 0.02, n 5 70) but lost significance in brain volume loss (p 5 0.06, n 5 83) and ADCSADL (p 5 0.055, n 5 88).

DISCUSSION High-dose oral resveratrol is safe and well-tolerated. The most common AEs were nausea and diarrhea, but results were similar to placebo. Weight and fat loss with resveratrol are reported in some preclinical studies,4 but human studies are scarce and of shorter duration. A decrease in body fat and a trend toward weight loss were reported in a 26- week trial with 200 mg/day resveratrol in healthy older participants.33 Weight and fat loss may be related to enhanced mitochondrial biogenesis mediated by SIRT1 activation of PCG-1a. 4,10,11 Ab levels declined as dementia advanced. The altered CSF Ab40 trajectory suggests that the drug penetrated the blood–brain barrier to have central effects. At week 52, the mean CSF levels of resveratrol, 3G-RES, 4G-RES, and S-RES were 3.3%, 0.4%, 0.4%, and 0.3%, respectively, of plasma levels at the same study visit. At the highest dosage, low mM levels of resveratrol and its metabolites were measured in plasma, with corresponding low nM levels found in CSF. Resveratrol has many targets, with some engaged at uM concentrations.4 These findings suggest that a central molecular target may be engaged at nM concentrations. In addition to anti-inflammatory, antioxidant, and anti-Ab aggregation, putative targets include sirtuin activation with enhanced a-cleavage of amyloid precursor protein34 and promotion of autophagy.35 Further studies of banked CSF, plasma, pellets, DNA, and blood mononuclear cells from participants will examine mechanisms.

Resveratrol treatment increased brain volume loss. This finding persisted when participants with weight loss (table 2) were excluded (data not shown). The etiology and interpretation of brain volume loss observed here and in other studies are unclear, but they are not associated with cognitive or functional decline. In the first human active Ab immunization trial, antibody responders had greater brain volume loss, and greater volumetric changes were associated with higher antibody titers.36 In the phase 2 bapineuzumab trial, treatment resulted in greater ventricular enlargement, but only in APOE4 carriers.37 In the phase 3 bapineuzumab APOE4 carrier trial and the high-dose noncarrier study, treatment resulted in a trend toward greater brain atrophy.38 Since this phase 2 study lacks consistent changes in clinical outcomes, interpretation of the effects on trajectories for plasma and CSF Ab40, and brain and ventricular volume, remain uncertain.

Resveratrol altered levels of CSF Ab40 (A) and plasma Ab40 (B) (ng/mL, mean 6 SE). Similar but nonsignificant trends were found for CSF Ab42 (C) and plasma Ab42 (D) (ng/mL, mean 6 SE). Note difference in scales. Sample sizes are indicated.

Resveratrol increased brain volume loss (A, C) (mL, mean 6 SE) with a corresponding increase in ventricular volume (B, D) (mL, mean 6 SE). Sample sizes are indicated.

This phase 2 study has limitations. It was designed to determine the safety and tolerability of resveratrol and to examine pharmacokinetics. Although some biomarker trajectories were altered, we found no effects of drug treatment on plasma Ab42, CSF Ab42, CSF tau, CSF phospho-tau 181, hippocampal volume, entorhinal cortex thickness, MMSE, CDR, ADAS-cog, NPI, or glucose or insulin metabolism. The altered biomarker trajectories must be interpreted with caution. Although they suggest CNS effects, they do not indicate benefit.

22.  Miniature VHS Solenoid Valves Play Significant Role in the Viability of 3D Bio-Printing of Human Cells  

The rapid development of viable inkjet technology for highly specialised applications, such as printing human cells, continues to generate significant interest. If successful, the realisation of this technology for specialised biological applications, generally known as ‘biofabrication’, has the potential to replace the long established (and often controversial) process of using animals for testing new drugs. However, there are many challenges to overcome to enable the successful production of a valve-based cell printer for the formation of human embryonic stem cell spheroid aggregates. For example, printing techniques need to be developed which are both controllable and less harmful to the process of preserving human cell tissue viability and functions.

One particular cell printing project at an advanced stage and which has benefitted from the features and benefits of Lee Products miniature VHS solenoid valves and nozzles, is the result of pioneering activities at Edinburgh’s Heriot-Watt University. Dr Will Shu at the University’s Biomedical micro-engineering Group and his colleagues, including Alan Faulkner-Jones a bioengineering PhD student have successfully developed a bio-printer which has been demonstrated at the 3D Print show in London. Also involved in the development of the bio-printer are specialists at Roslin Cellab in Midlothian, a leading stem cell technology company.

The valve based bio-printer has been validated to print highly viable cells in programmable patterns from two different bio-inks with independent control of the volume of each droplet (with a lower limit of 2nL or fewer than five cells per droplet). Human ESC’s (Embryonic Stem Cells) were used to make spheroids by overprinting two opposing gradients of bio-ink; one of hESC’s in medium and the other of medium alone.
The resulting array of uniform sized droplets with a gradient of cell concentrations was inverted to allow cells to aggregate and form spheroids via gravity.
The resulting aggregates have controllable and repeatable sizes and consequently they can be made to order for specific applications. Spheroids with between 5 and 140 dissociated cells resulted in spheroids of 0.25-0.6 mm diameter. The success of the bio-printer demonstrates that a valve based printing process is gentle enough to maintain stem cell viability, accurate enough to produce spheroids of uniform size and that printed cells maintain their pluripotency.
Looking closer at the design of the bio-printer platform reveals two dispensing systems, each comprising a Lee VHS Nanolitre solenoid dispensing valve with a Teflon coated 101.6 µm internal diameter Lee Minstac nozzle controlled by a Arduino microcontroller. Each dispensing system is attached to a static pressure reservoir for the bio-ink solution to be dispensed via flexible tubing. The dispensing system and bio-ink reservoirs are mounted within a custom-built enclosure on the tool head of a micrometer-resolution 3-axis 3d printing platform (High-Z S-400, CNC Step) and controlled by a customized CNC controller (based on G540, Geokodrives).

A relatively larger nozzle diameter (compared to the size of the cells that are printed) was selected to reduce the amount of shear stress that could be experienced by the cells during the dispensing process. The bio-ink reservoirs were kept as close as possible to the valves in order to minimise the amount of time it would take to charge the system with bio-ink and to purge it at the end of the experiment. A USB microscope is also included to enable visual inspection of the target substrate during the printing process. Due to the type of deposition system used, a direct line of sight view through the nozzle is not possible and therefore the USB microscope is mounted at an offset angle from the cell deposition system assemblies.
Commenting on the development of the bio-printer and the vital role played by Lee Product’s VHS solenoid valves, Dr Will Shu at Heriot-Watt University said: “Printing living cells is extremely challenging and to the best of our knowledge, this is the first time that these cells have been 3D printed. The technique will allow us to create more accurate human tissue models which are essential to in-vitro drug development and toxicity testing and since the majority of drug discovery is targeting human disease, it makes sense to use human tissues.
”The development of the bio-printer has taken many years of effort and we are very pleased with the performance of Lee’s VHS solenoid valves, they are a vital component within the bio-printer printhead and we recommend them to our colleagues working on similar projects.”
Dr Shu added, “We also acknowledge the support and interaction from our contacts at Lee Products which has helped us to overcome the challenges of this project.”
This highly specialised application is an excellent example of the performance of Lee’s range of VHS Micro-Dispense Solenoid Valves which provide precise, repeatable, non-contact dispensing of fluids in the nanolitre to microlitre range. The valves feature a number of port configurations to facilitate quick and convenient connections to Lee’s 062 MINSTAC fittings and press-on tubing. The 062 MINSTAC outlet port can be used with Lee 062 MINSTAC tubing or atomising nozzles. Custom configurations and voltages are also available to suit specific applications.


Posted on September 23, 2015 by Healthinnovations

Ten million Canadians are living with diabetes or pre-diabetes. The Canadian Diabetes Association reports that more than 20 Canadians are newly diagnosed with the disease every hour of every day. It is also the seventh leading cause of death in Canada, with associated health-care costs estimated at nearly $9 billion a year. Type 2 diabetes accounts for 90 per cent of all cases, increasing the risk of blindness, nerve damage, stroke, heart disease and several other serious health conditions.

Insulin secretion from β cells of the pancreatic islets of Langerhans is impaired in type 2 diabetes (T2D).  Evidence suggests that this metabolic amplification of insulin secretion occurs distally in the secretory pathway, possibly at the calcium dependent exocytotic site.  Therefore the regulation or amplification of insulin is an important target for researchers around the world

Now, researchers from the University of Alberta have identified a new molecular pathway that manages the amount of insulin produced by the pancreatic cells, essentially a ‘dimmer’ switch that adjusts how much or how little insulin is secreted when blood sugar increases.  The team state that the dimmer appears to be lost in Type 2 diabetes, however, it can be restored and ‘turned back on’, reviving proper control of insulin secretion from islet cells of people with Type 2 diabetes.  The opensource study is published in the Journal of Clinical Investigation.

Previous studies show that the canonical mechanism of glucose-stimulated insulin secretion involving increases in metabolism-derived ATP, inhibition of KATP channels, and activation of VDCCs was first introduced more than 30 years ago and remains as a cornerstone mechanism for the triggering of insulin secretion’.  The KATP channel mechanism does not define the entire secretory response with multiple metabolic coupling intermediates proposed as factors that amplify the secretory response to a Ca2+exocytosis-based signal, with the net export of mitochondrial substrates being of great interest.

The current study examined pancreatic islet cells from 99 human organ donors.  Results show that the glucose-dependent amplification of exocytosis in human β cells, which is disrupted in type 2 diabetes, requires isocitrate flux through mitochondrial export which generates cytosolic NADPH and GSH. These then act through SENP1 to amplify the exocytosis of insulin, thereby controlling glucose homeostasis.  The lab then validated these data findings in a transgenic animal model.

The researchers state that the discovery is a potential game-changer in Type 2 diabetes research, leading to a new way of thinking about the disease and its future treatment.  The go on to add that understanding the islet cells in the pancreas that make insulin, how they work, and how they can fail, could lead to new ways to treat the disease, delaying or even preventing diabetes.

The team surmise that although the ability to restore and fix the dimmer switch in islet cells may have been proven on a molecular level, finding a way to translate those findings into clinical use could yet take decades. Despite this the group conclude that the findings show an important new way forward.

Source: University of Alberta

Pancreatic islet–specific knockout of Senp1 blunts insulin secretion due to an impaired amplification of exocytosis. Proposed pathway linking mitochondrial export of (iso)citrate, glutathione biosynthesis (blue), and glutathione reduction (orange) pathways to the amplification of insulin exocytosis (yellow). Isocitrate-to-SENP1 signaling amplifies insulin secretion and rescues dysfunctional β cells. MacDonald et al 2015.

24.  Nanotechnology

Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled “There’s Plenty of Room at the Bottom” by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Medieval stained glass windows are an example of  how nanotechnology was used in the pre-modern era. (Courtesy: NanoBioNet)

It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

  • There are 25,400,000 nanometers in an inch
  • A sheet of newspaper is about 100,000 nanometers thick
  • On a comparative scale, if a marble were a nanometer, then one meter would be the size of the Earth

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atoms—the food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, it’s impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recently—about 30 years ago.

Once scientists had the right tools, such as the scanning tunneling microscope (STM) and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materials were used for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didn’t know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists and engineers are finding a wide variety of ways to deliberately make materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight, increased control of light spectrum, and greater chemical reactivity than their larger-scale counterparts.

Education and workforce development are critical to the advancement of nanotechnology and are encompassed within one of the four goals of the National Nanotechnology Initiative (NNI): “Develop and sustain educational resources, a skilled workforce, and a dynamic infrastructure and toolset to advance nanotechnology.” As new knowledge is created through exploratory research and development, it is a challenge to translate this understanding into the educational system and to the broader public. Over the past fifteen years of the NNI, there have been several activities that have made significant contributions in this area: public outreach and informal education by the NSF Nanoscale Informal Science Education  Network (NISE Net) through programs such as NanoDays; technician and workforce training through programs such as the NSF Advanced Technological Education Centers including the Nanotechnology Applications and Career Knowledge (NACK) Network; countless university courses and degree programs; and the emerging incorporation of nanoscience into the K-12 science education standards in states such as Virginia. To build upon this strong foundation, several announcements were made last week at the White House Forum on Small Business Challenges to Commercializing Nanotechnology including the establishment of a Nano and Emerging Technologies Student Leaders conference, a webinar series focused on providing information for teachers, and a web portal of nanoscale science and engineering educational resources.  – See more at:

25.  Antimicrobial film for future implants

(Nanowerk News) The implantation of medical devices is not without risks. Bacterial or fungal infections can occur and the body’s strong immune response may lead to the rejection of the implant. Researchers at Unit 1121 “Biomaterials and Bio-engineering” (Inserm/Strasbourg university) have succeeded in creating a biofilm with antimicrobial, antifungal and anti-inflammatory properties. It may be used to cover titanium implants (orthopaedic prostheses, pacemakers…) prevent or control post-operative infections. Other frequently used medical devices that cause numerous infectious problems, such as catheters, may also benefit.
These results are published in the journal Advanced Healthcare Materials (“Harnessing the Multifunctionality in Nature: A Bioactive Agent Release System with Self-Antimicrobial and Immunomodulatory Properties”).

26.  Characterizing the forces that hold everything together: UMass Amherst physicists offer new open source calculations for molecular interactions

UMass Amherst physicists, with others, provide a new software tool and database to help materials designers with the difficult calculations needed to predict the magnitude of van der Waals interactions between anisotropic or directionally dependent bodies such as those illustrated, with long-range torques. Though small, these forces are dominant on the nanoscale.

CREDIT: UMass Amherst

As electronic, medical and molecular-level biological devices grow smaller and smaller, approaching the nanometer scale, the chemical engineers and materials scientists devising them often struggle to predict the magnitude of molecular interactions on that scale and whether new combinations of materials will assemble and function as designed.

Characterizing the forces that hold everything together: UMass Amherst physicists offer new open source calculations for molecular interactions

Amherst. MA | Posted on September 23rd, 2015

This is because the physics of interactions at these scales is difficult, say physicists at the University of Massachusetts Amherst, who with colleagues elsewhere this week unveil a project known as Gecko Hamaker, a new computational and modeling software tool plus an open science database to aid those who design nano-scale materials.

In the cover story in today’s issue of Langmuir, Adrian Parsegian, Gluckstern Chair in physics, physics doctoral student Jaime Hopkins and adjunct professor Rudolf Podgornik on the UMass Amherst team report calculations of van der Waals interactions between DNA, carbon nanotubes, proteins and various inorganic materials, with colleagues at Case Western Reserve University and the University of Missouri who make up the Gecko-Hamaker project team.

To oversimplify, van der Waals forces are the intermolecular attractions between atoms, molecules, surfaces, that control interactions at the molecular level. The Gecko Hamaker project makes available to its online users a large variety of calculations for nanometer-level interactions that help to predict molecular organization and evaluate whether new combinations of materials will actually stick together and work.

In this work supported by the U.S. Department of Energy, Parsegian and colleagues say their open-science software opens a whole range of insights into nano-scale interactions that materials scientists haven’t been able to access before.

Parsegian explains, “Van der Waals forces are small, but dominant on the nanoscale. We have created a bridge between deep physics and the world of new materials. All miniaturization, all micro- and nano-designs are governed by these forces and interactions, as is behavior of biological macromolecules such as proteins and lipid membranes. These relationships define the stability of materials.”

He adds, “People can try putting all kinds of new materials together. This new database and our calculations are going to be important to many different kinds of scientists interested in colloids, biomolecular engineering, those assembling molecular aggregates and working with virus-like nanoparticles, and to people working with membrane stability and stacking. It will be helpful in a broad range of other applications.”

Podgornik adds, “They need to know whether different molecules will stick together or not. It’s a complicated problem, so they try various tricks and different approaches.” One important contribution of Gecko Hamaker is that it includes experimental observations seemingly unrelated to the problem of interactions that help to evaluate the magnitude of van der Waals forces.

Podgornik explains, “Our work is fundamentally different from other approaches, as we don’t talk only about forces but also about torques. Our methodology allows us to address orientation, which is more difficult than simply describing van der Waals forces, because you have to add a lot more details to the calculations. It takes much more effort on the fundamental level to add in the orientational degrees of freedom.”

He points out that their methods also allow Gecko Hamaker to address non-isotropic, or non-spherical and other complex molecular shapes. “Many molecules don’t look like spheres, they look like rods. Certainly in that case, knowing only the forces isn’t enough. You must calculate how torque works on orientation. We bring the deeper theory and microscopic understanding to the problem. Van der Waals interactions are known in simple cases, but we’ve taken on the most difficult ones.”

Hopkins, the doctoral student, notes that as an open-science product, Gecko Hamaker’s calculations and data are transparent to users, and user feedback improves its quality and ease of use, while also verifying the reproducibility of the science.


For more information, please click here

Janet Lathrop

27.  Researchers have succeeded in creating a biofilm with antimicrobial, antifungal and anti-inflammatory properties. (Image: Inserm / E.Falett)
Implantable medical devices (prosthesis/pacemakers) are an ideal interface for micro-organisms, which can easily colonize their surface. As such, bacterial infection may occur and lead to an inflammatory reaction. This may cause the implant to be rejected. These infections are mainly caused by bacteria such as Staphylococcus aureus, originating in the body, and Pseudomonas aeruginosa. These infections may also be fungal or caused by yeasts. The challenge presented by implanting medical devices in the body is preventing the occurrence of these infections, which lead to an immune response that compromises the success of the implant. Antibiotics are currently used during surgery or to coat certain implants. However, the emergence of multi-resistant bacteria now restricts their effectiveness.
A biofilm invisible to the naked eye…
It is within this context that researchers at the “Bioengineering and Biomaterials” Unit 1121 (Inserm/Strasbourg University) with four laboratories1 have developed a biofilm with antimicrobial and anti-inflammatory properties. Researchers have used a combination of two substances: polyarginine (PAR) and hyaluronic acid (HA), to develop and create a film invisible to the naked eye (between 400 and 600 nm thick) that is made of several layers. As arginine is metabolised by immune cells to fight pathogens, it has been used to communicate with the immune system to obtain the desired anti-inflammatory effect. Hyaluronic acid, a natural component of the body, was also chosen for its biocompatibility and inhibiting effect on bacterial growth.
…with embedded antimicrobial peptides,
The film is also unique due to the fact that it embeds natural antimicrobial peptides, in particular catestatin, to prevent possible infection around the implant. This is an alternative to the antibiotics that are currently used. As well as having a significant antimicrobial role, these peptides are not toxic to the body that they are secreted into. They are capable of killing bacteria by creating holes in their cellular wall and preventing any counter-attack on their side.
…on a thin silver coating,
In this study researchers show that poly(arginine), associated with hyaluronic acid, possesses microbial activity against Staphylococcus aureus (S. aureus) for over 24 hours. “In order to prolong this activity, we have placed a silver-coated precursor before applying the film. Silver is an anti-infectious material currently used on catheters and dressings. This strategy allows us to extend antimicrobial activity in the long term” explains Philippe Lavalle, Research Director at Inserm.
…effectively reducing inflammation, preventing and controlling infection
The results from numerous tests performed on this new film shows that it reduces inflammation and prevents the most common bacterial and fungal infections.
On the one hand, researchers demonstrate, through contact with human blood, that the presence of the film on the implant suppresses the activation of inflammatory markers normally produced by immune cells in response to the implant. Moreover, “the film inhibits the growth and long-term proliferation of staphylococcal bacteria (Staphylococcus aureus), yeast strains (Candida albicans) or fungi (Aspegillus fumigatus) that frequently cause implant-related infection” emphasises Philippe Lavalle.
Researchers conclude that this film may be used in vivo on implants or medical devices within a few years to control the complex microenvironment surrounding implants and to protect the body from infection.
Source: INSERM (Institut national de la santé et de la recherche médicale)

28.  Quantum dots light up under strain

Semiconductor nanocrystals, or quantum dots, are tiny, nanometer-sized particles with the ability to absorb light and re-emit it with well-defined colors. With low-cost fabrication, long-term stability and a wide palette of colors, they have become a building blocks of the display technology, improving the image quality of TV-sets, tablets, and mobile phones. Exciting quantum dot applications are also emerging in the fields of green energy, optical sensing, and bio-imaging.

Prospects have become even more appealing after a publication, entitled “Band structure engineering via piezoelectric fields in strained anisotropic CdSe/CdS nanocrystals,” was published in the journal Nature Communications last July. An international team, formed by scientists at the Italian Institute of Technology (Italy), the University Jaume I (Spain), the IBM research lab Zurich (Switzerland) and the University of Milano-Bicocca (Italy) demonstrated a radically new approach to manipulate the light emission of quantum dots.

The traditional operating principle of quantum dots is based on the so-called quantum confinement effect, where the particle size determines the color of the emitted light. The new strategy relies on a completely different physical mechanism; a strain induced electrical field inside the quantum dots. It is created by growing a thick shell around the dots. This way, researchers were able to compress the inner core, creating the intense internal electric field. This field now becomes the dominating factor in determining the emission properties.

The result is a new generation of quantum dots whose properties are beyond those enabled by quantum confinement alone. This not only broadens the application scope of the well-known CdSe/CdS material set but also of other materials. “Our findings add an important new degree of freedom to the development of quantum dot-based technological devices,” the researchers say. “For example, the elapsed time between light absorption and emission can be extended to be more than 100 times longer compared to conventional quantum dots, which opens the way towards optical memories and smart pixel new devices. The new material could also lead to optical sensors that are highly sensitive to the electrical field in the environment on the nanometer scale.”

Explore further: Resonant energy transfer from quantum dots to graphene

More information: “Band structure engineering via piezoelectric fields in strained anisotropic CdSe/CdS nanocrystals” Nat Commun. 2015 Jul 29; 6:7905. DOI: 10.1038/ncomms8905

Journal reference: Nature Communications

Read more at:

29. Turing Reaction-diffusion Model Confirmed

In 1952, the legendary British mathematician and cryptographer Alan Turing proposed a model, which assumes formation of complex patterns through chemical interaction of two diffusing reagents. Russian scientists managed to prove that the corneal surface nanopatterns in 23 insect orders completely fit into this model.

Their work is published in the Proceedings of the National Academy of Sciences.

The work was done by a team working in the Institute of Protein Research of the Russian Academy of Sciences, (Pushchino, Russia) and the Department of Entomology at the Faculty of Biology of the Lomonosov Moscow State University. It was supervised by Professor Vladimir Katanaev, who also leads a lab in the University of Lausanne, Switzerland. Artem Blagodatskiy and Mikhail Kryuchkov performed the choice and preparation of insect corneal samples and analyzed the data. Yulia Lopatina from the Lomonosov Moscow State University played the role of expert entomologist, while Anton Sergeev performed the atomic force microscopy.

The initial goal of the study was to characterize the antireflective three-dimensional nanopatterns covering insect eye cornea, with respect to the taxonomy of studied insects and to get insight into their possible evolution path.

The result was surprising as the pattern morphology did not correlate with insect position on the evolutionary tree. Instead, Russian scientists have characterized four main morphological corneal nanopatterns as well as transition forms between them, omnipresent among the insect class. Another finding was that all the possible forms of the patterns directly matched to the array of patterns predicted by the famous Turing reaction-diffusion model published in 1952, what Russian scientists confirmed not by mere observation, but by mathematical modeling as well. The model assumes formation of complex patterns through chemical interaction of two diffusing reagents.

The analysis of corneal surface nanopatterns in 23 insect orders has been performed by means of atomic force microscopy with resolution up to single nanometers.

“This method allowed us to drastically expand the previously available data, acquired through scanning electron microscopy; it also made possible to characterize surface patterns directly, not based upon analysis of metal replicas. When possible, we always examined corneae belonging to distinct families of one order to get insight into intra-order pattern diversity,” Blagodatskiy said.

The main implication of the work is the understanding of the mechanisms underlying the formation of biological three-dimensional nano-patterns, demonstrating the first example of Turing reaction-diffusion model acting in the bio-nanoworld.

Interestingly, the Turing nanopatterning mechanism is common not only for the insect class, but also for spiders, scorpions and centipedes in other words — universal for arthropods. Due to the antireflective properties of insect corneal nanocoatings, the revealed mechanisms are paving the way for design of artificial antireflective nanosurfaces.

“A promising future development of the project is planned to be a genetic analysis of corneal nanopattern formation on platform of a well-studied Drosophila melanogaster (fruitfly) model. The wild-type fruitflies possess a nipple array type nanocoating on their eyes,” Blagodatskiy summarized.

Different combinations of overexpressed and underexpressed proteins known to be responsible for corneal development in Drosophila may alter the nipple pattern to another pattern type and thus shed the light on chemical nature of compounds, forming the Turing-type structures upon insect eyes. Revealing of proteins and\or other agents responsible for nanopattern formation will be a direct clue to artificial design of nanocoatings with desired properties. Another direction of project development will be the comparison

Citation: Artem Blagodatski, Anton Sergeev, Mikhail Kryuchkov, Yuliya Lopatina, Vladimir L. Katanaev. Diverse set of Turing nanopatterns coat corneae across insect lineages.Proceedings of the National Academy of Sciences, 2015; 112 (34): 10750 DOI:10.1073/pnas.1505748112

30.  Germ-free mice gain weight when transplanted with gut microbes from obese humans, in a diet-dependent manner.

By Ed Yong | September 5, 2013

Escherichia coliWIKIPEDIAPhysical traits like obesity and leanness can be “transmitted” to mice, by inoculating the rodents with human gut microbes. A team of scientists led byJeffrey Gordon from the Washington University School of Medicine in St. Louis found that germ-free mice put on weight when they were transplanted with gut microbes from an obese person, but not those from a lean person.

The team also showed that a “lean” microbial community could infiltrate and displace an “obese” one, preventing mice from gaining weight so long as they were on a healthy diet. The results were published today (September 5) in Science.

Gordon emphasized that there are many causes of obesity beyond microbes. Still, he said that studies like these “provide a proof-of-principle for ameliorating diseases.” By understanding how microbes and food interact to influence human health, researchers may be able to design effective probiotics that can prevent obesity by manipulating the microbiome.

The human gut is home to tens of trillions of microbes, which play crucial roles in breaking down food and influencing health. Gordon’s group and others have now shown that obese and lean people differ in their microbial communities. Just last week, the MetaHIT consortium showed that a quarter of Danish people studied had a very low number of bacterial genes in their gut—an impoverished state that correlated with higher risks of both obesity and metabolic diseases.

However, descriptive studies like these cannot tell scientists whether such microbial differences are the cause of obesity or a consequence of it. “A lot of correlations are being made between microbe community configurations and disease states, but we don’t know if these are casual or causal,” said Gordon. By using germ-free mice as living laboratories, Gordon and his colleagues aim to start moving “beyond careful description to direct tests of function,” he added.

“It’s extremely exciting and powerful to go from descriptive studies in humans to mechanistic studies in mice,” said Oluf Pedersen, an endocrinologist who was involved in the MetaHIT studies. “That’s beautifully illustrated in this paper.”

Gordon lab graduate student Vanessa Ridaura inoculated the germ-free mice with gut microbes from four pairs of female twins, each in which one person was obese and the other had a healthy weight. Mice that received the obese humans’ microbes gained more body fat, put on more weight, and showed stronger molecular signs of metabolic problems.

Once the transplanted microbes had taken hold in their guts, but before their bodies had started to change, Ridaura housed the two groups of mice together. Mice regularly eat one another’s feces, so these cage-mates inadvertently introduced their neighbors’ microbes to their own gut communities. Gordon called this the “Battle of the Microbiota.”

These co-housing experiments prevented the mice with “obese” microbes from putting on weight or developing metabolic problems, while those with the “lean” microbes remained at a healthy weight.

Gordon explains that the obese microbe communities, being less diverse than the lean ones, leave many “job openings” within the gut—niches that can be filled by the diverse lean microbes when they invade. “And obviously, those job openings aren’t there in the richer, lean gut community,” he said. “That’s why the invasion is one-directional.”

“But if invasion is so robust, why then isn’t there an epidemic of leanness?” asked Gordon. “The answer appears to be, in part, diet.”

In her initial experiments, Ridaura fed the mice standard chow, which is high in fiber and plant matter. She also blended up two new recipes, designed to reflect extremes of saturated fat versus fruit and vegetable consumption associated with Western diets.

If the mice were fed food low in fat and high in fruit and vegetables, Ridaura found the same results as before—the lean microbes could cancel out the effect of the obese ones. But when the mice were fed food low in fruit and vegetables and high in saturated fat, those with obese gut microbes still gained weight, no matter who their neighbors were.

This may be because the best colonizers among the lean communities were the Bacteroidetes—a group of bacteria that are excellent at breaking down the complex carbohydrates found in plant foods. When the mice ate plant-rich diets, the Bacteroidetes could fulfill a metabolic role that was vacant in the obese gut communities. When the mice ate unhealthy, plant-poor diets, “these vacancies weren’t there and the organisms couldn’t establish themselves,” said Gordon.

“We’re now trying to identify particular sets of organisms that can do what the complete community does,” Gordon added. The ultimate goal is to create a set of specific bacteria that could be safely administered as a probiotic that, along with a defined diet, could help these beneficial microbes to establish themselves and might effectively prevent weight gain.

“This study is an inspiration for us at MetaHIT,” said Pedersen. “It would be very interesting to take stools or cultures from extreme cases within our samples—people who have very rich or very poor gut microbiomes—and inoculate them into germ-free mice. . . . Now that we have a proof-of-concept, it’s obvious for us to follow up our findings through these studies.”

V.K. Ridaura et al., “Gut microbiota from twins discordant for obesity modulate metabolism in mice,” Science, doi: 10.1126/science.1241214, 2013.

31. Gut Microbes Treat Illness

Oral administration of a cocktail of bacteria derived from the human gut reduces colitis and allergy-invoked diarrhea in mice.

By Chris Palmer | July 10, 2013

Micrograph of germ-free mice colon colonized with 17 strains of human-derived Clostridia. Kenya Honda

An astounding array of microorganisms colonizes the human gut; our large intestines alone are home to 1014 bacteria from more than 1,000 species. Though scientists have long attempted to manipulate these microbial populations to affect health, probiotics have failed to reliably treat disease. However, a new study published today in Nature reports that a blend of specially selected strains of Clostridium bacteria derived from humans can significantly reduce symptoms of certain immune disorders in mice.

“[This work] shows that microbes can influence the balance and architecture of the immune system of their host,” said Sarkis Mazmanian, an immunologist at the California Institute of Technology who did not participate in the research. “I think it has tremendous potential for ameliorating human disease.”

Mammalian gut microbiota—the community of microorganisms that inhabit the gastrointestinal tract—have a long, intimate, and mostly symbiotic history with their hosts. The ubiquitous bugs are integral to some of the most basic of physiological functions, including metabolism and immune system development and function. However, specific gut microbes have also been linked to autoimmune disorders, obesity, inflammatory bowel disease, and possibly even neurological disorders. “It’s clear that gut microbes can affect many, many aspects of our physiology,” said Mazmanian.

Senior author Kenya Honda and his team previously reported that colonization of germ-free mice—mice that lack a microbiota—with a cocktail of a few dozen strains of Clostridium bacteria derived from wild-type mice promoted the activity of regulatory T cells (Treg) in the colon. Treg cells produce important anti-inflammatory immune molecules, including interleukin-10 and inducible T-cell co-stimulator, to prevent an overreaction of the immune system, and disruption of Treg cells is known to play a role in autoimmune disorders such as colitis, Crohn’s disease, food allergies, and type II diabetes. Indeed, mice treated with theClostridium cocktail appeared more resistant to allergies and intestinal inflammation.

Clostridia bacteria include the well-known tetanus and botulism toxins. “Clostridia are very diverse bacteria, and include some pathogens,” said Alexander Rudensky, an immunologist at the Memorial Sloan-Kettering Cancer Center in New York and a cofounder,  of Vedanta Biosciences, which he launched with the paper authors in 2010. “So, their role [in disease] may be surprising to immunologists and public, but not to microbiologists.”

To extend the clinical relevance of the previous results, Honda’s group repeated their experiment usingClostridium derived from a sample of human feces. As in the previous study, germ-free mice treated with specially selected strains of human-derived Clostridia displayed a significant increase in Treg cells. The treated mice also displayed reduced symptoms of colitis and allergy-induced diarrhea.

“This is a terrific advance to their previous studies where they showed that mouse microbiota can induce regulatory T cells,” said Mazmanian. “In this paper they’ve extended that to bacteria that come from humans, which they have tested in mice.”

The researchers used RNA sequencing of gut tissue samples of mice treated with human microbes to identify 17 specific non-virulent strains of Clostridium responsible for the increased production of Treg cells. They then sequenced the metagenomes of human ulcerative colitis patient guts, and found that they tended to carry lower levels of the 17 strains, with 5 out of the 17 showing a statistically significant reduction. “This work lays out the first instance of a rationally designed drug candidate isolated from human microbiota, which can be given to animals to treat autoimmune disease,” said study coauthor Bernat Olle, the chief operating officer of Vedanta Biosciences, which is developing therapies based on the new research.

Investigations into the mechanisms underlying Treg-cell induction pointed to small chain fatty acids and bacterial antigens that are cooperatively produced by the 17 strains of Clostridium. The small chain fatty acids and antigens in turn activate a transforming growth factor (TGF-beta) response that drives Treg cell differentiation and expansion.

“It’s very valuable to see studies like this one, where detailed analysis of microbial compositions is linked to biology,” said Rudensky.

Atarashi et al., “Treg induction by a rationally selected mixture of Clostridia strains from the human microbiota,” Nature, doi:10.1038/nature12331, 2013.


32.  Foxp3 targets revealed

The first comprehensive — but preliminary — list of Foxp3 targets in mice could provide clues to how the protein helps regulate the immune system

By Chandra Shekhar | January 22, 2007

The first comprehensive catalogue of mouse genes targeted by the transcriptional factor Foxp3 appears intwo papers published in this week’s Nature. The lists from both studies don’t always match, but the combined findings represent a key step in understanding how the protein helps regulatory T-cells maintain immune system tolerance and prevent autoimmune diseases. “The papers provide the first look at relating the transcriptional DNA-binding activity of Foxp3 with specific target genes,” said Fred Ramsdell of ZymoGenetics in Seattle, who was not involved in either study. “This is something the field has beenlooking to do for the past five years.” Expressed primarily in regulatory T-cells, Foxp3 is essential to both their development and normal function. Loss-of-function Foxp3 mutations in mice and humans result in fatal autoimmune diseases. A research team led by Alexander Rudensky of the University of Washington in Seattle, with Ye Zheng as first author, used ex vivo T-cells from mice with Foxp3 knocked out or tagged with GFP. Using a chromatin immunoprecipitation (ChIP) protocol, the team located nearly 1,300 Foxp3 binding sites on the mouse genome, from which it identified 702 Foxp3-bound genes. “Unlike other transcription factors, Foxp3 binds to only a few sites in the genome,” observed Rudensky. “But its binding results in very efficient changes in gene expression.” Another study, led by Richard Young of the Whitehead Institute in Cambridge, Mass. and Harald von Boehmer of the Dana-Farber Cancer Institute in Boston, also used ChIP to identify Foxp3 binding sites. Out of more than 1,500 binding sites, they identified 1,119 genes bound by Foxp3. Instead of ex vivo T-cells, however, the researchers used T-cell hybridomas transfected with Foxp3. This made it easier to observe the effects of T-cell receptor stimulation, explained study’s first author, Alexander Marson. “Foxp3 exerts a much stronger influence on its target genes in stimulated cells than in unstimulated cells,” he noted. Ethan Shevach of the National Institutes of Health in Bethesda, Md., who was not involved in either study, said he preferred the use of normal T-cells — as in the Zheng et al. study — to hybridomas. “There is no evidence that the cell [Marson et al] transfect with Foxp3 is a regulatory T-cell,” Shevach said. Some of the direct targets of Foxp3 identified in the two studies — such as members of the irf family — are transcription factors in their own right, indicating a second layer of regulation mediated by Foxp3. The target lists also include a number of genes for cell surface molecules, such as CD28, and signal transduction, such as Cdc42. “Some of these targets are red herrings,” cautioned Shevach. “Foxp3 may bind to them, but they may have nothing to do with regulatory cell function.” The results from the two studies differ significantly. For instance, Zheng et al. noted that ctla4 — an important T-cell inhibitor — was bound and strongly upregulated by Foxp3, but Marson et al. did not observe this. Conversely, while both studies found that Foxp3 bound to the receptor for IL2, a key player in immune response, only Marson et al. found IL2 itself to be a target. Further, while Zheng et al. determined that Foxp3 activated more genes than it suppressed, Marson et al. came to the opposite conclusion. “What I found most striking was the amount of non-overlap between the two datasets,” said Steve Ziegler of the Benaroya Research Institute in Seattle. “This may reflect the fact that they used two different systems for their chip-on-chip analysis.” Despite the discrepancies, experts said the studies would be a major help in research into immune tolerance. “Foxp3 is located in the nucleus and is hard to get at,” said Ziegler. “Downstream targets of it may be more accessible and give us more tractable surrogate markers of regulatory T-cells.” Chandra Shekhar Links within this article Two papers: Y. Zheng, et al., “Genome-wide analysis of Foxp3 target genes in developing and mature regulatory T cells,” Nature, Jan 2007. A. Marson, et al., “Foxp3 occupancy and regulation of key target genes during T-cell stimulation,” Nature, Jan 2007. T.P. Toma, “Self-tolerance gene?” The Scientist, January 9, 2003 M. Greener, “Hot on tolerance’s trail: The hunt for human Foxp3,” The Scientist, May 23, 2005 F. Ramsdell, “Foxp3 and natural regulatory T cells: Key to a cell lineage?” Immunity, August 2003. ‘ Alexander Rudensky Richard Young Harald von Boehmer Ethan Shevach Steve Ziegler

32.  Lasker Winners Announced

This year’s prizes honor pioneering work on the unfolded protein response, deep-brain stimulation, and the discovery of cancer-related genes.

By Tracy Vence | September 8, 2014

Kazutoshi Mori (left), Peter Walter (right) ALBERT AND MARY LASKER FOUNDATION Kazutoshi Mori of Kyoto University in Japan and Peter Walter of the University of California, San Francisco, have won the 2014 Lasker Award for basic medical research. Mori and Walter are being honored by the Albert and Mary Lasker Foundation for their work related to the unfolded protein response—a cellular stress response that has been implicated in several protein-folding diseases.

In its announcement, the foundation said that “Mori and Walter’s work has led to a better understanding of inherited diseases such as cystic fibrosis, retinitis pigmentosa, and certain elevated cholesterol conditions in which unfolded proteins overwhelm the unfolded protein response.”

Three years ago, the Lasker Foundation honored Franz-Ulrich Hartl and Arthur Horwich for their protein-folding work with its 2011 basic research award.

Meanwhile, Alim Louis Benabid of Joseph Fourier University in Grenoble, France, and Mahlon DeLong of the Emory University School of Medicine in Atlanta, Georgia, have won the this year’s Lasker-DeBakey Clinical Medical Research Award for their deep-brain stimulation work that has been used to help restore and motor function in patients with advanced Parkinson’s disease.

And the University of Washington’s Mary-Claire King has won the 2014 Lasker-Koshland Special Achievement Award in Medical Science for “bold, imaginative, and diverse contributions to medical science and human rights” related to her work to reunite missing persons or their remains with their families, as well as her discovery of the cancer-related BRCA1 gene locus. In a commentary published in JAMA today (September 8), King and her colleagues advocated for population-based screening for cancer-related genetic variants. “Population-wide screening will require significant efforts to educate the public and to develop new counseling strategies, but this investment will both save women’s lives and provide a model for other public health programs in genomic medicine,” they wrote.

This year’s recipients will receive a $250,000 honorarium per category. The awards will be presented on Friday, September 19, in New York City.

33.  Protein Binding

Edited by: Thomas W. Durso S.D. Rosen, C.R. Bertozzi, “The selectins and their ligands,” Current Opinion in Cell Biology, 6:663-73, 1994. (Cited in more than 60 publications through April 1996) Comments by Steven D. Rosen, University of California, San Francisco The selectins are a trio of related proteins involved in leukocyte-endothelium interactions, affecting the ability of leukocytes-that is, white blood cells-to interact with blood vessel walls. THREEPEAT: The selectins are a threesome

By Carolyn Bertozzi | October 28, 1996

Edited by: Thomas W. Durso
S.D. Rosen, C.R. Bertozzi, “The selectins and their ligands,” Current Opinion in Cell Biology6:663-73, 1994. (Cited in more than 60 publications through April 1996) Comments by Steven D. Rosen, University of California, San Francisco

The selectins are a trio of related proteins involved in leukocyte-endothelium interactions, affecting the ability of leukocytes-that is, white blood cells-to interact with blood vessel walls.

THREEPEAT: The selectins are a threesome of related proteins, says UC-San Francisco’s Steven Rosen.

“One of the novel aspects of the selectins is that they function as carbohydrate-binding receptor molecules-that is, they recognize specific carbohydrate structures as their ligands, or counter-receptors,” Rosen says. “This means that in principle, it’s possible to interrupt the function of selectins by determining what carbohydrates they bind to and providing mimics for those carbohydrates in the form of soluble small molecules, thereby arriving at a new class or classes of anti-inflammatory substances.”The paper summarizes the three selectins and their physiological functions in leukocyte-endothelium interactions, and describes how they function.First identified at the molecular level in 1989 (L.M. Stoolman, Cell,56:907-10, 1989), selectins are the topic of this review paper by Steven D. Rosen, a professor in the department of anatomy and program in immunology at the University of California, San Francisco, and Carolyn R. Bertozzi, a former postdoc in Rosen’s lab and now an assistant professor of chemistry at the University of California, Berkeley.

Rosen explains that with leukocytes moving from the blood into tissues, the leukocyte-endothelium interaction is critical to inflammatory reactions.

ONE PLACE: UC-Berkeley’s Carolyn Bertozzi, Rosen’s former postdoc, was coauthor of the review paper.

“Leukocytes in tissue sites are protecting the individual from bacterial invasions and foreign substances that the individual wants to eliminate, but leukocytes can have an arsenal of destructive capabilities which can be turned on the individual’s own tissues. So inflammatory reactions have a down side. There are a lot of inflammatory diseases, such as rheumatoid arthritis, multiple sclerosis, lupus, and other autoimmune diseases.””In many cases, inflammatory reactions lead to pathological problems,” he points out. “It’s a defense mechanism the body has, but leukocytes being in tissue sites can cause problems as well as be of value to the individual.

He concludes: “The interest in the selectins was: Here’s a family of proteins that has involvement in leukocyte-endothelium interactions, therefore here’s a potential set of targets to prevent leukocyte entry into tissues and prevent inflammatory problems.”

Asked for his opinion on why this paper has been cited so much, Rosen replies: “There’s a huge amount of interest in the selectins, because there’s basic cell biology and biochemistry that everybody’s interested in here. . . . There’s a real convergence of the basic science with direct clinical applications. What you do in the lab can have immediate ramifications on the design of anti-inflammatory compounds. There’s tremendous biotech and pharmaceutical company interest in the selectins and their ligands.

“This has been a tremendously hot topic since 1989, and it will be for years to come. Our article put everything down in one place, from the basic cell biology to the clinical connections, and updated the carbohydrate information and ligand identification information in a very accessible way.”

In addition to reviewing the selectins, Rosen states, “the paper deals with what is known about the carbohydrates that the selectins recognize, and what is known about the macromolecules-the ligands-that carry these carbohydrates. What might make the carbohydrates that one selectin recognizes different from the carbohydrates that another selectin molecule might recognize-that is, what is the selectivity of carbohydrate binding among the three selectins?”

The paper also lists the animal models of inflammatory diseases in which selectins have been shown to play an important role, “where antagonism of the selectin leads to beneficial effects, in terms of decreasing damage,” Rosen notes.

Since the publication of this paper, he and Bertozzi have written a second review, updating ligand characterizations (S.D. Rosen, C.R. Bertozzi, Current Biology6:261-4, 1996).

“It has a lot more on carbohydrate specificity, and it’s got some new information on how one of the selectins recognizes its ligands,” Rosen notes. “Sulfation is important. At the time of the first review, sulfation was known to be important for the binding of one selectin to its ligands. . . . This review points to the importance of sulfation for the ligand of another selectin. The nature of the sulfation modifications of the ligands are very different for the two selectins.”

Additional LPBI articles:

MIT’s Promise for the MI Patient: A new cardiac patch uses Gold Nanowires to enhance Electrical Signaling between heart cells

Curator: Aviva Lev-Ari, PhD, RN

Nanotechnology and Heart Disease

Author and Curator:  Tilda Barliya PhD

AAAS February 14-18, 2013, Boston: Symposia – The Science of Uncertainty in Genomic Medicine

Reporter: Aviva Lev-Ari, PhD, RN

Robert S. Langer, Massachusetts Institute of Technology

Challenges and Opportunities at the Confluence of Biotechnology and Nanomaterials

Introduction to Tissue Engineering; Nanotechnology applications

Author, editor; Tilda Barliya PhD

Building a Drug-Delivery System(DDS): choice of polymers and drugs

Author: Tilda Barliya PhD

Read Full Post »

Augmentation of the ONTOLOGY of the 3D Printing Research

Curator: Larry H. Bernstein, MD, FCAP

Encore: Research Allows for 3D Printed Augmentation of Everyday Objects


If you have a 3D printer then you are likely overwhelmed by the sheer number of possible objects you can find online to print out. At the same time, the technology is somewhat limited, unless you have professional CAD skills or are incredibly creative. What, for instance, would you do if you wanted to take an everyday item such as a hot glue gun and print an attached stand for it, or turn your child’s favorite action figure into a magnet?

Four researchers at Carnegie Mellon University, Xiang ‘Anthony’ Chen, Stelian Coros, Jennifer Mankoff and Scott E. Hudson, believe that they can change this via a new WebGL-based tool under development called Encore.

Funded by the National Science Foundation under grant NSF IIS 1217929, Encore is a multifaceted tool which enables three different techniques to augment already existing objects. The researchers call these three techniques Print-Over, Print-to-Affix and Print-Through, all of which allow for the adherence of newly 3D printed attachments to other objects. Before we get into what each of these three techniques involves, one first should understand the computational pipeline involved in designing these attachments.

First a user is required to design a basic attachment, or perhaps use a design that they’d like to iterate upon. Once they have a basic model of their desired attachment the Encore system will geometrically analyze the model along with a 3D scan of a target object, that the attachment will adhere to, in order to determine its printability while also deciding if it will be durable enough and usable once attached to the target object. Next comes the interactive exploration phase, where the tool will visualize and explore various areas where the attachment can be affixed to the target object. The tool will also adjust the design of the attachment, if required, to better fit the target object. Once this is all settled, it’s now time for Encore to generate a model which not only will include the attachment itself, but also any connecting structures and even supports to hold the target item or attachment in place. Below you will find the three techniques that the Encore tool can use to attach 3D printed items to a target object:

This technique, in my opinion, is one of the coolest, as it allows for the printing of an attachment directly on a given object. Once Encore establishes the parameters required and sizes up the model to fit the target object, it will automatically tell the printer to print supports to hold the target item in place. Once the target item is in place, Encore will then tell the printer to begin printing the attachment in a particular spot on the target item, based on its geometric analysis of that object.

“It is also important to ensure that the existing object will not impede the motion of the print head while the attachment is being printed,” warn the researchers.

An example used by the researchers for this technique was a magnet holder which they directly attached to a teddy bear figurine. They also printed an LED light onto a 9V battery by first placing a small amount of glue on the battery prior to beginning to print on top.

This approach is similar to the Print-Over technique, only that instead of printing an attachment directly onto the target object, Encore will analyze the geometry of the target object prior to printing in order to create an attachment which will fit perfectly on that object via glue, straps (zip-ties) or even snaps. Once the attachment is printed the user can then use hot-glue or another adhesive or attachment mechanism to affix the printed object onto the target.

This technique is perfect for items which you don’t want to physically attach, but instead can connect to an object. For instance a tag on the loop of a pair of scissors or a charm onto a bracelet. This process requires that the printer be paused while a user manually places the target object within the print field.

“Print-through has aesthetic qualities that distinguish it from print-to-affix and print-over – it typically creates a loose but permanent connection between two objects,” explained the researchers.

While the new Encore tool is still under development as researchers improve upon its analytical capabilities, it certainly seems to show promise to those of us wishing to do more than just fabricate new items. In fact, the researchers were able to show that via all three techniques they could save a substantial amount of time and material over printing an object in one single piece. As an example, they used Slic3r to estimate the print time and total material required for printing a typical Utah teapot with a torus-shaped handle, as well as just printing the handle onto an already fabricated Utah teapot. Their estimate showed that the time of fabricating the item could be cut by more than 80% and material use reduced by as much as 85% by using their techniques and the Encore tool.

There are many variables going into the tool’s decision making algorithms, such as determining where to place attachments for the best balance when holding an object, what placement of an attachment will result in the best adhesion, etc. More research is still required as the team continues to develop the tool, as well as new techniques to attach multiple parts to one object, but it certainly seems like something which could have a sizable impact on the industry in general. Let us know your thoughts on the Encore tool in the Augmenting 3D Prints forum thread on

The Limits of 3D Printing

Matthias Holweg

Harvard Business Review Feb 2015

Contrary to what some say, 3D printing is not going to revolutionize the manufacturing sector, rendering traditional factories obsolete. The simple fact of the matter is the economics of 3D printing now and for the foreseeable future make it an unfeasible way to produce the vast majority of parts manufactured today. So instead of looking at it as a substitute for existing manufacturing, we should look to new areas where it can exploit its unique capabilities to complement traditional manufacturing processes.

Additive manufacturing, or “3D printing” as it is commonly known, has understandably captured the popular imagination: New materials that can be “printed” are announced virtually every day, and the most recent generation of printers can even print several materials at the same time, opening up new opportunities. Exciting applications have already been demonstrated across all sectors — from aerospace and medical applications to biotechnology and food production.

Some predict that a day is coming when we’ll be able to make any part at the push of a button at a local printer, which might even render the global supply lines that dominate today’s world of manufacturing a thing of the past. Unfortunately, this vision does not stack up to economic reality. Early findings from a research project being conducted by the Additive Manufacturing and 3D Printing Research Group at the University of Nottingham and Saïd Business School at the University of Oxford show that there are both significant scale and learning effects inherent in the 3D printing process. (The project, in which I am a principal investigator, is focusing on industrial selective laser sintering (or more accurately, melting) processes and not fused deposition modelling or stereolithography processes that are more suited to rapid prototyping and home applications.)

Furthermore, the pre- and post-printing cost amount to a significant proportion of total cost per printed part. So even when the cost for printers materials come down, the labor-cost penalty will remain.

3D printing simply works best in areas where customization is key — from printing hearing aids and dental implants to printing a miniature of the happy couple for their wedding cake. Using a combination of 3D scanning and printing, implants can be customized to specific anatomic circumstances in a way that was simply not feasible beforehand. However, we also know that 99% of all manufactured parts are standard and do not require customization. In these cases, 3D printing has to compete with scale-driven manufacturing processes and rather efficient logistics operations. A good example is the wrench  that NASA printed on the International Space Station last year. The cost of shipping it to the space station would have been at least $400 (assuming the unpackaged weight of 18 grams per wrench and using the most recent cost data given by NASA for transporting goods into lower-earth orbit); in comparison, shipping it from China to the United States would only cost $0.002 per unit. Thus, while it makes a lot of sense to print the wrench on the space station, printing it for local consumption in the United States wouldn’t.

The simple fact is that when customization isn’t important, 3D printing is not competitive. For one, printing costs per part are highly sensitive to the utilization of the “build room,” the three-dimensional area inside the 3D printer where the laser fuses the metal or plastic powder. Therefore, contract manufacturers that perform 3D printing  such as Shapeways generally wait to fill a batch that uses the entire build room. Printing just one part raises unit cost considerably; so economies of scale do matter. Interestingly, the economic case for the most-cited standard part in 3D volume production today, the GE fuel nozzle for the CFM LEAP engine, is it is lighter and more fuel efficient, not a lower manufacturing cost per se.

A second point often overlooked is that the labor cost that remains. Counter to common perception, 3D printing does not happen “at the touch of a button”; it involves considerable pre- and post-processing, which incur non-trivial labor costs. The starting point for any 3D printing process is a 3D file that can be “printed.” Just having an electronic CAD drawing is not sufficient; currently, there is no way to automatically convert the CAD drawing into a 3D file.

Creating printable files involves two steps: creating a three-dimensional volume model that can be printed, and “slicing” that volume model in the best possible way to avoid material wastage and prevent printing errors. Both steps require tacit knowledge. Following the printing, the parts produced have to be recovered, cleaned, washed (or sanded and polished, in the case of metal prints), and inspected. This, in turn, means that using 3D printing for the aftermarket services — an application where it makes a lot of sense — requires making a significant upfront investment in generating the printable files of the spare parts that would likely be needed. This investment would have to outweigh the cost of keeping a lifetime supply of spare parts in inventory, which is a tough call for small bolts, brackets, and connectors that make up the bulk of aftermarket demand.

So while I, like many others, have fallen in love with the notion of the “ultimate lean supply chain” of having 3D printers at every other corner table to print single parts just in time where they are needed, I am afraid that this vision does not stack up against reality. 3D printing technology undoubtedly has great potential. However, it is unlikely to replace traditional manufacturing. Instead, we should see it as a complement, a new tool in the box, and exploit its unique capabilities — both in making existing products better as well as being able to manufacture entirely new ones that we previously could not make.

Medical implants and printable body parts to drive 3D printer growth

With 3D bio-printing in the pipeline, dental and medical applications could be worth $6bn by 2025

False teeth, hip joints and replacement knees – and potentially printable skin and organs – will drive growth in the burgeoning market for 3D printers over the next decade, according to new research.

A report suggests that dentistry and medicine will increasingly harness one of the 21st century’s most exciting technological breakthroughs.

The technology is better known to British households for its ability to replace broken crockery or produce awkward figurine “selfies.”

But a report by Cambridge-based market research firm IDTechEx says ceramic jaw or teeth implants and metal hip replacements will become increasingly common 3D fare.

The parts are created by nozzles laying down fine sedimentary layers of material that build a product indistinguishable from an item that has rolled off a factory conveyor belt.

The dental and medical market for 3D printers is expected to expand by 365% to $867m (£523m) by 2025, according to IDTechEx analysts, even before bio-printing technology is taken into account. If bio-printing becomes suitable for commercial use – which scientists hope will allow the printing of pieces of skin, liver or kidney using live cells – analysts estimate the medical market could reach a value of $6bn or more within 10 years.

While printing of complete organs for transplants may be decades away, the use of pieces of tissue for laboratory toxicology tests for cosmetics or drugs could be ready within five years, helping the medical market for 3D printers overtake all other sectors.

Dr Jon Harrop, a director of IDTechEx, said: “Bio printing is a bit unsure as it doesn’t exist commercially at the moment but all the medical professionals we interviewed thought it was highly likely to be commercial within 10 years.”

In the US, dental labs have invested in technology that can scan a patient’s teeth so new teeth can be produced by pressing the print button.

Harrop said there are a number of stumbling blocks in the way of the commercial application of bio printing, but even in the past year, scientists have been able to extend the life of a piece of skin tissue created in the lab from just a few hours to 40 days, taking it closer to the three months required for toxicology tests.

At present, 3D printers are most widely used in the automotive industry where they help produce prototypes for new cars or car parts. The next biggest market is aerospace, where manufacturers are using the technology to make lighter versions of complex parts for aeroplanes.

Already, 3D printers have been used by the medical industry to create a jaw, a pelvis and several customised hip replacements from metal. This year, surgeons in Newcastle upon Tyne created a titanium pelvis for a man who lost half his original one to a rare bone cancer, while in May doctors in Southampton completed Britain’s first hip replacement made using a 3D printer. Professor Richard Oreffo at the University of Southampton, who helped develop the hip replacement technique, said at the time: “The 3D printing of the implant in titanium, from CT scans of the patient and stem cell graft, is cutting edge and offers the possibility of improved outcomes for patients.”

Dentists have been using 3D printers to create exact replicas of jaws or teeth in order to aid complex procedures for a few years, but increasingly they are creating implants made of durable plastic or medical ceramics.

The Future of 3D Printing

By Stephen F. DeAngelis

In his most recent State of the Union address, President Barack Obama stated, “Last year, we created our first manufacturing innovation institute in Youngstown, Ohio. A once-shuttered warehouse is now a state-of-the art lab where new workers are mastering the 3D printing that has the potential to revolutionize the way we make almost everything. There’s no reason this can’t happen in other towns. So tonight, I’m announcing the launch of three more of these manufacturing hubs, where businesses will partner with the Department of Defense and Energy to turn regions left behind by globalization into global centers of high-tech jobs. And I ask this Congress to help create a network of 15 of these hubs and guarantee that the next revolution in manufacturing is made right here in America.” [“Remarks by the President in the State of the Union Address,” White House, 12 February 2013]

Official White House Photo by Chuck Kennedy (not shown)

Clearly, the President believes that 3D printing marks a new era in manufacturing that could change the entire business landscape. Kinaxis analyst Andrew Bell agrees that “change is inevitable.” “The question that we need to answer,” he writes, “is how will it change?” [“What Could 3D Printing Mean for the Supply Chain? The 21st Century Supply Chain, 9 January 2013] Bell offers a few thoughts on the subject; he writes:

“One thing is for sure, the supply chain isn’t going away. As usual, it will likely just get more complicated. Here are some of the areas that I propose will influence the supply chain as 3D printing becomes more and more mainstream, and I’m sure there are many more.

  • Local Manufacturing – More things will be made closer to their final destination. This will have definite impact on the logistics industry, and will change the way business try and schedule their operations.
  • Customizability – It will be easier, faster, and more efficient for companies to provide made-to-order products to their end users.
  • Distribution of raw materials – There will need to be a dramatic shift in the way raw materials are distributed since these printers will require raw materials in order to produce the final product.
  • New replacement parts model – Business will be able to provide replacement parts as required instead of trying to predict the need and manufacture the stock well in advance (as they do today)
  • Blurred boundaries within businesses – A closer integration of the various departments of an organization will be mandatory. A siloed manufacturing department will no longer allow for a competitive business.

“My predictions may be right, or they may be wrong, but one thing I think all will agree on is that 3D printing will make the supply chain more complex and more difficult to manage.”

Jim Stockton notes that 3D printing (or additive manufacturing) has been around since the 1980s. It has only broken into the mainstream because of recent breakthroughs that now have everyone talking. [“Top Innovations in the World of 3D Printing,” BestDesignTuts, 12 February 2013] He writes:

“Some products produced with this technology that might already be a part of your daily life include shoes, jewellery, clothing accessories, and educational products. The field of engineering is also benefitting from the development of 3D printing, in terms of boosting geographic information systems, aerospace engineering, engineering projects, and construction processes. Furthermore, humans are already benefitting in the field of healthcare with dental and medical instruments and products being formulated with 3D printing. The automotive industry is able to make more reliable products at a quicker speed, and the industrial design and architecture fields are able to develop new products that were inconceivable only a decade ago.”

What is really exciting people, however, is not so much what is currently being manufactured but the potential of what could be manufactured using 3D printers. Stockton writes:

“Engineers and scientists around the world are looking forward to the near future, in which printers will be able to create equipment, tools and devices via open-source models. This kind of advancement will drastically change the ways in which research and practical medicine are performed. Chemists are even attempting to build chemical compounds using 3D printers, and scientists have already started to replicate fossils and other ancient materials to better understand their compositions and functions. Architects wonder if in the future, buildings themselves can be printed from the ground up. The future of 3D printing is very bright – both in small-scale products for individual users and for large-scale projects, such as printing meters of building materials in the course of an hour and making intricate parts of automobiles and planes.”

Vivek Srinivasan and Jarrod Bassan offer ten trends that will likely define the direction that will be taken by additive manufacturing in the years ahead. [“Manufacturing The Future: 10 Trends To Come In 3D Printing,” Forbes, 7 December 2012] They are:

  1. 3D printing becomes industrial strength. Once reserved for prototypes and toys, 3D printing will become industrial strength. You will take a flight on an airliner that includes 3D-printed components, making it lighter and more fuel efficient. In fact, there are aircrafts that already contain some 3D-printed components. The technology will also start to be adopted for the direct manufacture of specialist components in industries like defense and automotive. Overall, the number of 3D printed parts in planes, cars and even appliances will increase without you knowing.
  2. 3D printing starts saving lives. 3D-printed medical implants will improve the quality of life of someone close to you. Because 3D printing allows products to be custom-matched to an exact body shape, it is being used today for making better titanium bone implants, prosthetic limbs and orthodontic devices. Experiments in printing soft tissue are underway, and may soon allow printed veins and arteries to be used in operations. Today’s research into medical applications of 3D printing covers nano-medicine, pharmaceuticals and even printing of organs. Taken to the extreme, 3D printing could one day enable custom medicines and reduce if not eliminate the organ donor shortage.
  3. Customization becomes the norm. You will buy a product, customized to your exact specifications, which is 3D-printed and delivered to your doorstep. Innovative companies will use 3D printing technologies to give themselves a competitive advantage by offering customization at the same price as their competitor’s standard products. At first this may range from novelty items like custom smartphone cases or ergonomic improvements to standard tools, but it will rapidly expand to new markets. The leaders will adjust their sales, distribution and marketing channels to take advantage of their capability to provide customization direct to the customer. Customization will also play a big role in healthcare devices such as 3D-printed hearing aids and artificial limbs.
  4. Product innovation is faster. Everything from new car models to better home appliances will be designed more rapidly, bringing innovation to you faster. Because rapid prototyping using 3D printers reduces the time to turn a concept into a production-ready design, it allows designers to focus on the function of products. Although the use of 3D printing for rapid prototyping is not new, the rapidly decreasing cost, improved design software and increasing range of printable materials means designers will have more access to printers, allowing them to innovate faster by 3D printing an object early in the design phase, modifying it, re-printing it, and so on. The result will be better products, designed faster.
  5. New companies develop innovative business models built on 3D printing. You will invest in a 3D printing company’s IPO. Start-up companies will flourish as a generation of innovators, hackers and “makers” take advantage of the capabilities of 3D printing to create new products or deliver services to the burgeoning 3D printer market. Some enterprises will fail, and there may be a boom-bust cycle, but 3D printing will spawn new and creative business models.
  6. 3D print shops open at the mall. 3D print shops will begin to appear, at first servicing local markets with high-quality 3D printing services. Initially designed to service rapid-prototyping and other niche capabilities, these shops will branch into the consumer marketplace. As retailers begin to “ship the design, not the product,” the local 3D print shop will one day be where you pick up your customized, locally manufactured products, just like you pick up your printed photos from the local Walmart today.
  7. Heated debates on who owns the rights emerge. As manufacturers and designers start to grapple with the prospect of their copyrighted designs being replicated easily on 3D printers, there will be high-profile test cases over the intellectual property of physical object designs. Just like file-sharing sites shook the music industry because they made it easy to copy and share music, the ability to easily copy, share, modify and print 3D objects will ignite a new wave of intellectual property issues.
  8. New products with magical properties will tantalize us. New products – that can only be created on 3D printers – will combine new materials, nano scale and printed electronics to exhibit features that seem magical compared to today’s manufactured products. These printed products will be desirable and have distinct competitive advantage. The secret sauce is that 3D printing can control material as it is printed, right down to the molecules and atoms. As today’s research is perfected into tomorrow’s commercially available printers, expect exciting and desirable new products with amazing capabilities. The question is: What are these products and who will be selling them?
  9. New machines grace the factory floor. Expect to see 3D printing machines appearing in factories. Already some niche components are produced more economically on 3D printers, but this is only on a small scale. Many manufacturers will begin experimenting with 3D printing for applications outside of prototyping. As the capabilities of 3D printers develop and manufacturers gain experience in integrating them into production lines and supply chains, expect hybrid manufacturing processes that incorporate some 3D-printed components. This will be further fueled by consumers desiring products that require 3D printers for their manufacture.
  10. “Look what I made!” Your children will bring home 3D printed projects from school. Digital literacy – including Web and app development, electronics, collaboration and 3D design – will be supported by 3D printers in schools. A number of middle schools and high schools already have 3D printers. As 3D printing costs continue to fall, more schools will sign on. Digital literacy will be about things as well as bits.

If, as President Obama believes, a manufacturing revolution, led by 3D printing, is coming, it behooves business leaders in every field to ask themselves how it could affect their business model and assumptions about the future.

Research Leads to the 3D Printing of Pure Graphene Nanostructures


Researchers in Korea have successfully 3D printed graphene nano-structures without the use of any other material. With the entire printed structure being composed of graphene, the strength, as well as full conductivity of the material can be taken advantage of.

3d printing graphene

3d printing graphene

There is no question that graphene, has enormous potential, from solar cell technology, to electronics to medicine.  A key factor in developing practical and commercial applications of the one-atom thick carbon sheets is in aligning the material in the desired form depending on the application.

Now 3D printing of graphene is nearing a feasible stage and companies such as Graphene 3D Lab, are at the forefront of the technology.

However, there is a difference between 3D printing pure graphene, and 3D printing a graphene/thermoplastic composites like Graphene 3D has been doing.

While printing with composite materials, using a typical FDM/FFF or powder based laser sintering process, will keep some of graphene’s superior properties intact, most will be lost. The plastic will eventually break down leaving any prints weak, and not much different from a typical object you’d print with a MakerBot Replicator.

Now, researchers, led by Professor Seung Kwon Seol from Korea Electrotechnology Research Institute (KERI), recentlypublished a paper in Advanced Materials where they describe a new process of directly 3D printing pure graphene.

Their techniques mean that graphene nano-structures can be fabricated without the use of any other material. With the entire printed structure being composed of graphene, the strength, as well as full conductivity of the material can be realized.

“We are convinced that this approach will present a new paradigm for implementing 3D patterns in printed electronics.”

“We developed a nanoscale 3D printing approach that exploits a size-controllable liquid meniscus to fabricate 3D reduced graphene oxide (rGO) nanowires,” Seol told Nanowerk. “Different from typical 3D printing approaches which use filaments or powders as printing materials, our method uses the stretched liquid meniscus of ink. This enables us to realize finer printed structures than a nozzle aperture, resulting in the manufacturing of nanostructures.”
“So far, to the best of our knowledge, nobody has reported 3D printed nanostructures composed entirely of graphene,” says Seol. “Several results reported the 3D printing (millimeter- or centimeter-scale) of graphene or carbon nanotube/plastic composite materials by using a conventional 3D printer. In such composite system, the graphene (or CNT) plays an important role for improving the properties of plastic materials currently used in 3D printers. However, the plastic materials used for producing the composite structures deteriorate the intrinsic properties of graphene (or CNT).”

“We are convinced that this approach will present a new paradigm for implementing 3D patterns in printed electronics,” says Seol.

For their technique, the team grew graphene oxide (GO) wires at room temperature using the meniscus formed at the tip of a micropipette filled with a colloidal dispersion of GO sheets, then reduced it by thermal or chemical treatment (with hydrazine).

The deposition of GO was obtained by pulling the micropipette as the solvent rapidly evaporated, thus enabling the growth of GO wires. The researchers were able to accurately control the radius of the rGO wires by tuning the pulling rate of the pipette; they managed to reach a minimum value of ca. 150 nm.

Using this technique, they were able to produce arrays of different freestanding rGO architectures, grown directly at chosen sites and in different directions: straight wires, bridges, suspended junctions, and woven structures.

Seol points out that this 3D nanoprinting approach can be used for manufacturing 2D patterns and 3D geometry in diverse devices such as printed circuit boards, transistors, light emitting devices, solar cells, sensors and so on.

A lot of work remains to reduce the 3D printable size to below 10 nm and increase the production yield. A short video of Seol’s process is below:

Nanoparticle Solar Cells May Drive Down Price of Solar Cells

University of Alberta researchers have found that abundant materials in the Earth’s crust can be used to make inexpensive and easily manufactured nanoparticle-based solar cells.

The research, which was supported by the Natural Sciences and Engineering Research Council of Canada, is published in the latest issue of ACS Nano.

The discovery, several years in the making, is an important step forward in making solar power more accessible to parts of the world that are off the traditional electricity grid or face high power costs, such as the Canadian North, said researcher Jillian Buriak, a chemistry professor and senior research officer of the National Institute for Nanotechnology based on the U of A campus.

Buriak and her team have designed nanoparticles that absorb light and conduct electricity from two very common elements: phosphorus and zinc. Both materials are more plentiful than scarce materials such as cadmium and are free from manufacturing restrictions imposed on lead-based nanoparticles.

“Half the world already lives off the grid, and with demand for electrical power expected to double by the year 2050, it is important that renewable energy sources like solar power are made more affordable by lowering the costs of manufacturing,” Buriak said.

“My goal is that a store like Ikea could sell rolls of these things with simple instructions and baggies of screws and do-dads and you could install them yourself,” said Buriak
Her team’s research supports a promising approach of making solar cells cheaply using mass manufacturing methods like roll-to-roll printing (as with newspaper presses) or spray-coating (similar to automotive painting). “Nanoparticle-based ‘inks’ could be used to literally paint or print solar cells or precise compositions,” Buriak said.

Buriak collaborated with U of A post-doctoral fellows Erik Luber of the U of A Faculty of Engineering and Hosnay Mobarok of the Faculty of Science to create the nanoparticles. The team was able to develop a synthetic method to make zinc phosphide nanoparticles, and demonstrated that the particles can be dissolved to form an ink and processed to make thin films that are responsive to light.

Buriak and her team are now experimenting with the nanoparticles, spray-coating them onto large solar cells to test their efficiency. The team has applied for a provisional patent and has secured funding to enable the next step to scale up for manufacturing.

Graphene-Based Solar Cells Get Major Boost

graphene on silicon

graphene on silicon
raphene has extreme conductivity and is completely transparent while being inexpensive and nontoxic. This makes it a perfect candidate material for transparent contact layers for use in solar cells to conduct electricity without reducing the amount of incoming light – at least in theory. Whether or not this holds true in a real world setting is questionable as there is no such thing as “ideal” graphene – a free floating, flat honeycomb structure consisting of a single layer of carbon atoms: interactions with adjacent layers can change graphene’s properties dramatically.

The research recently appeared in the journal Applied Physics Letters.

“We examined how graphene’s conductive properties change if it is incorporated into a stack of layers similar to a silicon based thin film solar cell and were surprised to find that these properties actually change very little,” Marc Gluba explains.

To this end, they grew graphene on a thin copper sheet, next transferred it to a glass substrate, and finally coated it with a thin film of silicon. They examined two different versions that are commonly used in conventional silicon thin-film technologies: onesample contained an amorphous silicon layer, in which the silicon atoms are in a disordered state similar to a hardened molten glass; the other sample contained poly-crystalline silicon to help them observe the effects of a standard crystallization process on graphene‘s properties.
Even though the morphology of the top layer changed completely as a result of being heated to a temperature of several hundred degrees Celcius, the graphene is still detectable. “That’s something we didn’t expect to find, but our results demonstrate that graphene remains graphene even if it is coated with silicon,” says Norbert Nickel.

Their measurements of carrier mobility using the Hall-effect showed that the mobility of charge carriers within the embedded graphene layer is roughly 30 times greater than that of conventional zinc oxide based contact layers.

Says Gluba: “Admittedly, it’s been a real challenge connecting this thin contact layer, which is but one atomic layer thick, to external contacts. We’re still having to work on that.” Adds Nickel: “Our thin film technology colleagues are already pricking up their ears and wanting to incorporate it.” The researchers obtained their measurements on one square centimeter samples, although in practice it is feasible to coat much larger areas than that with graphene.

SOURCE  Helmholtz Zentrum Berlin

Fabricated data bodies: Reflections on 3D printed digital body objects in medical and health domains

Deborah Lupton

Social Theory & Health 13, 99-115 (May 2015) |

The advent of 3D printing technologies has generated new ways of representing and conceptualizing health and illness, medical practice and the body. There are many social, cultural and political implications of 3D printing, but a critical sociology of 3D printing is only beginning to emerge. In this article I seek to contribute to this nascent literature by addressing some of the ways in which 3D printing technologies are being used to convert digital data collected on human bodies and fabricate them into tangible forms that can be touched and held. I focus in particular on the use of 3D printing to manufacture non-organic replicas of individuals’ bodies, body parts or bodily functions and activities. The article is also a reflection on a specific set of digital data practices and the meaning of such data to individuals. In analyzing these new forms of human bodies, I draw on sociomaterialist perspectives as well as the recent work of scholars who have sought to theorize selfhood, embodiment, place and space in digital society and the nature of people’s interactions with digital data. I argue that these objects incite intriguing ways of thinking about the ways in digital data on embodiment, health and illnesses are interpreted and used across a range of contexts. The article ends with some speculations about where these technologies may be headed and outlining future research directions.

Osteoconduction and osteoinduction of low-temperature 3D printed bioceramic implants.

Habibovic PGbureck UDoillon CJBassett DCvan Blitterswijk CABarralet JE

Europe Pubmed Central

Rapid prototyping is a valuable implant production tool that enables the investigation of individual geometric parameters, such as shape, porosity, pore size and permeability, on the biological performance of synthetic bone graft substitutes. In the present study, we have employed low-temperature direct 3D printing to produce brushite and monetite implants with different geometries. Blocks predominantly consisting of brushite with channels either open or closed to the exterior were implanted on the decorticated lumbar transverse processes of goats for 12 weeks. In addition, similar blocks with closed channel geometry, consisting of either brushite or monetite were implanted intramuscularly. The design of the channels allowed investigation of the effect of macropore geometry (open and closed pores) and osteoinduction on bone formation orthotopically. Intramuscular implantation resulted in bone formation within the channels of both monetite and brushite, indicating osteoinductivity of these resorbable materials. Inside the blocks mounted on the transverse processes, initial channel shape did not seem to significantly influence the final amount of formed bone and osteoinduction was suggested to contribute to bone formation.

Read Full Post »

Nanotechnology Used for Prevention of Bone Infection

Larry H Bernstein, MD, FCAP, Reporter


Functionalised nanoscale coatings using layer-by-layer assembly for imparting antibacterial properties to polylactide-co-glycolide surfaces

Piergiorgio GentileMaria E. FrongiaMar CardellachCheryl A. Mille

In order to achieve high local biological activity and reduce the risk of side effects of antibiotics in the treatment of periodontal and bone infections, a localised and temporally controlled delivery system is desirable. The aim of this research was to develop a functionalised and resorbable surface to contact soft tissues to improve the antibacterial behaviour during the first week after its implantation in the treatment of periodontal and bone infections. Solvent-cast poly(d,l-lactide-co-glycolide acid) (PLGA) films were aminolysed and then modified by Layer-by-Layer technique to obtain a nano-layered coating using poly(sodium4-styrenesulfonate) (PSS) and poly(allylamine hydrochloride) (PAH) as polyelectrolytes. The water-soluble antibiotic, metronidazole (MET), was incorporated from the ninth layer. Infrared spectroscopy showed that the PSS and PAH absorption bands increased with the layer number. The contact angle values had a regular alternate behaviour from the ninth layer. X-ray Photoelectron Spectroscopy evidenced two distinct peaks, N1s and S2p, indicating PAH and PSS had been introduced. Atomic Force Microscopy showed the presence of polyelectrolytes on the surface with a measured roughness about 10 nm after 20 layers’ deposition. The drug release was monitored by Ultraviolet–visible spectroscopy showing 80% loaded-drug delivery in 14 days. Finally, the biocompatibility was evaluated in vitro with L929 mouse fibroblasts and the antibacterial properties were demonstrated successfully against the keystone periodontal bacteria Porphyromonas gingivalis, which has an influence on implant failure, without compromising in vitro biocompatibility. In this study, PLGA was successfully modified to obtain a localised and temporally controlled drug delivery system, demonstrating the potential value of LbL as a coating technology for the manufacture of medical devices with advanced functional properties.
functionalized coating at nanoscale dimension

functionalized coating at nanoscale dimension



Read Full Post »

Imaging-guided cancer treatment

Imaging-guided cancer treatment

Writer & reporter: Dror Nir, PhD

It is estimated that the medical imaging market will exceed $30 billion in 2014 (FierceMedicalImaging). To put this amount in perspective; the global pharmaceutical market size for the same year is expected to be ~$1 trillion (IMS) while the global health care spending as a percentage of Gross Domestic Product (GDP) will average 10.5% globally in 2014 (Deloitte); it will reach ~$3 trillion in the USA.

Recent technology-advances, mainly miniaturization and improvement in electronic-processing components is driving increased introduction of innovative medical-imaging devices into critical nodes of major-diseases’ management pathways. Consequently, in contrast to it’s very small contribution to global health costs, medical imaging bears outstanding potential to reduce the future growth in spending on major segments in this market mainly: Drugs development and regulation (e.g. companion diagnostics and imaging surrogate markers); Disease management (e.g. non-invasive diagnosis, guided treatment and non-invasive follow-ups); and Monitoring aging-population (e.g. Imaging-based domestic sensors).

In; The Role of Medical Imaging in Personalized Medicine I discussed in length the role medical imaging assumes in drugs development.  Integrating imaging into drug development processes, specifically at the early stages of drug discovery, as well as for monitoring drug delivery and the response of targeted processes to the therapy is a growing trend. A nice (and short) review highlighting the processes, opportunities, and challenges of medical imaging in new drug development is: Medical imaging in new drug clinical development.

The following is dedicated to the role of imaging in guiding treatment.

Precise treatment is a major pillar of modern medicine. An important aspect to enable accurate administration of treatment is complementing the accurate identification of the organ location that needs to be treated with a system and methods that ensure application of treatment only, or mainly to, that location. Imaging is off-course, a major component in such composite systems. Amongst the available solution, functional-imaging modalities are gaining traction. Specifically, molecular imaging (e.g. PET, MRS) allows the visual representation, characterization, and quantification of biological processes at the cellular and subcellular levels within intact living organisms. In oncology, it can be used to depict the abnormal molecules as well as the aberrant interactions of altered molecules on which cancers depend. Being able to detect such fundamental finger-prints of cancer is key to improved matching between drugs-based treatment and disease. Moreover, imaging-based quantified monitoring of changes in tumor metabolism and its microenvironment could provide real-time non-invasive tool to predict the evolution and progression of primary tumors, as well as the development of tumor metastases.

A recent review-paper: Image-guided interventional therapy for cancer with radiotherapeutic nanoparticles nicely illustrates the role of imaging in treatment guidance through a comprehensive discussion of; Image-guided radiotherapeutic using intravenous nanoparticles for the delivery of localized radiation to solid cancer tumors.

 Graphical abstract


One of the major limitations of current cancer therapy is the inability to deliver tumoricidal agents throughout the entire tumor mass using traditional intravenous administration. Nanoparticles carrying beta-emitting therapeutic radionuclides [DN: radioactive isotops that emits electrons as part of the decay process a list of β-emitting radionuclides used in radiotherapeutic nanoparticle preparation is given in table1 of this paper.) that are delivered using advanced image-guidance have significant potential to improve solid tumor therapy. The use of image-guidance in combination with nanoparticle carriers can improve the delivery of localized radiation to tumors. Nanoparticles labeled with certain beta-emitting radionuclides are intrinsically theranostic agents that can provide information regarding distribution and regional dosimetry within the tumor and the body. Image-guided thermal therapy results in increased uptake of intravenous nanoparticles within tumors, improving therapy. In addition, nanoparticles are ideal carriers for direct intratumoral infusion of beta-emitting radionuclides by convection enhanced delivery, permitting the delivery of localized therapeutic radiation without the requirement of the radionuclide exiting from the nanoparticle. With this approach, very high doses of radiation can be delivered to solid tumors while sparing normal organs. Recent technological developments in image-guidance, convection enhanced delivery and newly developed nanoparticles carrying beta-emitting radionuclides will be reviewed. Examples will be shown describing how this new approach has promise for the treatment of brain, head and neck, and other types of solid tumors.

The challenges this review discusses

  • intravenously administered drugs are inhibited in their intratumoral penetration by high interstitial pressures which prevent diffusion of drugs from the blood circulation into the tumor tissue [1–5].
  • relatively rapid clearance of intravenously administered drugs from the blood circulation by kidneys and liver.
  • drugs that do reach the solid tumor by diffusion are inhomogeneously distributed at the micro-scale – This cannot be overcome by simply administering larger systemic doses as toxicity to normal organs is generally the dose limiting factor.
  • even nanoparticulate drugs have poor penetration from the vascular compartment into the tumor and the nanoparticles that do penetrate are most often heterogeneously distributed

How imaging could mitigate the above mentioned challenges

  • The inclusion of an imaging probe during drug development can aid in determining the clearance kinetics and tissue distribution of the drug non-invasively. Such probe can also be used to determine the likelihood of the drug reaching the tumor and to what extent.

Note: Drugs that have increased accumulation within the targeted site are likely to be more effective as compared with others. In that respect, Nanoparticle-based drugs have an additional advantage over free drugs with their potential to be multifunctional carriers capable of carrying both therapeutic and diagnostic imaging probes (theranostic) in the same nanocarrier. These multifunctional nanoparticles can serve as theranostic agents and facilitate personalized treatment planning.

  • Imaging can also be used for localization of the tumor to improve the placement of a catheter or external device within tumors to cause cell death through thermal ablation or oxidative stress secondary to reactive oxygen species.

See the example of Vintfolide in The Role of Medical Imaging in Personalized Medicine


Note: Image guided thermal ablation methods include radiofrequency (RF) ablation, microwave ablation or high intensity focused ultrasound (HIFU). Photodynamic therapy methods using external light devices to activate photosensitizing agents can also be used to treat superficial tumors or deeper tumors when used with endoscopic catheters.

  • Quality control during and post treatment

For example: The use of high intensity focused ultrasound (HIFU) combined with nanoparticle therapeutics: HIFU is applied to improve drug delivery and to trigger drug release from nanoparticles. Gas-bubbles are playing the role of the drug’s nano-carrier. These are used both to increase the drug transport into the cell and as ultrasound-imaging contrast material. The ultrasound is also used for processes of drug-release and ablation.


Additional example; Multifunctional nanoparticles for tracking CED (convection enhanced delivery)  distribution within tumors: Nanoparticle that could serve as a carrier not only for the therapeutic radionuclides but simultaneously also for a therapeutic drug and 4 different types of imaging contrast agents including an MRI contrast agent, PET and SPECT nuclear diagnostic imaging agents and optical contrast agents as shown below. The ability to perform multiple types of imaging on the same nanoparticles will allow studies investigating the distribution and retention of nanoparticles initially in vivo using non-invasive imaging and later at the histological level using optical imaging.



Image-guided radiotherapeutic nanoparticles have significant potential for solid tumor cancer therapy. The current success of this therapy in animals is most likely due to the improved accumulation, retention and dispersion of nanoparticles within solid tumor following image-guided therapies as well as the micro-field of the β-particle which reduces the requirement of perfectly homogeneous tumor coverage. It is also possible that the intratumoral distribution of nanoparticles may benefit from their uptake by intratumoral macrophages although more research is required to determine the importance of this aspect of intratumoral radionuclide nanoparticle therapy. This new approach to cancer therapy is a fertile ground for many new technological developments as well as for new understandings in the basic biology of cancer therapy. The clinical success of this approach will depend on progress in many areas of interdisciplinary research including imaging technology, nanoparticle technology, computer and robot assisted image-guided application of therapies, radiation physics and oncology. Close collaboration of a wide variety of scientists and physicians including chemists, nanotechnologists, drug delivery experts, radiation physicists, robotics and software experts, toxicologists, surgeons, imaging physicians, and oncologists will best facilitate the implementation of this novel approach to the treatment of cancer in the clinical environment. Image-guided nanoparticle therapies including those with β-emission radionuclide nanoparticles have excellent promise to significantly impact clinical cancer therapy and advance the field of drug delivery.

Read Full Post »

« Newer Posts - Older Posts »