Posts Tagged ‘T2DM’

Novartis – Type 2 Diabetes Mellitus

Larry H. Bernstein, MD, FCAP, Curator



LIK 066, NOVARTIS, for the treatment of type 2 diabetes




C23 H28 O7 . 2 C6 H11 N O, 642.7795

(1S)-1,5-Anhydro-1-[3-(2,3-dihydro-1,4-benzodioxin-6-ylmethyl)-4-ethylphenyl]-D-glucitol bis[1-[(2S)-pyrrolidin-2-yl]ethanone]

(2S,3R,4R,5S,6R)-2-[3-(2,3-Dihydro-benzo[1,4]dioxin-6-ylmethyl)-4- ethyl-phenyl]-6-hydroxymethyl-tetrahydro-pyran-3,4,5-triol

Sodium glucose transporter-2 inhibitor

SGLT 1/2 inhibitor


Novartis Ag innovator

LIK-066 is in phase II clinical studies at Novartis for the treatment of type 2 diabetes.

In June 2014, the EMA’s PDCO adopted a positive opinion on a pediatric investigation plan (PIP) for LIK-066 for type 2 diabetes


Diabetes mellitus is a metabolic disorder characterized by recurrent or persistent hyperglycemia (high blood glucose) and other signs, as distinct from a single disease or condition. Glucose level abnormalities can result in serious long-term complications, which include cardiovascular disease, chronic renal failure, retinal damage, nerve damage (of several kinds), microvascular damage and obesity.

Type 1 diabetes, also known as Insulin Dependent Diabetes Mellitus (IDDM), is characterized by loss of the insulin-producing β-cells of the islets of Langerhans of the pancreas leading to a deficiency of insulin. Type-2 diabetes previously known as adult- onset diabetes, maturity-onset diabetes, or Non-Insulin Dependent Diabetes Mellitus (NIDDM) – is due to a combination of increased hepatic glucose output, defective insulin secretion, and insulin resistance or reduced insulin sensitivity (defective responsiveness of tissues to insulin). Chronic hyperglycemia can also lead to onset or progression of glucose toxicity characterized by decrease in insulin secretion from β-cell, insulin sensitivity; as a result diabetes mellitus is self-exacerbated [Diabetes Care, 1990, 13, 610].

Chronic elevation of blood glucose level also leads to damage of blood vessels. In diabetes, the resultant problems are grouped under “microvascular disease” (due to damage of small blood vessels) and “macro vascular disease” (due to damage of the arteries). Examples of microvascular disease include diabetic retinopathy, neuropathy and nephropathy, while examples of macrovascular disease include coronary artery disease, stroke, peripheral vascular disease, and diabetic myonecrosis.

Diabetic retinopathy, characterized by the growth of weakened blood vessels in the retina as well as macular edema (swelling of the macula), can lead to severe vision loss or blindness. Retinal damage (from microangiopathy) makes it the most common cause of blindness among non-elderly adults in the US. Diabetic neuropathy is characterized by compromised nerve function in the lower extremities. When combined with damaged blood vessels, diabetic neuropathy can lead to diabetic foot. Other forms of diabetic neuropathy may present as mononeuritis or autonomic neuropathy. Diabetic nephropathy is characterized by damage to the kidney, which can lead to chronic renal failure, eventually requiring dialysis. Diabetes mellitus is the most common cause of l adult kidney failure worldwide. A high glycemic diet (i.e., a diet that consists of meals that give high postprandial blood sugar) is known to be one of the causative factors contributing to the development of obesity.

Type 2 diabetes is characterized by insulin resistance and/or inadequate insulin secretion in response to elevated glucose level. Therapies for type 2 diabetes are targeted towards increasing insulin sensitivity (such as TZDs), hepatic glucose utilization (such as biguanides), directly modifying insulin levels (such as insulin, insulin analogs, and insulin secretagogues), increasing increttn hormone action (such as exenatide and sitagliptin), or inhibiting glucose absorption from the diet (such as alpha glucosidase inhibitors) [Nature 2001 , 414, 821-827],

Glucose is unable to diffuse across the cell membrane and requires transport proteins. The transport of glucose into epithelial cells is mediated by a secondary active cotransport system, the sodium-D-glucose co-transporter (SGLT), driven by a sodium- gradient generated by the Na+/K+-ATPase. Glucose accumulated in the epithelial cell is further transported into the blood across the membrane by facilitated diffusion through GLUT transporters [Kidney International 2007, 72, S27-S35].

SGLT belongs to the sodium/glucose co-transporter family SLCA5. Two different SGLT isoforms, SGLT1 and SGLT2, have been identified to mediate renal tubular glucose reabsorption in humans [Curr. Opinon in Investigational Drugs (2007): 8(4), 285-292 and references cited herein]. Both of them are characterized by their different substrate affinity. Although both of them show 59% homology in their amino acid sequence, they are functionally different. SGLT1 transports glucose as well as galactose, and is expressed both in the kidney and in the intestine, while SGLT2 is found exclusively in the S1 and S2 segments of the renal proximal tubule. As a consequence, glucose filtered in the glomerulus is reabsorbed into the renal proximal tubular epithelial cells by SGLT2, a low-affinity/high-capacity system, residing on the surface of epithelial cell lining in S1 and S2 tubular segments. Much smaller amounts of glucose are recovered by SGLT1 , as a high-affinity/low-capacity system, on the more distal segment of the proximal tubule. In healthy human, more than 99% of plasma glucose that is filtered in the kidney glomerulus is reabsorbed, resulting in less than 1 % of the total filtered glucose being excreted in urine. It is estimated that 90% of total renal glucose absorption is facilitated by SGLT2; remaining 10 % is likely mediated by SGLT1 [J. Parenter. Enteral Nutr. 2004, 28, 364-371]. SGLT2 was cloned as a candidate sodium glucose co-transporter, and its tissue distribution, substrate specificity, and affinities are reportedly very similar to those of the low-affinity sodium glucose co-transporter in the renal proximal tubule. A drug with a mode of action of SGLT2 inhibition will be a novel and complementary approach to existing classes of medication for diabetes and its associated diseases to meet the patient’s needs for both blood glucose control, while preserving insulin secretion. In addition, SGLT2 inhibitors which lead to loss of excess glucose (and thereby excess calories) may have additional potential for the treatment of obesity.

Indeed small molecule SGLT2 inhibitors have been discovered and the anti-diabetic therapeutic potential of such molecules has been reported in literature [T-1095 (Diabetes, 1999, 48, 1794-1800, Dapagliflozin (Diabetes, 2008, 57, 1723-1729)].

PATENT     WO 2011048112



H NMR (400 MHz, CD3OD): δ 1.07 (t, J = 7.6 Hz, 3H), 2.57 (q, J = 7.6 Hz, 2H), 3.34- 3.50 (m, 4H), 3.68 (dd, J = 12.0, 5.6 Hz, 1 H), 3.85-3.91 (m, 3H), 4.08 (d, J = 9.6 Hz, 1 H), 4.17 (s, 4H), 6.53-6.58 (m, 2H), 6.68 (d, J – 8.4 Hz, 1 H), 7.15-7.25 (m, 3H).

MS (ES) m z 434.2 (M+18).



Pediatric investigation plan (PIP) decision: (S)-Pyrrolidine-2-carboxylic acid compound with (2S,3R,4R,5S,6R)-2-(3-((2,3-dihydrobenzo[b][1,4]dioxin-6-yl)methyl)-4-ethylphenyl)-6-(hydroxymethyl)tetrahydro-2H-pyran-3,4,5-triol (2:1) ( LIK066) (EMEA-001527-PIP01-13)
European Medicines Agency (EMA) Web Site 2014, July 24

Safety, tolerability, pharmacokinetics (PK) and pharmacodynamics (PD) assessment of LIK066 in healthy subjects and in patients with type 2 diabetes mellitus (T2DM) (NCT01407003)
ClinicalTrials.gov Web Site 2011, August 07

Read Full Post »

Excess Eating, Overweight, and Diabetic

Larry H Bernstein, MD, FCAP, Curator



You Did NOT Eat Your Way to Diabetes!



The myth that diabetes is caused by overeating also hurts the one out of five people who are not overweight when they contract Type 2 Diabetes. Because doctors only think “Diabetes” when they see a patient who fits the stereotype–the grossly obese inactive patient–they often neglect to check people of normal weight for blood sugar disorders even when they show up with classic symptoms of high blood sugar such as recurrent urinary tract infections or neuropathy.

Where Did This Toxic Myth Come From?

The way this myth originated is this: Because people with Type 2 Diabetes are often overweight and because many people who are overweight have a syndrome called “insulin resistance” in which their cells do not respond properly to insulin so that they require larger than normal amounts of insulin to lower their blood sugar, the conclusion was drawn years ago that insulin resistance was the cause of Type 2 Diabetes.

It made sense. Something was burning out the beta cells in these people, and it seemed logical that the something must be the stress of pumping out huge amounts of insulin, day after day. This idea was so compelling that it was widely believed by medical professionals, though few realized it had never been subjected to careful investigation by large-scale research.

That is why any time there is an article in the news about Type 2 Diabetes you are likely to read something that says, “While Type 1 diabetes (sometimes called Juvenile Diabetes) is a condition where the body does not produce insulin, Type 2 Diabetes is the opposite: a condition where the body produces far too much insulin because of insulin resistance caused by obesity.”

When your doctor tells you the same thing, the conclusion is inescapable: your overeating caused you to put on excess fat and that your excess fat is what made you diabetic.

Blaming the Victim

This line of reasoning leads to subtle, often unexpressed, judgmental decisions on the part of your doctor, who is likely to believe that had you not been such a pig, you would not have given yourself this unnecessary disease.

And because of this unspoken bias, unless you are able to “please” your doctor by losing a great deal of weight after your diagnosis you may find yourself treated with a subtle but callous disregard because of the doctor’s feeling that you brought this condition down on yourself. This bias is similar to that held by doctors who face patients who smoke a pack a day and get lung cancer and still refuse to stop smoking.

You also see this bias frequently expressed in the media. Articles on the “obesity epidemic” blame overeating for a huge increase in the number of people with diabetes, including children and teenagers who are pictured greedily gorging on supersized fast foods while doing no exercise more strenuous than channel surfing. In a society where the concepts “thin” and “healthy” have taken on the overtones of moral virtue and where the only one of the seven deadly sins that still inspires horror and condemnation is gluttony, being fat is considered by many as sure proof of moral weakness. So it is not surprising that the subtext of media coverage of obesity and diabetes is that diabetes is nothing less than the just punishment you deserve for being such a glutton.

Except that it’s not true.

Obesity Has Risen Dramatically While Diabetes Rates Have Not

The rate of obesity has grown alarmingly over the past decades, especially in certain regions of the U.S. The NIH reports that “From 1960-2 to 2005-6, the prevalence of obesity increased from 13.4 to 35.1 percent in U.S. adults age 20 to 74.7.”

If obesity was causing diabetes, you’d exect to see a similar rise in the diabetes rate. But this has not happened. The CDC reports that “From 1980 through 2010, the crude prevalence of diagnosed diabetes increased …from 2.5% to 6.9%.” However, if you look at the graph that accompanies this statement, you see that the rate of diabetes diagnoses rose only gradually through this period–to about 3.5% until it suddenly sped upward in the late 1990s. This sudden increase largely due to the fact that in 1998 the American Diabetes Association changed the criteria by which diabetes was to be diagnosed, lowering the fasting blood sugar level used to diagnose diabetes from 141 mg/dl to 126 mg/dl. (Details HERE)

Analyzing these statistics, it becomes clear that though roughtly 65 million more Americans became fat over this period, only 13 million more Americans became diabetic.

And to further confuse the matter, several factors other than the rise in obesity and the ADA’s lowering of the diagnostic cutoff also came into play during this period which also raised the rate of diabetes diagnoses:

Diabetes becomes more common as people age as the pancreas like other organs, becames less efficient. In 1950 only 12% of the U.S. population was over 65. By 2010 40% was, and of those 40%, 19% were over 75.(Details HERE.)

At the same time, the period during which the rate of diabetes rose was also the period in which doctors began to heavily prescribe statins, a class of drugs we now know raises the risk of developing diabetes. (Details HERE.)

Why Obesity Doesn’t Cause Diabetes: The Genetic Basis of Diabetes

While people who have diabetes are often heavy, one out of five people diagnosed with diabetes are thin or normal weight. And though heavy people with diabetes are, indeed, likely to be insulin resistant, the majority of people who are overweight will never develop diabetes. In fact, they will not develop diabetes though they are likely to be just as insulin resistant as those who do–or even more so.

The message that diabetes researchers in academic laboratories are coming up with about what really causes diabetes is quite different from what you read in the media. What they are finding is that to get Type 2 Diabetes you need to have some combination of a variety of already-identified genetic flaws which produce the syndrome that we call Type 2 Diabetes. This means that unless you have inherited abnormal genes or had your genes damaged by exposure to pesticides, plastics and other environmental toxins known to cause genetic damage, you can eat until you drop and never develop diabetes.

Now let’s look in more depth at what peer reviewed research has found about the true causes of diabetes

Twin Studies Back up a Genetic Cause for Diabetes

Studies of identical twins showed that twins have an 80% concordance for Type 2 Diabetes. In other words, if one twin has Type 2 Diabetes, the chance that the other will have it two are 4 out of 5. While you might assume that this might simply point to the fact that twins are raised in the same home by mothers who feed them the same unhealthy diets, studies of non-identical twins found NO such correlation. The chances that one non-identical twin might have Type 2 Diabetes if the other had it were much lower, though these non-identical twins, born at the same time and raised by the same caregivers were presumably also exposed to the same unhealthy diets.

This kind of finding begins to hint that there is more than just bad habits to blame for diabetes. A high concordance between identical twins which is not shared by non-identical twins is usually advanced as an argument for a genetic cause, though because one in five identical twins did not become diabetic, it is assumed that some additional factors beyond the inherited genome must come into play to cause the disease to appear. Often this factor is an exposure to an environmental toxin which knocks out some other, protective genetic factor.

The Genetic Basis of Type 2 Diabetes Mellitus: Impaired Insulin Secretion versus Impaired Insulin Sensitivity. John E. Gerich. Endocrine Reviews 19(4) 491-503, 1998.

The List of Genes Associated with Type 2 Keeps Growing

Here is a brief list of some of the abnormal genes that have been found to be associated with Type 2 Diabetes in people of European extraction: TCF7L2, HNF4-a, PTPN, SHIP2, ENPP1, PPARG, FTO, KCNJ11, NOTCh3, WFS1, CDKAL1, IGF2BP2, SLC30A8, JAZF1, and HHEX.

People from non-European ethnic groups have been found to have entirely different sets of diabetic genes than do Western Europeans, like the UCP2 polymorphism found in Pima Indians and the three Calpain-10 gene polymorphisms that have been found to be associated with diabetes in Mexicans. The presence of a variation in yet another gene, SLC16A11, was recently found to be associated with a 25% higher risk of a Mexican developing Type 2 diabetes.

The More Diabetes Genes You Have The Worse Your Beta Cells Perform

A study published in the Journal Diabetologia in November 2008 studied how well the beta cells secreted insulin in 1,211 non-diabetic individuals. They then screened these people for abnormalities in seven genes that have been found associated with Type 2 Diabetes.

They found that with each abnormal gene found in a person’s genome, there was an additive effect on that person’s beta cell dysfunction with each additional gene causing poorer beta cell function.

The impact of these genetic flaws becomes clear when we learn that in these people who were believed to be normal, beta cell glucose sensitivity and insulin production at meal times was decreased by 39% in people who had abnormalities in five genes. That’s almost half. And if your beta cells are only putting out half as much insulin as a normal person’s it takes a lot less stress on those cells to push you into becoming diabetic.

Beta cell glucose sensitivity is decreased by 39% in non-diabetic individuals carrying multiple diabetes-risk alleles compared with those with no risk alleles L. Pascoe et al. Diabetologia, Volume 51, Number 11 / November, 2008.

Gene Tests Predict Diabetes Independent of Conventional “Risk Factors”

A study of 16,061 Swedish and 2770 Finnish subjects found that

Variants in 11 genes (TCF7L2, PPARG, FTO, KCNJ11, NOTCh3, WFS1, CDKAL1, IGF2BP2, SLC30A8, JAZF1, and HHEX) were significantly associated with the risk of Type 2 Diabetes independently of clinical risk factors [i.e. family history, obesity etc.]; variants in 8 of these genes were associated with impaired beta-cell function.

Note that though the subjects here were being screened for Type 2 Diabetes, the defect found here was NOT insulin resistance, but rather deficient insulin secretion. This study also found that:

The discriminative power of genetic risk factors improved with an increasing duration of follow-up, whereas that of clinical risk factors decreased.

In short, the longer these people were studied, the more likely the people with these gene defects were to develop diabetes.

Clinical Risk Factors, DNA Variants, and the Development of Type 2 Diabetes Valeriya Lyssenko, M.D. et. al. New England Journal of Medicine, Volume 359:2220-2232, November 20, 2008,Number 21.

What A Common Diabetes Gene Does

A study published in July of 2009 sheds light on what exactly it is that an allele (gene variant) often found associated with diabetes does. The allele in question is one of TCF7L2 transcription factor gene. The study involved 81 normal healthy young Danish men whose genes were tested. They were then given a battery of tests to examine their glucose metabolisms. The researchers found that:

Carriers of the T allele were characterised by reduced 24 h insulin concentrations … and reduced insulin secretion relative to glucose during a mixed meal test … but not during an IVGTT [intravenous glucose tolerance test].

This is an interesting finding, because what damages our bodies is the blood sugar we experience after eating “a mixed meal” but so much research uses the artificial glucose tolerance (GTT) test to assess blood sugar health. This result suggests that the GTT may be missing important signs of early blood sugar dysfunction and that the mixed meal test may be a better diagnostic test than the GTT. I have long believed this to be true, since so many people experience reactive lows when they take the GTT which produces a seemingly “normal reading” though they routinely experience highs after eating meals. These highs are what damage our organs.

Young men with the TCF7L2 allele also responded with weak insulin secretion in response to the incretin hormone GLP-1 and “Despite elevated hepatic [liver] glucose production, carriers of the T allele had significantly reduced 24 h glucagon concentrations … suggesting altered alpha cell function.”

Here again we see evidence that long before obesity develops, people with this common diabetes gene variant show highly abnormal blood sugar behavior. Abnormal production of glucose by the liver may also contribute to obesity as metformin, a drug that that blocks the liver’s production of glucose blocks weight gain and often causes weight loss.

The T allele of rs7903146 TCF7L2 is associated with impaired insulinotropic action of incretin hormones, reduced 24 h profiles of plasma insulin and glucagon, and increased hepatic glucose production in young healthy men. K. Pilgaard et al. Diabetologia, Issue Volume 52, Number 7 / July, 2009. DOI 10.1007/s00125-009-1307-x

Genes Linked to African Heritage Linked to Poor Carbohydrate Metabolism

It has long been known that African-Americans have a much higher rate of diabetes and metabolic syndrome than the American population as a whole. This has been blamed on lifestyle, but a 2009 genetic study finds strong evidence that the problem is genetic.

The study reports,

Using genetic samples obtained from a cohort of subjects undergoing cardiac-related evaluation, a strict algorithm that filtered for genomic features at multiple levels identified 151 differentially-expressed genes between Americans of African ancestry and those of European ancestry. Many of the genes identified were associated with glucose and simple sugar metabolism, suggestive of a model whereby selective adaptation to the nutritional environment differs between populations of humans separated geographically over time.

In the full text discussion the authors state,

These results suggest that differences in glucose metabolism between Americans of African and European may reside at the transcriptional level. The down-regulation of these genes in the AA cohorts argues against these changes being a compensatory response to hyperglycemia and suggests instead a genetic adaptation to changes in the availability of dietary sugars that may no longer be appropriate to a Western Diet.

In conclusion the authors note that the vegetarian diet of the Seventh Day Adventists, often touted as proof of the usefulness of the “Diet Pyramid” doesn’t provide the touted health benefits to people of African American Heritage. Obviously, when hundreds of carbohydrate metabolizing genes aren’t working properly the diet needed is a low carbohydrate diet.

The study is available in full text here:

Stable Patterns of Gene Expression Regulating Carbohydrate Metabolism Determined by Geographic AncestryJonathan C. Schisler et. al. PLoS One 4(12): e8183. doi:10.1371/journal.pone.0008183

Gene that Disrupts Circadian Clock Associated with Type 2 Diabetes

It has been known for a while that people who suffer from sleep disturbances often suffer raised insulin resistance. In December of 2008, researchers identified a gene, “rs1387153, near MTNR1B (which encodes the melatonin receptor 2 (MT2)), as a modulator of fasting plasma glucose.” They conclude,

Our data suggest a possible link between circadian rhythm regulation and glucose homeostasis through the melatonin signaling pathway.

Melatonin levels appear to control the body clock which, in turn, regulates the secretion of substances that modify blood pressure, hormone levels, insulin secretion and many other processes throughout the body.

A variant near MTNR1B is associated with increased fasting plasma glucose levels and type 2 diabetes risk. Nabila Bouatia-Naji et al. Nature Genetics Published online: 7 December 2008, doi:10.1038/ng.277

There’s an excellent translation of what this study means, translated into layman’s terms at Science Daily:

Body Clock Linked to Diabetes And High Blood Sugar In New Genome-wide Study


The Environmental Factors That Push Borderline Genes into Full-fledged Diabetes

We’ve seen so far that to get Type 2 Diabetes you seem to need to have some diabetes gene or genes, but that not everyone with these genes develops diabetes. There are what scientists call environmental factors that can push a borderline genetic case into full fledged diabetes. Let’s look now at what the research has found about what some of these environmental factors might be.


Your Mother’s Diet During Pregnancy May Have Caused Your Diabetes

Many “environmental factors” that scientists explore occur in the environment of the womb. Diabetes is no different, and the conditions you experienced when you were a fetus can have life-long impact on your blood sugar control.

Researchers following the children of mothers who had experienced a Dutch famine during World War II found that children of mothers who had experienced famine were far more likely to develop diabetes in later life than a control group from the same population whose mothers had been adequately fed.

Glucose tolerance in adults after prenatal exposure to famine. Ravelli AC et al.Lancet. 1998 Jan 17;351(9097):173-7.,

A study of a Chinese population found a link between low birth weight and the development of both diabetes and impaired glucose regulation (i.e. prediabetes) that was independent of “sex, age, central obesity, smoking status, alcohol consumption, dyslipidemia, family history of diabetes, and occupational status.” Low birth weight in this population may well be due to less than optimal maternal nutrition during pregnancy.

Evidence of a Relationship Between Infant Birth Weight and Later Diabetes and Impaired Glucose Regulation in a Chinese Population Xinhua Xiao et. al. Diabetes Care31:483-487, 2008.

This may not seem all that relevant to Americans whose mothers have not been exposed to famine conditions. But to conclude this is to forget how many American teens and young women suffer from eating disorders and how prevalent crash dieting is in the group of women most likely to get pregnant.

It is also true that until the 1980s obstetricians routinely warned pregnant women against gaining what is now understood to be a healthy amount of weight. When pregnant women started to gain weight, doctors often put them on highly restrictive diets which resulted in many case in the birth of underweight babies.

Your Mother’s Gestational Diabetes May Have Caused Your Diabetes

Maternal starvation is not the only pre-birth factor associated with an increased risk of diabetes. Having a well-fed mother who suffered gestational diabetes also increases a child’s risk both of obesity and of developing diabetes.

High Prevalence of Type 2 Diabetes and Pre-Diabetes in Adult Offspring of Women With Gestational Diabetes Mellitus or Type 1 Diabetes The role of intrauterine hyperglycemia Tine D. Clausen, MD et al. Diabetes Care 31:340-346, 2008

Pesticides and PCBs in Blood Stream Correlate with Incidence of Diabetes

A study conducted among members of New York State’s Mohawk tribe found that the odds of being diagnosed with diabetes in this population was almost 4 times higher in members who had high concentrations of PCBs in their blood serum. It was even higher for those with high concentrations of pesticides in their blood.

Diabetes in Relation to Serum Levels of Polychlorinated Biphenyls and Chlorinated Pesticides in Adult Native Americans Neculai Codru, Maria J. Schymura,Serban Negoita,Robert Rej,and David O. Carpenter.Environ Health Perspect. 2007 October; 115(10): 1442-1447.Published online 2007 July 17. doi: 10.1289/ehp.10315.

It is very important to note that there is no reason to believe this phenomenon is limited to people of Native American heritage. Upstate NY has a well-known and very serious PCB problem–remember Love Canal? And the entire population of the U.S. has been overexposed to powerful pesticides for a generation.

More evidence that obesity may be caused by exposure to toxic pollutants which damage genes comes in a study published January of 2009. This study tracked the exposure of a group of pregnant Belgian woman to several common pollutants: hexachlorobenzene, dichlorodiphenyldichloroethylene (DDE) , dioxin-like compounds, and polychlorinated biphenyls (PCBs). It found a correlation between exposure to PCBs and DDE and obesity by age 3, especially in children of mothers who smoked.

Intrauterine Exposure to Environmental Pollutants and Body Mass Index during the First 3 Years of Life Stijn L. Verhulst et al., Environmental Health Perspectives. Volume 117, Number 1, January 2009

These studies, which garnered no press attention at all, probably have more to tell us about the reason for the so-called “diabetes epidemic” than any other published over the last decade.

BPA and Plasticizers from Packaging Are Strongly Linked to Obesity and Insulin Resistance

BPA, the plastic used to line most metal cans has long been suspected of causing obesity. Now we know why. A study published in 2008 reported that BPA suppresses a key hormone, adiponectin, which is responsible for regulating insulin sensitivity in the body and puts people at a substantially higher risk for metabolic syndrome.

Science Daily: Toxic Plastics: Bisphenol A Linked To Metabolic Syndrome In Human Tissue

The impact of BPA on children is dramatic. Analysis of 7 years of NHANES epidemiological data found that having a high urine level of BPA doubles a child’s risk of being obese.

Bisphenol A and Chronic Disease Risk Factors in US Children. Eng, Donna et al.Pediatrics Published online August 19, 2013. doi: 10.1542/peds.2013-0106

You, and your children are getting far more BPA from canned foods than what health authorities assumed they were getting. A research report published in 2011 reported that the level of BPA actually measured in people’s bodies after they consumed canned soup turned out to be extremely high. People who ate a serving of canned soup every day for five days had BPA levels of 20.8 micrograms per liter of urine, whereas people who instead ate fresh soup had levels of 1.1 micrograms per liter.

Canned Soup Consumption and Urinary Bisphenol A: A Randomized Crossover Trial Carwile, JL et al. JAMA. November 23/30, 2011, Vol 306, No. 20

Nevertheless, the FDA caved in to industry pressure in 2012 and refused to regulate BPA claiming that, as usual, more study was needed. (FDA: BPA)

BPA is not the only toxic chemical associated with plastics that may be promoting insulin resistance. . Phthalates are compounds added to plastic to make it flexible. They rub off on our food and are found in our blood and urine. A study of 387 Hispanic and Black, New York City children who were between six and eight years old measured the phthalates in their urine and found that the more phthalates in their urine, the fatter the child was a year later.

Associations between phthalate metabolite urinary concentrations and body size measures in New York City children.
Susan L. Teitelbaum et al.Environ Res. 2012 Jan;112:186-93.

This finding was echosed by another study:

Urinary phthalates and increased insulin resistance in adolescents Trasande L, et al. Pediatrics 2013; DOI: 10.1542/peds.2012-4022.

And phthalates are everywhere. A study of 1,016 Swedes aged 70 years and older found that four phthalate metabolites were detected in the blood serum of almost all the participants. High levels of three of these were associated with the prevalence of diabetes. The researchers explain that one metabolite was mainly related to poor insulin secretion, whereas two others were related to insulin resistance. The researchers didn’t check to see whether this relationship held for prediabetes.

Circulating Levels of Phthalate Metabolites Are Associated With Prevalent Diabetes in the Elderly.Lind, MP et al. Diabetes. Published online before print April 12, 2012, doi: 10.2337/dc11-2396

Chances are very good that these same omnipresent phthalates are also causing insulin resistance and damaging insulin secretion in people whose ages fall between those of the two groups studied here.

Use of Herbicide Atrazine Maps to Obesity, Causes Insulin Resistance

A study published in April of 2009 mentions that “There is an apparent overlap between areas in the USA where the herbicide, atrazine (ATZ), is heavily used and obesity-prevalence maps of people with a BMI over 30.”

It found that when rats were given low doses of this pesticide in thier water, “Chronic administration of ATZ decreased basal metabolic rate, and increased body weight, intra-abdominal fat and insulin resistance without changing food intake or physical activity level.” In short the animals got fat even without changing their food intake. When the animals were fed a high fat,high carb diet, the weight gain was even greater.

Insulin resistance was increased too, which if it happens in people, means that people who have genetically-caused borderline capacity to secrete insulin are more likely to become diabetic when they are exposed to this chemical via food or their drinking water.

Chronic Exposure to the Herbicide, Atrazine, Causes Mitochondrial Dysfunction and Insulin Resistance PLoS ONE Published 13 Apr 2009

2,4-D A Common Herbicide Blocks Secretion of GLP-1–A Blood Sugar Lowering Gastric Peptide

In 2007 scientists at New York’s Mount Sinai Hospital discovered that the intestine has receptors for sugar identical to those found on the tongue and that these receptors regulate secretion of glucagon-like peptide-1 (GLP-1). GLP-1 is the peptide that is mimicked by the diabetes drug Byetta and which is kept elevated by Januvia and Onglyza. You can read about that finding in this Science Daily report:

Science Daily: Your Gut Has Taste Receptors

In November 2009, these same scientists reported that a very common herbicide 2,4 D blocked this taste receptor, effectively turning off its ability to stimulate the production GLP-1. The fibrate drugs used to lower cholesterol were also found to block the receptor.

Science Daily: Common Herbicides and Fibrates Block Nutrient-Sensing Receptor Found in Gut and Pancreas

What was even more of concern was the discovery that the ability of these compounds to block this gut receptor “did not generalize across species to the rodent form of the receptor.” The lead researcher was quoted as saying,

…most safety tests were done using animals, which have T1R3 receptors that are insensitive to these compounds,

This takes on additional meaning when you realize that most compounds released into the environment are tested only on animals, not humans. It may help explain why so many supposedly “safe” chemicals are damaging human glucose metabolisms.

Trace Amounts of Arsenic in Urine Correlate with Dramatic Rise in Diabetes

A study published in JAMA in August of 2008 found of 788 adults who had participated in the 2003-2004 National Health and Nutrition Examination Survey (NHANES) found those who had the most arsenic in their urine, were nearly four times more likely to have diabetes than those who had the least amount.

The study is reported here:

Arsenic Exposure and Prevalence of Type 2 Diabetes in US Adults. Ana Navas-Acien et al. JAMA. 2008;300(7):814-822.

The New York Times report about this study (no longer online) added this illuminating bit of information to the story:

Arsenic can get into drinking water naturally when minerals dissolve. It is also an industrial pollutant from coal burning and copper smelting. Utilities use filtration systems to get it out of drinking water.

Seafood also contains nontoxic organic arsenic. The researchers adjusted their analysis for signs of seafood intake and found that people with Type 2 Diabetes had 26 percent higher inorganic arsenic levels than people without Type 2 Diabetes.

How arsenic could contribute to diabetes is unknown, but prior studies have found impaired insulin secretion in pancreas cells treated with an arsenic compound.

Prescription Drugs, Especially SSRI Antidepressants Cause Obesity and Possibly Diabetes

Another important environmental factor is this: Type 2 Diabetes can be caused by some commonly prescribed drugs. Beta blockers and atypical antipsychotics like Zyprexa have been shown to cause diabetes in people who would not otherwise get it. This is discussed here.

There is some research that suggests that SSRI antidepressants may also promote diabetes. It is well known that antidepressants cause weight gain.

Spin doctors in the employ of the drug companies who sell these high-profit antidepressants have long tried to attribute the relationship between depression and obesity to depression, rather than the drugs used to treat the condition.

However, a new study published in June 2009 used data from the Canadian National Population Health Survey (NPHS), a longitudinal study of a representative cohort of household residents in Canada and tracked the incidence of obesity over ten years.

The study found that, “MDE [Major Depressive Episode] does not appear to increase the risk of obesity. …Pharmacologic treatment with antidepressants may be associated with an increased risk of obesity. [emphasis mine]. The study concluded,

Unexpectedly, significant effects were seen for serotonin-reuptake-inhibiting antidepressants [Prozac,Celexa, Lovox, Paxil, Zoloft] and venlafaxine [Effexor], but neither for tricyclic antidepressants nor antipsychotic medications.

Scott B. Patten et al. Psychother Psychosom 2009;78:182-186 (DOI: 10.1159/000209349)

Here is an article posted by the Mayo Clinic that includes the statement “weight gain is a reported side effect of nearly all antidepressant medications currently available.

Antidepressants and weight gain – Mayoclinic.com

Here is a report about a paper presented at the 2006 ADA Conference that analyzed the Antidepressant-Diabetes connection in a major Diabetes prevention study:

Medscape: Antidepressant use associated with increased type 2 diabetes risk.

Treatment for Cancer, Especially Radiation, Greatly Increases Diabetes Risk Independent of Obesity or Exercise Level

A study published in August 2009 analyzed data for 8599 survivors in the Childhood Cancer Survivor Study. It found that after adjusting for body mass and exercise levels, survivors of childhood cancer were 1.8 times more likely than the siblings to report that they had diabetes.

Even more significantly, those who had had full body radiation were 7.2 times more likely to have diabetes.

This raises the question of whether exposure to radiation in other contexts also causes Type 2 diabetes.

Diabetes Mellitus in Long-term Survivors of Childhood Cancer: Increased Risk Associated With Radiation Therapy: A Report for the Childhood Cancer Survivor Study.Lillian R. Meacham et al. Arch. Int. Med.Vol. 169 No. 15, Aug 10/24, 2009.

More Insight into the Effect of Genetic Flaws

Now that we have a better idea of some of the underlying physiological causes of diabetes, lets look more closely at the physiological processes that takes place as these genetic flaws push the body towards diabetes.

Insulin Resistance Develops in Thin Children of People with Type 2 Diabetes

Lab research has come up with some other intriguing findings that challenge the idea that obesity causes insulin resistance which causes diabetes. Instead, it looks like the opposite happens: Insulin resistance precedes the development of obesity.

One of these studies took two groups of thin subjects with normal blood sugar who were evenly matched for height and weight. The two groups differed only in that one group had close relatives who had developed Type 2 Diabetes, and hence, if there were a genetic component to the disorder, they were more likely to have it. The other group had no relatives with Type 2 Diabetes. The researchers then and examined the subjects’ glucose and insulin levels during a glucose tolerance test and calculated their insulin resistance. They found that the thin relatives of the people with Type 2 Diabetes already had much more insulin resistance than did the thin people with no relatives with diabetes.

Insulin resistance in the first-degree relatives of persons with Type 2 Diabetes. Straczkowski M et al. Med Sci Monit. 2003 May;9(5):CR186-90.

This result was echoed by a second study published in November of 2009.

That study compared detailed measurements of insulin secretion and resistance in 187 offspring of people diagnosed with Type 2 diabetes against 509 controls. Subjects were matched with controls for age, gender and BMI. It concluded:

The first-degree offspring of type 2 diabetic patients show insulin resistance and beta cell dysfunction in response to oral glucose challenge. Beta cell impairment exists in insulin-sensitive offspring of patients with type 2 diabetes, suggesting beta cell dysfunction to be a major defect determining diabetes development in diabetic offspring.

Beta cell (dys)function in non-diabetic offspring of diabetic patients M. Stadler et al. Diabetologia Volume 52, Number 11 / November, 2009, pp 2435-2444. doi 10.1007/s00125-009-1520-7

Mitochondrial Dysfunction is Found in Lean Relatives of People with Type 2 Diabetes

One reason insulin resistance might precede obesity was explained by a landmark 2004 study which looked at the cells of the “healthy, young, lean” but insulin-resistant relatives of people with Type 2 Diabetes and found that their mitochondria, the “power plant of the cells” that is the part of the cell that burns glucose, appeared to have a defect. While the mitochondria of people with no relatives with diabetes burned glucose well, the mitochondria of the people with an inherited genetic predisposition to diabetes were not able to burn off glucose as efficiently, but instead caused the glucose they could not burn and to be stored in the cells as fat.

Impaired mitochondrial activity in the insulin-resistant offspring of patients with type 2 diabetes. Petersen KF et al. New England J Med 2004 Feb 12; 350(7);639-41

More Evidence that Abnormal Insulin Resistance Precedes Weight Gain and Probably Causes It

A study done by the same researchers at Yale University School of Medicine who discovered the mitochondrial problem we just discussed was published in Proceedings of the National Academy of Science (PNAS) in July 2007. It reports on a study that compared energy usage by lean people who were insulin resistant and lean people who were insulin sensitive.

The role of skeletal muscle insulin resistance in the pathogenesis of the metabolic syndrome Petersen,KF et al. PNAS July 31, 2007 vol. 104 no. 31 12587-12594.

Using new imaging technologies, the researchers found that lean but insulin resistant subjects converted glucose from high carbohydrate meals into triglycerides–i.e. fat. Lean insulin-sensitive subjects, in contrast, stored the same glucose in the form of muscle and liver glycogen.

The researchers conclude that:

the insulin resistance, in these young, lean, insulin resistant individuals, was independent of abdominal obesity and circulating plasma adipocytokines, suggesting that these abnormalities develop later in the development of the metabolic syndrome.”

In short, obesity looked to be a result, not a cause of the metabolic flaw that led these people to store carbohydrate they ate in the form of fat rather than burn it for energy.

The researchers suggested controlling insulin resistance with exercise. It would also be a good idea for people who are insulin resistant, or have a family history of Type 2 Diabetes to cut back on their carb intake, knowing that the glucose from the carbs they eat is more likely to turn into fat.

Beta Cells Fail to Reproduce in People with Diabetes

A study of pancreas autopsies that compared the pancreases of thin and fat people with diabetes with those of thin and fat normal people found that fat, insulin-resistant people who did not develop diabetes apparently were able to grow new beta-cells to produce the extra insulin they needed. In contrast, the beta cells of people who developed diabetes were unable to reproduce. This failure was independent of their weight.

Beta-Cell Deficit and Increased Beta-Cell Apoptosis in Humans With Type 2 Diabetes. Alexandra E. Butler, et al. Diabetes 52:102-110, 2003

Once Blood Sugars Rise They Impair a Muscle Gene that Regulates Insulin Sensitivity

Another piece of the puzzle falls into place thanks to a research study published on Feb 8, 2008.

Downregulation of Diacylglycerol Kinase Delta Contributes to Hyperglycemia-Induced Insulin Resistance. Alexander V. Chibalin et. al. Cell, Volume 132, Issue 3, 375-386, 8 February 2008.

As reported in Diabetes in Control (which had access to the full text of the study)

The research team identified a “fat-burning” gene, the products of which are required to maintain the cells insulin sensitivity. They also discovered that this gene is reduced in muscle tissue from people with high blood sugar and type 2-diabetes. In the absence of the enzyme that is made by this gene, muscles have reduced insulin sensitivity, impaired fat burning ability, which leads to an increased risk of developing obesity.

“The expression of this gene is reduced when blood sugar rises, but activity can be restored if blood sugar is controlled by pharmacological treatment or exercise”, says Professor Juleen Zierath. “Our results underscore the importance of tight regulation of blood sugar for people with diabetes.”

In short, once your blood sugar rises past a certain point, you become much more insulin resistant. This, in turn, pushes up your blood sugar more.

A New Model For How Diabetes Develops

These research findings open up a new way of understanding the relationship between obesity and diabetes.

Perhaps people with the genetic condition underlying Type 2 Diabetes inherit a defect in the beta cells that make those cells unable to reproduce normally to replace cells damaged by the normal wear and tear of life.Or perhaps exposure to an environmental toxin damages the related genes.

Perhaps, too, a defect in the way that their cells burn glucose inclines them to turn excess blood sugar into fat rather than burning it off as a person with normal mitochondria might do.

Put these facts together and you suddenly get a fatal combination that is almost guaranteed to make a person fat.

Studies have shown that blood sugars only slightly over 100 mg/dl are high enough to render beta cells dysfunctional.

Beta-cell dysfunction and glucose intolerance: results from the San Antonio metabolism (SAM) study. Gastaldelli A, et al. Diabetologia. 2004 Jan;47(1):31-9. Epub 2003 Dec 10.

In a normal person who had the ability to grow new beta cells, any damaged beta cells would be replaced by new ones, which would keep the blood sugar at levels low enough to avoid further damage. But the beta cells of a person with a genetic heritage of diabetes are unable to reproduce So once blood sugars started to rise, more beta cells would succumb to the resulting glucose toxicity, and that would, in turn raise blood sugar higher.

As the concentration of glucose in their blood rose, these people would not be able to do what a normal person does with excess blood sugar–which is to burn it for energy. Instead their defective mitochondria will cause the excess glucose to be stored as fat. As this fat gets stored in the muscles it causes the insulin resistance so often observed in people with diabetes–long before the individual begins to gain visible weight. This insulin resistance puts a further strain on the remaining beta cells by making the person’s cells less sensitive to insulin. Since the person with an inherited tendency to diabetes’ pancreas can’t grow the extra beta cells that a normal person could grow when their cells become insulin resistant this leads to ever escalating blood sugars which further damage the insulin-producing cells, and end up in the inevitable decline into diabetes.

Low Fat Diets Promote the Deterioration that Leads to Diabetes in People with the Genetic Predisposition

In the past two decades, when people who were headed towards diabetes begin to gain weight, they were advised to eat a low fat diet. Unfortunately, this low fat diet is also a high carbohydrate diet–one that exacerbates blood sugar problems by raising blood sugars dangerously high, destroying more insulin-producing beta-cells, and catalyzing the storage of more fat in the muscles of people with dysfunctional mitochondria. Though they may have stuck to diets to low fat for weeks or even months these people were tormented by relentless hunger and when they finally went off their ineffective diets, they got fatter. Unfortunately, when they reported these experiences to their doctors, they were almost universally accused of lying about their eating habits.

It has only been documented in medical research during the past two years that that many patients who have found it impossible to lose weight on the low fat high carbohydrate can lose weight–often dramatically–on a low carbohydrate diet while improving rather than harming their blood lipids.

Very low-carbohydrate and low-fat diets affect fasting lipids and postprandial lipemia differently in overweight men. Sharman MJ, et al. J Nutr. 2004 Apr;134(4):880-5.

An isoenergetic very low carbohydrate diet improves serum HDL cholesterol and triacylglycerol concentrations, the total cholesterol to HDL cholesterol ratio and postprandial lipemic responses compared with a low fat diet in normal weight, normolipidemic women. Volek JS, et al. J Nutr. 2003 Sep;133(9):2756-61.

The low carb diet does two things. By limiting carbohydrate, it limits the concentration of blood glucose which often is enough to bring moderately elevated blood sugars down to normal or near normal levels. This means that there will be little excess glucose left to be converted to fat and stored.

It also gets around the mitochondrial defect in processing glucose by keeping blood sugars low so that the body switches into a mode where it burns ketones rather than glucose for muscle fuel.

Relentless Hunger Results from Roller Coaster Blood Sugars

There is one last reason why you may believe that obesity caused your diabetes, when, in fact, it was undiagnosed diabetes that caused your obesity.

Long before a person develops diabetes, they go through a phase where they have what doctors called “impaired glucose tolerance.” This means that after they eat a meal containing carbohydrates, their blood sugar rockets up and may stay high for an hour or two before dropping back to a normal level.

What most people don’t know is that when blood sugar moves swiftly up or down most people will experience intense hunger. The reasons for this are not completely clear. But what is certain is that this intense hunger caused by blood sugar swings can develop years before a person’s blood sugar reaches the level where they’ll be diagnosed as diabetic.

This relentless hunger, in fact, is often the very first diabetic symptom a person will experience, though most doctors do not recognize this hunger as a symptom. Instead, if you complain of experiencing intense hunger doctors may suggest you need an antidepressant or blame your weight gain, if you are female, on menopausal changes.

This relentless hunger caused by impaired glucose tolerance almost always leads to significant weight gain and an increase in insulin resistance. However, because it can take ten years between the time your blood sugar begins to rise steeply after meals and the time when your fasting blood sugar is abnormal enough for you to be diagnosed with diabetes, most people are, indeed, very fat at the time of diagnosis.

With better diagnosis of diabetes (discussed here) we would be able to catch early diabetes before people gained the enormous amounts of weight now believed to cause the syndrome. But at least now people with diabetic relatives who are at risk for developing diabetes can go a long way towards preventing the development of obesity by controlling their carbohydrate intake long before they begin to put on weight.

You CAN Undo the Damage

No matter what your genetic heritage or the environmental insults your genes have survived, you can take steps right now to lower your blood sugar, eliminate the secondary insulin resistance caused by high blood sugars, and start the process that leads back to health. The pages linked here will show you how.

How To Get Your Blood Sugar Under Control

What Can You Eat When You Are Cutting The Carbs?

What is a Normal Blood Sugar

Research Connecting Blood Sugar Level with Organ Damage

The 5% Club: They Normalized Their Blood Sugar and So Can You

Read Full Post »

Obesity Issues

Larry H. Bernstein, MD, FCAP, Curator



The Changing Face of Obesity

Science tells us obesity is a chronic disease. Why does the outmoded and injurious notion that it is a problem of willpower persist?

By Joseph Proietto | November 1, 2015   http://www.the-scientist.com//?articles.view/articleNo/44288/title/The-Changing-Face-of-Obesity/

In Dante Alighieri’s Divine Comedy the narrator meets a man named Ciacco who had been sent to Hell for the “Damning sin of Gluttony.” According to Catholic theology, in order to end up in Hell one must willfully commit a serious sin. So Dante believed that fat people chose to be fat. This antiquated view of the cause of obesity is still widespread, even among medical professionals. The consequences of this misconception are significant, because it forms the basis for the discrimination suffered by the obese; for the wasting of scarce resources in attempts to change lifestyle habits by public education; and for the limited availability of subsidized obesity treatments.


While obesity is often labeled a lifestyle disease, poor lifestyle choices alone account for only a 6 to 8 kg weight gain. The body has a powerful negative feedback system to prevent excessive weight gain. The strongest inhibitor of hunger, the hormone leptin, is made by fat cells. A period of increased energy intake will result in fat deposition, which will increase leptin production. Leptin suppresses hunger and increases energy expenditure. This slows down weight gain. To become obese, it may be necessary to harbor a genetic difference that makes the individual resistant to the action of leptin.

Evidence from twin and adoption studies suggests that obesity has a genetic basis, and over the past two decades a number of genes associated with obesity have been described. The most common genetic defect in European populations leading to severe obesity is due to mutations in the gene coding for the melanocortin 4 receptor (MCR4). Still, this defect can explain severe obesity in only approximately 6 percent to 7 percent of cases (J Clin Invest, 106:271-79, 2000). Other genes have been discovered that can cause milder increases in weight; for example, variants of just one gene (FTO) can explain up to 3 kg of weight variation between individuals (Science, 316:889-94, 2007).

Genes do not directly cause weight gain. Rather, genes influence the desire for food and the feeling of satiety. In an environment with either poor access to food or access to only low-calorie food, obesity may not develop even in persons with a genetic predisposition. When there is an abundance of food and a sedentary lifestyle, however, an obesity-prone person will experience greater hunger and reduced satiety, increasing caloric intake and weight gain.

Since the 1980s, there has been a rapid rise in the prevalence of obesity worldwide, a trend that likely results from a variety of complex causes. There is increasing evidence, for example, that the development of obesity on individual or familial levels may be influenced by environmental experiences that occur in early life. For example, if a mother is malnourished during early pregnancy, this results in epigenetic changes to genes involved in the set points for hunger and satiety in the developing child. These changes may then become fixed, resulting in a tendency towards obesity in the offspring.

The biological basis of obesity is further highlighted by the vigorous defense of weight following weight loss. There are at least 10 circulating hormones that modulate hunger. Of these, only one has been confirmed as a hunger-inducing hormone (ghrelin), and it is made and released by the stomach. In contrast, nine hormones suppress hunger, including CCK, PYY, GLP-1, oxyntomodulin, and uroguanylin from the small bowel; leptin from fat cells; and insulin, amylin, and pancreatic polypeptide from the pancreas.


After weight loss, regardless of the diet employed, there are changes in circulating hormones involved in the regulation of body weight. Ghrelin levels tend to increase and levels of multiple appetite-suppressing hormones decrease. There is also a subjective increase in appetite. Researchers have shown that even after three years, these hormonal changes persist (NEJM, 365:1597-604, 2011; Lancet Diabetes and Endocrinology, 2:954-62, 2014). This explains why there is a high rate of weight regain after diet-induced weight loss.

Given that the physiological responses to weight loss predispose people to regain that weight, obesity must be considered a chronic disease. Data show that those who successfully maintain their weight after weight loss do so by remaining vigilant and constantly applying techniques to oppose weight regain. These techniques may involve strict diet and exercise practices and/or pharmacotherapy.

It is imperative for society to move away from a view that obesity is simply a lifestyle issue and to accept that it is a chronic disease. Such a change would not only relieve the stigma of obesity but would also empower politicians, scientists and clinicians to tackle the problem more effectively.

Joseph Proietto was the inaugural Sir Edward Dunlop Medical Research Foundation Professor of Medicine in the Department of Medicine, Austin Health at the University of Melbourne in Australia. He is a researcher and clinician investigating and treating obesity and type 2 diabetes.



A Weighty Anomaly

Why do some obese people actually experience health benefits?

By Jyoti Madhusoodanan | November 1, 2015     http://www.the-scientist.com//?articles.view/articleNo/44304/title/A-Weighty-Anomaly/


THE ENDOCRINE THEORY: Some researchers have posited that fat cells may secrete molecules that affect glucose homeostasis in muscle or liver tissue.COURTESY OF MITCHELL LAZAR

In the early 19th century, Belgian mathematician Adolphe Quetelet was obsessed with a shape: the bell curve. While helping with a population census, Quetelet proposed that the spread of human traits such as height and weight followed this trend, also known as a Gaussian or normal distribution. On a quest to define a “normal man,” he showed that human height and weight data fell along his beloved bell curves, and in 1823 devised the “Quetelet Index”—more familiar to us today as the BMI, or body mass index, a ratio of weight to height.

Nearly two centuries later, clinicians, researchers, and fitness instructors continue to rely on this metric to pigeonhole people into categories: underweight, healthy, overweight, or obese. But Quetelet never intended the metric to serve as a way to define obesity. And now, a growing body of evidence suggests these categories fail to accurately reflect the health risks—or benefits—of being overweight.

Although there is considerable debate surrounding the prevalence of metabolically healthy obesity, when obesity is defined in terms of BMI (a BMI of 30 or higher), estimates suggest that about 10 percent of adults in the U.S. are obese yet metabolically healthy, while as many as 80 percent of those with a normal BMI may be metabolically unhealthy, with signs of insulin resistance and poor circulating lipid levels, even if they suffer no obvious ill effects. “If all we know about a person is that they have a certain body weight at a certain height, that’s not enough information to know their health risks from obesity,” says health-science researcher Paul McAuley of Winston-Salem State University. “We need better indicators of metabolic health.”

The dangers of being overweight, such as a higher risk of heart disease, type 2 diabetes, and other complications, are well known. But some obese individuals—dubbed the “fat fit”—appear to fare better on many measures of health when they’re heavier. Studies have found lower mortality rates, better response to hemodialysis in chronic kidney disease, and lower incidence of dementia in such people. Mortality, it’s been found, correlates with obesity in a U-shaped curve (J Sports Sci, 29:773-82, 2011). So does extra heft help or hurt?

To answer that question, researchers are trying to elucidate the metabolic reasons for this obesity paradox.

In a recent study, Harvard University epidemiologist Goodarz Danaei and his colleagues analyzed data from nine studies involving a total of more than 58,000 participants to tease apart how obesity and other well-known metabolic risk factors influence the risk of coronary heart disease. Controlling these other risk factors, such as hypertension or high cholesterol, with medication is simpler than curbing obesity itself, Danaei explains. “If you control a person’s obesity you get rid of some health risks, but if you control hypertension or diabetes, that also reduces health risks, and you can do the latter much more easily right now.”

Danaei’s team assessed BMI and metabolic markers such as systolic blood pressure, total serum cholesterol, and fasting blood glucose. The three metabolic markers only explained half of the increased risk of heart disease across all study participants. In obese individuals, the other half appeared to be mediated by fat itself, perhaps via inflammatory markers or other indirect mechanisms (Epidemiology, 26:153-62, 2015). While Danaei’s study was aimed at understanding how obesity hurts health, the results also uncovered unknown mechanisms by which excess adipose tissue might exert its effects. This particular study revealed obesity’s negative effects, but might these unknown mechanisms hold clues that explain the obesity paradox?

Other researchers have suggested additional possibilities—for example, that inflammatory markers such as TNF-α help combat conditions such as chronic kidney disease, or that obesity makes a body more capable of making changes to, and tolerating changes in, blood flow depending on systemic needs (Am J Clin Nutr, 81:543-54, 2005).

According to endocrinologist Mitchell Lazar at the University of Pennsylvania, the key to explaining the obesity paradox may be two nonexclusive ways fat tissue is hypothesized to function. One mechanism, termed the endocrine theory, suggests that fat cells secrete, or don’t secrete enough of, certain molecules that influence glucose homeostasis in other tissues, such as muscle or liver. The first such hormone to be discovered was leptin; later studies reported several other adipocyte-secreted factors, including adiponectin, resistin, and various cytokines.

The other hypothesis, dubbed the spillover theory, suggests that storing lipids in fat cells has some pluses. Adipose tissue might sequester fat-soluble endotoxins, and produce lipoproteins that can bind to and clear harmful lipids from circulation. When fat cells fill up, however, these endotoxins are stashed in the liver, pancreas, or other organs—and that’s when trouble begins. In “fat fit” people, problems typically linked to obesity such as high cholesterol or diabetes may be avoided simply because their adipocytes mop up more endotoxins.

“In this model, one could imagine that if you could store even more fat in fat cells, you could be even more obese, but you might be protected from problems [associated with] obesity because you’re protecting the other tissues from filling up with lipids that cause problems,” says Lazar. “This may be the most popular current model to explain the fat fit.”

Although obesity greatly increases the risk of type 2 diabetes—up to 93-fold in postmenopausal women, for example—not all obese people suffer from the condition. Similarly, a certain subtype of individuals with “normal” BMIs are at greater risk of developing insulin resistance and type 2 diabetes than others with BMIs in the same range. Precisely what distinguishes these two cohorts is still unclear. “Just as important as explaining why some obese people don’t get diabetes is to explain why other subgroups—normal-weight people or those with lipodystrophy—sometimes get it,” Lazar says. “If there are multiple subtypes of obesity and diabetes, can we figure out genetic aspects or biomarkers that cause one of these phenotypes and not the other?”

To Lazar, McAuley, and other researchers, it’s increasingly evident that BMI may not be that metric. Finding better ways to assess a healthy weight, however, has proven challenging. Researchers have tested measures, such as the body shape index (ABSI) or the waist-hip ratio, which attempt to gauge visceral fat—considered to be more metabolically harmful than fat in other body locations. However, these metrics have yet to be implemented widely in clinics, and few are as simple to understand as the BMI (Science, 341:856-58, 2013).

Independent of metrics, however, the health message regarding weight is still unanimous: exercise and healthy dietary choices benefit everyone. “At a certain point, despite all the so-called fit-fat people, the demographics say that there’s a huge risk of diabetes and heart disease at very high BMI,” notes Lazar. “We can’t assume we’ll be one of the lucky ones who will have a BMI in the obese category but will still be protected from heart disease.”

Correction (November 2): The original version of this article misattributed the pull quote above. The attribution for this quote has been corrected, and The Scientist regrets the error.




 Science 23 Aug 2013;  341(6148): 856858     DOI: http://dx.doi.org:/10.1126/science.1241244
Obesity paradoxes.
In this review, we examine the original obesity paradox phenomenon (i.e. in cardiovascular disease populations, obese patients survive better), as well as three other related paradoxes (pre-obesity, “fat but fit” theory, and “healthy” obesity). An obesity paradox has been reported in a range of cardiovascular and non-cardiovascular conditions. Pre-obesity (defined as a body mass index of 25.0-29.9 kg · m⁻²) presents another paradox. Whereas “overweight” implies increased risk, it is in fact associated with decreased mortality risk compared with normal weight. Another paradox concerns the observation than when fitness is taken into account, the mortality risk associated with obesity is offset. The final paradox under consideration is the presence of a sizeable subset of obese individuals who are otherwise healthy. Consequently, a large segment of the overweight and obese population is not at increased risk for premature death. It appears therefore that low cardiorespiratory fitness and inactivity are a greater health threat than obesity, suggesting that more emphasis should be placed on increasing leisure time physical activity and cardiorespiratory fitness as the main strategy for reducing mortality risk in the broad population of overweight and obese adults.
Obesity, insulin resistance, and cardiovascular disease.
Recent Prog Horm Res. 2004;59:207-23.
The ability of insulin to stimulate glucose disposal varies more than six-fold in apparently healthy individuals. The one third of the population that is most insulin resistant is at greatly increased risk to develop cardiovascular disease (CVD), type 2 diabetes, hypertension, stroke, nonalcoholic fatty liver disease, polycystic ovary disease, and certain forms of cancer. Between 25-35% of the variability in insulin action is related to being overweight. The importance of the adverse effects of excess adiposity is apparent in light of the evidence that more than half of the adult population in the United States is classified as being overweight/obese, as defined by a body mass index greater than 25.0 kg/m(2). The current epidemic of overweight/obesity is most-likely related to a combination of increased caloric intake and decreased energy expenditure. In either instance, the fact that CVD risk is increased as individuals gain weight emphasizes the gravity of the health care dilemma posed by the explosive increase in the prevalence of overweight/obesity in the population at large. Given the enormity of the problem, it is necessary to differentiate between the CVD risk related to obesity per se, as distinct from the fact that the prevalence of insulin resistance and compensatory hyperinsulinemia are increased in overweight/obese individuals. Although the majority of individuals in the general population that can be considered insulin resistant are also overweight/obese, not all overweight/obese persons are insulin resistant. Furthermore, the cluster of abnormalities associated with insulin resistance – namely, glucose intolerance, hyperinsulinemia, dyslipidemia, and elevated plasma C-reactive protein concentrations — is limited to the subset of overweight/obese individuals that are also insulin resistant. Of greater clinical relevance is the fact that significant improvement in these metabolic abnormalities following weight loss is seen only in the subset of overweight/obese individuals that are also insulin resistant. In view of the large number of overweight/obese subjects at potential risk to be insulin resistant/hyperinsulinemic (and at increased CVD risk), and the difficulty in achieving weight loss, it seems essential to identify those overweight/obese individuals who are also insulin resistant and will benefit the most from weight loss, then target this population for the most-intensive efforts to bring about weight loss.
Long-Term Persistence of Hormonal Adaptations to Weight Loss

Priya Sumithran, Luke A. Prendergast, Elizabeth Delbridge, Katrina Purcell, Arthur Shulkes, Adamandia Kriketos, and Joseph Proietto

N Engl J Med 2011; 365:1597-1604   October 27, 2011http://dx.doi.org:/10.1056/NEJMoa1105816

After weight loss, changes in the circulating levels of several peripheral hormones involved in the homeostatic regulation of body weight occur. Whether these changes are transient or persist over time may be important for an understanding of the reasons behind the high rate of weight regain after diet-induced weight loss.

Weight loss (mean [±SE], 13.5±0.5 kg) led to significant reductions in levels of leptin, peptide YY, cholecystokinin, insulin (P<0.001 for all comparisons), and amylin (P=0.002) and to increases in levels of ghrelin (P<0.001), gastric inhibitory polypeptide (P=0.004), and pancreatic polypeptide (P=0.008). There was also a significant increase in subjective appetite (P<0.001). One year after the initial weight loss, there were still significant differences from baseline in the mean levels of leptin (P<0.001), peptide YY (P<0.001), cholecystokinin (P=0.04), insulin (P=0.01), ghrelin (P<0.001), gastric inhibitory polypeptide (P<0.001), and pancreatic polypeptide (P=0.002), as well as hunger (P<0.001).

What’s new in endocrinology and diabetes mellitus

Large genome wide association studies have demonstrated that variants in the FTO gene have the strongest association with obesity risk in the general population, but the mechanism of the association has been unclear. However, a nonocoding causal variant in FTO has now been identified that changes the function of adipocytes from energy utilization (beige fat) to energy storage (white fat) with a fivefold decrease in mitochondrial thermogenesis [17]. When the effect of the variant was blocked in genetically engineered mice, thermogenesis increased and weight gain did not occur, despite eating a high-fat diet. Blocking the gene’s effect in human adipocytes also increased energy utilization. This observation has important implications for potential new anti-obesity drugs. (See “Pathogenesis of obesity”, section on ‘FTO variants’.)

Liraglutide for the treatment of obesity (July 2015)

Along with diet, exercise, and behavior modification, drug therapy may be a helpful component of treatment for select patients who are overweight or obese. Liraglutide is a glucagon-like peptide-1 (GLP-1) receptor agonist, used for the treatment of type 2 diabetes, and can promote weight loss in patients with diabetes, as well as those without diabetes.

In a randomized trial in nondiabetic patients who had a body mass index (BMI) of ≥30 kg/m2 or ≥27 kg/m2 with dyslipidemia and/or hypertension, liraglutide 3 mg once daily, compared with placebo, resulted in greater mean weight loss (-8.0 versus -2.6 kg with placebo) [18]. In addition, cardiometabolic risk factors, glycated hemoglobin (A1C), and quality of life improved modestly. Gastrointestinal side effects transiently affected at least 40 percent of the liraglutide group and were the most common reason for withdrawal (6.4 percent). Liraglutide is an option for select overweight or obese patients, although gastrointestinal side effects (nausea, vomiting) and the need for a daily injection may limit the use of this drug. (See “Obesity in adults: Drug therapy”, section on ‘Liraglutide’.)

In a trial designed specifically to evaluate the effect of liraglutide on weight loss in overweight or obese patients with type 2 diabetes (mean weight 106 kg), liraglutide, compared with placebo, resulted in greater mean weight loss (-6.4 kg and -5.0 kg for liraglutide 3 mg and 1.8 mg, respectively, versus -2.2 kg for placebo) [19]. Treatment with liraglutide was associated with better glycemic control, a reduction in the use of oral hypoglycemic agents, and a reduction in systolic blood pressure. Although liraglutide is not considered as initial therapy for the majority of patients with type 2 diabetes, it is an option for select overweight or obese patients with type 2 diabetes who fail initial therapy with lifestyle intervention and metformin.  (See “Glucagon-like peptide-1 receptor agonists for the treatment of type 2 diabetes mellitus”, section on ‘Weight loss’.)

The Skinny on Fat Cells

Bruce Spiegelman has spent his career at the forefront of adipocyte differentiation and metabolism.

By Anna Azvolinsky | November 1, 2015


Bruce Spiegelman
Stanley J. Korsmeyer Professor of Cell Biology
and Medicine
Harvard Medical School
Director, Center for Energy Metabolism
and Chronic
Disease, Dana-Farber Cancer Institute, Boston

It’s hard to know whether you have the right stuff to be a scientist, but I had a passion for the research,” says Bruce Spiegelman, professor of cell biology at Harvard Medical School and the Dana-Farber Cancer Institute. After receiving his PhD in biochemistry from Princeton University in 1978, Spiegelman sent an application to do postdoctoral research to just one lab. “I wasn’t thinking I should apply to five different labs. I just marched forward more or less in a straight line,” he says. Spiegelman did know that he had no financial backup and depended on research fellowships throughout the early phase of his science career. “I thought it was fantastic, and still think so, that a PhD in science is supported by the government. I certainly appreciated that, because many of my friends in the humanities had to support themselves by cobbling together fellowships and teaching every semester, whereas we didn’t face similar challenges in the sciences.”

Since his graduate student days, Spiegelman has realized his potential, pioneering the study of adipose tissue biology and metabolism. He was introduced to the field in Howard Green’s laboratory, then at MIT, where Spiegelman began his one and only postdoc in 1978. Green had recently developed a system for culturing adipose cells and asked Spiegelman if he wanted to study fat cell differentiation. “I knew nothing about adipose tissue, but I was really interested in any model of how one cell switches to another. Whether skin or fat didn’t matter too much to me, because I was not coming at this from the perspective of physiology but from the perspective of how do these switches work at a molecular level?”

Spiegelman has stuck with studying the biology and differentiation of fat cells for more than 30 years. While looking for the master transcriptional regulator of fat development—which his laboratory found in 1994—Spiegelman’s group also discovered one of the first examples of a nuclear oncogene that functions as a transcription factor, and, more recently, the team found that brown fat and white fat come from completely different origins and that brown and beige fat are distinct cell types. Spiegelman was also the first to provide evidence for the connection between inflammation, insulin resistance, and fat tissue.

Here, Spiegelman talks about his strong affinity for the East Coast, his laboratory’s search for molecules that can crank up brown fat production and activity, and the culture of his laboratory’s weekly meeting.

Spiegelman Sets Out

First publication. Spiegelman grew up in Massapequa, New York, a town on Long Island. “Birds, insects, fish, and animals were fascinating to me. As a kid, I imagined I would be a wildlife ranger,” he says. Spiegelman and his brother were the first in their family to attend college; Spiegelman entered the College of William and Mary in 1970 thinking he would major in psychology. But before taking his first psychology course, he had to take a biology course, really loved it, and switched his major. For his senior thesis, he chose one of the few labs that did biochemistry-related research. He studied cultures of the filamentous fungus Aspergillus ornatus in which he induced the upregulation of a metabolic enzyme. Spiegelman applied a calculus transformation that related the age of the culture to the age of individual cells, something that had not been previously done. The work earned him his first first-author publication in 1975. “It was not a great breakthrough, but I think it showed that I was maybe applying myself more than the typical undergraduate.”

Full steam ahead. “My interest in laboratory research was intense. Even though it was not particularly inspired work, the first-author publication in a college where not many of the professors published a lot gave me a lot of confidence. It was probably out of proportion to the quality of the actual work.” That confidence and Spiegelman’s interest in the chemistry of living things led him to pursue a PhD in biochemistry at Princeton University. “Very early on, I felt that I couldn’t understand biology if it didn’t go to the molecular level. To me, just describing how an animal lived without understanding how it worked was very unsatisfying. I think it was one of the best decisions that I made in my life, to do a PhD in biochemistry,” he says, “because if you really want to understand living systems, you are very limited in how you can understand them without having a strong background in biochemistry because these are, essentially, chemical systems.”

Embracing molecular biology. Spiegelman initially joined Arthur Pardee’s laboratory, but switched when Pardee left Princeton for Harvard University in 1975. Because he was already collaborating with Marc Kirschner, a cell biologist and biochemist who studies the regulation of the cell cycle and how the cytoskeleton works, it was an easy transition to transfer to the new laboratory. In Kirschner’s group, Spiegelman became the cell biologist among many protein biochemists working on microtubule assembly in vitro. Rather than understanding how the proteins fit together to form the filamentous structures, Spiegelman wanted to understand what controlled their assembly inside cells. Working in mammalian cells, Spiegelman published three consecutive Cell papers on how microtubule assembly occurs in vivo. The firstpaper, from 1977, demonstrated that a nucleotide functions to stabilize the tubulin molecule rather than to regulate tubulin assembly in vivo.

Spiegelman Simmers

A new tool. For his next move, Spiegelman wanted to marry his background in biochemistry and molecular biology with a good cellular model system. He became interested in differentiation at the end of his PhD, while studying how the cytoskeleton is reorganized during neural differentiation, and settled on Green’s MIT laboratory for his postdoc. Green had developed a way to study both skin and fat cell differentiation. Again, Spiegelman was the odd man out, working on the molecular biology of fat cell differentiation while most of the graduate students and postdocs focused on the cellular biology of skin cell differentiation. While there, Spiegelman learned how to clone cDNA—a new method that some researchers thought was just another new fad, he says. “I thought it was pretty obvious that this was a tool that would be a game changer. I could see how I could clone some of the cDNAs and genes that were regulated in the fat cell lineage and then try to understand the regulation of these genes.”

Setting the stage. Spiegelman demonstrated that cAMP regulates the synthesis of certain enzymes in fat cells during differentiation. But while this was the most influential paper from his postdoc, says Spiegelman, it was his demonstration of cloning mRNAs from adipocytes, published in 1983, that set the stage for cloning fat-selective genes. The work, mostly done when Spiegelman was already a new faculty member at the Dana-Farber Cancer Institute, stemmed from his learning molecular cloning in Phillip Sharp’s lab at MIT and Bryan Roberts’s lab at Harvard. “This was the raw material from which we eventually cloned PPARγ and showed it to be the master regulator of fat [cell] development.”

Roots. Spiegelman became an assistant professor at the Harvard Medical School in 1982, when he was not yet 30. Although he had entertained the idea of moving to the West Coast with his wife, whom he had met at Princeton where she obtained a PhD in French literature, Spiegelman says he is really an East Coaster at heart. “My wife and I came to love Boston and were very comfortable there. Our families were both in New York, which was close, but not too close, and we really enjoyed the culture and pace of Boston; it was more ‘us.’ We really liked to visit California but didn’t particularly want to move there. We’re both real Northeastern people.”

Relating to Sisyphus. The transition from doing a postdoc to setting up his own laboratory was “very exciting and terribly stressful,” says Spiegelman. “When I think back, I always tried to be professional with my laboratory, but I was so stressed at suddenly being on my own with no management training.” The people resources he had encountered in his graduate and postdoctoral training labs were also not there yet, and he says his first publication as a principal investigator was like pushing a rock up a hill. But eventually, Spiegelman’s lab built a reputation and reached a critical mass of talented people who advanced the science. Again in 1983, Spiegelman produced a publication showing that morphological manipulation can affect gene expression and adipose differentiation.

End goal. Spiegelman’s goal was to find a master molecule that  orchestrates the conversion of adipocyte precursor cells into bona fide fat cells. Piece by piece, his lab identified the enhancers, promoters, and other regulatory elements involved in adipocyte differentiation. In 1994, graduate student Peter Tontonoz finallyfound that the PPARγ gene, inserted via a retroviral vector into fibroblasts, could induce the cells to become adipose cells. “It took 10 years,” Spiegelman says. Along the way, the laboratory found that c-fos, the product of a famous nuclear oncogene, bound to the promoters of fat-specific genes and worked as a transcription factor. “It was not really known how nuclear oncogenes worked. This was one of the first papers showing that these oncogenes bound to gene promoters and were transcription factors.”

A wider scope. In 1993, graduate student Gökhan Hotamisligil found that tumor necrosis factor-alpha(TNF-α), is induced in the fat tissue of rodent models of obesity and diabetes. The paper sparked the formation of the field of immunometabolism and resulted in the expansion of Spiegelman’s lab into the physiology arena, partly thanks to the guidance of C. Ronald Kahn and Jeff Flier, who both study metabolism and diabetes. But the work initially encountered pushback, says Spiegelman, partly because it was the merging of two fields.

Spiegelman Scales Up

Fat color palette. Brown fat tissue, abundant in infants but scarce in adults, is a metabolically active form of fat that is chock full of mitochondria and is found in pockets in the body distinct from white fat tissue.Pere Puigserver, then a postdoc in Spiegelman’s lab, found that the coactivator PCG-1, binding to PPARγ and other nuclear receptors, could stimulate mitochondrial biogenesis. The PCG-1 gene is turned on by stimuli such as exercise or a cold environment. Later, postdoc Patrick Seale, Spiegelman, and their colleagues showed brown fat cells derive from the same lineage that gives rise to skeletal muscle. “This was a big surprise, maybe the biggest surprise we ever uncovered in the lab,” says Spiegelman.

A paler shade of brown. More recently, in 2012, Spiegelman’s laboratory showed that within adult white adipose tissue, there are pockets of a yet another type of fat tissue that he called beige fat. “I think the evidence is very good from rodents that if you activate brown and beige fat, you get metabolic benefit both in obesity and diabetes. So the question now is: Can that be done in humans in a way that’s beneficial and not toxic?”  The lab is now looking to identify molecules that can either ramp up the activity of brown and beige fat or increase the production of both cell types as possible therapeutics for metabolic disorders or even cancer-associated cachexia. “Anyone who says that either approach will work better is being foolish. We just don’t know enough to go after just one or the other.”

On the irisin controversy. After reporting in 2012 that a muscle-related hormone called irisin could switch white fat to metabolically active brown fat, Spiegelman became embroiled in a media-covered debate about whether the molecule really exists; he was also the victim of a potential fraud plot. Most recently, Spiegelman provided thorough evidence that irisin does in fact exist. On the controversy, he says it’s a fine line between defending his scientific integrity and not adding more fuel to the fire or engaging with his harassers. “We have a long track record of doing credible and reproducible science and it was not that complicated to address the paper that claimed irisin was ‘a myth.’ That study used very outmoded scientific approaches.”

Raw talent. Many of Spiegelman’s trainees have gone on to become very successful scientists, including Tontonoz, Hotamisligil, Evan Rosen, and Randy Johnson. “It’s a quantum change in the experience of doing science when you get people who have their own visions. I would have thought that interacting with smart people would mainly help me get my scientific vision accomplished. And that was partly true, but also it changed my vision. When you have people challenging you on a day-to-day basis, you learn from them through the questions they ask and the way they challenge you in a constructive way. They made me a much better scientist.”

Rigorous mentorship.  “I feel very passionately that a major part of my job is to prepare the next generation of scientists. Everyone who comes through my lab will tell you that I take that very seriously. We make sure my students give a lot of talks and get critical assessments of their presentations to our lab group. I am very hands-on both scientifically and in developing the way students project their vision. I had a very good mentor, Marc Kirschner, and I’d like to think that I learned how to be a mentor from him. I want to make sure that when people walk out of my lab they are prepared to run independent research programs.”

Greatest Hits

  • Identified the master regulator of adipogenesis, the nuclear receptor PPARγ
  • Was the first to show that a nuclear oncogene, c-fos, codes for a transcription factor that binds to the promoters of genes
  • Demonstrated that adipose tissue synthesizes tumor necrosis factor-alpha (TNF-α), providing the first direct link between obesity, inflammation, insulin resistance, and fat tissue.
  • Showed that brown fat cells are not developmentally related to white fat
  • Identified beige fat as a distinct cell type, different from either white or brown fat


Fanning the Flames

Obesity triggers a fatty acid synthesis pathway, which in turn helps drive T cell differentiation and inflammation.

By Kate Yandell | November 1, 2015



The paper
Y. Endo et al., “Obesity drives Th17 cell differentiation by inducing the lipid metabolic kinase, ACC1,” Cell Reports, 12:1042-55, 2015.

Cell Rep. 2015 Aug 11;12(6):1042-55.   http://dx.doi.org:/10.1016/j.celrep.2015.07.014. Epub 2015 Jul 30.
Obesity Drives Th17 Cell Differentiation by Inducing the Lipid Metabolic Kinase, ACC1.
  • A high-fat diet augments Th17 cell development and the expression of Acaca
  • ACC1 controls Th17 cell development in vitro and Th17 cell pathogenicity in vivo
  • ACC1 modulates RORγt function in developing Th17 cells
  • Obesity in humans induces ACACA and IL-17A expression in CD4 T cells

Chronic inflammation due to obesity contributes to the development of metabolic diseases, autoimmune diseases, and cancer. Reciprocal interactions between metabolic systems and immune cells have pivotal roles in the pathogenesis of obesity-associated diseases, although the mechanisms regulating obesity-associated inflammatory diseases are still unclear. In the present study, we performed transcriptional profiling of memory phenotype CD4 T cells in high-fat-fed mice and identified acetyl-CoA carboxylase 1 (ACC1, the gene product of Acaca) as an essential regulator of Th17 cell differentiation in vitro and of the pathogenicity of Th17 cells in vivo. ACC1 modulates the DNA binding of RORγt to target genes in differentiating Th17 cells. In addition, we found a strong correlation between IL-17A-producing CD45RO(+)CD4 T cells and the expression of ACACA in obese subjects. Thus, ACC1 confers the appropriate function of RORγt through fatty acid synthesis and regulates the obesity-related pathology of Th17 cells.

Figure thumbnail fx1





FEEDING INFLAMMATION: When mice eat a diet high in fat, their CD4 T cells show increased expression of the fatty acid biosynthesis gene Acaca, which encodes the enzyme ACC1 (1). Products of the ACC1 fatty acid synthesis pathway encourage the transcription factor RORγt to bind near the gene encoding the cytokine IL-17A (2). There, RORγt recruits an enzyme called p300 to modify the genome epigenetically and turn on IL-17A. The memory T cells then differentiate into inflammatory T helper 17 cells.
See full infographic: PDF

Obesity often comes with a side of chronic inflammation, causing inflammatory chemicals and immune cells to flood adipose tissue, the hypothalamus, the liver, and other areas of the body. Inflammation is a big part of what makes obesity such an unhealthy condition, contributing to Type 2 diabetes, heart disease, cancers, autoimmune disorders, and possibly even neurodegenerative diseases.

To better understand the relationship between obesity and inflammation, Toshinori Nakayama, Yusuke Endo, and their colleagues at Chiba University in Japan started with what often leads to obesity: a high-fat diet. They fed mice rich meals for a couple of months and looked at how gene expression in the animals’ T cells compared to gene expression in the T cells of mice fed a normal diet. Most notably, they found increased expression ofAcaca, a gene that codes for a fatty acid synthesis enzyme called acetyl coA carboxylase 1 (ACC1). They went on to show that the resulting increase in fatty acid levels pushed CD4 T cells to differentiate into inflammatory T helper 17 (Th17) cells.

Th17 cells help fight off invading fungi and some bacteria. But these immune cells can also spin out of control in autoimmune diseases such as multiple sclerosis. Nakayama’s team showed that either blocking ACC1 activity with a drug called TOFA or deleting a key portion of Acaca in mouse CD4 T cells reduced the generation of pathologic Th17 cells. Overexpressing Acaca increased Th17-cell generation.

The researchers also demonstrated that mice fed a high-fat diet had elevated susceptibility to a multiple sclerosis–like disease, and that TOFA reduced the symptoms.

“This is a very intriguing finding, suggesting not only that obesity can directly induce Th17 differentiation but also indicating that pharmacologic targeting of fatty acid synthesis may help to interfere with obesity-associated inflammation,” Tim Sparwasser of the Twincore Center for Experimental and Clinical Infection Research in Hannover, Germany, says in an email. Sparwasser and his colleagues had previously shown that ACC1 is required for the differentiation of Th17 cells in mice and humans.

Nakayama explains that CD4 T cells must undergo profound metabolic changes as they mature and differentiate. “The intracellular metabolites, including fatty acids, are essential for cell proliferation and cell growth,” he says in an email. When fatty acid levels in T cells increase, the cells are activated and begin to proliferate.

“It’s a nice illustration of how, really, immune response is so highly connected to the metabolic state of the cell,” says Gökhan S. Hotamisligil of Harvard University’s T.H. Chan School of Public Health who was not involved in the study. “The immune system launches its responses commensurate with the sources of nutrients and energy from the environment,” he adds in an email.

There are still missing pieces in the path from high-fat diet to increased Acaca expression to ACC1’s influence on T-cell differentiation. It also remains to be seen how this plays out in obese humans, although Nakayama and colleagues did show that inhibiting ACC1 reduced pathologic Th17 generation in human immune cell cultures, and that the T cells of obese humans contain elevated levels of ACC1 and show signs of increased differentiation into Th17 cells.


The prevalence of obesity has been increasing worldwide, and obesity is now a major public health problem in most developed countries (Gregor and Hotamisligil, 2011, Ng et al., 2014). Obesity-induced inflammation contributes to the development of various chronic diseases, such as autoimmune diseases, metabolic diseases, and cancer (Kanneganti and Dixit, 2012, Kim et al., 2014,Osborn and Olefsky, 2012, Winer et al., 2009a). A number of studies have pointed out the importance of reciprocal interactions between metabolic systems and immune cells in the pathogenesis of obesity-associated diseases (Kaminski and Randall, 2010, Kanneganti and Dixit, 2012, Kim et al., 2014, Mauer et al., 2014, Stienstra et al., 2012, Winer et al., 2011).

Elucidating the molecular mechanisms by which naive CD4 T cells differentiate into effector T cells is crucial for understanding helper T (Th) cell-mediated immune pathogenicity. After antigen stimulation, naive CD4 T cells differentiate into at least four distinct Th cell subsets: Th1, Th2, Th17, and inducible regulatory T (iTreg) cells (O’Shea and Paul, 2010, Reiner, 2007). Several specific master transcription factors that regulate Th1/Th2/Th17/iTreg cell differentiation have been identified, including T-bet for Th1 (Szabo et al., 2000), GATA3 (Yamashita et al., 2004, Zheng and Flavell, 1997) for Th2, retinoic-acid-receptor-related orphan receptor γt (RORγt) for Th17 (Ivanov et al., 2006), and forkhead box protein 3 (Foxp3) for iTreg (Sakaguchi et al., 2008). The appropriate expression and function of these transcription factors is essential for proper immune regulation by each Th cell subset.

Among these Th cell subsets, Th17 cells contribute to the host defense against fungi and extracellular bacteria (Milner et al., 2008). However, the pathogenicity of IL-17-producing T cells has been recognized in various autoimmune diseases, including multiple sclerosis, psoriasis, inflammatory bowel diseases, and steroid-resistant asthma (Bettelli et al., 2006, Coccia et al., 2012, Ivanov et al., 2006,Leonardi et al., 2012, McGeachy and Cua, 2008, Nylander and Hafler, 2012,Stockinger et al., 2007, Sundrud et al., 2009).

An HFD Promotes Th17 Cell Differentiation and Affects the Expression of Fatty Acid Enzymes in Memory CD4 T Cells In Vivo

Inhibition of ACC1 Function Results in Decreased Th17 Cell Differentiation and Ameliorates the Development of Autoimmune Disease

ACC1 Controls the Differentiation of Th17 Cells Both In Vitro and In Vivo

ACC1 Controls the Function, but Not Expression, of RORγt in Differentiating Th17 Cells

Extrinsic Fatty Acid Supplementation Restored Acaca−/− Th17 Cell Differentiation through the Functional Improvement of RORγt

Obese Subjects Show Upregulation of ACACA and Increased Th17 Cells in CD45RO+ Memory CD4 T Cells

We herein identified a critical role that ACC1 plays in Th17 cell differentiation and the pathogenicity of Th17 cells through the control of the RORγt function under obese circumstances. High-fat-induced obesity augments Th17 cell differentiation and the expression of enzymes involved in fatty acid metabolism, including ACC1. Pharmacological inhibition or genetic deletion of ACC1 resulted in impaired Th17 cell differentiation in both mice and humans. In contrast, overexpression of Acaca induced Th17 cells in vivo, leaving the expression ofIfng and Il4 largely unchanged. ACC1 modulated the binding of RORγt to theIl17a gene and the subsequent p300 recruitment in differentiating Th17 cells. Memory CD4 T cells from peripheral blood mononuclear cells (PBMCs) of obese subjects showed increased IL-17A production and ACACA expression. Furthermore, a strong correlation was detected between the proportion of IL-17A-producing cells and the expression level of ACACA in memory CD4 T cells in obese subjects. Thus, our findings provide evidence of a mechanism wherein obesity can exacerbate IL-17-mediated pathology via the induction of ACC1.

Read Full Post »

ZYDPLA 1, New Antidiabetic in Gliptin Class

Larry H. Bernstein, MD, FCAP, Curator



ZYDPLA 1 From ZYDUS CADILA, a new NCE in Gliptin class of antidiabetic agents.


Figure imgf000004_0001


Probable structure –



ZYDPLA1 is a novel compound in the Gliptin class of antidiabetic agents. It works by blocking the enzyme Dipeptidyl Peptidase-4 (DPP-4), which inactivates the Incretin hormone GLP-1. By increasing the GLP-1 levels, ZYDPLA1 glucose-dependently increases insulin secretion and lowers glucagon secretion.


Zydus announces data presentations on ZYDPLA1 “A once-weekly small molecule DPP-IV inhibitor for treating diabetes”, at the ENDO conference in Chicago, Illinois, USA. Ahmedabad, India June 9, 2014 The Zydus group will be presenting data on its molecule ZYDPLA1 a novel compound in the Gliptin class of anti-diabetic agents during the joint meeting of the International Society of Endocrinology and the Endocrine Society: ICE/ENDO 2014 to be held from June 21-24, 2014 in Chicago, Illinois.

ZYDPLA1, currently in Phase I clinical evaluation in USA, is an orally active, small molecule NCE, discovered and developed by the Zydus Research Centre. ZYDPLA1 works by blocking the enzyme Dipeptidyl Peptidase-4 (DPP-4), which inactivates the Incretin hormone GLP-1. By increasing the GLP- 1 levels, ZYDPLA1 glucose-dependently increases insulin secretion. This results in an overall improvement in the glucose homoeostasis, including reduction in HbA1c and blood sugar levels.

The Chairman & Managing Director of Zydus, Mr. Pankaj R. Patel said, “Currently, all available DPP-4 inhibitors are dosed once-daily. ZYDPLA1 with a once-a-week dosing regimen would provide diabetic patients with a more convenient treatment alternative. ZYDPLA1 will offer sustained action, which will result in an improved efficacy profile.”

The abstract of Poster Number: LB-PP02-4 can also be viewed on the ENDO web program at https://endo.confex.com/endo/2014endo/webprogram/authora.html. The Poster Preview is scheduled on Sunday, June 22, 2014 at McCormick Place West.

The number of diabetics in the world is estimated to be over 360 million. In 2025 nearly half of the world’s diabetic population will be from India, China, Brazil, Russia and Turkey. The sales of the DPP IV inhibitors is expected to peak at almost $14 billion by 2022. Research in the field of anti-diabetic therapy seeks to address the problems of hypoglycemia, GI side effects, lactic acidosis, weight gain, CV risks, edema, potential immunogenicity etc., which pose a major challenge in the treatment of diabetes.

About Zydus

Headquartered in Ahmedabad, India, Zydus Cadila is an innovative, global pharmaceutical company that discovers, manufactures and markets a broad range of healthcare therapies. The group employs over 16,000 people worldwide including over 1100 scientists engaged in R & D and is dedicated to creating healthier communities globally. As a leading healthcare provider, it aims to become a global researchbased pharmaceutical company by 2020. The group has a strong research pipeline of NCEs, biologics and vaccines which are in various stages of clinical trials including late stage.

About Zydus Research Centre

The Zydus Research Centre has over 20 discovery programmes in the areas of cardio-metabolic disorders, pain, inflammation and oncology. Zydus has in-house capabilities to conduct discovery research from concept to IND-enabling pre-clinical development and human proof-of-concept clinical trials. The Zydus Research group had identified and developed Lipaglyn™ (Saroglitazar) which has now become India’s first NCE to reach the market. Lipaglyn™ is a breakthrough therapy in the treatment of diabetic dyslipidemia and Hypertriglyceridemia. The company recently announced the commencement of Phase III trials of LipaglynTM (Saroglitazar) in patients suffering from Lipodystrophy.



Rajendra Kharul, Mukul R. Jain, Pankaj R. Patel    Substituted benzamide derivatives as glucokinase (gk) activators.

Zydus announces US FDA approval for initiating Phase I clinical trials of ‘ZYDPLA1’ – a novel next generation orally active, small molecule DPP-4 inhibitor to treat Type 2 Diabetes Ahmedabad, October 23, 2013
• Zydus strengthens its cardiometabolic pipeline with the addition of ZYDPLA1
• Novel next generation New Chemical Entity (NCE) would offer once-a-week oral treatment option, a significant benefit to Type-2 diabetic patients Close on the heels of launching Lipaglyn, the breakthrough therapy to treat diabetic dyslipidemia and India’s first NCE to reach the market, the Zydus group announced the Phase I clinical trial approval from the USFDA for ZYDPLA1 – a Next Generation, long-acting DPP-4 Inhibitor.
ZYDPLA1 is an orally active, small molecule NCE, discovered and developed by the Zydus Research Centre, the NCE research wing of Zydus. ZYDPLA1 is a novel compound in the Gliptin class of antidiabetic agents. It works by blocking the enzyme Dipeptidyl Peptidase-4 (DPP-4), which inactivates the Incretin hormone GLP-1. By increasing the GLP-1 levels, ZYDPLA1 glucose-dependently increases insulin secretion and lowers glucagon secretion. This results in an overall improvement in the glucose homoeostasis, including reduction in HbA1c and blood sugar levels.
Currently, all available DPP-4 inhibitors are dosed once-daily. ZYDPLA1 with a once-a-week dosing regimen, would provide diabetic patients with a more convenient treatment alternative. ZYDPLA1 will offer sustained action, which will result in an improved efficacy profile.
Speaking on the new development, Mr. Pankaj R. Patel, Chairman and Managing Director, Zydus Group, said, “After a promising start with Lipaglyn, we take another big leap forward in the area of diabetic research and long term management of Type 2 diabetes. The IND approval by USFDA is another major regulatory milestone for us. We believe that ZYDPLA1 holds promise and would take us closer to our mission of reducing the burden of chronic diseases and addressing unmet medical needs in the treatment of diabetes.”
The number of diabetics in the world is estimated to be over 360 million. In 2025 nearly half of the world’s diabetic population will be from India, China, Brazil, Russia and Turkey. The sales of the DPPIV inhibitors is expected to peak at almost $14 billion by 2022. Research in the field of anti-diabetic therapy seeks to address the problems of hypoglycemia, GI side effects, lactic acidosis, weight gain, CV risks, edema, potential immunogenicity etc., which pose a major challenge in the treatment of diabetes.
Zydus is the only Indian pharma company to launch its own patented NCE – Lipaglyn™, the world’s first drug to be approved for the treatment of diabetic dyslipidemia. It aims to be a leading global healthcare provider with a robust product pipeline, achieve sales of over $3 billion by 2015 and be a research-based pharmaceutical company by 2020.
The Zydus Research Centre has over 20 discovery programmes ongoing with several candidates in the pre-clinical development stage focused on metabolic, cardiovascular, pain, inflammation and oncology therapeutic areas. With over 400 research professionals spearheading its research programme, Zydus has inhouse capabilities to conduct discovery research from concept to IND-enabling pre-clinical development and human proof-of-concept clinical trials. ZYDPLA1 is the latest addition to the group’s strong research pipeline of 6 NCEs which are in various stages of clinical trials. For more information, please visit: http://www.zyduscadila.com



Read Full Post »

Phase I/II Hepato-specific Glucokinase Activator

Larry H. Bernstein, MD, FCAP, Curator


Advinus Therapeutics announced that it has successfully completed a 14-day POC study in 60 Type II diabetic patients on its lead molecule, GKM-001, a glucokinase activator. The results of the trial show effective glucose lowering across all doses tested without any incidence of hypoglycemia or any other clinically relevant adverse events.

GKM-001 is differentiated from most other GK molecules that are in development, or have been discontinued, due to its novel liver selective mechanism of action.

GKM-001 belongs to a novel class of molecules for treatment of type II diabetes. It is an activator of Glucokinase (GK), a glucose-sensing enzyme found mainly in the liver and pancreas. Being liver selective, GKM-001 mostly activates GK in the liver and not in pancreas, which is its key differentiation from most competitor molecules that activate GK in pancreas as well.

GKM 001 in pipeline for Diabetes by Advinus


ad 1
GKM 001

Advinus Therapeutics Private L,

A glucokinase activator for treatment of type II diabetes, currently in PI. Advinus is actively exploring partnership options to expedite further development and WW marketing of GKM-001.

Company Advinus Therapeutics Ltd.
Description Activator of glucokinase (GCK; GK)
Molecular Target Glucokinase (GCK) (GK)
Mechanism of Action Glucokinase activator
Therapeutic Modality Small molecule
Latest Stage of Development Phase I/II
Standard Indication Diabetes
Indication Details Treat Type II diabetes



Example Cl : (-)-{5-ChIoro-2-[2-(4-cyclopropanesulfonylphenyI)-2-(2,4- difluorophenoxy)acetylamino]thiazol-4-yl}-acetic acid, ethyl ester
1H NMR(400 MHz, CDCl3): δ 1.06-1.08 (m, 2H), 1.30 (t, J=7.2 Hz, 3H), 1.33-1.38 (m, 2H), 2.42-2.50 (m, IH), 3.73 (d, J=2 Hz, 2H), 4.22 (q, J=7.2 Hz ,2H), 5.75 (s, IH), 6.76- 6.77 (m, IH), 6.83-6.86 (m, IH), 6.90-6.98 (m, IH), 7.73 (d, J=8.4 Hz, 2H), 7.96 (d, J=8.4 Hz, 2H), 9.96 (bs, IH). MS (EI) m/z: 571.1 and 573.1 (M+ 1; for 35Cl and 37Cl respectively).

Examples C2 and C3 were prepared in analogues manner of example (Cl) from the appropriate chiral intermediate:

Figure imgf000044_0002

Example Dl : (+)-{5-Chloro-2-[2-(4-cyclopropanesulfonylphenyl)-2-(2,4- difluorophenoxy)acetylamino]thiazol-4-yl}acetic acid, ethyl ester

Advinus’ GK-activator Achieves Early POC for Diabetes

November 29 2011

Partnership Dialog Actively Underway

Advinus Therapeutics, a research-based pharmaceutical company founded by globally experienced industry executives and promoted by the TATA Group, announced that it has successfully completed a 14-day POC study in 60 Type II diabetic patients on its lead molecule, GKM-001, a glucokinase activator. The results of the trial show effective glucose lowering across all doses tested without any incidence of hypoglycemia or any other clinically relevant adverse events.

The clinical trials on GKM-001 validate the company’s pre-clinical hypothesis that a liver selective Glucokinase activator would not cause hypoglycemia (very low blood sugar), while showing robust efficacy.

“GKM-001 is differentiated from most other GK molecules that are in development, or have been discontinued, due to its novel liver selective mechanism of action. GKM-001 has a prolonged pharmacological effect and a half-life that should support a once a day dosing as both mono and combination therapy.” said Dr. Rashmi Barbhaiya, MD & CEO, Advinus Therapeutics. He added that Advinus is actively exploring partnership options to expedite further development and global marketing of GKM-001.

GKM-001 belongs to a novel class of molecules for treatment of type II diabetes. It is an activator of Glucokinase (GK), a glucose-sensing enzyme found mainly in the liver and pancreas. Being liver selective, GKM-001 mostly activates GK in the liver and not in pancreas, which is its key differentiation from most competitor molecules that activate GK in pancreas as well. The resulting increase in insulin secretion creates a potential for hypoglycemia-a risk GKM-001 is designed to avoid. Advinus has the composition of matter patent on GKM-001 for all major markets globally. Both the Single Ascending Dose data, in healthy and type II diabetics, and the Multiple Ascending Dose Study in Type II diabetics has shown that the molecule shows effective glucose lowering in a dose dependent manner and has excellent safety and tolerability profile over a 40-fold dose range. The pharmacokinetic properties of the molecule support once a day dosing. GKM-001 has the potential to be “First-in-Class” drug to address this large, growing and yet poorly addressed market.

Advinus also has identified a clinical candidate as a back-up to GKM-001, which is structurally different. In its portfolio, the company has a growing pipeline for COPD, sickle cell disease, inflammatory bowel disease, type 2 diabetes, acute and chronic pain and rheumatoid arthritis in various stages of late discovery and pre-clinical development.

Advinus Therapeutics team discovers novel molecule for treatment of diabetes

  • The first glucokinase modulator discovered and developed in India 
  • A new concept for the management of diabetes for patients, globally 
  • 100 per cent ‘made in India’ molecule for the treatment of diabetes 
  • IND approved by DGCI, Phase I clinical trial shows excellent safety and tolerance profiles with efficacy

Bangalore: Advinus Therapeutics (Advinus), the research-based pharmaceutical company founded by leading global pharmaceutical executives and promoted by the Tata group, today, announced the discovery of a novel molecule for the treatment of type II diabetes — GKM-001.The molecule is an activator of glucokinase; an enzyme that regulates glucose balance and insulin secretion in the body.

GKM-001 is a completely indigenously developed molecule and the initial clinical trials have shown excellent results for both safety and efficacy.

“Considering past failures of other companies on this target, our discovery programme primarily focused on identifying a molecule that would be efficacious without causing hypoglycaemia; a side effect associated with most compounds developed for this target.

“Recently completed Phase I data indicate that Advinus’ GKM–001 is a liver selective molecule that has overcome the biggest clinical challenge of hypoglycaemia. GKM-001 is differentiated from most other GK molecules in development due to this novel mechanism of action,” said Dr Rashmi Barbhaiya, MD and CEO, Advinus Therapeutics.

He further added, “We are very proud that GKM-001 is 100 per cent Indian. Advinus’s discovery team in Pune discovered the molecule and entire preclinical development was carried out at our centre in Bangalore. The Investigational New Drug (IND) application was filed with the DGCI for approval to initiate clinical trials in India within 34 months of initiation of the discovery programme. Subsequent to the approval of the IND, we have completed the Phase I Single Ascending Dose study in India within two months.”

GKM-001 is a novel molecule for the treatment of type II diabetes. It is the first glucokinase modulator discovered and developed in India and has potential to be both first or best in class. The success in discovering GKM-001 is attributed to the science-driven efforts in Advinus laboratories and ‘breaking the conventional mold’ for selection of a drug candidate. Advinus has ‘composition of matter’ patent on the molecule for all major markets globally. Glucokinase as a class of target is considered to be novel as currently there is no product in the market or in late clinical trials. The strategy for early clinical development revolved around assessing safety (particularly hypoglycaemia) and early assessment of therapeutic activity (glucose lowering and other biomarkers) in type II diabetics. The Phase I data, in both healthy and type II diabetics, shows excellent safety and tolerability over a 40-fold dose range and desirable pharmacokinetic properties consistent with ‘once a day’ dosing. The next wave of clinical studies planned continues on this strategy of early testing in type II diabetics.

Right behind the lead candidate GKM-001, Advinus has a rich pipeline of back up compounds on the same target. These include several structurally different compounds with diverse potency, unique pharmacology and tissue selectivity. Having discovered the molecule with early indication of wide safety margins, desired efficacy and pharmacokinetic profiles, the company now seeks to out-licence GKM-001 and its discovery portfolio.

Kasim A. Mookhtiar, , Debnath Bhuniya, Siddhartha De, Anita Chugh, Jayasagar
Gundu, Venkata Palle, Dhananjay Umrani, Nimish Vachharajani, Vikram
Ramanathan and Rashmi H. Barbhaiya
Advinus Therapeutics Ltd, Hinjewadi, Pune – 411057, and Peenya Industrial Area,
Bangalore – 560058, India


wo 2008104994

wo 2008 149382

wo 2009047798
WO2008104994A2* 25 Feb 2008 4 Sep 2008 Advinus Therapeutics Private L 2,2,2-tri-substituted acetamide derivatives as glucokinase activators, their process and pharmaceutical application

///////GKM 001, pipeline, Diabetes, Advinus, type II diabetes, glucokinase modulator, Rashmi Barbhaiya

Comment    See all comments    Like

Read Full Post »

Brain and Cognition

Larry H. Bernstein, MD, FCAP, Curator


Brain activity may be as unique as fingerprints

Tue, 10/13/2015 – Bill Hathaway, Yale Univ.


Image: Michael S. Helfenbeing/Shutterstock

A person’s brain activity appears to be as unique as his or her fingerprints, a new Yale Univ.-led imaging study shows. These brain “connectivity profiles” alone allow researchers to identify individuals from the fMRI images of brain activity of more than 100 people, according to the study published in Nature Neuroscience.

“In most past studies, fMRI data have been used to draw contrasts between, say, patients and healthy controls,” said Emily Finn, a PhD student in neuroscience and co-first author of the paper. “We have learned a lot from these sorts of studies, but they tend to obscure individual differences which may be important.”

Finn and co-first author Xilin Shen, under the direction of R. Todd Constable, professor of diagnostic radiology and neurosurgery at Yale, compiled fMRI data from 126 subjects who underwent six scan sessions over two days. Subjects performed different cognitive tasks during four of the sessions. In the other two, they simply rested. Researchers looked at activity in 268 brain regions: specifically, coordinated activity between pairs of regions. Highly coordinated activity implies two regions are functionally connected. Using the strength of these connections across the whole brain, the researchers were able to identify individuals from fMRI data alone, whether the subject was at rest or engaged in a task. They were also able to predict how subjects would perform on tasks.

Finn said she hopes that this ability might one day help clinicians predict or even treat neuropsychiatric diseases based on individual brain connectivity profiles.

Brain Activity Identifies Individuals

By Kerry Grens

Neural connectome patterns differ enough between people to use them as a fingerprint.

New Alzheimer’s Gene Identified

Megan Brooks


Researchers have identified a new gene involved in the immune system that increases the risk for Alzheimer’s disease (AD), providing a potential new target for prevention and treatment.

They found that older adults at risk for AD and those with the disease who carry a specific variant in the interleukin-1 receptor accessory protein (IL1RAP) had higher rates of amyloid plaque accumulation in the brain over 2 years. The effect of the variant was stronger than the well-known AD risk allele APOE ε4.

“These findings suggest that targeting the IL1RAP immune pathway may be a viable approach for promoting the clearance of amyloid deposits and fighting an important cause of progression in Alzheimer’s disease,” Andrew J. Saykin, PsyD, director of the Indiana Alzheimer Disease Center, Indianapolis, and the national Alzheimer’s Disease Neuroimaging Initiative Genetics Core, said in a statement.

The study was published in the October 1 issue of Brain.

Novel Association

The researchers conducted a genome-wide association study of longitudinal changes in brain amyloid burden measured by florbetapir positron emission tomography (PET) in nearly 500 individuals. They assessed the levels of brain amyloid deposits at an initial visit and again 2 years later.

Study participants came from the Alzheimer’s Disease Neuroimaging Initiative, the Indiana Memory and Aging Study, the Religious Orders Study, and the Rush Memory and Aging Project, all longitudinal studies of older adults representing clinical stages along the continuum from normal aging to AD.

As expected, APOE ε4 was associated with higher rates of amyloid plaque buildup. However, they also identified a novel association between a single nucleotide polymorphism in IL1RAP (rs12053868-G) and higher rates of amyloid accumulation, independent of APOE ε4.

Carriers of the IL1RAP rs12053868-G variant showed accelerated cognitive decline and were more likely to progress from mild cognitive impairment to AD. They also showed greater longitudinal atrophy of the temporal cortex, which is involved in memory and had a lower level of microglial activity as measured by PET scans, the researchers report.

“This was an intriguing finding because IL1RAP is known to play a central role in the activity of microglia, the immune system cells that act as the brain’s ‘garbage disposal system’ and the focus of heavy investigation in a variety of neurodegenerative diseases,” Vijay K. Ramanan, MD, PhD, postdoctoral researcher at the Indiana University School of Medicine, Indianapolis, who worked on the study, said in the statement.

“These results suggest a crucial role of activated microglia in limiting amyloid accumulation and nominate the IL-1/IL1RAP pathway as a potential target for modulating this process,” the investigators write.

The study was supported by the National Institute on Aging and a consortium of private partners through the Foundation for the National Institutes of Health. Several authors disclosed relationships with pharmaceutical companies. A complete list can be found with the original article.

Brain. 2015;138:3076-3088. Abstract

Cognitive Impairments in Elderly Diabetic Patients: Understanding the Risks for Better Management

Medscape Medical News from the

Visit Medscape in Hall B Booth #B13:31

Medscape Diabetes & Endocrinology


Lyse Bordier, MD


Editor’s Note: The following is an edited, translated transcript of a presentation by Professor Lyse Bordier, a diabetologist at Military Hospital Bégin, Saint-Mandé, France, summarizing her lecture at the European Association for the Study of Diabetes (EASD) 2015 AnnualMeeting in Stockholm, Sweden.

Hello. I am Professor Lyse Bordier. I work at the Bégin Military Hospital, in Saint-Mandé, France, and I had the pleasure of participating in a symposium organized by the EASD 2015 conference in Stockholm on elderly patients, specifically on cognitive impairments.

A Public Health Problem

Dementia and cognitive impairments are a major problem; Alzheimer disease accounts for 70% of all cases of dementia. The other main causes are vascular dementias and mixed dementias. They are a real public health problem; it is estimated that, in the United States, 5.2 million people have this condition, and worldwide, every 7 seconds, a new case of dementia is diagnosed.[1,2] In France, for example, it was estimated in 2010 that 750,000-850,000 people had dementia and that this figure will increase by a factor of 2.4 by the year 2050.

Diabetes is an important contributor to the development of cognitive impairments, all the way up to dementia. In Europe, it is estimated that nearly 25% of people over age 85 years have dementia. Its prevalence and incidence are higher in women than in men.[2] We know that the complications of diabetes have changed over the years and that acute metabolic complications are, in the end, much less important. With the improvement in life expectancy in our diabetic patients, who are now better treated thanks to better therapeutic management, new complications have arisen, such as renal failure, heart failure, and, of course, geriatric complications, which are, in large part, cognitive disorders.[3]

Prevalence Underestimated by Physicians

These cognitive impairments are common and largely underestimated. This was clearly shown in the GERODIAB study,[4] which included a cohort of 987 patients over the age of 70 years. At inclusion, the physicians reported that 11% of their patients had cognitive impairments and that 3% had dementia. In actual fact, 25% of the patients had impaired cognitive functions, with a Mini-Mental State Examination (MMSE) score under 25. The prevalence is therefore significantly underestimated by physicians.

Cognitive impairments are more prevalent and more severe in diabetics than in nondiabetics. It is estimated that the risk for cognitive impairments and that for dementia are 20% to 70% and 60% higher, respectively, in the presence of diabetes.[5] Furthermore, the risk for Alzheimer dementia is considerable, it being 40% higher in diabetics. As expected (given the combination of the other cardiovascular risk factors), the increase in the risk is even greater for vascular dementia, with an odds ratio of 2.38.[6]


What are the mechanisms in the development of cognitive impairments and dementia? There are many mechanisms, and they are often poorly understood. Hyperglycemia plays a very important role as a direct result of oxidative stress, of advanced glycation end-products, but also as a result of micro- and macroangiopathy, hypertension, and dyslipidemia.[7,8] Other major factors, such as hypoglycemia,[9-12]play an extremely important role in the development of cognitive impairments. As well, a great deal of literature has been published lately on the role of inflammation[13] and genetic factors. Another widely known aspect is insulin resistance, which increases the risk for dementia at a fairly early stage by 40%[14,15]; this already during the metabolic syndrome, even before the onset of type 2 diabetes.


Figure. Multiple and poorly understood mechanisms of cognitive impairments and dementia. HTA = arterial hypertension. Adapted from Buysschaert M, et al.[16]

What Are the Consequences of Cognitive Impairments?

Cognitive impairments lead to a number of complications, including a reduction in life expectancy. In the GERODIAB cohort, we found, after 2 years of follow-up, that the mortality rate was twice as high in the patients with an MMSE score <24 compared with those with an MMSE score >24. In this study, the patients with a lower MMSE score had less well-controlled diabetes, were usually treated with insulin, and had heart failure and cerebrovascular complications more often. Very surprisingly, hypoglycemia was not more prevalent in these patients, perhaps because, being less independent, they were better managed by care teams.[17]

Cognitive impairments lead to geriatric complications, such as malnutrition, falls, and a loss of autonomy. They also promote social and family isolation and iatrogenic accidents, as well as depression, which can both mask cognitive impairments and exacerbate an underlying dementia. Another important aspect is that cognitive impairments increase the risk for hypoglycemia. This has been shown very clearly in all of the studies. There is, in fact, a bidirectional link between dementia and hypoglycemia: Hypoglycemia doubles the risk for dementia, and dementia triples the risk for hypoglycemia.[18]

Screening and Management

What do we do when a patient presents with cognitive impairments? First, they should be identified so that they can be managed. We need to be vigilant for certain little signs: changes in the patient’s behavior (eg, a patient who forgets his appointments, whose personal hygiene has declined, who is less diligent in keeping his blood glucose diary, and, lastly, who has an unexplained diabetic imbalance). We should also know how to use simple tests, such as the MMSE, which provides an overall assessment of space-time orientation, cognitive functions, language functions, and calculation, and how to assess the patient’s autonomy and loss of autonomy.[19] Next, we should, as per the recommendations of the American Diabetes Association[20] and the EASD, individualize the glycemic goals, taking into account, in the most fragile, elderly patients, cognitive status, the level of autonomy, depression, nutritional status—in particular, sarcopenia, which can coexist with obesity, and the risk for hypoglycemia.[21]

We should therefore avoid overtreating the most fragile patients (those at greatest risk for hypoglycemia), but neither should we undertreat patients who have a long life expectancy and who could develop micro- and macroangiopathic complications.

One last aspect, which is very important, is the family. Help needs to be provided to prevent the patient’s loss of autonomy.[21] Lastly, I think that cognitive decline should be added to the already long list of degenerative complications of diabetes.

PDGFR-ß Plays a Key Role in the Ectopic Migration of Neuroblasts in Cerebral Stroke

Hikari Sato et al.

The neuroprotective agents and induction of endogenous neurogenesis remain as the urgent issues to be established for the care of cerebral stroke. Platelet-derived growth factor receptor beta (PDGFR-ß) is mainly expressed in neural stem/progenitor cells (NSPCs), neurons and vascular pericytes of the brain; however, the role in pathological neurogenesis remains elusive. This review examined the role of PDGFR-ß in the migration and proliferation of NSPCs after stroke.

Read Full Post »

Confluence of Chemistry, Physics, and Biology

Curator: Larry H. Bernstein, MD, FCAP


  1. How Nanotechnology Works by Kevin Bonsor and Jonathan Strickland


Image Source:

There’s an unprecedented multidisciplinary convergence of scientists dedicated to the study of a world so small, we can’t see it — even with a light microscope. That world is the field of nanotechnology, the realm ofatoms and nanostructures.Nanotechnology i­s so new, no one is really sure what will come of it. Even so, predictions range from the ability to reproduce things like diamonds and food to the world being devoured by self-replicating nanorobots.In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale. A nanometer (nm) is one-billionth of a meter, smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair

[source 2=”Lab</a>” language=”href=”][/source]

As small as a nanometer is, it’s still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom’s nucleus is much smaller — about 0.00001 nm. Atoms are the building blocks for all matter in our universe. You and everything around you are made of atoms. Nature has perfected the science of manufacturing matter molecularly. For instance, our bodies are assembled in a specific manner from millions of living cells. Cells are nature’s nanomachines. At the atomic scale, elements are at their most basic level. On the nanoscale, we can potentially put these atoms together to make almost anything.

In a lecture called “Small Wonders:The World of Nanoscience,” Nobel Prize winner Dr. Horst Störmer said that the nanoscale is more interesting than the atomic scale because the nanoscale is the first point where we can assemble something — it’s not until we start putting atoms together that we can make anything useful.

In this article, we’ll learn about what nanotechnology means today and what the future of nanotechnology may hold. We’ll also look at the potential risks that come with working at the nanoscale.

In the next section, we’ll learn more about our world on the nanoscale.

The World of Nanotechnology

Experts sometimes disagree about what constitutes the nanoscale, but in general, you can think ofnanotechnology dealing with anything measuring between 1 and 100 nm. Larger than that is the microscale, and smaller than that is the atomic scale.

Nanotechnology is rapidly becoming an interdisciplinary field. Biologists, chemists, physicists and engineers are all involved in the study of substances at the nanoscale. Dr. Störmer hopes that the different disciplines develop a common language and communicate with one another


[source 1=”” 2=”2="2="2="2="2="2="2="href="http://video.google.com/videoplay?docid">Störmer"""""""&#8221; language=”:”][/source]

Only then, he says, can we effectively teach nanoscience since you can't understand the world of nanotechnology without a solid background in multiple sciences.

One of the exciting and challenging aspects of the nanoscale is the role that quantum mechanics plays in it. The rules of quantum mechanics are very different from classical physics, ­which means that the behavior of substances at the nanoscale can sometimes contradict common sense by behaving erratically. You can’t walk up to a wall and immediately teleport to the other side of it, but at the nanoscale an electron can — it’s called electron tunneling. Substances that are insulators, meaning they can’t carry an electric charge, in bulk form might become semiconductors when reduced to the nanoscale. Melting points can change due to an increase in surface area. Much of nanoscience requires that you forget what you know and start learning all over again.

So what does this all mean? Right now, it means that scientists are experimenting with substances at the nanoscale to learn about their properties and how we might be able to take advantage of them in various applications. Engineers are trying to use nano-size wires to create smaller, more powerful microprocessors. Doctors are searching for ways to use nanoparticles in medical applications. Still, we’ve got a long way to go before nanotechnology dominates the technology and medical markets.

In the next section, we’ll look at two important nanotechnology structures: nanowires and carbon nanotubes.


At the nanoscale, objects are so small that we can’t see them — even with a light microscope. Nanoscientists have to use tools like scanning tunneling microscopes or atomic force microscopes to observe anything at the nanoscale. Scanning tunneling microscopes use a weak electric current to probe the scanned material. Atomic force microscopes scan surfaces with an incredibly fine tip. Both microscopes send data to a computer, which can assemble the information and project it graphically onto a monitor

[source 1=”” 2=”2="2="2="2="2="2="2="href="http://search.eb.com/eb/article-9384821">Encyclopædia"""""""&#8221; 3=”Britannica” language=”:”][/source]



Nanowires and Carbon Nanotubes

Currently, scientists find two nano-size structures of particular interest: nanowires and carbon nanotubes. Nanowires are wires with a very small diameter, sometimes as small as 1 nanometer. Scientists hope to use them to build tiny transistors for computer chips and other electronic devices. In the last couple of years, carbon nanotubes have overshadowed nanowires. We’re still learning about these structures, but what we’ve learned so far is very exciting.

A carbon nanotube is a nano-size cylinder of carbon atoms. Imagine a sheet of carbon atoms, which would look like a sheet of hexagons. If you roll that sheet into a tube, you’d have a carbon nanotube. Carbon nanotube properties depend on how you roll the sheet. In other words, even though all carbon nanotubes are made of carbon, they can be very different from one another based on how you align the individual atoms.

With the right arrangement of atoms, you can create a carbon nanotube that’s hundreds of times stronger than steel, but six times lighter

[source 1=”” 2=”2="2="2="2="2="2="2="href="http://science.howstuffworks.com/nanotechnology6.htm">The"""""""&#8221; 3=”Ecologist” language=”:”][/source]

Engineers plan to make building material out of carbon nanotubes, particularly for things like cars and airplanes. Lighter vehicles would mean better fuel efficiency, and the added strength translates to increased passenger safety.

Carbon nanotubes can also be effective semiconductors with the right arrangement of atoms. Scientists are still working on finding ways to make carbon nanotubes a realistic option for transistors in microprocessors and other electronics.

In the next section, we’ll look at products that are taking advantage of nanotechnology.


What’s the difference between graphite and diamonds? Both materials are made of carbon, but both have vastly different properties. Graphite is soft; diamonds are hard. Graphite conducts electricity, but diamonds are insulators and can’t conduct electricity. Graphite is opaque; diamonds are usually transparent. Graphite and diamonds have these properties because of the way the carbon atoms bond together at the nanoscale.

Products with Nanotechnology

You might be surprised to find out how many products on the market are already benefiting from nanotechnology.

Bridgestone engineers developed this Quick Response Liquid Powder Display, a flexible digital screen, using nanotechnology.

Yoshikazu Tsuno/AFP/Getty Images

  • Sunscreen – Many sunscreens contain nanoparticles of zinc oxide or titanium oxide. Older sunscreen formulas use larger particles, which is what gives most sunscreens their whitish color. Smaller particles are less visible, meaning that when you rub the sunscreen into your skin, it doesn’t give you a whitish tinge.
  • Self-cleaning glass – A company called Pilkington offers a product they call Activ Glass, which uses nanoparticles to make the glassphotocatalytic and hydrophilic. The photocatalytic effect means that when UV radiation from light hits the glass, nanoparticles become energized and begin to break down and loosen organic molecules on the glass (in other words, dirt). Hydrophilic means that when water makes contact with the glass, it spreads across the glass evenly, which helps wash the glass clean.
  • Clothing – Scientists are using nanoparticles to enhance your clothing. By coating fabrics with a thin layer of zinc oxide nanoparticles, manufacturers can create clothes that give better protection from UV radiation. Some clothes have nanoparticles in the form of little hairs or whiskers that help repel water and other materials, making the clothing stain-resistant.
  • Scratch-resistant coatings – Engineers discovered that adding aluminum silicate nanoparticles to scratch-resistant polymer coatings made the coatings more effective, increasing resistance to chipping and scratching. Scratch-resistant coatings are common on everything from cars to eyeglass lenses.
  • Antimicrobial bandages – Scientist Robert Burrell created a process to manufacture antibacterial bandages using nanoparticles of silver. Silver ions block microbes’ cellular respiration

    [source 1=”” 2=”2="2="2="2="2="2="2="href="http://www.burnsurgery.org/Modules/silver/section2.htm">Burnsurgery.org</a>"""""""&#8221; language=”:”][/source]

    . In other words, silver smothers harmful cells, killing them.

New products incorporating nanotechnology are coming out every day. Wrinkle-resistant fabrics, deep-penetrating cosmetics, liquid crystal displays (LCD) and other conveniences using nanotechnology are on the market. Before long, we’ll see dozens of other products that take advantage of nanotechnology ranging from Intel microprocessors to bio-nanobatteriescapacitors only a few nanometers thick. While this is exciting, it’s only the tip of the iceberg as far as how nanotechnology may impact us in the future.

In the next section, we’ll look at some of the incredible things that nanotechnology may hold for us.­


Nanotechnology is making a big impact on the tennis world. In 2002, the tennis racket company Babolat introduced the VS Nanotube Power racket. They made the racket out of carbon nanotube-infused graphite, meaning the racket was very light, yet many times stronger than steel. Meanwhile, tennis ball manufacturer Wilson introduced the Double Core tennis ball. These balls have a coating of clay nanoparticles on the inner core. The clay acts as a sealant, making it very difficult for air to escape the ball.

Accelerate Your Time to Print Using ANSYS™ SpaceClaim 2015

Switching between multiple tools to prepare 3D models for printing is not only time consuming, but also inefficient and costly to maintain. In the 2015 release, ANSYS SpaceClaim has honed its 3D printing capabilities while adding a multitude of new features to streamline model preparation, providing you with the best 3D printing model prep solution.

ANSYS™ SpaceClaim 2015 provides new features to the STL Prep module, including:

  1. A one-click tool for adding a desired thickness to a part for printing
  2. Automatic facet smoothing for building precision into 3D parts
  3. A minimum thickness detection feature to check for areas falling below a tolerance limit
  4. An unsupported material warning with an overhangs button to add support material where it is needed


The Future of Nanotechnology

In the world of “Star Trek,” machines called replicators can produce practically any physical object, from weapons to a steaming cup of Earl Grey tea. Long considered to be exclusively the product of science fiction, today some people believe replicators are a very real possibility. They call it molecular manufacturing, and if it ever does become a reality, it could drastically change the world.


Atoms and molecules stick together because they have complementary shapes that lock together, or charges that attract. Just like with magnets, a positively charged atom will stick to a negatively charged atom. As millions of these atoms are pieced together by nanomachines, a specific product will begin to take shape. The goal of molecular manufacturing is to manipulate atoms individually and place them in a pattern to produce a desired structure.

The first step would be to develop nanoscopic machines, called assemblers, that scientists can program to manipulate atoms and molecules at will. Rice University Professor Richard Smalley points out that it would take a single nanoscopic machine millions of years to assemble a meaningful amount of material. In order for molecular manufacturing to be practical, you would need trillions of assemblers working together simultaneously. Eric Drexler believes that assemblers could first replicate themselves, building other assemblers. Each generation would build another, resulting in exponential growth until there are enough assemblers to produce objects

[source 1=”” 2=”2="2="2="2="2="2="2="href="http://www.kurzweilai.net/articles/art0604.html">Ray"""""""&#8221; 3=”Kurzweil” language=”:”][/source]

Assemblers might have moving parts like the nanogears in this concept drawing.

Trillions of assemblers and replicators could fill an area smaller than a cubic millimeter, and could still be too small for us to see with the naked eye. Assemblers and replicators could work together to automatically construct products, and could eventually replace all traditional labor methods. This could vastly decrease manufacturing costs, thereby making consumer goods plentiful, cheaper and stronger. Eventually, we could be able to replicate anything, including diamonds, water and food. Famine could be eradicated by machines that fabricate foods to feed the hungry.

Nanotechnology may have its biggest impact on the medical industry. Patients will drink fluids containing nanorobots programmed to attack and reconstruct the molecular structure of cancer cells and viruses. There’s even speculation that nanorobots could slow or reverse the aging process, and life expectancy could increase significantly. Nanorobots could also be programmed to perform delicate surgeries — suchnanosurgeons could work at a level a thousand times more precise than the sharpest scalpel

[source 1=”” 2=”2="2="2="2="2="2="2="href="http://www.nanomedicine.com/Papers/IntlJSurgDec05.pdf">International"""""""&#8221; 3=”Journal” 4=”of” 5=”Surgery” language=”:”][/source]

By working on such a small scale, a nanorobot could operate without leaving the scars that conventional surgery does. Additionally, nanorobots could change your physical appearance. They could be programmed to perform cosmetic surgery, rearranging your atoms to change your ears, nose, eye color or any other physical feature you wish to alter.

Nanotechnology has the potential to have a positive effect on the environment. For instance, scientists could program airborne nanorobots to rebuild the thinning ozone layer. Nanorobots could remove contaminants from water sources and clean up oil spills. Manufacturing materials using the bottom-upmethod of nanotechnology also creates less pollution than conventional manufacturing processes. Our dependence on non-renewable resources would diminish with nanotechnology. Cutting down trees, mining coal or drilling for oil may no longer be necessary — nanomachines could produce those resources.

Many nanotechnology experts feel that these applications are well outside the realm of possibility, at least for the foreseeable future. They caution that the more exotic applications are only theoretical. Some worry that nanotechnology will end up like virtual reality — in other words, the hype surrounding nanotechnology will continue to build until the limitations of the field become public knowledge, and then interest (and funding) will quickly dissipate.

In the next section, we’ll look at some of the challenges and risks of nanotechnology.


In 1959, physicist and future Nobel prize winner Richard Feynman gave a lecture to the American Physical Society called “There’s Plenty of Room at the Bottom.” The focus of his speech was about the field of miniaturization and how he believed man would create increasingly smaller, powerful devices.

In 1986, K. Eric Drexler wrote “Engines of Creation” and introduced the term nanotechnology. Scientific research really expanded over the last decade. Inventors and corporations aren’t far behind — today, more than 13,000 patents registered with the U.S. Patent Office have the word “nano” in them

[source 1=”” 2=”2="2="2="2="2="2="2="href="http://www.uspto.gov/patft/index.html">U.S."""""""&#8221; 3=”Patent” 4=”and” 5=”Trademark” 6=”Office” language=”:”][/source]

Nanotechnology Challenges, Risks and Ethics


The most immediate challenge in nanotechnology is that we need to learn more about materials and their properties at the nanoscale. Universities and corporations across the world are rigorously studying how atoms fit together to form larger structures. We’re still learning about how quantum mechanics impact substances at the nanoscale.

Because elements at the nanoscale behave differently than they do in their bulk form, there’s a concern that some nanoparticles could be toxic. Some doctors worry that the nanoparticles are so small, that they could easily cross the blood-brain barrier, a membrane that protects the brain from harmful chemicals in the bloodstream. If we plan on using nanoparticles to coat everything from our clothing to our highways, we need to be sure that they won’t poison us.

Closely related to the knowledge barrier is the technical barrier. In order for the incredible predictions regarding nanotechnology to come true, we have to find ways to mass produce nano-size products like transistors and nanowires. While we can use nanoparticles to build things like tennis rackets and make wrinkle-free fabrics, we can’t make really complex microprocessor chips with nanowires yet.

There are some hefty social concerns about nanotechnology too. Nanotechnology may also allow us to create more powerful weapons, both lethal and non-lethal. Some organizations are concerned that we’ll only get around to examining the ethical implications of nanotechnology in weaponry after these devices are built. They urge scientists and politicians to examine carefully all the possibilities of nanotechnology before designing increasingly powerful weapons.

If nanotechnology in medicine makes it possible for us to enhance ourselves physically, is that ethical? In theory, medical nanotechnology could make us smarter, stronger and give us other abilities ranging from rapid healing to night vision. Should we pursue such goals? Could we continue to call ourselves human, or would we become transhuman — the next step on man’s evolutionary path? Since almost every technology starts off as very expensive, would this mean we’d create two races of people — a wealthy race of modified humans and a poorer population of unaltered people? We don’t have answers to these questions, but several organizations are urging nanoscientists to consider these implications now, before it becomes too late.

Not all questions involve altering the human body — some deal with the world of finance and economics. If molecular manufacturing becomes a reality, how will that impact the world’s economy? Assuming we can build anything we need with the click of a button, what happens to all the manufacturing jobs? If you can create anything using a replicator, what happens to currency? Would we move to a completely electronic economy? Would we even need money?

Whether we’ll actually need to answer all of these questions is a matter of debate. Many experts think that concerns like grey goo and transhumans are at best premature, and probably unnecessary. Even so, nanotechnology will definitely continue to impact us as we learn more about the enormous potential of the nanoscale.


Eric Drexler, the man who introduced the word nanotechnology, presented a frightening apocalyptic vision — self-replicating nanorobots malfunctioning, duplicating themselves a trillion times over, rapidly consuming the entire world as they pull carbon from the environment to build more of themselves. It’s called the “grey goo” scenario, where a synthetic nano-size device replaces all organic material. Another scenario involves nanodevices made of organic material wiping out the Earth — the “green goo” scenario.

The Technion’s Russell Berrie Nanotechnology Institute is a world-leader in nanotechnology research having made seminal discoveries in the field.

Breakthroughs in Nanotechnology

  • Prof. Ester Segal and a team of Israeli and American researchers find that silicon nanomaterials used for the localized delivery of chemotherapy drugs behave differently in cancerous tumors than they do in healthy tissues. The findings could help scientists better design such materials to facilitate the controlled and targeted release of the chemotherapy drugs to tumors.
  • Associate Professor Alex Leshansky of the Faculty of Chemical Engineering is part of an international team that has created a tiny screw-shaped propeller that can move in a gel-like fluid, mimicking the environment in a living organism. The breakthrough brings closer the day robots that are only nanometers – billionths of a meter – in length, can maneuver and perform medicine inside the human body and possibly inside human cells.
  • Prof. Amit Miller and a team of researchers at the Technion and Boston University have discovered a simple way to control the passage of DNA molecules through nanopore sensors. The breakthrough could lead to low-cost, ultra-fast DNA sequencing that would revolutionize healthcare and biomedical research, and spark major advances in drug development, preventative medicine and personalized medicine.

– Israeli Prime Minister Benjamin Netanyahu presents U.S. President Barack Obama with nano-sized inscribed replicas of the Declarations of Independence of the United States and the State of Israel. The replicas were created by scientists at the Technion’s Russell Berrie Nanotechnology Institute (RBNI). (03/13)

– Prof. Nir Tessler has found a way to generate an electrical field inside solar cells that use inorganic nanocrystals or “quantum dots,” making them more suitable for building an energy-efficient nanocrystal solar cell. (11/11)

– Researchers led by Prof. Wayne Kaplan discover the nature of nanometer-thick layers between different materials and find that they have both solid and liquid properties. The results could enable scientists to improve the resilience of the bond between ceramic materials and metals, two types of materials that “do not like” to come into contact. Applications include cutting tools for metal-working; composites for brake pads; the joins between metal conducting wires and chips in computers; and the application of protective ceramic coatings on jet engine blades. (05/11)

– Israeli President Shimon Peres presents Pope Benedict XVI with a “Nano-Bible” smaller than a pinhead. Created by researchers at the Technion-Israel Institute of Technology, the complete punctuated and vowelized version of the Old Testament takes up just 0.5 square millimeters. The idea to write the Bible on such a tiny surface was conceived by Professor Uri Sivan, the first head of the university’s Russell Berrie Nanotechnology Institute (RBNI). (05/09)

Nanotechnology and medicine

Expert Opinion on Biological Therapy  2003; Volume 3Issue 4, 655-663
Dwaine F Emerich & Christopher G Thanos   http://dx.doi.org:/10.1517/14712598.3.4.655

Nanotechnology, or systems/device manufacture at the molecular level, is a multidisciplinary scientific field undergoing explosive development. The genesis of nanotechnology can be traced to the promise of revolutionary advances across medicine, communications, genomics and robotics. On the surface, miniaturisation provides cost effective and more rapidly functioning mechanical, chemical and biological components. Less obvious though is the fact that nanometre sized objects also possess remarkable self-ordering and assembly behaviours under the control of forces quite different from macro objects. These unique behaviours are what make nanotechnology possible, and by increasing our understanding of these processes, new approaches to enhancing the quality of human life will surely be developed. A complete list of the potential applications of nanotechnology is too vast and diverse to discuss in detail, but without doubt one of the greatest values of nanotechnology will be in the development of new and effective medical treatments (i.e., nanomedicine). This review focuses on the potential of nanotechnology in medicine, including the development of nanoparticles for diagnostic and screening purposes, artificial receptors, DNA sequencing using nanopores, manufacture of unique drug delivery systems, gene therapy applications and the enablement of tissue engineering.

Nanotechnology in Medicine – Nanomedicine

The use of nanotechnology in medicine offers some exciting possibilities. Some techniques are only imagined, while others are at various stages of testing, or actually being used today.

Nanotechnology in medicine involves applications of nanoparticles currently under development, as well as longer range research that involves the use of manufactured nano-robots to make repairs at the cellular level (sometimes referred to as nanomedicine).

Whatever you call it, the use of nanotechnology in the field of medicine could revolutionize the way we detect and treat damage to the human body and disease in the future, and many techniques only imagined a few years ago are making remarkable progress towards becoming realities.

Nanotechnology in Medicine Application: Drug Delivery

One application of nanotechnology in medicine currently being developed involves employing nanoparticles to deliver drugs, heat, light or other substances to specific types of cells (such as cancer cells). Particles are engineered so that they are attracted to diseased cells, which allows direct treatment of those cells. This technique reduces damage to healthy cells in the body and allows for earlier detection of disease.

For example, nanoparticles that deliver chemotherapy drugs directly to cancer cells are under development. Tests are in progress for targeted delivery of chemotherapy drugs and their final approval for their use with cancer patients is pending. One company, CytImmune has published the results of a Phase 1 Clinical Trial of their first targeted chemotherapy drug and another company, BIND Biosciences, has published preliminary results of a Phase 1 Clinical Trial for their first targeted chemotherapy drug and is proceeding with a Phase 2 Clinical Trial.

Researchers at the University of Illinois have demonstated that gelatin nanoparticles can be used to deliver drugs to damaged brain tissue.

Researchers at MIT using nanoparticles to deliver vaccine. The nanoparticles protect the vaccine, allowing the vaccine time to trigger a stronger immune response.

Reserchers are developing a method to release insulin that uses a sponge-like matrix that contains insulin as well as nanocapsules containing an enzyme. When the glucose level rises the nanocapsules release hydrogen ions, which bind to the fibers making up the matrix. The hydrogen ions make the fibers positively charged, repelling each other and creating openings in the matrix through which insulin is released.

Researchers are developing a nanoparticle that can be taken orally and pass through the lining of the intestines into the bloodsteam. This should allow drugs that must now be delivered with a shot to be taken in pill form.

Researchers are also developing a nanoparticle to defeat viruses. The nanoparticle does not actually destroy viruses molecules, but delivers an enzyme that prevents the reproduction of viruses molecules in the patients bloodstream.

Read more about nanomedicine in drug delivery

Nanotechnology in Medicine Application: Therapy Techniques

Researchers have developed “nanosponges” that absorb toxins and remove them from the bloodstream. The nanosponges are polymer nanoparticles coated with a red blood cell membrane. The red blood cell membrane allows the nanosponges to travel freely in the bloodstream and attract the toxins.

Researchers have demonstrated a method to generate sound waves that are powerful, but also tightly focused, that may eventually be used for noninvasive surgery. They use a lens coated with carbon nanotubes to convert light from a laser to focused sound waves. The intent is to develop a method that could blast tumors or other diseased areas without damaging healthy tissue.

Researchers are investigating the use of bismuth nanoparticles to concentrate radiation used in radiation therapy to treat cancer tumors. Initial results indicate that the bismuth nanoparticles would increase the radiation dose to the tumor by 90 percent.

Nanoparticles composed of polyethylene glycol-hydrophilic carbon clusters (PEG-HCC) have been shown to absorb free radicals at a much higher rate than the proteins out body uses for this function. This ability to absorb free radicals may reduce the harm that is caused by the release of free radicals after a brain injury.

Targeted heat therapy is being developed to destroy breast cancer tumors. In this method antibodies that are strongly attracted to proteins produced in one type of breast cancer cell are attached to nanotubes, causing the nanotubes to accumulate at the tumor. Infrared light from a laser is absorbed by the nanotubes and produces heat that incinerates the tumor.

Read more about nanomedicine therapy techniques

Nanotechnology in Medicine Application: Diagnostic Techniques

Reseachers at MIT have developed a sensor using carbon nanotubes embedded in a gel; that can be injected under the skin to monitor the level of nitric oxide in the bloodstream. The level of nitric oxide is important because it indicates inflamation, allowing easy monitoring of imflammatory diseases. In tests with laboratory mice the sensor remained functional for over a year.

Researchers at the University of Michigan are developing a sensor that can detect a very low level of cancer cells, as low as 3 to 5 cancer cells in a one milliliter in a blood sample. They grow sheets of graphene oxide, on which they attach molecules containing an antibody that attaches to the cancer cells. They then tag the cancer cells with fluorescent molecules to make the cancer cells stand out in a microscope.

Researchers have demonstrated a way to use nanoparticles for early diagnosis of infectious disease. The nanoparticles attach to molecules in the blood stream indicating the start of an infection. When the sample is scanned for Raman scattering the nanoparticles enhance the Raman signal, allowing detection of the molecules indicating an infectious disease at a very early stage.

A test for early detection of kidney damage is being developed. The method uses gold nanorodsfunctionalized to attach to the type of protein generated by damaged kidneys. When protein accumulates on the nanorod the color of the nanorod shifts. The test is designed to be done quickly and inexpensively for early detection of a problem.

Read more about nanomedicine diagnostic techniques

Nanotechnology in Medicine Application: Anti-Microbial Techniques

One of the earliest nanomedicine applications was the use of nanocrystalline silver which is  as an antimicrobial agent for the treatment of wounds, as discussed on the Nucryst Pharmaceuticals Corporation website.

A nanoparticle cream has been shown to fight staph infections. The nanoparticles contain nitric oxide gas, which is known to kill bacteria. Studies on mice have shown that using the nanoparticle cream to release nitric oxide gas at the site of staph abscesses significantly reduced the infection.

Burn dressing that is coated with nanocapsules containing antibotics. If a infection starts the harmful bacteria in the wound causes the nanocapsules to break open, releasing the antibotics. This allows much quicker treatment of an infection and reduces the number of times a dressing has to be changed.

A welcome idea in the early study stages is the elimination of bacterial infections in a patient within minutes, instead of delivering treatment with antibiotics over a period of weeks. You can read about design analysis for the antimicrobial nanorobot used in such treatments in the following article: Microbivores: Artifical Mechanical Phagocytes using Digest and Discharge Protocol.

Nanotechnology in Medicine Application: Cell Repair

Nanorobots could actually be programmed to repair specific diseased cells, functioning in a similar way to antibodies in our natural healing processes.  Read about design analysis for one such cell repair nanorobot in this article: The Ideal Gene Delivery Vector: Chromallocytes, Cell Repair Nanorobots for Chromosome Repair Therapy

Nanotechnology in Medicine: Company Directory

Company Product
CytImmune Gold nanoparticles for targeted delivery of drugs to tumors
NanoBio Nanoemulsions for nasal delivery to fight viruses (such as the flu and colds) or through the skin to fight bacteria

More nanomedicine companies

Nanotechnology in Medicine: Resources

National Cancer Institute Alliance for Nanotechnology in Cancer; This alliance includes aNanotechnology Characterization Lab as well as eight Centers of  Cancer Nanotechnology Excellence.

Alliance for NanoHealth; This alliance includes eight research institutions performing collaborative research.

European Nanomedicine platform

The National Institute of Health (NIH) is funding research at eight Nanomedicine Development Centers.

Page 2: Nanomedicine based upon nano-robots

Compiled by Earl Boysen of Hawk’s Perch Technical Writing, LLC and UnderstandingNano.com.

Future impact of nanotechnology on medicine and dentistry

Mallanagouda Patil,1 Dhoom Singh Mehta,2 and Sowjanya Guvva3

J Indian Soc Periodontol. 2008 May-Aug; 12(2): 34–40.

doi:  10.4103/0972-124X.44088  PMCID: PMC2813556

The human characteristics of curiosity, wonder, and ingenuity are as old as mankind. People around the world have been harnessing their curiosity into inquiry and the process of scientific methodology. Recent years have witnessed an unprecedented growth in research in the area of nanoscience. There is increasing optimism that nanotechnology applied to medicine and dentistry will bring significant advances in the diagnosis, treatment, and prevention of disease. Growing interest in the future medical applications of nanotechnology is leading to the emergence of a new field called nanomedicine. Nanomedicine needs to overcome the challenges for its application, to improve the understanding of pathophysiologic basis of disease, bring more sophisticated diagnostic opportunities, and yield more effective therapies and preventive properties. When doctors gain access to medical robots, they will be able to quickly cure most known diseases that hobble and kill people today, to rapidly repair most physical injuries our bodies can suffer, and to vastly extend the human health span. Molecular technology is destined to become the core technology underlying all of 21st century medicine and dentistry. In this article, we have made an attempt to have an early glimpse on future impact of nanotechnology in medicine and dentistry.

Keywords: Nanodentistry, nanomedicine, nanoscience, nanotechnology


The world began without man, and it will complete itself without him. …Cloude Levi Strauss. Winfred Phillips, DSc, said, “You have to be able to fabricate things, you have to be able to analyze things, you have to be able to handle things smaller than ever imagined in ways not done before”.[1] Many researchers believed that in future, scientific devices that are dwarfed by dust mites may one day be capable of grand biomedical miracles.

The vision of nanotechnology introduced in 1959 by late Nobel Physicist Richard P Faynman in dinner talk said, “There is plenty of room at the bottom,”[2] proposed employing machine tools to make smaller machine tools, these are to be used in turn to make still smaller machine tools, and so on all the way down to the atomic level, noting that this is “a development which I think cannot be avoided”. He suggested nanomachines, nanorobots, and nanodevices ultimately could be used to develop a wide range of automically precise microscopic instrumentation and manufacturing tools, could be applied to produce a vast quantities of ultrasmall computers and various nanoscale microscale robots.

Feynman’s idea remained largely undiscussed until the mid-1980s, when the MIT educated engineer K Eric Drexler published “Engines of Creation”, a book to popularize the potential of molecular nanotechnology.[3]

Nano comes from the Greek word for dwarf, usually nanotechnology is defined as the research and development of materials, devices, and systems exhibiting physical, chemical, and biological properties that are different from those found on a larger scale (matter smaller than scale of things like molecules and viruses).[4]

Old rules don’t apply, small things behave differently. Researchers in nanoland are also making really, really small things with astonishing properties like the carbon nanotube. Chris Papadopoulos, a nanotechnology researcher says, “The carbon nanotube is the poster boy for nanotechnology”. It’s is a very thin sheet of graphite that’s formed into a tube, its strength can be harnessed by embedding them in constructive materials, among other applications, nanotubes may be part of future improvements for high-performance air craft.

In nanoland, tiny differences in size can add up to huge differences in function. Ted Sergent, author of The dance of Molecules, says matter is tunable at nanoscale. For example, change the length of a guitar string and you change the sound it makes; change the size of semiconductors called quantum dots, and you change their rainbow of colors from a single material. Sergent made a three-nanometric dot that ‘glows’ blue, and four nanometer dot that glows red and a five nanometer dot that emits infrared rays or heat.

Nanotechnology will affect everything, says William Atkinson, author of Nanoscom. Nanotechnology and the big changes coming from the inconceivably small. It’ll be like a blizzard; snowflakes whose weight you can’t detect can bring a city to a standstill. Nanotechnology is going to be like that.

The unique quantum phenomena that happen at the nanoscale, draw researchers from many different disciplines to the field, including medicine, chemistry, physics, engineering, and others (dentistry).

The scientists in the field of regenerative medicine and tissue engineering are continually looking for new ways to apply the principles of cell transplantation, material science, and bioengineering to construct biological substitutes that will restore and maintain normal function in diseased and injured tissue. Development of more refined means of delivering medications at therapeutic levels to specific sites is an important clinical issue, for applications of such technology in medicine, and dentistry.[5]


The field of “Nanomedicine” is the science and technology of diagnosing, treating, and preventing disease and traumatic injury, of relieving pain, and of preserving and improving human health, using nanoscale structured materials, biotechnology, and genetic engineering, and eventually complex machine systems and nonorobots.[5] It was perceived as embracing five main subdisciplines that in many ways are overlapping by common technical issues [Figure 1].

Figure 1

Dimensions in Nanomedicine


It is the use of nanodevices for the early disease identification or predisposition at cellular and molecular level. In in-vitro diagnostics, nanomedicine could increase the efficiency and reliability of the diagnostics using human fluids or tissues samples by using selective nanodevices, to make multiple analyses at subcellular scale, etc. In in vivo diagnostics, nanomedicine could develop devices able to work inside the human body in order to identify the early presence of a disease, to identify and quantify toxic molecules, tumor cells.

Regenerative medicine

It is an emerging multidisciplinary field to look for the reparation, improvement, and maintenance of cells, tissues, and organs by applying cell therapy and tissue engineering methods. With the help of nanotechnology it is possible to interact with cell components, to manipulate the cell proliferation and differentiation, and the production and organization of extracellular matrices.

Present day nanomedicine exploits carefully structured nanoparticles such as dendrimers, carbon fullerenes (buckyballs), and nanoshells to target specific tissues and organs. These nanoparticles may serve as diagnostic and therapeutic antiviral, antitumor, or anticancer agents. Years ahead, complex nanodevices and even nanorobots will be fabricated, first of biological materials but later using more durable materials such as diamond to achieve the most powerful results.[6]

The human body is comprised of molecules, hence the availablity of molecular nanotechnology will permit dramatic progress to address medical problems and will use molecular knowledge to maintain and improve human health at the molecular scale.

Applications in medicine

Within 10–20 years it should become possible to construct machines on the micrometer scale made up of parts on the nanometer scale. Subassemblies of such devices may include such as useful robotic components as 100 nm manipulater arms, 10 nm sorting rotors for molecule by molecule reagent purification, and smooth super hard surfaces made of automically flawless diamond.

Nanocomputers would assume the important task of activating, controlling, and deactivating such nanomechanical devices. Nanocomputers would store and execute mission plans, receive and process external signals and stimuli, communicate with other nanocomputers or external control and monitoring devices, and possess contextual knowledge to ensure safe functioning of the nanomechanical devices. Such technology has enormous medical and dental implications.

Programmable nanorobotic devices would allow physicians to perform precise interventions at the cellular and molecular level. Medical nanorobots have been proposed for genotological[7] applicatons in pharmaceuticals research,[8] clinical diagnosis, and in dentistry,[9] and also mechanically reversing atherosclerosis, improving respiratory capacity, enabling near-instantaneous homeostasis, supplementing immune system, rewriting or replacing DNA sequences in cells, repairing brain damage, and resolving gross cellular insults whether caused by irreversible process or by cryogenic storage of biological tissues.

Feynman offered the first known proposal for a nanorobotic surgical procedure to cure heart disease,[2] “A friend of mine (Albert R. Hibbs) suggests a very interesting possibility for relatively small machines. He says that, although it is a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and looks around. It finds out which valve is the faulty and takes a little knife and slices it out, that we can manufacture an object that maneuvers at that level, other small machines might be permanently incorporated in the body to assist some inadequately functioning organs”.[2]

Many disease causing culprits such as bacteria and viruses are nanosize. So, it only makes sense that nanotechnology would offer us ways of fighting back. The ancient greeks used silver to promote healing and prevent infection, but the treatment took backseat when antibiotics came on the scene. Nycryst pharmaceuticals (Canada) revived and improved an old cure by coating a burn and wound bandage with nanosize silver particles that are more reactive than the bulk form of metal. They penetrate into skin and work steadily. As a result, burn victims can have their dressings changed just once a week.

Genomics and protomics research is already rapidly elucidating the molecular basis of many diseases. This has brought new opportunities to develop powerful diagnostic tools able to identify genetic predisposition to diseases. In the future, point of care diagnosis will be routinely used to identify those patients requiring preventive medication to select the most appropriate medication for individual patients, and to monitor response to treatment. Nanotechnology has a vital role to play in realizing cost-effective diagnostic tools.

Chris Backous developing Lab–on-Chip to give doctor immediate results from medical tests for cancer and viruses, it gets its information by analyzing the genetic material in individual cells. Advances in gene sequencing mean this can now be done quickly and sequencing with tiny samples of body fluids or tissues such as blood, bone marrow, or tumors. The device can also detect the BK virus, a sign of trouble in patients who have had kidney transplants. Ultimately (Pilarski thinks,) chip technology will be able to detect what kind of flu a person has, or, even if they have SARS or HIV.

Nanotechnology has the potential to offer invaluable advances such as use of nanocoatings to slow the release of asthma medication in the lungs, allowing people with asthma to experience longer periods of relief from symptoms after using inhalants. Thus, what nanotechnology tries to do is essentially make drug particles in such a way, that they don’t dissolve that fast, done this with.

Nanosensors developed for military use in recognizing airborne rogue agents and chemical weapons to detect drugs and other substances in exhaled breath.[1] Basically, you can detect many drugs in breath, but the amount you detect in breath is going to be related to the amount that you take and also to whether it partitions well between the blood and the breath. Drug abuse like marijuna (and things like), concentration of alcohol, testing of athletes for banned substances, and individual’s drug treatment programs are two areas long overdue for breath detection technologies. We see this in future totally replacing urine testing.

Currently, most legal and illegal drug overdoses have no specific way to be effectively neutralized, using nanoparticles as absorbents of toxic drugs, is another area of medical nanoscience that is rapidly gaining momentum. Goal is design nanostructures that effectively bind molecular entities, which currently don’t have effective treatments. We are putting nanosponges into the blood stream and they are soaking up toxic drug molecules to reduce the free amount in the blood, in turn, causes a resolution of the toxicity that was there before you put the nanosponges into the blood.

French and Italian researchers have come up with a completely new approach to render anticancer and antiviral nucleoside analoges significantly more potent. By linking the nucleoside analoges to sequalene, a biochemical precursor to the whole family of steroids, the researchers observed the self-organization of amphiphilic molecules in water. These nanoassemblies exhibited superior anticancer activity in vitro in human cancer cells.

Laurie B Gower, PhD, has been researching bone formation and structure at the nanoscale level. She is examining biomimetic methods of constructing a synthetic bone graft substitute with a nanostructured architecture that matches natural bone so that it would be accepted by the body and guide the cells toward the mending of damaged bones. Biomineralization refers to minerals that are formed biologically, which have very different properties than geological minerals or lab-formed crystals. The crystal properties found in bone are manipulated at nanoscale and are imbedded within collagen fibers to create an interpenetrating organic–inorganic composite with unique mechanical properties. She foresees numerous implications of the material in the future of osteology.

Hichan Fenniri, a chemistry professor, tried to make artificial joints act more like natural ones. Fenniri has made a nanotube coating for titanium hip or knee, is very good mimic of collagen, as a result of coating attracts and attaches more bone cells, osteoblasts, which help in bone growth quickly than uncoated hip or knee.

There is ongoing attempts to build ‘medical microrobots’ for in vivo medical use.[10] In 2002, Ishiyama et al,[11] at Tohku University developed tiny magnetically driven spinning screws intended to swim along veins and carry drugs to infected tissues or even to burrow into tumors and kill them with heat. In 2005, Brad Nelson’s[12] team reported the fabrication of a microscopic robot, small enough (approximately 200 µm) to be injected into the body through a syringe. They hope that this device or its descendants might someday be used to deliver drugs or perform minimally invasive eye surgery. Gorden’s[9,13] group at the University of Manitoba has also proposed magnetically controlled ‘cytobots’ and ‘karyobots’ for performing wireless intracellular and intranuclear surgery.

‘Respirocytes’, the first theoreotical design study of a complete medical nanorobot ever published in peer-reviewed journal described a hypothetical artificial mechanical red blood cell or ‘respirocyte’ made of 18 billion precisely arranged structural atoms.[10,14] The respirocyte is a bloodborne spherical 1 µm diamondedoid 1000 atmosphere pressure vessel with reversible molecule selective surface pumps powered by endogenous serum glucose. This nanorobot would deliver 236 times more oxygen to body tissues per unit volume than natural red cells and would manage carbonic acidity, controlled by gas concentration sensors and an onboard nanocomputer.

Nanorobotic microbivores

Artificial phagocytes called microbivores could patrol the bloodstream, seeking out and digesting unwanted pathogens including bacteria, viruses, or fungi.[10,15] Microbivores would achieve complete clearance of even the most severe septicemic infections in hours or less. The nanorobots do not increase the risk of sepsis or septic shock because the pathogens are completely digested into harmless sugars, amino acids, and the like, which are the only effluents from the nanorobot.

Surgical nanorobotics

A surgical nanorobot, programmed or guided by a human surgeon, could act as a semiautonomous on site surgeon inside the human body, when introduced into the body through vascular system or cavities. Such a device could perform various functions such as searching for pathology and then diagnosing and correcting lesions by nanomanipulation, coordinated by an onboard computer while maintaining contact with the supervising surgeon via coded ultrasound signals.[10]

The earliest forms of cellular nanosurgery are already being explored today. For example, rapidly vibrating (100 Hz) micropipette with a <1 µm tip diameter has been used to completely cut dentrites from single neurons without damaging cell viability.[16] Axotomy of roundworm neurons was performed by femtosecond laser surgery, after which the axons functionally regenerated.[17] Femtolaser acts like a pair of nanoscissors by vaporizing tissue locally while leaving adjacent tissue unharmed. Femtolaser surgery has performed the individual chromosomes.[18]


They could make new class of self-powered implantable medical devices, sensors, and portable electronics, by converting mechanical energy from body movement, muscle stretching, or water flow into electricity.

Nanogenerators produce electric current by bending and then releasing zinc oxide nanowires, which are both piezoelectric and semiconducting. Nanowires can be grown on polymer-based films, use of flexible polymer substrates could one day allow portable devices to be powered by movement of their users.

“Our bodies are good at converting chemical energy from glucose into the mechanical energy of our muscles,” Wang (faculty at Peking University and National Center for Nanoscience and Technology of China) explained “these nanogenerators can take mechanical energy and convert it to electrical energy for powering devices inside the body. This could open up tremendous possibilities for self-powered implantable medical devices.”


Nanodentistry will make possible the maintenance of comprehensive oral health by employing nanomaterials, biotechnology, including tissue engineering, and ultimately, dental nanorobotics. New potential treatment opportunities in dentistry may include, local anesthesia, dentition renaturalization, permanent hypersensitivity cure, complete orthodontic realignments during a single office visit, covalently bonded diamondised enamel, and continuous oral health maintenance using mechanical dentifrobots.

When the first micro-size dental nanorobots can be constructed, dental nanorobots might use specific motility mechanisms to crawl or swim through human tissue with navigational precision, acquire energy, sense, and manipulate their surroundings, achieve safe cytopenetration and use any of the multitude techniques to monitor, interrupt, or alter nerve impulse traffic in individual nerve cells in real time.

These nanorobot functions may be controlled by an onboard nanocomputer that executes preprogrammed instructions in response to local sensor stimuli. Alternatively, the dentist may issue strategic instructions by transmitting orders directly to in vivo nanorobots via acoustic signals or other means.

Inducing anesthesia

One of the most common procedure in dental practice, to make oral anesthesia, dental professionals will instill a colloidal suspension containing millions of active analgesic micron-sized dental nanorobot ‘particles’ on the patient’s gingivae. After contacting the surface of the crown or mucosa, the ambulating nanorobots reach the dentin by migrating into the gingival sulcus and passing painlessly through the lamina propria or the 1–3-micron thick layer of loose tissue at the cementodentinal junction. On reaching dentin, the nanorobots enter dentinal tubules holes that are 1–4 microns in diameter and proceed toward the pulp, guided by a combination of chemical gradients, temperature differentials, and even positional navigation, all under the control of the onboard nanocomputer as directed by the dentist.[9]

There are many pathways to choose from, near to CEJ, midway between junction and pulp, and near to pulp. Tubules diameter increases as it nears the pulp, which may facilitate nanorobot movement, although circumpulpal tubule openings vary in numbers and size (tubules number density 22,000 mm DEJ, 37,000 mm square midway, ans 48000 mm square near to pulp). Tubules branching patterns, between primary and irregular secondary dentin, regular secondary dentin in young and old teeth (sclerosing) may present a significant challenge to navigation.

The presence of natural cells that are constantly in motion around and inside the teeth including human gingival and pulpal fibroblasts, cementoblasts of the CDJ, bacteria inside dentinal tubules, odontoblasts near the pulp dentin border, and lymphocytes within the pulp or lamina propria suggested that such journey should be feasible by cell-sized nanorobots of similar mobility.

Once installed in the pulp and having established control over nerve impulse traffic, the analgesic dental nanorobots may be commanded by the dentist to shut down all sensitivity in any particular tooth that requires treatment. When on the hand-held controller display, the selected tooth immediately becomes numb. After the oral procedures completed, the dentist orders the nanorobots to restore all sensation, to relinguish control of nerve traffic and to engress, followed by aspiration. Nanorobotic analgesics offer greater patient comfort and reduced anxiety, no needles, greater selectivity, and controllability of the analgesic effect, fast and completely reversible switchable action and avoidance of most side effects and complications.

Tooth repair

Nanorobotic manufacture and installation of a biologically autologous whole replacement tooth that includes both mineral and cellular components, that is, ‘complete dentition replacement therapy’ should become feasible within the time and economic constraints of a typical office visit through the use of an affordable desktop manufacturing facility, which would fabricate the new tooth in the dentist’s office.

Chen et al[19] took advantage of these latest developments in the area of nanotechnology to simulate the natural biomineralization process to create the hardest tissue in the human body, dental enamel, by using highly organized microarchitectural units of nanorod-like calcium hydroxyapatite crystals arranged roughly parallel to each other.

Dentin hypersensitivity

Natural hypersensitive teeth have eight times higher surface density of dentinal tubules and diameter with twice as large than nonsensitive teeth. Reconstructive dental nanorobots, using native biological materials, could selectively and precisely occlude specific tubules within minutes, offering patients a quick and permanent cure.[9]

Tooth repositioning

Orthodontic nanorobots could directly manipulate the periodontal tissues, allowing rapid and painless tooth straightening, rotating and vertical repositioning within minutes to hours.

Tooth renaturalization

This procedure may become popular, providing perfect treatment methods for esthetic dentistry. This trend may begin with patients who desire to have their (1) old dental amalgams excavated and their teeth remanufactured with native biological materials, and (2) full coronal renaturalization procedures in which all fillings, crowns, and other 20th century modifications to the visible dentition are removed with the affected teeth remanufactured to become indistinguishable from original teeth.

Dental durability and cosmetics

Durability and appearance of tooth may be improved by replacing upper enamel layers with covalently bonded artificial materials such as sapphire or diamond,[20] which have 20–100 times the hardness and failure strength of natural enamel or contemporary ceramic veneers and good biocompatibility. Pure sapphire and diamond are brittle and prone to fracture, can be made more fracture resistant as part of a nanostructured composite material that possibly includes embedded carbon nanotubes.

Nanorobotic dentifrice (dentifrobots) delivered by mouthwash or toothpaste could patrol all supragingival and subgingival surfaces at least once a day metabolizing trapped organic mater into harmless and odorless vapors and performing continous calculus debridement.

Properly configured dentifrobots could identify and destroy pathogenic bacteria residing in the plaque and elsewhere, while allowing the 500 species of harmless oral microflora to flourish in a healthy ecosystem. Dentifrobots also would provide a continous barriers to halitosis, since bacterial putrification is the central metabolic process involved in oral malodor. With this kind of daily dental care available from an early age, conventional tooth decay and gingival deseases will disappear into the annals of medical history.

Potential benefits of nanotechnology are its ability to exploit the atomic or molecular properties of materials and the development of newer materials with better properties. Nanoproducts can be made by: building-up particles by combining atomic elements and using equipments to create mechanical nanoscale objects.

Nanotechnology has improved the properties of various kinds of fibers.[21] Polymer nanofibers with diameters in the nanometer range, possess a larger surface area per unit mass and permit an easier addition of surface functionalities compared to polymer microfibers.[21,22] Polymer nanofiber materials have been studied as drug delivery systems, scaffolds for tissue engineering and filters. Carbon fibers with nanometer diamensions showed a selective increase in osteoblast adhesion necessary for successful orthopedic/dental implant applications due to a high degree of nanometer surface roughness.[23]

Nonagglomerated discrete nanoparticles are homogenously manufactured in resins or coatings to produce nanocomposites. The nanofiller used include an aluminosilicate powder having a mean particles size of about 80 nm and 1:4 M ratio of alumina to silica. Advantages – superior hardness, flexible strength, modulus of elasticity, translucency and esthetic appeal, excellent color density, high polish, and polish retention, and excellent handling properties.[24] (Filtek O supreme Univrasl Restorative Pure Nano O).

Heliometer, microfilled composite resin, a close examination of this composite suggests that a form of nanotechnology was in use years ago, yet never recognized.

Nanosolutions produce unique and dispersible nanoparticles that can be added to various solvents, paints, and polymers in which they are dispersed homogenously. Nanotechnology in bonding agents ensures homogeneity and so the operator can now be totally confident that the adhesive is perfectly mixed every time.

Nanofillers are integrated in the vinylsiloxanes, producing a unique addition siloxane impression material. Better flow, improved hydrophilic properties, hence fewer voids at margin and better model pouring, enhanced detail precision.[25]


Nanotechnology is part of a predicted future in which dentistry and periodontal practice may become more high-tech and more effective looking to manage individual dental health on a microscopic level by enabling us to battle decay where it begins with bacteria. Construction of a comprehensive research facility is crucial to meet the rigorous requirements for the development of nanotechnologies.

Researchers are looking at ways to use microscopic entities to perform tasks that are now done by hand or with equipment. This concept is known as nanotechnology. Tiny machines, known as nanoassemblers, could be controlled by computer to perform specialized jobs. The nanoassemblers could be smaller than a cell nucleus so that they could fit into places that are hard to reach by hand or with other technology. Used to destroy bacteria in the mouth that cause dental caries or even repair spots on the teeth where decay has set in, by use of computer to direct these tiny workers in their tasks.

Nanotechnology has tremendous potential, but social issues of public acceptance, ethics, regulation, and human safety must be addressed before molecular nanotechnology can be seen as the possibility of providing high quality dental care to the 80% of the world’s population that currently receives no significant dental care.

Role of periodontitis will continue to evolve along the lines of currently visible trends. For example, simple self-care neglect will become fewer, while cases involving cosmetic procedures, acute trauma, or rare disease conditions will become relatively more commonplace.

Trends in oral health and disease also may change the focus on specific diagnostic and treatment modalities. Increasingly preventive approaches will reduce the need for cure prevention a viable approach for the most of them.

Diagnosis and treatment will be customized to match the preferences and genetics of each patient. Treatment options will become more numerous and exciting. All this will demand, even more so than today, the best technical abilities, professional skills that are the hallmark of the contemporary dentist and periodontist. Developments are expected to accelerate significantly.

Nanometers and nanotubes, technologies could be used to administer drugs more precisely. Technology should be able to target specific cells in a patient suffering from cancer or other life-threatening conditions. Toxic drugs used to fight these illnessess would become much more direct and consequently less harmful to the body.


The visions described in this article may sound unlikely, implausible, or even heretic. Yet, the theoretical and applied research to turn them into reality is progressing rapidly. Nanotechnology will change dentistry, healthcare, and human life more profoundly than many developments of the past. As with all technologies, nanotechnology carries a significant potential for misuse and abuse on a scale and scope never seen before. However, they also have potential to bring about significant benefits, such as improved health, better use of natural resources, and reduced environmental pollution. These truly are the days of miracle and wonder.

Current work is focused on the recent developments, particularly of nanoparticles and nanotubes for periodontal management, the materials developed from such as the hollow nanospheres, core shell structures, nanocomposites, nanoporous materials, and nanomembranes will play a growing role in materials development for the dental industry.

Once nanomechanics are available, the ultimate dream of every healer, medicine man and physician throughout recorded history will, at last become a reality. Programmable and controllable microscale robots comprised of nanoscale parts fabricated to nanometer precision will allow medical doctors to execute curative and reconstructive procedures in the human body at the cellular and molecular levels. Nanomedical physicians of the 21st century will still make good use of the body’s natural healing powers and homeostatic mechanisms, because all else equal, those interventions are best that intervene least.


Source of Support: Nil

Conflict of Interest: None declared.


  1. Rocco Castoro. U F expects big things from the science of small, nanotechnology. Think Small. The POST 02-2005.
  2. Feynman RP. There’s plenty of room at the bottom. Eng Sci. 1960;23:22–36.
  3. Drexler KE. New era of nanotechnology. New York: Anchor Press; 1986. Engines of creation: The coming era of nanotechnology; pp. 99–129.
  4. Freitas RA., Jr . Basic capabilities. Vol 1. Texas: Landes Bioscience; 1999. Nanomedicine. Available from: http//www.nanomedicine.com[last accessed on 2000 Sep 26] Georgetown.
  5. European Science Foundation. Nanomedicine. Forward look on Nanomedicine. 2005
  6. Frietas RA., Jr Current status of nanomedicine and medical nanorobotics. J Comut Ther Nanosci.2005;2:1–25.
  7. Fahy GM. Short-term and long term possibilities for interventive gerontology. Mt Sinai J Med.1991;58:328–40. [PubMed]
  8. Fahy GM. Molecular nanotechnology and its possible pharmaceutical implications. In: Bezold C, Halperin JA, Eng JL, editors. 2020 visions: Health care information standards and technologies. Rockville, MD: U.S Pharmacopenial Convention; 1993. pp. 152–9.
  9. Freitas RA., Jr Nanodentistry. J Am Dent Assoc. 2000;131:1559–66. [PubMed]
  10. Freitas R., Jr Nanotechnology, nanomedicine and nanosurgery. Int J Surg. 2005;3:243–6. [PubMed]
  11. Ishiyama K, Sendoh M, Arai KI. Magnetic micromachines for medical applications. J Magn Mater.2002;242:1163–5.
  12. Nelson B, Rajamani R. Biomedical micro-robotic system. In: Eighth international conference on medical image computing and computer assised intervention. MICCAI; 2005; Available from:http://www.miccai2005.orgPalm Springs CA: 26-29 Otober 2005.
  13. Chrusch DD, Podaima BW, Gordon R. Cytobots: Intracellular robotic micromanipulators. In: Kinsner W, Sebak A, editors. Conference proceedings, 2002 IEEE Canadian conference on electrical and computer engineering; 2002 May 12-15; Winnipeg, Canada. Winnipeg: IEEE; 2002.
  14. Freitas RA., Jr Exploratory design in medical nanotechnology: A mechanical artificial red cell. Artif Cells Blood Substit Immobil Biotechnol. 1998;26:411–30. [PubMed]
  15. Freitas RA., Jr Microbivores: Artificial mechanical phagocytes using digest and discharge protocol. J Evol Technol. 2005;14:1–52.
  16. Kirson ED, Yaari Y. A novel technique for micro-dissection of neuronal processes. J Neurosci Methods.2000;98:119–22. [PubMed]
  17. Yanik MF, Cinar H, Cinar HN, Chisholm AD, Jin Y, Ben-Yakar A. Neurosurgery: functional regeneration after laser axotomy. Nature. 2004;432:822. [PubMed]
  18. Konig K, Riemann I, Fischer P, Halbhuber KJ. Intracellular nanosurgery with near infrared femtosecond laser pulses. Cell Mol Biol. 1999;45:195–201. [PubMed]
  19. Chen HF, Clarkson BH, Sunk, Mansfield JF. Self assembly of synthetic hydroxyaptite nanorods into enamel prism like structure. J Colloid Interf Sci. 2005;188:97–103. [PubMed]
  20. Yunshin S, Park HN, Kim KH. Biologic evaluation of Chitosan Nanofiber Membrane for guided bone regeneration. J Periodontol. 2005;76:1778–84. [PubMed]
  21. Reifman EM. Diamond teeth. In: Crandall BC, editor. Nanotechnology: Molecular speculations on global abundance. Cambridge, Mass: MIT Press; 1996. pp. 81–6.
  22. Jayraman K, Kotaki M, Zhang Y, Mox, Ramakrishna S. Recent advances in Polymer nanofibers. J Nanosci Nanotechnol. 2004;4:52–65. [PubMed]
  23. Katti DS, Robinson KW, Ko FK, Laurenci CT. Bioresorbable nanofiber based systems for wound healing and drug delivery: Optimisation of fabrication parameters. J Biomed Mater Res. 2004;70:282–96.[PubMed]
  24. Price RL, Ellison K, Haberstroh KM, Webster TJ. Nano-meter surface roughness increases select osteoblasts adhesion on carbon nanofiber compacts. J Biomed Mater Res. 2004;70:129–38. [PubMed]
  25. Nano A. The A to Z of nanotechnology And nanomaterials. The Institute of nanotechnology, Azom Co Ltd; 2003.


The MIT-Harvard Center for Cancer Nanotechnology Excellence is a collaborative effort among MIT, Harvard University, Harvard Medical School, Massachusetts General Hospital, and Brigham and Women’s Hospital. It is one of eight Centers of Cancer Nanotechnology Excellence awarded by The National Cancer Institute (NCI), part of the National Institutes of Health (NIH). It focuses on developing a diversified portfolio of nanoscale devices for targeted delivery of cancer therapies, diagnostics, non-invasive imaging, and molecular sensing. In addition to general oncology applications, the Consortium focuses on prostate, brain, lung, ovarian, and colon cancer.

Examples of projects that the Consortium is undertaking include the development of:

  • Targeted nanoparticles for treating prostate cancer
  • Polymer nanoparticles and quantum dots for siRNA delivery
  • Next-generation magnetic nanoparticles for multimodal, non-invasive tumor imaging
  • Implantable, biodegradable microelectromechanical systems (MEMS), also known as lab-on-a-chip devices, for in vivo molecular sensing of tumor-associated biomolecules
  • Low-toxicity nanocrystal quantum dots for biomedical sensing

In addition to drawing on the scientific and technological expertise of its investigators, the Consortium uses available facilities for toxicology testing and the extensive mouse models of cancer collection at the collaborating institutions.

  1.  Nanotechnology and CancerNanotechnology is one of the most popular areas of scientific research, especially with regard to medical applications. We’ve already discussed some of the new detection methods that should bring about cheaper, faster and less invasive cancer diagnoses. But once the diagnosis occurs, there’s still the prospect of surgery, chemotherapy or radiation treatment to destroy the cancer. Unfortunately, these treatments can carry serious side effects. Chemotherapy can cause a variety of ailments, including hair loss, digestive problems, nausea, lack of energy and mouth ulcers.But nanotechnologists think they have an answer for treatment as well, and it comes in the form o ftargeted drug therapies. If scientists can load their cancer-detecting gold nanoparticles with anticancer drugs, they could attack the cancer exactly where it lives. Such a treatment means fewer side effects and less medication used. Nanoparticles also carry the potential for targeted and time-release drugs. A potent dose of drugs could be delivered to a specific area but engineered to release over a planned period to ensure maximum effectiveness and the patient’s safety.These treatments aim to take advantage of the power of nanotechnology and the voracious tendencies of cancer cells, which feast on everything in sight, including drug-laden nanoparticles. One experiment of this type used modified bacteria cells that were 20 percent the size of normal cells. These cells were equipped with antibodies that latched onto cancer cells before releasing the anticancer drugs they contained.Another used nanoparticles as a companion to other treatments. These particles were sucked up by cancer cells and the cells were then heated with a magnetic field to weaken them. The weakened cancer cells were then much more susceptible to chemotherapy.It may sound odd, but the dye in your blue jeans or your ballpoint pen has also been paired with gold nanoparticles to fight cancer. This dye, known as phthalocyanine, reacts with light. The nanoparticles take the dye directly to cancer cells while normal cells reject the dye. Once the particles are inside, scientists “activate” them with light to destroy the cancer. Similar therapies have existed to treat skin cancers with light-activated dye, but scientists are now working to use nanoparticles and dye to treat tumors deep in the body.From manufacturing to medicine to many types of scientific research, nanoparticles are now rather common, but some scientists have voiced concerns about their negative health effects. Nanoparticles’ small size allows them to infiltrate almost anywhere. That’s great for cancer treatment but potentially harmful to healthy cells and DNA. There are also questions about how to dispose of nanoparticles used in manufacturing or other processes. Special disposal techniques are needed to prevent harmful particles from ending up in the water supply or in the general environment, where they’d be impossible to track.Gold nanoparticles are a popular choice for medical research, diagnostic testing and cancer treatment, but there are numerous types of nanoparticles in use and in development. Bill Hammack, a professor of chemical engineering at the University of Illinois, warned that nanoparticles are “technologically sweet” [Source: Marketplace]. In other words, scientists are so wrapped up in what they can do, they’re not asking if they should do it. The Food and Drug Administration has a task force on nanotechnology, but as of yet, the government has exerted little oversight or regulation.
  2. The U.S. Food and Drug Administration (FDA)regulates a wide range of products, including foods, cosmetics, drugs, devices, veterinary products, and tobacco products some of which may utilize nanotechnology or contain nanomaterials. Nanotechnology allows scientists to create, explore, and manipulate materials measured in nanometers (billionths of a meter).  Such materials can have chemical, physical, and biological properties that differ from those of their larger counterparts.Guidance documents issued
    • On June 24, 2014, FDA issued three final guidance documentsrelated to the use of nanotechnology in regulated products,incuding cosmetics and food substances.
    • On August 5, 2015, FDA issued one final guidance documentrelated to the use of nanotechnology in food for animals.
      • FDA Guidance on Nanotechnology
        1. Nanotechnology Fact Sheet
        2. FDA issues three final guidances related to nanotechnology applications in regulated products, including cosmetics and food substances (June 2014)
        3. FDA issues final guidance on the use of nanotechnology in food for animals (August 2015)
        4. Nanotechnology in TherapeuticsA Focus on Nanoparticles as a Drug Delivery SystemSuwussa Bamrungsap; Zilong Zhao; Tao Chen; Lin Wang; Chunmei Li; Ting Fu; Weihong TanDisclosuresNanomedicine. 2012;7(8):1253-1271.AbstractContinuing improvement in the pharmacological and therapeutic properties of drugs is driving the revolution in novel drug delivery systems. In fact, a wide spectrum of therapeutic nanocarriers has been extensively investigated to address this emerging need. Accordingly, this article will review recent developments in the use of nanoparticles as drug delivery systems to treat a wide variety of diseases. Finally, we will introduce challenges and future nanotechnology strategies to overcome limitations in this field.IntroductionNanotechnology involves the engineering of functional systems at the molecular scale. Such systems are characterized by unique physical, optical and electronic features that are attractive for disciplines ranging from materials science to biomedicine. One of the most active research areas of nanotechnology is nanomedicine, which applies nanotechnology to highly specific medical interventions for the prevention, diagnosis and treatment of diseases.[1,2,401] The surge in nanomedicine research during the past few decades is now translating into considerable commercialization efforts around the globe, with many products on the market and a growing number in the pipeline. Currently, nanomedicine is dominated by drug delivery systems, accounting for more than 75% of total sales.[3]

          Nanomaterials fall into a size range similar to proteins and other macromolecular structures found inside living cells. As such, nanomaterials are poised to take advantage of existing cellular machinery to facilitate the delivery of drugs. Nanoparticles (NPs) containing encapsulated, dispersed, absorbed or conjugated drugs have unique characteristics that can lead to enhanced performance in a variety of dosage forms. When formulated correctly, drug particles are resistant to settling and can have higher saturation solubility, rapid dissolution and enhanced adhesion to biological surfaces, thereby providing rapid onset of therapeutic action and improved bioavailability. In addition, the vast majority of molecules in a nanostructure reside at the particle surface,[4] which maximizes the loading and delivery of cargos, such as therapeutic drugs, proteins and polynucleotides, to targeted cells and tissues. Highly efficient drug delivery, based on nanomaterials, could potentially reduce the drug dose needed to achieve therapeutic benefit, which, in turn, would lower the cost and/or reduce the side effects associated with particular drugs. Furthermore, NP size and surface characteristics can be easily manipulated to achieve both passive and active drug targeting. Site-specific targeting can be achieved by attaching targeting ligands, such as antibodies or aptamers, to the surface of particles, or by using guidance in the form of magnetic NPs. NPs can also control and sustain release of a drug during transport to, or at, the site of localization, altering drug distribution and subsequent clearance of the drug in order to improve therapeutic efficacy and reduce side effects.

          Nanotechnology could be strategically implemented in new developing drug delivery systems that can expand drug markets. Such a plan would be applied to drugs selected for full-scale development based on their safety and efficacy data, but which fail to reach clinical development because of poor biopharmacological properties, for example, poor solubility or poor permeability across the intestinal epithelium, situations that translate into poor bioavailability and undesirable pharmacokinetic properties.[5] The new drug delivery methods are expected to enable pharmaceutical companies to reformulate existing drugs on the market, thereby extending the lifetime of products and enhancing the performance of drugs by increasing effectiveness, safety and patient adherence, and ultimately reducing healthcare costs.[6–8]

          Commercialization of nanotechnology in pharmaceutical and medical science has made great progress. Taking the USA alone as an example, at least 15 new pharmaceuticals approved since 1990 have utilized nanotechnology in their design and drug delivery systems. In each case, both product development and safety data reviews were conducted on a case-by-case basis, using the best available methods and procedures, with an understanding that postmarketing vigilance for safety issues would be ongoing. Some representative examples of therapeutic nanocarriers on the market are briefly described in Table 1.

          In this review, we focus mainly on the application of nanotechnology to drug delivery and highlight several areas of opportunity where current and emerging nanotechnologies could enable novel classes of therapeutics. We look at challenges and general trends in pharmaceutical nanotechnology, and we also explore nanotechnology strategies to overcome limitations in drug delivery. However, this article can only serve to provide a glimpse into this rapidly evolving field, both now and what may be expected in the future.

          Nanocarriers & Their Applications

          Various nanoforms have been attempted as drug delivery systems, varying from biological substances, such as albumin, gelatin and phospholipids for liposomes, to chemical substances, such as various polymers and solid metal-containing NPs (Figure 1). Polymer–drug conjugates, which have high size variation, are normally not considered as NPs. However, since their size can still be controlled within 100 nm, they are also included in these nanodelivery systems. These nanodelivery systems can be designed to have drugs absorbed or conjugated onto the particle surface, encapsulated inside the polymer/lipid or dissolved within the particle matrix. As a consequence, drugs can be protected from a critical environment or their unfavorable biopharmaceutical properties can be masked and replaced with the properties of nanomaterials. In addition, nanocarriers can be accumulated preferentially at tumor, inflammatory and infectious sites by virtue of the enhanced permeability and retention (EPR) effect. The EPR effect involves site-specific characteristics, not associated with normal tissues or organs, thus resulting in increased selective targeting. Based on those properties, nanodrug delivery systems offer many advantages,[9–11] including:

          (Enlarge Image)

          Figure 1.

          Some nanotechnology-based drug delivery platforms, including a nanocrystal, liposome, polymeric micelle, protein-based nanoparticle, dendrimer, carbon nanotube and polymer–drug conjugate.
          NP: Nanoparticle.

          • Improving the stability of hydrophobic drugs, rendering them suitable for administration;
          • Improving biodistribution and pharmacokinetics, resulting in improved efficacy;
          • Reducing adverse effects as a consequence of favored accumulation at target sites;
          • Decreasing toxicity by using biocompatible nanomaterials.

          By adopting nanotechnology, fundamental changes in drug production and delivery are expected to affect approximately half of the worldwide drug production in the next decade, totaling approximately US$380 billion in revenue.[12] Next, several main nanocarriers are briefly discussed.


          One of the most obvious and important nanotechnology tools for product development is the opportunity to convert existing drugs with poor water solubility and dissolution rate into readily water-soluble dispersions by converting them into nanosized drugs.[13,14] In other words, the drug itself may be formulated at a nanoscale such that it can function as its own ‘carrier’.[15] Many approaches have been studied, but the most practical strategy involves reducing the drug particle size to nanometer range and stabilizing the drug NP surface with a layer of nonionic surfactants or polymeric macromolecules.[16] By reducing the particle size of the active pharmaceutical ingredient, the drug’s surface area is increased considerably, thereby improving its solubility and dissolution and consequently increasing both the maximum plasma concentration and area under the curve. Once the drug is nanosized, it can be formulated into various dosage forms, such as oral, nasal and injectable. These nanocrystal drugs may have advantages over association colloids (micelle solutions) because the level of surfactant per amount of drug can be greatly minimized, using only the amount that is necessary to stabilize the solid–fluid interface.[15]

          Furthermore, recent studies have shown that external agents, such as surfactants, for nanocrystal drug delivery can be eliminated. For example, a method was recently developed for the delivery of a hydrophobic photosensitizing anticancer drug in its pure form using nanocrystals.[17] Synthesized by the reprecipitation method, the resulting drug nanocrystals were stable in aqueous dispersion, without the necessity of any additional stabilizer. These nanocrystals are uniform in size distribution with an average diameter of 110 nm. Such nanocrystals were efficiently taken up by tumor cells in vitro, and irradiation of such cells with visible light (665 nm) resulted in significant cell death. An in vivo study of the nanocrystal drug also showed significant efficacy compared with the conventional surfactant-based delivery system. These results illustrate the potential of pure drug nanocrystals for photodynamic therapy. As shown in Table 1 , a number of well-known drugs have already been commercialized using the nanocrystal approach.

          Organic Nanoplatforms

          Liposomes Liposomes are self-assembled artificial vesicles developed from amphiphilic phospholipids. These vesicles consist of a spherical bilayer structure surrounding an aqueous core domain, and their size can vary from 50 nm to several micrometers. Liposomes have attractive biological properties, including general biocompatibility, biodegradability, isolation of drugs from the surrounding environment and the ability to entrap both hydrophilic and hydrophobic drugs. Through the addition of agents to the lipid membrane, or the alteration of the surface chemistry, liposome properties, such as size, surface charge and functionality, can be easily tuned.

          Liposomes are the most clinically established nanosystems for drug delivery. Their efficacy has been demonstrated in reducing systemic effects and toxicity, as well as in attenuating drug clearance.[18,19]Modified liposomes at the nanoscale have been shown to have excellent pharmacokinetic profiles for the delivery of DNA, antisense oligonucleotide, siRNA, proteins and chemotherapeutic agents.[20]Examples of marketed liposomal drugs with higher efficacy and lower toxicity than their nonliposomal analogues are listed in Table 1 . Doxorubicin is an anticancer drug that is widely used for the treatment of various types of tumors. It is a highly toxic compound affecting not only tumor tissue, but also heart and kidney, a fact that limits its therapeutic applications. However, the development of doxorubicin enclosed in liposomes culminated in an approved nanomedical drug delivery system.[21,22] This novel liposomal formulation has resulted in reduced delivery of doxorubicin to the heart and renal system, while elevating the accumulation in tumor tissue[23,24] by the EPR effect. Furthermore, a number of liposomal drugs are currently being investigated, including anticancer agents, such as camptothecin[25]and paclitaxel (PTX),[26] as well as antibiotics, such as vancomycin[27] and amikacin.[28]

          Liposomes are also subject to some limitations, including low encapsulation efficiency, fast burst release of drugs, poor storage stability and lack of tunable triggers for drug release.[29] Furthermore, since liposomes cannot usually permeate cells, drugs are released into the extracellular fluid.[30] As such, many efforts have focused on improving their stability and increasing circulation half-life for effective targeting or sustained drug action.[19,31] Surface modification is one method of conferring stability and structural integrity against a harsh bioenvironment after oral or parenteral administration.[32] Surface modification can be achieved by attaching polyethylene glycol (PEG) units, which form a protective layer over the liposome surface (known as stealth liposomes) to slow down liposome recognition, or by attaching other polymers, such as poly(methacrylic acid-co-cholesteryl methacrylate)[33] and poly(actylic acid),[34] to improve the circulation time of liposomes in blood. To overcome the fast burst release of the chemotherapeutic drugs from liposomes, drugs such as doxorubicin may be encapsulated in the liposomal aqueous phase by an ammonium sulphate gradient.[35] This strategy enables stable drug entrapment with negligible drug leakage during circulation, even after prolonged residence in the blood stream.[36] Further efforts to improve control over the rate of release and drug bioavailability have been made by designing liposomes whose release is environmentally triggered. Accordingly, the drug release from liposome-responsive polymers, or hydrogel, is triggered by a change in pH, temperature, radiofrequency or magnetic field.[37] Liposomes have also been conjugated with active-targeting ligands, such as antibodies[38–40] or folate, for target-specific drug delivery.[41]

          Polymeric NPs Polymeric NPs are colloidal particles with a size range of 10–1000 nm, and they can be spherical, branched or core–shell structures. They have been fabricated using biodegradable synthetic polymers, such as polylactide–polyglycolide copolymers, polyacrylates and polycaprolactones, or natural polymers, such as albumin, gelatin, alginate, collagen and chitosan.[42] Various methods, such as solvent evaporation, spontaneous emulsification, solvent diffusion, salting out/emulsification-diffusion, use of supercritical CO2 and polymerization, have been used to prepare the NPs.[43]Advances in polymer science and engineering have resulted in the development of smart polymer (stimuli-sensitive polymer), which can change its physicochemical properties in response to environmental signals. Physical (temperature, ultrasound, light, electricity and mechanical stress), chemical (pH and ionic strength) and biological signals (enzymes and biomolecules) have been used as triggering stimuli. Various monomers having sensitivity to specific stimuli can be tailored to a homopolymer in response to a certain signal or copolymers answering multiple stimuli. The versatility of polymer sources and their easy combination make it possible to tune up polymer sensitivity in response to a given stimulus within a narrow range, leading to more accurate and programmable drug delivery.

          Polymeric nanocarriers can be categorized based on three drug-incorporation mechanisms. The first includes polymeric carriers that use covalent chemistry for direct drug conjugation (e.g., linear polymers). The second group includes hydrophobic interactions between drugs and nanocarriers (e.g., polymeric micelles from amphiphilic block copolymers). Polymeric nanocarriers in the third group include hydrogels, which offer a water-filled depot for hydrophilic drug encapsulation.

          Polymer–Drug Conjugates (Prodrugs) Many polymer–drug conjugates have been developed since the first combination reported in the 1970s.[44,45] Conjugation of macromolecular polymers to drugs can significantly enhance the blood circulation time of the drugs. Especially, protein or peptide drugs, which can be readily digested inside the human body, can maintain their activity by conjugation of the water-soluble polymer PEG (PEGylation). For example, it was reported that PEGylated L-asparaginase increased its plasma half-life by up to 357 h.[46] Without PEG, the half-life of natural L-asparaginase is only 20 h. In addition to PEGylation of proteins, small molecular anticancer drugs can also be PEGylated to improve their pharmacokinetics for cancer therapy. For instance, PEG-camptothecin (PROTHECAN®) has entered clinical trials for cancer therapy.[47]

          Increasing the otherwise poor solubility of some drugs is another important function of polymer–drug conjugation. Specifically, conjugating water-soluble polymers to functional groups that already exist in the drug structure can significantly enhance the water solubility of the drug. Recently, a new category of polymer–drug conjugates called brush polymer–drug conjugates were prepared by ring-opening metathesis copolymerization.[48] In this report, as PEG was employed as the brush polymer side chains, the conjugates exhibited significant water solubility. However, polymer–drug conjugates require chemical modification of the existing drugs; as a consequence, their production could cost more, and additional purification steps are needed. Moreover, polymers that are chemically conjugated with drugs are often considered new chemical entities owing to a pharmacokinetic profile distinct from that of the parent drugs. As such, additional US FDA approval is required, even though the parent drug has already been approved. Despite the variety of novel drug targets and sophisticated chemistries available, only four drugs (doxorubicin, camptothecin, PTX and platinate) and four polymers (N-[2-hydroxylpropyl]methacrylamide [HPMA] copolymer, poly-L-glutamic acid [PGA], PEG and dextran) have been used to develop polymer–drug conjugates.[49–54] In addition to the commercially available polymer drugs listed in Table 1 , PGA-PTX (Xyotax™, CT-2103; Cell Therapeutics Inc./Chugai Pharmaceutical Co. Ltd.),[55] PGA-camptothecin (CT-2106; Cell Therapeutics Inc.)[56] and HPMA–doxorubicin (PK1/FCE-28068; Pfizer Inc./Cancer Research Campaign)[57] are now in clinical trials. As an example, PK1 has been evaluated in clinical trials as an anticancer agent, and a Phase I evaluation has been completed in patients with several types of tumors resistant to prior therapy, such as chemotherapy or radiation. However, although the clinical results for HPMA–doxorubicin conjugates look promising, PEG-based conjugation remains the gold-standard in the field of polymeric drug delivery. In addition, polymer–drug conjugates are still limited by their nonbiodegradability and the fate of polymers after in vivoadministration.[58]

          Polymeric Micelles Polymeric micelles are formed when amphiphilic surfactants or polymeric molecules spontaneously associate in aqueous medium to form core–shell structures. The inner core of a micelle, which is hydrophobic, is surrounded by a shell of hydrophilic polymers, such as PEG.[59] Their hydrophobic core serves as a reservoir for poorly water-soluble and amphiphilic drugs; at the same time, their hydrophilic shell stabilizes the core, prolongs circulation time in blood and increases accumulation in tumor tissues.[41] So far, a large variety of drug molecules have been incorporated into polymeric micelles, either by physical encapsulation[60,61] or covalent attachment.[62] Genexol-PM® (Samyang, Korea), PEG-poly(D,L-lactide)-PTX, employs cremophor-free polymeric micelles loaded with PTX drugs. It was found to have a three-times higher maximum tolerated dose in nude mice and two- to threefold higher levels of biodistribution, compared with those of pristine PTX, in various tissues, including tumors. A Phase I clinical trial has been evaluated in patients, and the results showed that Genexol-PM is superior to conventional PTX for the delivery of higher doses without additional toxicity.[63] Recently, a series of novel dual targeting micellar delivery systems were developed based on the self-assembled hyaluronic acid-octadecyl (HA-C18) copolymer and folic acid-conjugated HA-C18 (FA-HA-C18). PTX was successfully encapsulated by HA-C18 and FA-HA-C18 polymeric micelles, with a high encapsulation efficiency of 97.3%. Since these copolymers are biodegradable, biocompatible and cell-specifically targetable, they become promising nanostructure carriers for hydrophobic anticancer drugs.[64] In addition, stimuli-responsive drug-loaded micelles[65–69] and multifunctional polymeric micelles containing imaging as well as therapeutic agents[70–72] are now under active investigation with the potential to be the mainstream of the polymeric drug development in the near future. Furthermore, using computer simulation, the experimental preparation of drug-loaded polymeric micelles could be more efficiently guided, by providing insight into the mechanism of mesoscopic structures and serving as a complement to experiments.[73]

          Hydrogel NPs In recent years, hydrogel NPs have gained considerable attention as one of the most promising nanoparticulate drug delivery systems owing to their unique properties. Hydrogels are cross-linked networks of hydrophilic polymers that can absorb and retain more than 20% of their weight in water, while at the same time, maintaining the distinct 3D structure of the polymer network. Swelling properties, network structure, permeability or mechanical stability of hydrogels can be controlled by external stimuli or physiological parameters.[74–78] Hydrogels have been extensively studied for controlled release of therapeutics, stimuli-responsive release and applications in biological implants.[75,79–81] However, the hydration response to changes in stimuli in most hydrogel systems is too slow for therapeutic applications. To overcome this limitation, further development of hydrogel structures at the micro- and nano-scale is needed.[82] Recent reports showed some progress in micro- and nanogels of poly-N-isopropylacrylamide with ultrafast responses and attractive rheological properties.[83,84] Ding et al. demonstrated that cisplatin-loaded polyacrylic acid hydrogel NPs could be implanted and plastered on tumor tissue.[85] This hydrogel system exhibited superior efficacy in impeding tumor growth and prolonging lifespan in mice. The in vivo biodistribution assay also demonstrated that the hydrogel implant results in high concentration and retention of the drug. A multifunctional hybrid hydrogel was developed by combining the magnetic properties of NPs and the typical characteristics of the hydrogel. These hybrid hydrogels could be used to load a large number of drugs and transport them to the target site by the application of an external magnetic field.[86] To improve the specificity of the hydrogel drug delivery systems, core–shell nanogels were developed, which utilize aptamers as the recognition element and near-infrared light as a triggering stimulus for drug delivery. In this system, gold (Au)–silver nanorods, which possess intense absorption bands in the near-infrared range, were coated with DNA cross-linked polymeric shells, so that drugs can be rapidly and controllably released upon the near-infrared irradiation.[87] As the fate of hydrogel NPs after in vivo administration may be a concern for clinical applications, biodegradable hydrogel NPs with diameters of approximately 200 nm have been synthesized via inverse miniemulsion reversible addition–fragmentation chain-transfer polymerization of 2-(dimethylamino)ethyl methacrylate. A disulfide cross-linker was used to cross-link the NPs, so that the polymer network could be degraded to its constituent primary chains by exposure to a reductive environment. It is indicated that these biodegradable hydrogel NPs are currently being investigated for encapsulation and controlled release of siRNA.[88] Although hydrogel NPs-based drugs are not commercially available, they have high possibility to be further developed for drug delivery systems in the future, owing to their highly biocompatible and effective drug-loading properties.

          Protein-based NPs Hydrophobic drugs, such as taxanes, are highly active and widely used in a variety of solid tumor therapies. Both PTX and docetaxel, which are the commercially available taxanes for clinical treatments, are hydrophobic. Because of their solubility problems, they have been formulated as suspensions with nonionic surfactants, such as Cremophor EL® (BASF Corp.) for PTX and Tween-80 (ICI Americas, Inc.) for docetaxel. However, these surfactants are associated with hypersensitivity reaction and toxic side effects to tissues. To decrease toxicity, albumin conjugated with PTX has been formulated, yielding NPs approximately 130 nm in size and approved by the FDA for breast cancer treatment.[89–91] In addition to reduced toxicity, albumin–PTX has been found to bind with the albumin receptor (gp60) on endothelial cells, with further extravascular transport,[92–94] resulting in an increase in drug concentration at tumor sites without hypersensitivity reactions. The albumin–PTX complex is approved in 38 countries for the treatment of metastatic breast cancer. Furthermore, Abraxane® is currently in various stages of investigation for the treatment of other cancers, such as metastatic breast cancer, non-small-cell lung cancer, malignant melanoma, pancreatic and gastric cancer.

          Dendrimers Dendrimers are synthetic, branched macromolecules that form a tree-like structure. Unlike most linear polymers, the chemical composition and molecular weight of dendrimers can be precisely controlled; hence, it is relatively easy to predict their biocompatibility and pharmacokinetics.[95]Dendrimers are very uniform with extremely low polydispersities, and they are commonly created with dimensions incrementally grown in approximate nanometer steps from 1 to over 10 nm. Their globular structures and the presence of internal cavities enable drugs to be encapsulated within the macromolecule interior and are used to provide controlled release from the inner core.[96] Although the small size (up to 10 nm) of dendrimers limits extensive drug incorporation, their dendritic nature and branching allows drug loading onto the outside surface of the structure[97] via covalent binding or electrostatic interactions. Dendrimers can be synthesized by either divergent or convergent approaches. In the divergent approach, dendrimers are synthesized from the core and further built to other layers called generations. However, this method provides a low yield because the reactions that occur must be conducted on a single molecule processing a large number of equivalent reaction sites.[98] In addition, a large amount of reagents is required for the latter stages of synthesis, resulting in complication of purification. For the convergent method, synthesis begins at the periphery of the dendrimer molecules and stops at the core. In this approach, each synthesized generation can be subsequently purified.[98]

          Drug molecules associated with dendrimers can be utilized for cancer treatment,[99] the enhancement of drug solubility and permeability (dendrimer–drug conjugates)[100] and intracellular delivery.[101] Some drugs can be physically encapsulated inside the dendrimer network or form linkages (either covalently or noncovalently) on the dendrimer surface.[102] Furthermore, functionalization of the dendrimer surface with specific ligands can enhance potential targeting. For example, Myc et al. reported a polyamidoamine dendrimer conjugate containing FA as the targeting agent and methotroxate as the therapeutic agent.[103] Cytotoxicity and specificity were tested with both FA receptor-expressing and nonexpressing cells. Both in vitro and in vivo results showed that the dendrimer conjugate was preferentially cytotoxic to the target cells. The polyamido amine dendrimer conjugated with an anti-prostate specific membrane antigen antibody was also demonstrated.[104] The antibody–dendrimer conjugate specifically bound to anti-prostate specific membrane antigen-positive, but not negative, cell lines. However, dendrimer toxicity and immunogenicity are the main concerns when they are applied for drug delivery. Since the clinical experience with dendrimers has so far been limited, it is hard to tell whether the dendrimers are intrinsically ‘safe’ or ‘toxic’.

          Inorganic Platforms

          Au NPs Noble metal NPs, such as Au NPs, have emerged as a promising scaffold for drug and gene delivery in that they provide a useful complement to more traditional delivery vehicles. The combination of inertness and low toxicity,[105] easy synthesis, very large surface area, well-established surface functionalization (generally through thiol linkages) and tunable stability provide Au NPs with unique attributes to enable new delivery strategies. Moreover, excess loading of pharmaceuticals on NPs allows ‘drug reservoirs’ to accumulate for controlled and sustained release, thereby maintaining the drug level within the therapeutic window. An Au NP with 2-nm core diameter could, in principle, be conjugated with 100 molecules to available ligands (n = 108) in the monolayer.[106] Zubarev et al. have recently succeeded in coupling 70 PTX molecules, a chemotherapeutic drug, to an Au NP with a 2-nm core diameter.[107] Efficient release of these therapeutic agents could be triggered by internal (e.g., glutathione[108] or pH[109]) or external (e.g., light[110,111]) stimuli. In addition to serving as the carrier for drug delivery, Au NPs can also be imaged using contrast imaging techniques. Once the Au NPs are targeted to the diseased site, such as a tumor, hyperthermia treatment can be used for tumor destruction. For example, a recent study demonstrated that PEGylated Au NPs were employed for highly efficient drug delivery and in vivo photodynamic therapy of cancer.[112] Compared with conventional photodynamic therapy drug delivery in vivo, PEGylated Au NPs accelerated the silicon phthalocyanine 4 administration by approximately two orders of magnitude without side effects in treated mice. The key issue that needs to be addressed with Au NPs is the engineering of the particle surface for optimized properties, such as bioavailability and nonimmunogenicity.

          Superparamagnetic NPs Magnetic NPs have been proposed as drug carriers with a push towards clinical trials.[113] The superparamagnetic properties of iron (II) oxide particles can be used to guide microcapsules in place for delivery by external magnetic fields. Another advantage of using magnetic NPs is the ability to heat the particles after internalization, which is known as the hyperthermia effect. For example, Brazel et al. developed a grafted thermosensitive polymeric system by embedding FePt NPs in poly(N-isopropylacrylamide)-based hydrogels, which can be triggered to release the loaded drug by inducing an increase in temperature based on a magnetic thermal heating event.[114] The grafted hydrogel system is also shown to exhibit a desirable positive thermal response with an increased drug diffusion coefficient for temperatures higher than physiological temperature.[115]

          Besides being utilized for targeting and raising temperature, magnetic NPs can also affect the permeability of microcapsules by applying external oscillating magnetic fields and releasing encapsulated materials.[116] For example, ferromagnetic Au-coated cobalt NPs (3 nm in diameter) were incorporated into the polymer walls of microcapsules. Subsequently, application of external alternating magnetic fields of 100–300 Hz and 1200 Oe strength disturbed the capsule wall structures and dramatically increased their permeability to macromolecules. This work supports the hypothesis that magnetic NPs embedded in polyelectrolyte capsules can be used for the controlled release of substances by applying an external magnetic field.

          The main benefits of superparamagnetic NPs over classical cancer therapies are minimal invasiveness, accessibility of hidden tumors and minimal side effects. Conventional heating of a tissue by, for example, microwaves or laser light results in the destruction of healthy tissue surrounding the tumor. However, targeted paramagnetic particles provide a powerful strategy for localized heating of cancerous cells.

          Ceramic NPs Ceramic NPs are particles fabricated from inorganic compounds with porous characteristics, such as silica, alumina and titania.[117–119] Among these, silica NPs have attracted much research attention as a result of their biocompatibility and ease of synthesis, as well as surface modification.[120–122,301] Furthermore, the well-established silane chemistry facilitates the cross-linking of drugs to silica particles.[123,124] For example, recent breakthroughs in mesoporous silica NPs (MSNs) have brought new possibilities to this burgeoning area of research. MSNs contain hundreds of empty channels (mesopores) arranged in a 2D network of a honeycomb-like porous structure. In contrast to the low biocompatibility of other amorphous silica materials, recent studies have shown that MSNs exhibit superior biocompatibility at concentrations adequate for pharmacological applications.[125,126]Once the vehicle is localized in the cytoplasm, it is desirable to have effective control over the release of drug molecules in order to reach pharmacologically effective levels. The ability to selectively functionalize the external particle and/or the interior nanochannel surface of MSNs is advantageous in achieving this goal.[127,128] Different functional groups can be added by using this methodology, including, for example, functionalization with stimuli-responsive tethers that could be further attached to NPs (Au and iron [II] oxide). These NPs could work as gatekeepers and be removed by either intracellular or external triggers, such as changes in pH, reducing environment, enzymatic activity, light, electromagnetic field or ultrasound.[128] The surface of MSNs can be engineered with cell-specific moieties, such as organic molecules, peptides, aptamers and antibodies, to achieve cell type or tissue specificity. Moreover, optical and magnetic contrast agents can be introduced to develop multipurpose drug delivery systems.

          These strategies demonstrated that the application of target-specific MSN vehicles in vitro is promising; however, the application in vivo has not yet been reported. These particles are not biodegradable; consequently, there is a concern that they may accumulate in the human body and cause harmful effects.[117] For further in vivo applications, the biocompatibility, biodistribution, retention, degradation and clearance of MSNs must be systematically investigated.

          Carbon-based Nanomaterials Carbon-based nanomaterials have attracted particular interest because they can be surface functionalized for the grafting of nucleic acids, peptides and proteins. Carbon nanotubes (CNTs), fullerene, and nanodiamonds[129] have been extensively studied for drug delivery applications.[130] The size, geometry and surface characteristics of single-wall nanotubes (SWNTs), multiwall nanotubes and C60 fullerenes make them appealing for drug carrier usage. For example, PTX-conjugated SWNTs have shown promise for in vivo cancer treatment. SWNT delivery of PTX affords markedly improved treatment efficacy over clinical Taxol (Bristol-Myers Squibb Co.), as evidenced by its ability to slow down tumor growth at a low PTX dose.[131]

          However, the primary drawback of carbon-based nanomaterials appears to be their toxicity. Experiments have shown that CNTs can lead to cell proliferation inhibition and apoptosis. Although they are less toxic than carbon fibers and NPs, the toxicity of CNTs increases significantly when carbonyl, carboxyl and/or hydroxyl functional groups are present on their surface.[132] Because of the reported toxicity of CNTs,[133–137] studies involving their application for drug delivery are still being conducted.[138–140] In order to promote the application of CNTs for drug delivery, researchers have functionalized their surface, rendering them benign.[136] Unfortunately, concerns that functionalized CNTs may revert back to a toxic state if the functional group detaches has limited the pursuit of using these modified CNTs for biomedical applications.

          The toxicity of other forms of nanocarbons has also been reported.[132,140,141] One study of human lung tumor cells showed that carbon NPs are even more toxic than multiwall nanotubes and carbon nanofibers.[132] Given the mounting evidence demonstrating the toxicity of carbon NPs, the enthusiasm to develop carbon NPs for drug delivery has decreased significantly in recent years.

          Integrated Nanocomposite Particles

          A variety of nanoplatforms have been developed for a wide spectrum of applications, and each of these applications has unique advantages and limitations. By combining the specific function of each material, new hybrid nanocomposite materials can be fabricated. For instance, liposomes and polymeric NPs are the two most widely studied drug delivery platforms, and attempts have been made to combine the advantages of both systems. A recent study reported the use of nanocells consisting of nuclear poly(lactic-co-glycolic acid) NPs within an extranuclear PEGylated phospholipid envelope for temporal targeting of tumor cells and neovasculature.[142] Moreover, liposomes are routinely coated with a hydrophilic polymer, such as PEG or poly(ethylene oxide), to improve the circulation time in vivo, which is another example of a liposome–polymer composite.[143] Similarly, liposomal locked-in dendrimers, the combination of liposomes and dendrimers in one formulation, has resulted in higher drug loading and slower drug release from the composite, as compared with pure liposomes.[144] Another LipoMag formulation, which consists of an oleic acid-coated magnetic nanocrystal core and a cationic lipid shell, was magnetically guided to deliver and silence genes in cells and tumors in mice.[145]

          Targeting Strategies

          Two basic requirements should be realized in the design of nanocarriers to achieve effective drug delivery (Figure 2). First, drugs should be able to reach the desired tumor sites after administration with minimal loss to their volume and activity in blood circulation. Second, drugs should only kill tumor cells without harmful effects to healthy tissue.[146] These requirements may be enabled using two strategies: passive and active targeting of drugs.[147]

          (Enlarge Image)

          Figure 2.

          Passive and active targeting.
          By the enhanced permeability and retention effect, nanoparticles (NPs) can be passively extravasated through leaky vascularization, allowing their accumulation at the tumor region (A). In this case, drugs may be released in the extracellular matrix and then diffuse through the tissue. Active targeting (B) can enhance the therapeutic efficacy of drugs by the increased accumulation and cellular uptake of NPs through receptor-mediated endocytosis. NPs can be engineered to incorporate ligands that bind to endothelial cell surface receptors. In this case, the enhanced permeability and retention effect does not pertain, and the presence of leaky vasculature is not required.

          Passive Targeting

          Passive targeting takes advantage of the unique pathophysiological characteristics of tumor vessels, enabling nanodrugs to accumulate in tumor tissues. Typically, tumor vessels are highly disorganized and dilated with a high number of pores, resulting in enlarged gap junctions between endothelial cells and compromised lymphatic drainage. The ‘leaky’ vascularization, which refers to the EPR effect, allows migration of macromolecules up to 400 nm in diameter into the surrounding tumor region.[147–149] One of the earliest nanoscale technologies for passive targeting of drugs was based on the use of liposomes. More advanced liposomes are coated with a synthetic polymer that protects the agents from immune destruction.[150]

          Moreover, the EPR effect, the microenvironment surrounding tumor tissue, is different from that of healthy cells, a physiological phenomenon that also supports passive targeting. Based on the high metabolic rate of fast-growing tumor cells, they require more oxygen and nutrients. Consequently, glycolysis is stimulated to obtain extra energy, resulting in an acidic environment.[151] Taking advantage of this, pH-sensitive liposomes have been designed to be stable at physiological pH 7.4, but degraded to release drug molecules at the acidic pH.[152]

          Although passive targeting approaches form the basis of clinical therapy, they suffer from several limitations. Ubiquitously targeting cells within a tumor is not always feasible because some drugs cannot diffuse efficiently, and the random nature of the approach makes it difficult to control the process. The passive strategy is further limited because certain tumors do not exhibit an EPR effect, and the permeability of vessels may not be the same throughout a single tumor.[153]

          Active Targeting

          One way to overcome the limitations of passive targeting is to attach affinity ligands (antibodies,[154]peptides,[155] aptamers[156] or small molecules[157] that only bind to specific receptors on the cell surface) to the surface of the nanocarriers by a variety of conjugation chemistries. Nanocarriers will recognize and bind to target cells through ligand–receptor interactions by the expression of receptors or epitopes on the cell surface. In order to achieve high specificity, those receptors should be highly expressed on tumor cells, but not on normal cells. Furthermore, the receptors should homogeneously express and should not be shed into the blood circulation. Internalization of targeting conjugates can also occur by receptor-mediated endocytosis after binding to target cells, facilitating drug release inside the cells. Based on the receptor-mediated endocytosis mechanism, targeting conjugates bind with their receptors first, followed by plasma membrane enclosure around the ligand–receptor complex to form an endosome. The newly formed endosome is transferred to specific organelles, and drugs could be released by acidic pH or enzymes. Although the active targeting strategy looks intriguing, nanodrugs currently approved for clinical use are relatively simple and generally lack active targeting or triggered drug release components. Moreover, nanodrugs currently under clinical development lack specific targeting. To fully explore the application of targeted drug delivery, we need to investigate whether the specific diseases are the correct application for targeting, whether the properties of the therapeutic drugs, as well as their site and mode of action, are suited for targeting and whether the delivery vehicles are optimal for product development.[158]

          Key Factors Impacting Drug Delivery

          In order to achieve effective drug delivery, nanocarriers must have suitable circulation time to prevent the elimination of drugs before reaching their target. Based on previous investigations, size, shape and surface characteristics are key factors that impact the efficiency of drug delivery systems.


          Nanotechnology is an emerging field with the potential to revolutionize drug delivery. Advances in this area have allowed some nanomedicines in the market to achieve desirable pharmacokinetic properties, reduce toxicity and improve patient compliance, as well as clinical outcomes. Integration of nanoparticulate drug delivery technologies in preformulation work not only accelerates the development of new therapeutic moieties, but also helps in the reduction of attrition of new molecular entities caused by undesirable biopharmaceutical and pharmacokinetic properties.

          Optimizing the integration of nanomaterials into drug delivery systems will require standardized metrics for their classification, as well as protocols for their handling. This will, in turn, result in a better understanding of the interactions of nanomaterials with biological systems, which will facilitate better engineering of their properties specific to biomedical applications. The development of such drug carriers will require a greater understanding of both the surface chemistry of nanomaterials and the interaction chemistry of these nanomaterials with biological systems. This can only be achieved through collaborative efforts among scientists in different disciplines. Those who work in this emerging field should have up-to-date information on related toxicology issues, potential health and safety risks and the regulatory environment that will impact patient use. Understanding both the benefits and the risks of these new nanotechnology applications will be essential to good decision-making for drug developers, regulators and ultimately the consumers and patients who will be the beneficiaries of new drug delivery technologies.

        5. Nanoparticles wrapped inside human platelet membranes serve as new vehicles for targeted drug delivery.


Nanoparticles disguised as human platelets could greatly enhance the healing power of drug treatments for cardiovascular disease and systemic bacterial infections. These platelet-mimicking nanoparticles, developed by engineers at the University of California, San Diego, are capable of delivering drugs to targeted sites in the body — particularly injured blood vessels, as well as organs infected by harmful bacteria. Engineers demonstrated that by delivering the drugs just to the areas where the drugs were needed, these platelet copycats greatly increased the therapeutic effects of drugs that were administered to diseased rats and mice.

“This work addresses a major challenge in the field of nanomedicine: targeted drug delivery with nanoparticles,” said Liangfang Zhang, a nanoengineering professor at UC San Diego and the senior author of the study. “Because of their targeting ability, platelet-mimicking nanoparticles can directly provide a much higher dose of medication specifically to diseased areas without saturating the entire body with drugs.”


The study is an excellent example of using engineering principles and technology to achieve “precision medicine,” said Shu Chien, a professor of bioengineering and medicine, director of the Institute of Engineering in Medicine at UC San Diego, and a corresponding author on the study. “While this proof of principle study demonstrates specific delivery of therapeutic agents to treat cardiovascular disease and bacterial infections, it also has broad implications for targeted therapy for other diseases such as cancer and neurological disorders,” said Chien.

The ins and outs of the platelet copycats

On the outside, platelet-mimicking nanoparticles are cloaked with human platelet membranes, which enable the nanoparticles to circulate throughout the bloodstream without being attacked by the immune system. The platelet membrane coating has another beneficial feature: it preferentially binds to damaged blood vessels and certain pathogens such as MRSA bacteria, allowing the nanoparticles to deliver and release their drug payloads specifically to these sites in the body.

Enclosed within the platelet membranes are nanoparticle cores made of a biodegradable polymer that can be safely metabolized by the body. The nanoparticles can be packed with many small drug molecules that diffuse out of the polymer core and through the platelet membrane onto their targets.

To make the platelet-membrane-coated nanoparticles, engineers first separated platelets from whole blood samples using a centrifuge. The platelets were then processed to isolate the platelet membranes from the platelet cells. Next, the platelet membranes were broken up into much smaller pieces and fused to the surface of nanoparticle cores. The resulting platelet-membrane-coated nanoparticles are approximately 100 nanometers in diameter, which is one thousand times thinner than an average sheet of paper.

This cloaking technology is based on the strategy that Zhang’s research group had developed to cloak nanoparticles in red blood cell membranes. The researchers previously demonstrated that nanoparticles disguised as red blood cells are capable of removing dangerous pore-forming toxins produced by MRSA, poisonous snake bites and bee stings from the bloodstream.

By using the body’s own platelet membranes, the researchers were able to produce platelet mimics that contain the complete set of surface receptors, antigens and proteins naturally present on platelet membranes. This is unlike other efforts, which synthesize platelet mimics that replicate one or two surface proteins of the platelet membrane.

“Our technique takes advantage of the unique natural properties of human platelet membranes, which have a natural preference to bind to certain tissues and organisms in the body,” said Zhang. This targeting ability, which red blood cell membranes do not have, makes platelet membranes extremely useful for targeted drug delivery, researchers said.

Platelet copycats at work

In one part of this study, researchers packed platelet-mimicking nanoparticles with docetaxel, a drug used to prevent scar tissue formation in the lining of damaged blood vessels, and administered them to rats afflicted with injured arteries. Researchers observed that the docetaxel-containing nanoparticles selectively collected onto the damaged sites of arteries and healed them.

When packed with a small dose of antibiotics, platelet-mimicking nanoparticles can also greatly minimize bacterial infections that have entered the bloodstream and spread to various organs in the body. Researchers injected nanoparticles containing just one-sixth the clinical dose of the antibiotic vancomycin into one of group of mice systemically infected with MRSA bacteria. The organs of these mice ended up with bacterial counts up to one thousand times lower than mice treated with the clinical dose of vancomycin alone.

“Our platelet-mimicking nanoparticles can increase the therapeutic efficacy of antibiotics because they can focus treatment on the bacteria locally without spreading drugs to healthy tissues and organs throughout the rest of the body,” said Zhang. “We hope to develop platelet-mimicking nanoparticles into new treatments for systemic bacterial infections and cardiovascular disease.”

6.  Sponge-like nanoporous gold could be key to new devices to detect disease-causing agents in humans and plants, according to UC Davis researchers.


A group from the UC Davis Department of Electrical and Computer Engineering have demonstrated that they could detect nucleic acids  using nanoporous gold, a novel sensor coating material, in mixtures of other biomolecules that would gum up most detectors. This method enables sensitive detection of DNA in complex biological samples, such as serum from whole blood.

“Nanoporous gold can be imagined as a porous metal sponge with pore sizes that are a thousand times smaller than the diameter of a human hair,” said Erkin Şeker, assistant professor of electrical and computer engineering at UC Davis and the senior author on the papers. “What happens is the debris in biological samples, such as proteins, is too large to go through those pores, but the fiber-like nucleic acids that we want to detect can actually fit through them. It’s almost like a natural sieve.”


Rapid and sensitive detection of nucleic acids plays a crucial role in early identification of pathogenic microbes and disease biomarkers. Current sensor approaches usually require nucleic acid purification that relies on multiple steps and specialized laboratory equipment, which limit the sensors’ use in the field. The researchers’ method reduces the need for purification.

“So now we hope to have largely eliminated the need for extensive sample clean-up, which makes the process conducive to use in the field,” Şeker said.

The result is a faster and more efficient process that can be applied in many settings.

The researchers hope the technology can be translated into the development of miniature point-of-care diagnostic platforms for agricultural and clinical applications.

“The applications of the sensor are quite broad ranging from detection of plant pathogens to disease biomarkers,” said Şeker.

For example, in agriculture, scientists could detect whether a certain pathogen exists on a plant without seeing any symptoms. And in sepsis cases in humans, doctors might determine bacterial contamination much more quickly than at present, preventing any unnecessary treatments.

7.  Pushing the limits of lensless imaging


The Optical Society

To take a picture with this method, scientists fire an X-ray or extreme ultraviolet laser at a target. The light scatters off, and some of those photons interfere with one another and find their way onto a detector, creating a diffraction pattern. By analyzing that pattern, a computer then reconstructs the path those photons must have taken, which generates an image of the target material—all without the lens that’s required in conventional microscopy.

WASHINGTON — Using ultrafast beams of extreme ultraviolet light streaming at a 100,000 times a second, researchers from the Friedrich Schiller University Jena, Germany, have pushed the boundaries of a well-established imaging technique. Not only did they make the highest resolution images ever achieved with this method at a given wavelength, they also created images fast enough to be used in real time. Their new approach could be used to study everything from semiconductor chips to cancer cells.

The team will present their work at the Frontiers in Optics, The Optical Society’s annual meeting and conference in San Jose, California, USA, on October 22, 2015.

The researchers’ wanted to improve on a lensless imaging technique called coherent diffraction imaging, which has been around since the 1980s. To take a picture with this method, scientists fire an X-ray or extreme ultraviolet laser at a target. The light scatters off, and some of those photons interfere with one another and find their way onto a detector, creating a diffraction pattern. By analyzing that pattern, a computer then reconstructs the path those photons must have taken, which generates an image of the target material—all without the lens that’s required in conventional microscopy.

“The computer does the imaging part—forget about the lens,” explained Michael Zürch, Friedrich Schiller University Jena, Germany and lead researcher. “The computer emulates the lens.”

Without a lens, the quality of the images primarily depends on the radiation source. Traditionally, researchers use big, powerful X-ray beams like the one at the SLAC National Accelerator Laboratory in Menlo Park, CA, USA. Over the last 10 years, researchers have developed smaller, cheaper machines that pump out coherent, laser-like beams in the laboratory setting. While those machines are convenient from the cost perspective, they have drawbacks when reporting results.

The table-top machines are unable to produce as many photons as the big expensive ones which limits their resolution. To achieve higher resolutions, the detector must be placed close to the target material—similar to placing a specimen close to a microscope to boost the magnification. Given the geometry of such short distances, hardly any photons will bounce off the target at large enough angles to reach the detector. Without enough photons, the image quality is reduced.

Zürch and a team of researchers from Jena University used a special, custom-built ultrafast laser that fires extreme ultraviolet photons a hundred times faster than conventional table-top machines. With more photons, at a wavelength of 33 nanometers, the researchers were able to make an image with a resolution of 26 nanometers — almost the theoretical limit. “Nobody has achieved such a high resolution with respect to the wavelength in the extreme ultraviolet before,” Zürch said.

The ultrafast laser also overcame another drawback of conventional table-top light sources: long exposure times. If researchers have to wait for images, they can’t get real-time feedback on the systems they study. Thanks to the new high-speed light source, Zürch and his colleagues have reduced the exposure time to only about a second — fast enough for real-time imaging. When taking snapshots every second, the researchers reached a resolution below 80 nanometers.

The prospect of high-resolution and real-time imaging using such a relatively small setup could lead to all kinds of applications, Zürch said. Engineers can use this to hunt for tiny defects in semiconductor chips. Biologists can zoom in on the organelles that make up a cell. Eventually, he said, the researchers might be able to cut down on the exposure times even more and reach even higher resolution levels.

About FiO/LS

Frontiers in Optics (FiO) 2015 is The Optical Society’s (OSA) 99th Annual Meeting and is being held together with Laser Science, the 31th annual meeting of the American Physical Society (APS) Division of Laser Science (DLS). The two meetings unite the OSA and APS communities for five days of quality, cutting-edge presentations, in-demand invited speakers and a variety of special events spanning a broad range of topics in optics and photonics—the science of light—across the disciplines of physics, biology and chemistry. The exhibit floor will feature leading optics companies, technology products and programs.

About The Optical Society

Founded in 1916, The Optical Society (OSA) is a leading professional organization for scientists, engineers, students and entrepreneurs who fuel discoveries, shape real-life applications and accelerate achievements in the science of light. Through world-renowned publications, meetings and membership initiatives, OSA provides quality research, inspired interactions and dedicated resources for its extensive global network of optics and photonics experts. OSA is a founding partner of the National Photonics Initiative and the 2015 International Year of Light.

SOURCE: The Optical Society

8.  Physicists determine three-dimensional positions of individual atoms for the first time

Katherine Kornei, UCLA
The scientists were able to plot the exact coordinates of nine layers of atoms with a precision of 19 trillionths of a meter. Courtesy of Mary Scott and Jianwei (John) Miao/UCLAAtoms are the building blocks of all matter on Earth, and the patterns in which they are arranged dictate how strong, conductive or flexible a material will be. Now, scientists at UCLA have used a powerful microscope to image the three-dimensional positions of individual atoms to a precision of 19 trillionths of a meter, which is several times smaller than a hydrogen atom.

Their observations make it possible, for the first time, to infer the macroscopic properties of materials based on their structural arrangements of atoms, which will guide how scientists and engineers build aircraft components, for example. The research, led by Jianwei (John) Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, is published September 21 in the online edition of the journal Nature Materials.

For more than 100 years, researchers have inferred how atoms are arranged in three-dimensional space using a technique called X-ray crystallography, which involves measuring how light waves scatter off of a crystal. However, X-ray crystallography only yields information about the average positions of many billions of atoms in the crystal, and not about individual atoms’ precise coordinates.

“It’s like taking an average of people on Earth,” Miao said. “Most people have a head, two eyes, a nose and two ears. But an image of the average person will still look different from you and me.”

Because X-ray crystallography doesn’t reveal the structure of a material on a per-atom basis, the technique can’t identify tiny imperfections in materials, such as the absence of a single atom. These imperfections, known as point defects, can weaken materials, which can be dangerous when the materials are components of machines like jet engines.

“Point defects are very important to modern science and technology,” Miao said.

Miao and his team used a technique known as scanning transmission electron microscopy, in which a beam of electrons smaller than the size of a hydrogen atom is scanned over a sample and measures how many electrons interact with the atoms at each scan position. The method reveals the atomic structure of materials because different arrangements of atoms cause electrons to interact in different ways.

However, scanning transmission electron microscopes only produce two-dimensional images. So, creating a 3-D picture requires scientists to scan the sample once, tilt it by a few degrees and re-scan it—repeating the process until the desired spatial resolution is achieved—before combining the data from each scan using a computer algorithm. The downside of this technique is that the repeated electron beam radiation can progressively damage the sample.

Using a scanning transmission electron microscope at the Lawrence Berkeley National Laboratory’s Molecular Foundry, Miao and his colleagues analyzed a small piece of tungsten, an element used in incandescent light bulbs. As the sample was tilted 62 times, the researchers were able to slowly assemble a 3-D model of 3,769 atoms in the tip of the tungsten sample.

The experiment was time consuming because the researchers had to wait several minutes after each tilt for the setup to stabilize.

“Our measurements are so precise, and any vibrations—like a person walking by—can affect what we measure,” said Peter Ercius, a staff scientist at Lawrence Berkeley National Laboratory and an author of the paper.

The researchers compared the images from the first and last scans to verify that the tungsten had not been damaged by the radiation, thanks to the electron beam energy being kept below the radiation damage threshold of tungsten.

Miao and his team showed that the atoms in the tip of the tungsten sample were arranged in nine layers, the sixth of which contained a point defect. The researchers believe the defect was either a hole in an otherwise filled layer of atoms or one or more interloping atoms of a lighter element such as carbon.

Regardless of the nature of the point defect, the researchers’ ability to detect its presence is significant, demonstrating for the first time that the coordinates of individual atoms and point defects can be recorded in three dimensions.

“We made a big breakthrough,” Miao said.

Miao and his team plan to build on their results by studying how atoms are arranged in materials that possess magnetism or energy storage functions, which will help inform our understanding of the properties of these important materials at the most fundamental scale.

“I think this work will create a paradigm shift in how materials are characterized in the 21st century,” he said. “Point defects strongly influence a material’s properties and are discussed in many physics and materials science textbooks. Our results are the first experimental determination of a point defect inside a material in three dimensions.”

The study’s co-authors include Rui Xu, Chien-Chun Chen, Li Wu, Mary Scott, Matthias Bartels, Yongsoo Yang and Michael Sawaya, all of UCLA; as well as Colin Ophus of Lawrence Berkeley National Laboratory; Wolfgang Theis of the University of Birmingham; Hadi Ramezani-Dakhel and Hendrik Heinz of the University of Akron; and Laurence Marks of Northwestern University.

This work was primarily supported by the U.S. Department of Energy’s Office of Basic Energy Sciences (grant DE-FG02-13ER46943 and contract DE-AC02—05CH11231).

9.  An SDSU chemist has developed a technique to identify potential cancer drugs that are less likely to produce side effects.


A class of therapeutic drugs known as protein kinase inhibitors has in the past decade become a powerful weapon in the fight against various life-threatening diseases, including certain types of leukemia, lung cancer, kidney cancer and squamous cell cancer of the head and neck. One problem with these drugs, however, is that they often inhibit many different targets, which can lead to side effects and complications in therapeutic use. A recent study by San Diego State University chemist Jeffrey Gustafson has identified a new technique for improving the selectivity of these drugs and possibly decreasing unwanted side effects in the future.

Why are protein kinase–inhibiting drugs so unpredictable? The answer lies in their molecular makeup.

Many of these drug candidates possess examples of a phenomenon known as atropisomerism. To understand what this is, it’s helpful to understand a bit of the chemistry at work. Molecules can come in different forms that have exactly the same chemical formula and even the same bonds, just arranged differently. The different arrangements are mirror images of each other, with a left-handed and a right-handed arrangement. The molecules’ “handedness” is referred to as chirality. Atropisomerism is a form of chirality that arises when the spatial arrangement has a rotatable bond called an axis of chirality. Picture two non-identical paper snowflakes tethered together by a rigid stick.

Some axes of chirality are rigid, while others can freely spin about their axis. In the latter case, this means that at any given time, you could have one of two different “versions” of the same molecule.

Watershed treatment

As the name suggests, kinase inhibitors interrupt the function of kinases—a particular type of enzyme—and effectively shut down the activity of proteins that contribute to cancer.

“Kinase inhibition has been a watershed for cancer treatment,” said Gustafson, who attended SDSU as an undergraduate before earning his Ph.D. in organic chemistry from Yale University, then working there as a National Institutes of Health poctdoctoral fellow in chemical biology.

“However, it’s really hard to inhibit a single kinase,” he explained. “The majority of compounds identified inhibit not just one but many kinases, and that can lead to a number of side effects.”

Many kinase inhibitors possess axes of chirality that are freely spinning. The problem is that because you can’t control which “arrangement” of the molecule is present at a given time, the unwanted version could have unintended consequences.

In practice, this means that when medicinal chemists discover a promising kinase inhibitor that exists as two interchanging arrangements, they actually have two different inhibitors. Each one can have quite different biological effects, and it’s difficult to know which version of the molecule actually targets the right protein.

“I think this has really been under-recognized in the field,” Gustafson said. “The field needs strategies to weed out these side effects.”

Applying the brakes

So that’s what Gustafson did in a recently published study. He and his colleagues synthesized atropisomeric compounds known to target a particular family of kinases known as tyrosine kinases. To some of these compounds, the researchers added a single chlorine atom which effectively served as a brake to keep the atropisomer from spinning around, locking the molecule into either a right-handed or a left-handed version.

When the researchers screened both the modified and unmodified versions against their target kinases, they found major differences in which kinases the different versions inhibited. The unmodified compound was like a shotgun blast, inhibiting a broad range of kinases. But the locked-in right-handed and left-handed versions were choosier.

“Just by locking them into one or another atropisomeric configuration, not only were they more selective, but they inhibited different kinases,” Gustafson explained.

If drug makers incorporated this technique into their early drug discovery process, he said, it would help identify which version of an atropisomeric compound actually targets the kinase they want to target, cutting the potential for side effects and helping to usher drugs past strict regulatory hurdles and into the hands of waiting patients.

11.  ‘Nanocubes’ Make PSA Test Over 100 Times More Sensitive


A new catalyst that improves the sensitivity of the standard PSA test more than 100-fold, pictured above, is made of palladium nanocubes coated with iridium. (Credit: Xiaohu Xia, Michigan Technological University)

Say you’ve been diagnosed with prostate cancer, the second-leading cause of cancer death in men. You opt for surgery to remove your prostate. Three months later, a prostate surface antigen (PSA) test shows no prostate cells in your body. Everyone rejoices.

Until 18 months later, when another PSA test reveals that now prostate cells have reappeared. What happened?

The first PSA test yielded what’s known as a false negative result. It did not detect the handful of cells that remained after surgery and later multiplied. Now a chemist at Michigan Technological University has made a discovery that could, among other things, slash the numbers of false negatives in PSA tests.

Xiaohu Xia and his team, including researchers from Louisiana State University and the University of Texas at Dallas, have developed a new catalyst that could make lab tests like the PSA much more sensitive. And it may even speed up reactions that neutralize toxic industrial chemicals before they enter lakes and streams.

A paper on the research, “Pd-Ir Core-Shell Nanocubes: A Type of Highly Efficient and Versatile Peroxidase Mimic,” was published online Sept. 3 in ACS Nano. In addition to Xia, the coauthors are graduate students Jingtuo Zhang, Jiabin Liu and Haihang Ye and undergraduate Erin McKenzie of Michigan Tech; Moon J. Kim and Ning Lu of the University of Texas at Dallas; and Ye Xu and Kushal Ghale of Louisiana State University. The LSU team conducted theoretical calculations, and the UT Dallas team contributed high-resolution electron microscopy images.

Their new catalyst mimics the action of similar biochemicals found in nature, called peroxidases. “In animals and plants, these peroxidases are important– for example, they get rid of hydrogen peroxide, which is harmful to the organism,” said Xia, an assistant professor of chemistry at Michigan Tech. In medicine, peroxidases have become powerful tools for accelerating chemical reactions in diagnostic tests; a peroxidase found in the horseradish root is commonly used in the standard PSA test.

However, these natural peroxidases have drawbacks. They can be difficult to extract and purify. “And, they are made of protein, which isn’t very stable,” Xia explained. “At high temperatures, they cook, like meat.”

“Moreover, their efficiency is just fair,” he added. “We wanted to develop a mimic peroxidase that was substantially more efficient than the natural peroxidase, which would lead to a more-sensitive PSA test.”

Their new catalyst, made from nanoscale cubes of palladium coated with a few layers of iridium atoms, does just that. PSA tests Xia’s team conducted using the palladium-iridium catalyst were 110 times more sensitive than tests completed with the conventional peroxidase.

“After surgery, it’s vital to detect a tiny amount of prostate antigen, because otherwise you can get a false negative and perhaps delay treatment for cancer,” said Xia. “Our ultimate goal is to further refine our system for use in clinical diagnostic laboratories.”

Xia hopes that his mimic peroxidase will someday save lives through earlier detection of cancer and other maladies. He also plans to explore other applications, including how it compares with horseradish peroxidase in other catalytic reactions: breaking down toxic industrial-waste products like phenols into harmless substances.

Finally, the team wants to better understand why its palladium-iridium catalyst works so well. “We know the iridium coating is the key,” Xia said. “We think it makes the surface sticky, so the chemical reagents bind to it better.”

12.  Using Proteomics To Understand How Genetic Mutations Rewire Cancer Cells


SAN JOSE, Calif.–(BUSINESS WIRE)–Thermo Fisher Scientific and the Biotech Research and Innovation Center (BRIC) at the University of Copenhagen (UCPH) have shared results from two important scientific papers that advance understanding of how gene mutations drive cancer progression. The two landmark studies, published this week in the journal ; CELL, are some of the early results of the strategic collaboration between Thermo Fisher Scientific and the Linding Lab at BRIC, UCPH.

Using advanced Thermo Scientific Orbitrap Fusion mass spectrometry and next-generation sequencing technologies, researchers from the Universities of Copenhagen, Yale, Zurich, Rome and Tottori describe how specific cancer mutations target and damage the protein signaling networks within human cells on a global scale.

By developing advanced algorithms to integrate data from quantitative mass-spectrometry and next generation sequencing of tumor samples, the researchers have been able to uncover cancer-related changes to phosphorylation signaling networks. This new breakthrough allows researchers to identify the effects of mutations on the function of protein pathways in cancer for individual patients, even if those mutations are very rare.

Lead BRIC researcher Dr. Rune Linding said: “The identification of distinct changes within our tissues that could have the potential to help predict and treat cancer is a major step forward and we are confident that it can aid in the development of novel therapies and screening techniques.”

Since the human genome was decoded more than a decade ago, large scale cancer genome studies have successfully identified gene mutations in individual patients and tumors. However to develop improved cancer therapies, researchers need to explain and relate this genomic data to proteins, the targets of most pharmaceutical drugs. Creating this linkage provides powerful new insights into cancer biology and potential therapeutic approaches.

“The studies highlight the importance of integrating proteomics with genomics in future cancer studies and underscores the value of the broad technological expertise within Thermo Fisher,” said Ken Miller, vice president of research product marketing, life sciences mass spectrometry at Thermo Fisher. “It is becoming increasingly apparent that the genetic basis for each patient’s cancer is subtly, but importantly, different. This realization will inevitably lead to a need for tools to acquire and assess patient-specific information to develop highly personalized therapies with the potential for much greater efficacy. It is hoped that the novel approaches described in these studies, together with best-in-class enabling technologies such as the Orbitrap and Ion Torrent systems, will continue to improve our knowledge of cancer biology.”

The Biotech Research & Innovation Centre (BRIC) was established in 2003 by the Danish Ministry of Science, Technology and Innovation to form an elite centre in biomedical research.

The two studies will be available in advance online and printed in the 24th September issue of CELL, a premier journal in life and biological sciences. More information about the studies and links to media content can be found on http://www.lindinglab.science and http://www.bric.ku.dk. The work was supported by the European Research Council (ERC), the Lundbeck Foundation and Human Frontier Science Program.

13.  Multi-Ancestry GWAS Uncovers a Dozen New Loci Linked to Blood Pressure

Sep 21, 2015


NEW YORK (GenomeWeb) – In Nature Genetics, an international team described a dozen new loci influencing blood pressure patterns across individuals from multiple populations — a set that overlaps with variants implicated in epigenetic features of blood and other tissues.

Through a multi-stage genome-wide association study that relied on genotyping information for as many as 320,251 individuals of East Asian, South Asian, and European descent, the researchers focused in on SNPs at 12 blood pressure-associated sites in the genome, including loci previously linked to cardiac or metabolic functions.

In particular, the team saw blood pressure-linked variants in and around genes contributing to vascular smooth muscle and renal function. And a large proportion of the associated SNPs — or variants in linkage disequilibrium with them — turned up at sites already implicated in control of DNA methylation.

“We note an effect of genome-wide-associated sentinel SNPs on DNA methylation for traits in addition to blood pressure, suggesting that DNA methylation might have a wider role in linking common genetic variation to multiple phenotypes,” the study’s authors wrote.

More than a billion people around the world are affected by high blood pressure, the team explained, a condition that elevates the risk of heart disease, heart attack, stroke, and chronic kidney disease.

Because it occurs at especially high rates in East Asian and South Asian populations, the investigators reasoned that it might be possible to find both ancestry-specific and trans-ancestral genetic associations with high blood pressure.

The team started by analyzing imputed and directly genotyped SNPs in 31,516 individuals of East Asian ancestry, 35,352 individuals with European ancestry, and 33,126 individuals of South Asian descent, searching for variants associated with systolic blood pressure, diastolic blood pressure, pulse pressure, mean arterial pressure, and hypertension.

Through analyses on each population individually and in a meta-analysis of individuals from all three populations, the researchers initially identified 630 loci with suspected ties to at least one of the five blood pressure traits considered.

They then compared the top SNP at each site against data on as many as 87,205 individuals tested for various blood pressure traits for the International Consortium on Blood Pressure GWAS, narrowing in on 19 loci with potential ties to blood pressure that were not described in the past.

The team confirmed blood pressure associations for SNPs at 12 of the new loci through testing on another 48,268 East Asians, 68,456 Europeans, and 16,328 South Asians.

The analysis also verified almost two-dozen loci linked to blood pressure in the past and pointed to 17 sites in the genome with weaker ties to the traits of interest.

Variants at the 12 new loci seemed to have similar effects on the five traits in question, regardless of the population considered, while variants that first appeared to show population-specific effects in East Asians and Europeans did not pan out in replication testing.

By folding in linkage disequilibrium patterns for SNPs at the new blood pressure-associated sites, the researchers got a look at genes that fall near these linked SNPs — a collection that includes genes such as PDE3A, KCNK3, and PRDM6.

They also used these linkage patterns to look for overlap with DNA methylation-related SNPs, demonstrating that 28 of 35 SNPs at these loci seem to be linked to altered DNA methylation levels and related expression shifts in samples from thousands of Europeans or East Asians.

And the team saw similar effects in hundreds of cord blood samples subjected to methylation profiling, suggesting the effect is not simply a consequence of high blood pressure itself.

“The presence of these associations at an early stage of life, before substantial environmental exposure, lends support to the view that the sequence variants have a direct effect on DNA methylation and argues against reverse causation,” the study authors wrote.

14.  Elabela, A New Human Embryonic Stem Cell Growth Factor

September 20, 2015 by mburatov

When embryonic stem cell lines are made, they are traditionally grown on a layer of “feeder cells” that secrete growth factors that keep the embryonic stem cells (ESCs) from differentiating and drive them to grow. These feeder cells are usually irradiated mouse fibroblasts that coat the culture dish, but do not divide. Mouse ESCs can be grown without feeder cells if the growth factor LIF is provided in the medium. LIF, however, is not the growth factor required by human ESCs, and therefore, designing culture media for human ESCs to help them grow without feeder cells has proven more difficult.

Having said that, several laboratories have designed media that can be used to derive human embryonic stem cells without feeder cells. Such a procedure is very important if such cells are to be used for therapeutic purposes, since animal cells can harbor difficult to detect viruses and unusual sugars on their cell surfaces that can also be transferred to human ESCs in culture. These unusual sugars can elicit a strong immune response against them, and for this reason, ESCs must be cultivated or derived under cell-free conditions. However, to design good cell-free culture media, we must know more about the growth factors required by ESCs.

To that end, Bruno Reversade from The Institute of Molecular and Cell Biology in Singapore and others have identified a new growth factor that human ESCs secrete themselves. This protein, ELABELA (ELA), was first identified as a signal for heart development. However, Reversade’s laboratory has discovered that ELA is also abundantly secreted by human ESCs and is required for human ESCs to maintain their ability to self-renew.

Reversade and others deleted the ELA gene with the CRISPR/Cas9 system, and they also knocked the expression of this gene down in other cells with small interfering RNAs. Alternatively, they also incubated human ESCs with antibodies against ELA, which neutralized ELA and prevented it from binding to the cell surface. However Ela was inhibited, the results were the same; reduced ESC growth, increased amounts of cell death, and loss of pluripotency.

How does ELA signal to cells to grow? Global signaling studies of growing human ESCs showed that ELA activates the PI3K/AKT/mTORC1 signaling pathway, which has been show in other work to be required for cell survival. By activating this pathway, ELA drives human ESCs through the cell-cycle progression, activates protein synthesis, and inhibits stress-induced apoptosis.

Interestingly, INSULIN and ELA have partially overlapping functions in human ESC culture medium, but only ELA seems to prime human ESCs toward the endoderm lineage. In the heart, ELA binds to the Apelin receptor APLNR. This receptor, however, is not expressed in human ESCs, which suggests that another receptor, whose identity remains unknown at the moment, binds ELA in human ESCs.

Thus ELA seems to act through an alternate cell-surface receptor, is an endogenous secreted growth factor in human

This paper was published in the journal Cell Stem Cell.

15.  Multiwavelength TIRF Microscopy Enables Insight into Actin Filaments


Researchers at the University of California, San Francisco (UCSF) are combining multiple laser excitation wavelengths in total internal reflection fluorescence (TIRF) microscopy to investigate the binding dynamics of individual actin filaments.


TIRF microscopy provides a unique method of imaging isolated molecules and complexes in vitro. Additionally, the use of sensitive, low-noise cameras enables researchers to study this behavior in real time. A new plug-and-play method of combining several fiber-delivered, digitally modulated lasers into a single instrument, such as a TIRF microscope, now enables multiple labeled proteins to be imaged pseudosimultaneously at high frame rates. This article explores how multiwavelength excitation is being combined with TIRF microscopy in the laboratory of Dr. Dyche Mullins, a professor at UCSF, and how it’s being used to gain new insights into complex biochemical interactions that control the stability and function of actin filaments.

TIRF microscopy in single-filament studies

The Mullins Lab, located at UCSF’s Mission Bay campus, is widely recognized as a leading authority on the study of actin filaments. The protein filaments are fundamental to many processes in virtually every eukaryotic cell — they act as structural elements that enable movement of internal cargoes, amoeboid cell migration, cell division, etc. With these filaments playing so many different roles, it is not surprising that their combination of growth, branching, aggregation and movement involves many subtle control options, which are mediated by a range of different proteins. Sam Lord, the Mullins Lab’s microscope specialist, said, “One area of our research is studying how various proteins bind to actin filaments to enable aggregation, branching and other actions, and more specifically, how yet another set of proteins modulates these binding processes. Obviously, we do bulk studies in a cuvette that reveal overall kinetic data about these binding processes but we also want to image these processes in real time to study the structural biochemistry.” In order to do so, the lab uses TIRF microscopy to observe single actin filaments.

This process involves excitation light that is introduced into the sample region through either a glass slide or a cover slip. The microscope’s optics are configured so that the light hits the glass/sample interface beyond the critical angle, meaning that all of the light will undergo total internal reflection (TIR). However, even with TIR, some of the light’s electric field, called the evanescent wave, penetrates into the sample by an incredibly short distance — typically around 100 nm — beyond the interface. This means that TIRF microscopy can be used to selectively excite fluorescence in molecules and complexes that are adhered to the interface. However, because the light does not penetrate into the bulk (i.e., background) sample region, this methodology will not excite fluorescence from the huge backdrop of molecules freely floating within this medium.

TIRF microscopy is thus a 3D-resolved imaging technique. Its X-Y resolution is limited only by diffraction and/or the camera resolution, but the Z-axis sampling depth is much smaller than the diffraction limit. If there is sufficient signal for fast frame acquisition speeds, the important fourth dimension — time — enables dynamic processes, such as actin filament-protein binding, to be observed on a single filament or on a network of filaments, in real time.

In principle, both laser and nonlaser light sources may be used for fluorescence excitation in such TIRF-based applications. However, for experiments with naturally low signal levels, such as single-molecule monitoring, a laser beam’s extreme brightness is a critical advantage. In particular, a laser’s unique spatial brightness means that it is relatively simple to collimate and subsequently focus the beam into the sample with a narrow range of incidence angles, avoiding excitation of the bulk sample.

Through-objective TIRF microscopy

All TIRF microscope setups are based on one of two basic approaches: through-objective lens geometry or the prism-based method. In the former approach, light is directed in an off-axis geometry through an oil-immersion microscope objective so that the angle of incidence at the coverslip/sample interface is greater than the critical angle, as is shown schematically in Figure 1.

Figure 1.
 In TIRF microscopy, excitation light beyond the critical light is completely reflected. The evanescence of the light field at the refractive interface penetrates into the sample by about 100 nm, causing selective excitation of molecules and complexes adhered to this interface. TIRF microscopes are available with a choice of either through-objective excitation or prism excitation options.
In the prism-based method, the orientation of the sample is reversed with respect to the imaging objective. A light beam is introduced to the sample through a prism attached to the cover slip; the geometry of the prism ensures that the incidence angle at the sample is greater than the critical angle.

Depending on the type of experiment being performed, there are both advantages and disadvantages to each of the above methods. For example, the prism method limits physical access to the sample. As Lord explained, the Mullins Lab uses a Nikon microscope in the through-objective configuration with a very high numerical aperture (NA = 1.49) for several reasons. “For single-molecule studies, fluorescence signal strength is always a major challenge, particularly since we are following processes that need fast frame rates. So, we need a high-NA objective with a small working distance to maximize light collection efficiency. These objectives require a coverglass of precise thickness and the sample near the top of the coverslip to minimize aberrations.” Lord also stated that caution must be taken so the team does not introduce scattering and other losses due to viewing fluorescence through the bulk of the sample.

Multiple, simultaneous laser wavelengths

As has been noted, the mechanisms controlling the binding of regulatory proteins to actin filaments are quite complex. To better understand these processes, the Mullins Lab increasingly has been using sophisticated, multiwavelength TIRF-based experiments. In order to image multiple fluorophores, Lord explained, “We can use either multiple sequenced lasers or a scope equipped with multiple cameras — we have setups for both arrangements.” He continued, “Multiple excitation wavelengths that sequence at high rates enable us to selectively image multiple, differently labeled targets using a microscope equipped with a single high-sensitivity camera, and ensures near-perfect image registration.”

When using multiple lasers, the two technical challenges are to perfectly coalign the lasers into the microscope objective and then to be able to switch between different wavelengths. In order to follow fast binding processes in real time, researchers typically must switch wavelengths between alternate camera frames to build up pseudosimultaneous (i.e., interleaved) videos at two or sometimes three laser wavelengths. This switching must be performed with no undesirable dead time (i.e., shifts in the beam path) and without using mechanical shutters or a complex and costly approach, such as an acousto-optic tunable filter.

Lord notes, “As recently as five years ago, we simply didn’t have low-cost options to conduct single molecule studies using multiple laser wavelengths pseudosimultaneously at the requisite frame rates (30 fps) in order to follow critical binding processes.” He added, “Digitally controllable diode or solid-state lasers, hardware sequencing electronics and quad-band optical filters make it possible to achieve nearly simultaneous multicolor imaging with a single camera.”

In 2014, the lab acquired several digitally modulatable smart lasers to enable multiwavelength TIRF microscopy. These lasers included Coherent’s fiber-pigtailed OBIS FP modules that operate at 488, 561 and 640 nm. The lab also acquired the OBIS Galaxy, which enables simple plug-and-play combining of up to eight fiber-coupled lasers into one, single-mode output fiber. As was detailed by Coherent’s Dan Callen and Matthias Schulze in BioPhotonics’ November 2014 issue (“Laser Combiner Enables Scanning Fluorescence Endoscopy,” www.photonics.com/A56915), this passive module enables lasers to be added or subtracted (i.e., hot-swapped) to any fiber-coupled instrument or setup in a few minutes or less via standard fiber connectors, such as FC/UFC and FC/APC connectors.

Figure 2. The OBIS Galaxy (shown with the top cover removed) allows plug-and-play combining of up to eight separate fiber coupled lasers into a single output fiber. Courtesy of Sam Lord.
The timing hardware setup at the Mullins Lab is very simple in design due to the fact that these smart lasers support direct digital modulation. In each experiment, the frame rate is set by the microscope’s high-sensitivity camera, which is an Andor DU897. The camera’s TTL output trigger pulses are processed in either a programmable Arduino board or an ESio controller, which then directs TTL pulses to fire one of the three lasers without any hardware or software delays. Alternating wavelengths typically are used in most experiments, although any sequence of wavelength frames easily can be programed using the Arduino and Micro-Manager software.1

According to Lord, the flexibility of this arrangement supports future experimental setups that have even greater levels of complexity. In particular, he added, “We may well add a 405-nm laser option in the near future. If/when this arrives, we can simply plug it in and we are ready to go.”

Investigating modulation of actin binding processes

In the team’s work on the binding of actin filaments, this flexible TIRF setup enables the Mullins Lab to conduct experiments with several different approaches. For example, in typical two-wavelength experiments, the actin filament is labeled with one fluorophore, and the protein of interest is labeled with another fluorophore. The protein fluorophore only appears in the TIRF-produced images if/when it binds to the actin sitting on the cover slip. One use of the third wavelength is to image a second protein, which is labeled with a different fluorophore. The image sequences then may reveal, for example, whether the proteins are interspersed at different sites on the filament, or whether the second protein promotes filament growth or branching from a new site. Or, it may reveal that the second protein competitively displaces the first.

In a recently published study,2 Mullins Lab researchers used their multilaser TIRF setup to investigate the details of control mechanisms associated with the binding of tropomyosins to actin filaments. Tropomyosins are coiled-coil proteins whose known functions are to bind actin filaments and thereby regulate multiple cytoskeletal functions — including actin network dynamics near the leading edge of motile cells.

Mullins explained, “The binding of tropomyosins to actin filaments is known to be fundamentally important in actin dynamics. But, we do not yet fully understand how this binding is regulated, especially near the leading edge of migrating cells. Why, for example, are filaments in the lamellum coated with tropomyosin while filaments in the adjacent lamellipod are not?” (Lamellum and lamellipod are distinct, actin-based substructures involved in cell migration.) He went on to state that, prior to his team’s latest studies, previous research demonstrated that tropomyosins inhibit actin nucleation by the Arp2/3 protein complex and that this, in turn, prevented filament severing by the protein cofilin.3,4 “So, we have recently used TIRF and other methods to investigate if and how the Arp2/3 complex and cofilin in turn modulate the binding of tropomyosins to actin filaments,” he said.

Figure 3.
 TIRF images showing Tm1A binding preferentially to the pointed end of single actin filaments. The red signal is from Cy5 labeled Tm1A fluorescence excited at 640 nm, and the green signal is due to Alexa 488 labeled actin excited at 488 nm. Courtesy of J.Y. Hsiao, L.M. Goins, N.A. Petek, R.D. Mullins.
The team members studied these interactions in the specific case of nonmuscle Drosophila tropomyosin protein, Tm1A. They also compared some of these interactions in Tm1A to the same interactions in rabbit skeletal muscle tropomyosin, as other researchers previously have found that mammalian skeletal muscle tropomyosin is the least-effective Arp2/3 inhibitor.2

Data from dual-wavelength excitation produced by TIRF microscopy methodology when applied to single filaments is shown in Figure 3. This information shows that Tm1A preferentially binds near the pointed end of actin filaments. By comparing similar data that resulted from different experimental conditions, the researchers showed that pointed-end binding is dependent on the nucleotide state of the actin and the Tm1A concentration.

Although a complete evaluation of all of the research’s results, conclusions and wider implications falls outside the scope of this article, Mullins does summarize some of the key points. “Binding of cyto-skeletal tropomyosin to actin filaments turns out to be more complicated than previously appreciated. Both nucleation and spreading of tropomyosin are strongly influenced by the conformation of the actin filament and the presence of other regulatory proteins.” Mullins added that, based on TIRF-produced images and other collected data, “We have been able to propose a model where the cooperation of the severing activity of cofilin and tropomyosin binding helps establish the border between the lamellipod and lamellum.” The role of cofilin in the model referenced by Mullin is shown in Figure 4.

Figure 4.
 These images summarize the role of cofilin in the model proposed by Hsaio et al. [ref]. The branched actin network on the left shows the situation in the absence of cofilin, where tropomyosin binding is blocked by Arp2/3 branches. The branched actin network on the right illustrates that in the presence of cofilin, new pointed ends are created, which allows tropomyosin to bind. Once tropomyosin is bound, it protects the actin filaments from further cofilin severing, possibly resulting in the transition from the lamellipod to the lamellum. Courtesy of J.Y. Hsiao, L.M. Goins, N.A. Petek, R.D. Mullins.
In summation, TIRF microscopy is a well-established technique for imaging single molecular structures and protein complexes. This method also enables their respective dynamics to be observed in real time. By providing a method to rapidly switch between two or more excitation wavelengths, the latest lasers and laser-combining technologies are now enabling researchers to perform TIRF microscopy experiments with a greater number of separate labels. This capability is delivering unique insights into important and multifaceted processes in the study of cell biology.

Meet the author

Dan Callen is a product manager at Coherent Inc. in Santa Clara, Calif.; email: daniel.callen@coherent.com.


1. A.D. Edelstein et al. (2014). Advanced methods of microscope control using μManager software. J Biol Methods, Vol. 1, No. 2, e10.

2. J.Y. Hsiao et al. (2015). Arp2/3 complex and cofilin modulate binding of tropomyosin to branched actin networks. Curr Biol, pp. 1-10.

3. L. Blanchoin et al. (2001). Inhibition of the Arp2/3 complex-nucleated actin polymerization and branch formation by tropomyosin. Curr Biol, Vol. 11, No. 16, pp. 1300-1304.

4. J.H. Iwasa and R.D. Mullins (2007). Spatial and temporal relationships between actin-filament nucleation, capping and disassembly. Curr Biol, Vol. 17, No. 5, pp. 395-406.

18.  Using QCLs for MIR-Based Spectral Imaging — Applications in Tissue Pathology


A quantum cascade laser (QCL) microscope allows for fast data acquisition, real-time chemical imaging and the ability to collect only spectral frequencies of interest. Due to their high-quality, highly tunable illumination characteristics and excellent signal-to-noise performance, QCLs are paving the way for the next generation of mid-infrared (MIR) imaging methodologies.


H. Sreedhar*1, V. Varma*2, A. Graham3, Z. Richards1, F. Gambacorata4, A. Bhatt1,
P. Nguyen1, K. Meinke1, L. Nonn1, G. Guzman1, E. Fotheringham5, M. Weida5,
D. Arnone5, B. Mohar5, J. Rowlette5
Real-time, MIR chemical imaging microscopes could soon become powerful frontline screening tools for practicing pathologists. The ability to see differences in the biochemical makeup across a tissue sample greatly enhances a practioner’s ability to detect early stages of disease or disease variants. Today, this is accomplished much as it was 100 years ago — through the use of specially formulated stains and dyes in combination with white light microscopy. A new MIR, QCL-based microscope from Daylight Solutions enables real-time, nondestructive biochemical imaging of tissues without the need to perturb the sample with chemical or heat treatments, thus preserving the sample for follow-on fluorescence tagging, histochemical staining or other “omics” testing within the workflow.
MIR chemical imaging is a well-established absorbance spectroscopy technique; it senses the relative amount of light that molecules absorb due to their unique vibrational resonances falling within the MIR portion of the electromagnetic spectrum (i.e., wavelengths from approximately 2 to 15 µm). This absorption can be detected with a variety of MIR detector types and can provide detailed information about the sample’s chemical composition.

The most common instrument for this type of measurement is known as a Fourier transform infrared (FTIR) spectrometer. FTIR systems use a broadband MIR light source, known as a globar, to illuminate a sample; the absorption spectrum is generated by the use of interferometry. Throughout the past decade, FTIR systems have incorporated linear arrays and 2D focal plane arrays (FPAs) in a microscope configuration to enable a technique known as chemical imaging.

19.  Inner Ear Undertakers

Support cells in the inner ear respond differently to two drugs that kill hair cells.

By Kerry Grens | September 1, 2015


The paper
E.L. Monzack et al., “Live imaging the phagocytic activity of inner ear supporting cells in response to hair cell death,” Cell Death Differ, doi:10.1038/cdd.2015.48, 2015.

Killer drugs
A number of commonly used medications can cause hearing loss by killing off cochlear hair cells, which translate sound waves into neural activity. To understand how they die, Lisa Cunningham and Elyssa Monzack of the National Institute on Deafness and Other Communication Disorders and colleagues turned to the utricle, a vestibular inner-ear structure involved with balance whose hair cells are very similar to those in the cochlea, which are notoriously resistant to culturing when mature.

Body bags
The team developed a method to watch hair cells of whole mouse utricles die in real time after exposure to the chemotherapy drug cisplatin or the antibiotic neomycin. In response to the latter, supporting cells, glia-like neighbors of hair cells, appeared to form a phagosome around the corpses and engulf them. “You can see two, three, sometimes four supporting cells advancing simultaneously on that hair cell corpse,” says Cunningham—which suggests that the dying cell is giving off a specific and local signal.

Spilled guts
In contrast, cisplatin-induced hair cell death provoked hardly any phagocytic reaction from supporting cells, about half of which themselves succumbed. Cunningham says this could have clinical implications if dead hair cells then spill their cytoplasmic contents into the tissue, which can result in an immune response that can cause even further damage.

Distress call
Mark Warchol of Washington University in St. Louis says it will be important to identify the signal supporting cells are responding to after neomycin treatment. “There’s some molecular signal by which the hair cell causes [supporting cells] to execute this process. And with cisplatin, they’re just not capable of doing it.”


utriclesensory biologyphagocytosishearinghair cellearcell death and cell & molecular biology

20.  Inner Ear Cartography

Scientists map the position of cells within the organ of Corti.

By Ruth Williams | September 1, 2015


Age-related hearing loss caused by damage to the sensory hair cells within the cochlea is extremely common, but studying the inner ear is tough. “It’s in the densest bone in the body, so you don’t have access,” says John Brigande of Oregon Health and Science University in Portland. Even if you can extract cells, he says, “there are so darn few of them.”

Despite these technical difficulties, researchers have gleaned gene-expression information about different cell types within the organ of Corti—home to the sensory cells within the cochlea. But “it’s not only important to know what a cell expresses,” says Robert Durruthy-Durruthy, a postdoc in the Stanford University lab of Stefan Heller. “It’s also important to know where it can be found within a tissue.”

To this end, Durruthy-Durruthy, Heller, and postdoc Jörg Waldhaus have derived a 2-D map of organ of Corti cells from neonatal mice. First, the team sorted all cell types across the medial-to-lateral axis (or width) of the organ based on marker gene expression. The approximately 900 sorted cells, representing nine cell types, were then each quantitatively analyzed for the expression of 192 selected genes. Computational analysis of these expression data then enabled reconstruction of the cells’ positions along the organ’s apical-to-basal (length) and medial-to-lateral axes. In principle, the technique, which harnesses gene-expression information to determine cells’ spatial organization, could be applied to generate 2-D maps of any complex tissue, says Durruthy-Durruthy.

Within the mammalian cochlea, apical cells retain regenerative capacity for a few weeks after birth, but basal cells do not. “Spatial mapping allows us to get at the differences [between these cells],” says Brigande, and that could ultimately highlight possible ways to reinstate regeneration in the adult ear. (Cell Reports, 11:1385-99, 2015)

  FROM ORGAN TO SINGLE CELLS: To build a map of cells within the organ of Corti—where sound is translated to neural activity—scientists divide the cochlea in two. Each half of the organ of Corti is then broken up into its constituent cells, which comprise nine cell types (represented by the nine colors) spanning the organ’s edial-to-lateral axis.


21.  Resveratrol Stabilizes Amyloid in Alzheimer’s

Pauline Anderson

September 17, 2015


High doses of purified resveratrol, a polyphenol found in some foods, appear to stabilize levels of amyloid beta (Aβ) in cerebrovascular fluid (CSF) and in plasma in patients with mild to moderate Alzheimer disease (AD) and are safe and well tolerated, a new phase 2 study has shown.

Although it is too soon to start recommending resveratrol supplements to patients, the research indicates that this compound is safe and is promising, lead author R. Scott Turner, MD, PhD, professor, Neurology, and director of the Memory Disorders Program, Georgetown University Medical Center, Washington, DC, one of 21 medical centers across the United States participating in the study.

“It seems to have some interesting effects, enough to justify further research into this strategy,” he told Medscape Medical News.

The study was published online September 11 as an Open Access article in Neurology.

Natural Compound

Resveratrol is a naturally occurring compound found in red grapes, red wine, dark chocolate, and some other foods, and is widely available as a supplement.

It is believed that resveratrol promotes resilience to stress, as levels increase in plants exposed to severe cold or to fungus, said Dr Turner. Animal research suggests that resveratrol may affect sirtuins, which are proteins that are activated with calorie restriction, which is a form of mild stress.

The study included 119 patients randomly assigned to either high doses of pure synthetic pharmaceutical grade resveratrol that is not available commercially (n = 64) or placebo (n = 55). The resveratrol used in the study was introduced at a dose of 500 mg a day and was increased every 3 months, so that by the end of the 1-year study, subjects were taking 2000 mg a day.

Results showed that at 1 year, the treated group’s levels of Aβ40 in CSF declined from 6574 to 6513 ng/mL, but in the placebo group, these levels went from 6560 to 5622 ng/mL, for a statistical difference at week 52 (P = .002).

This difference was also found in secondary analyses of study completers, in the mild dementia subgroup, and in APOE4 carriers and noncarriers.

The treated group’s Aβ40 levels in plasma declined from 163 to 153 ng/mL, and in the placebo group, these levels went from 165 to 132 ng/mL (for a statistical difference; P = .024).

“We can’t prove efficacy from this trial, but we’re looking for some movement in biomarkers, and we actually found that,” which is promising, said Dr Turner. “The major movement we found was in amyloid proteins in blood and CSF that were stabilized by resveratrol treatment compared to placebo, where it trended downhill, which is what happens with Alzheimer’s disease.”

This downhill trend could signal more amyloid being deposited into the brain. In contrast, that resveratrol seemed to stabilize Aβ40 in the CSF and plasma suggests the drug was able to penetrate the blood–brain barrier.

Unfortunately, said Dr Turner, the study could not fund amyloid positron emission tomography scans, which might have shed more light on the Aβ status of subjects.

Although there were no significant effects of the treatment on other amyloid biomarkers, including CSF and plasma Aβ42, trends were similar to the findings with Aβ40.

There was no difference in CSF tau. There was a trend toward an increase in CSF phospho-tau 181 with treatment (P = .08) and in secondary analysis of mild dementia (P = .047).

As for brain volume determined through magnetic resonance imaging (MRI), results showed that volumes declined more in the treatment group (going from 866 to 839 mL) than in the placebo group (going from 850 to 840 mL). This result was “mysterious” and “unexpected,” said Dr Turner.

However, he noted that the same effect has been reported in other AD trials, including those investigating immunotherapy. “The working hypothesis is that by treating AD, we are also decreasing the amount of inflammation and swelling in the brain.”

The study showed no significant effects on the mini mental state exam or on other clinical scales, but the researchers note that the phase 2 trial was not powered to detect differences in clinical outcomes.

However, they did find that the activities of daily living scale declined less in the resveratrol group than in the placebo group. “That’s also promising, because even with this phase 2, we are seeing what we think might be a clinical benefit,” said Dr Turner

A total of 657 adverse events were reported (355 in the treatment and 302 in the placebo groups), most of which were mild. The most common adverse events were gastrointestinal-related and included nausea and diarrhea.

Weight Loss

The placebo group gained about 1 pound of body weight, whereas the treated group lost almost 2 pounds. At the end of the study, the mean body mass index in the placebo group was 26.1, and in the treated group, it was 25.4.

“The weight loss is concerning, because Alzheimer disease itself causes weight loss, and we don’t want people to continue to lose weight,” said Dr Turner.

It is not clear whether the weight loss was a result of the adverse effects of diarrhea, nausea, and so on, or because of some metabolic effect.

Interestingly, six of the seven new neoplasms seen in study participants occurred in those taking placebo. This is of great interest to cancer researchers, said Dr Turner, adding that resveratrol and similar compounds are being tested in many age-related disorders, including diabetes and neurodegenerative disorders, as well as AD and cancer.

None of the 36 serious adverse events (19 on the drug and 17 on placebo), including three deaths, were deemed to be related to treatment.

Commenting on this study for Medscape Medical News, James Hendrix, PhD, director, Global Science Initiatives, Alzheimer’s Association, said that although the finding that resveratrol might stabilize Aβ40 is encouraging, the study needs to be followed up with a larger and longer phase 3 trial.

“The main focus of this study, and the main question it addressed, was whether a dose at such a high level is safe, and with the exception of some [gastrointestinal] discomfort for some people, it appears to be mostly safe.”

Dr Hendrix noted that the high dose used in the study is equivalent to 1000 bottles of red wine.

He pointed out that the study was relatively small, with 56 subjects completing the study in the treatment group, and only 48 in the placebo group.

The research was supported by a grant from the National Institute on Aging. Dr Turner reports no personal financial interests related to the study. Dr Hendrix is an employee of the Alzheimer’s Association, which has funded resveratrol grants in the past, but did not fund this study.

Neurology. Published online September 11, 2015. Full text


A randomized, double-blind, placebo-controlled trial of resveratrol for Alzheimer disease ABSTRACT Objective: A randomized, placebo-controlled, double-blind, multicenter 52-week phase 2 trial of resveratrol in individuals with mild to moderate Alzheimer disease (AD) examined its safety and tolerability and effects on biomarker (plasma Ab40 and Ab42, CSF Ab40, Ab42, tau, and phospho-tau 181) and volumetric MRI outcomes (primary outcomes) and clinical outcomes (secondary outcomes). Methods: Participants (n 5 119) were randomized to placebo or resveratrol 500 mg orally once daily (with dose escalation by 500-mg increments every 13 weeks, ending with 1,000 mg twice daily). Brain MRI and CSF collection were performed at baseline and after completion of treatment. Detailed pharmacokinetics were performed on a subset (n 5 15) at baseline and weeks 13, 26, 39, and 52. Results: Resveratrol and its major metabolites were measurable in plasma and CSF. The most common adverse events were nausea, diarrhea, and weight loss. CSF Ab40 and plasma Ab40 levels declined more in the placebo group than the resveratrol-treated group, resulting in a significant difference at week 52. Brain volume loss was increased by resveratrol treatment compared to placebo. Conclusions: Resveratrol was safe and well-tolerated. Resveratrol and its major metabolites penetrated the blood–brain barrier to have CNS effects. Further studies are required to interpret the biomarker changes associated with resveratrol treatment. Classification of evidence: This study provides Class II evidence that for patients with AD resveratrol is safe, well-tolerated, and alters some AD biomarker trajectories. The study is rated Class II because more than 2 primary outcomes were designated. Neurology® 2015;85:1–9

Caloric restriction prevents aging-dependent phenotypes1 and activates sirtuins (including SIRT1), a highly conserved family of deacetylases that are regulated by NAD1/NADH and thus link energy metabolism to gene expression.2 SIRT1 substrates include FOXO and PGC- 1a. 3 A screen of SIRT1 activators identified resveratrol (trans-3,49,5-trihydroxystilbene) as a potent compound.4 Similar to caloric restriction,5,6 resveratrol decreases aging-dependent cognitive decline and pathology in Alzheimer disease (AD) animal models.7,8

Resveratrol is under investigation to prevent age-related disorders including cancer, diabetes mellitus, and neurodegeneration.4,9–12 Due to its low bioavailability but high bioactivity,13,14 we increased the dose to the maximal amount considered safe and well-tolerated for this study.15 We conducted a randomized, placebocontrolled, double-blind, multicenter 52- week phase 2 trial of resveratrol in individuals with mild to moderate AD. The primary objectives were to (1) assess the safety and tolerability of resveratrol; (2) assess effect on plasma and CSF Ab42 and Ab40, CSF tau and phospho-tau 181, and volumetric MRI; and (3) examine pharmacokinetics. The secondary objectives were to (1) explore the effects of resveratrol on cognitive, functional, and behavioral outcomes; (2) examine the influence of APOE genotype; and (3) determine whether resveratrol affects insulin and glucose metabolism. We hypothesized that resveratrol would alter AD biomarker trajectories.

RESULTS A total of 179 participants were screened, of whom 60 were not randomized (50 screen-failed and 10 withdrew consent). Participants (119) were randomized as shown (figure 1). A total of 104 completed the study (12.6% dropout), and 77 completed 2 CSF collections (34% dropout). Eighteen participants discontinued treatment early and 15 discontinued the study. The population was English-speaking, 57% female, and 91% Caucasian.

Safety and tolerability. No differences between the resveratrol and placebo-treated groups were found on vital signs, physical examinations, or neurologic examinations. Routine laboratory tests were normal. A total of 657 AEs (490 mild, 139 moderate, 28 severe) were reported (355 on drug, 302 on placebo) (table 2). A total of 113 out of 119 (95%) participants reported at least 1 AE. The most common AEs were nausea and diarrhea (in 42% of individuals with drug vs 33% with placebo, p 5 0.35). Few participants reported nausea and diarrhea—the most likely drug-related AE—that led to treatment discontinuation, a treatment plateau at a lower dosage, or study discontinuation (figure 1). The placebo group gained 0.54 6 3.2 kg body weight, while the treated group lost 0.92 6 4.9 kg (mean 6 SD, p 5 0.038) resulting in a difference in body mass index (BMI). The treated group’s BMI was 25.4 6 4.0 vs the placebo group’s 26.1 6 4.1 at week 52 (mean 6 SD, p 5 0.047). Thirty-six serious AEs (SAEs) were reported (19 on drug, 17 on placebo) including 27 hospitalizations (14 on drug, 13 on placebo) and 3 deaths (1 on drug, 2 on placebo)—none study drug-related. There were no differences in participants who experienced at least one SAE (20.3% on drug, 18.2% on placebo), at least one hospitalization (18.8% drug, 16.4% placebo), or died (1.6% drug, 3.6% placebo). Seven new neoplasms were reported (1 on drug, 6 on placebo, p , 0.048) (table 2). Retrospective review of the brain MRIs of a placebo-enrolled participant with malignant glioma, which resulted in death, revealed that the tumor was present at screening. Two participant deaths were due to lung melanoma (placebo group) and drowning (drug group).

AD duration (from year of symptom onset), y, mean (SD)      Resv 3.9 (2.3)        Placebo 5.5 (2.6)     <0.001

Outcomes. At week 52, the treated group’s CSF Ab40 declined from 6,574 6 2,346 to 6,513 6 2,279 ng/mL and from 6,560 6 2,190 to 5,622 6 1,736 ng/mL with placebo, resulting in a difference at week 52 (mean 6 SD, p 5 0.002) (figure 2A). This difference was also found in secondary analyses of study completers (p 5 0.002), in the mild dementia subgroup (p 5 0.01), and in APOE4 carriers (p 5 0.05) and noncarriers (p 5 0.01) (table e-2). During the study, the treated group’s plasma Ab40 (figure 2B) declined from 163 6 58 to 153 6 54 ng/mL and from 165 6 55 to 132 6 54 ng/mL with placebo (mean 6 SD, p 5 0.024). Secondary analyses by APOE4 genotype revealed an effect of treatment on plasma Ab40 in APOE4 carriers (p 5 0.04) but not noncarriers (table e-2). There were no effects on CSF Ab42 or plasma Ab42 (figure 2, C and D), although trends were similar to Ab40. There was no difference in CSF tau and a trend toward an increase in CSF phospho-tau 181 with treatment (p 5 0.08), and in a secondary analysis of mild dementia (p 5 0.047) (data not shown). Volumetric MRIs revealed that brain volume (excluding CSF, brainstem, and cerebellum) declined more in the treatment group (p 5 0.025) with an increase in ventricular volume (p 5 0.05) at week 52 (figure 3, A and B). In the treatment group, brain volume decreased from 866 6 84 to 839 6 85 mL and ventricular volume increased from 55 6 24 to 81 6 24 mL (mean 6 SD). With placebo, brain volume decreased from 850 6 99 to 840 6 93 mL and ventricular volume increased from 56 6 19 to 76 6 25 mL (mean 6 SD). Secondary analyses revealed that brain volume declined with treatment in APOE4 carriers (p 5 0.02) but not noncarriers (table e-2). Similar results were found with ventricular volume, which increased with treatment in APOE4 carriers (p 5 0.05) but not noncarriers. This phase 2 trial (underpowered to detect differences in clinical outcomes) found no significant effects on CDR-SOB, ADAS-cog, MMSE, or NPI. The drugtreated group’s ADCS-ADL declined from 63.7 6 10.8 to 57.4 6 12.3 and from 60.5 6 10.7 to 51.3 6 14.5 in the placebo group (mean 6 SD, p 5 0.03), indicating less decline with treatment. No drug effects were found with plasma glucose or insulin metabolism (data not shown). We also analyzed (post hoc) the subset of individuals with CSF Ab42 ,600 ng/mL at baseline as a proxy of AD amyloid pathology. At week 52, differences between treatment groups persisted for CSF Ab40 (p 5 0.001, total n 5 70) and plasma Ab40 (p 5 0.02, n 5 83). In this analysis, we also found a treatment effect on CSF Ab42 (p 5 0.02, n 5 70) but lost significance in brain volume loss (p 5 0.06, n 5 83) and ADCSADL (p 5 0.055, n 5 88).

DISCUSSION High-dose oral resveratrol is safe and well-tolerated. The most common AEs were nausea and diarrhea, but results were similar to placebo. Weight and fat loss with resveratrol are reported in some preclinical studies,4 but human studies are scarce and of shorter duration. A decrease in body fat and a trend toward weight loss were reported in a 26- week trial with 200 mg/day resveratrol in healthy older participants.33 Weight and fat loss may be related to enhanced mitochondrial biogenesis mediated by SIRT1 activation of PCG-1a. 4,10,11 Ab levels declined as dementia advanced. The altered CSF Ab40 trajectory suggests that the drug penetrated the blood–brain barrier to have central effects. At week 52, the mean CSF levels of resveratrol, 3G-RES, 4G-RES, and S-RES were 3.3%, 0.4%, 0.4%, and 0.3%, respectively, of plasma levels at the same study visit. At the highest dosage, low mM levels of resveratrol and its metabolites were measured in plasma, with corresponding low nM levels found in CSF. Resveratrol has many targets, with some engaged at uM concentrations.4 These findings suggest that a central molecular target may be engaged at nM concentrations. In addition to anti-inflammatory, antioxidant, and anti-Ab aggregation, putative targets include sirtuin activation with enhanced a-cleavage of amyloid precursor protein34 and promotion of autophagy.35 Further studies of banked CSF, plasma, pellets, DNA, and blood mononuclear cells from participants will examine mechanisms.

Resveratrol treatment increased brain volume loss. This finding persisted when participants with weight loss (table 2) were excluded (data not shown). The etiology and interpretation of brain volume loss observed here and in other studies are unclear, but they are not associated with cognitive or functional decline. In the first human active Ab immunization trial, antibody responders had greater brain volume loss, and greater volumetric changes were associated with higher antibody titers.36 In the phase 2 bapineuzumab trial, treatment resulted in greater ventricular enlargement, but only in APOE4 carriers.37 In the phase 3 bapineuzumab APOE4 carrier trial and the high-dose noncarrier study, treatment resulted in a trend toward greater brain atrophy.38 Since this phase 2 study lacks consistent changes in clinical outcomes, interpretation of the effects on trajectories for plasma and CSF Ab40, and brain and ventricular volume, remain uncertain.

Resveratrol altered levels of CSF Ab40 (A) and plasma Ab40 (B) (ng/mL, mean 6 SE). Similar but nonsignificant trends were found for CSF Ab42 (C) and plasma Ab42 (D) (ng/mL, mean 6 SE). Note difference in scales. Sample sizes are indicated.

Resveratrol increased brain volume loss (A, C) (mL, mean 6 SE) with a corresponding increase in ventricular volume (B, D) (mL, mean 6 SE). Sample sizes are indicated.

This phase 2 study has limitations. It was designed to determine the safety and tolerability of resveratrol and to examine pharmacokinetics. Although some biomarker trajectories were altered, we found no effects of drug treatment on plasma Ab42, CSF Ab42, CSF tau, CSF phospho-tau 181, hippocampal volume, entorhinal cortex thickness, MMSE, CDR, ADAS-cog, NPI, or glucose or insulin metabolism. The altered biomarker trajectories must be interpreted with caution. Although they suggest CNS effects, they do not indicate benefit.

22.  Miniature VHS Solenoid Valves Play Significant Role in the Viability of 3D Bio-Printing of Human Cells  

The rapid development of viable inkjet technology for highly specialised applications, such as printing human cells, continues to generate significant interest. If successful, the realisation of this technology for specialised biological applications, generally known as ‘biofabrication’, has the potential to replace the long established (and often controversial) process of using animals for testing new drugs. However, there are many challenges to overcome to enable the successful production of a valve-based cell printer for the formation of human embryonic stem cell spheroid aggregates. For example, printing techniques need to be developed which are both controllable and less harmful to the process of preserving human cell tissue viability and functions.

One particular cell printing project at an advanced stage and which has benefitted from the features and benefits of Lee Products miniature VHS solenoid valves and nozzles, is the result of pioneering activities at Edinburgh’s Heriot-Watt University. Dr Will Shu at the University’s Biomedical micro-engineering Group and his colleagues, including Alan Faulkner-Jones a bioengineering PhD student have successfully developed a bio-printer which has been demonstrated at the 3D Print show in London. Also involved in the development of the bio-printer are specialists at Roslin Cellab in Midlothian, a leading stem cell technology company.

The valve based bio-printer has been validated to print highly viable cells in programmable patterns from two different bio-inks with independent control of the volume of each droplet (with a lower limit of 2nL or fewer than five cells per droplet). Human ESC’s (Embryonic Stem Cells) were used to make spheroids by overprinting two opposing gradients of bio-ink; one of hESC’s in medium and the other of medium alone.
The resulting array of uniform sized droplets with a gradient of cell concentrations was inverted to allow cells to aggregate and form spheroids via gravity.
The resulting aggregates have controllable and repeatable sizes and consequently they can be made to order for specific applications. Spheroids with between 5 and 140 dissociated cells resulted in spheroids of 0.25-0.6 mm diameter. The success of the bio-printer demonstrates that a valve based printing process is gentle enough to maintain stem cell viability, accurate enough to produce spheroids of uniform size and that printed cells maintain their pluripotency.
Looking closer at the design of the bio-printer platform reveals two dispensing systems, each comprising a Lee VHS Nanolitre solenoid dispensing valve with a Teflon coated 101.6 µm internal diameter Lee Minstac nozzle controlled by a Arduino microcontroller. Each dispensing system is attached to a static pressure reservoir for the bio-ink solution to be dispensed via flexible tubing. The dispensing system and bio-ink reservoirs are mounted within a custom-built enclosure on the tool head of a micrometer-resolution 3-axis 3d printing platform (High-Z S-400, CNC Step) and controlled by a customized CNC controller (based on G540, Geokodrives).

A relatively larger nozzle diameter (compared to the size of the cells that are printed) was selected to reduce the amount of shear stress that could be experienced by the cells during the dispensing process. The bio-ink reservoirs were kept as close as possible to the valves in order to minimise the amount of time it would take to charge the system with bio-ink and to purge it at the end of the experiment. A USB microscope is also included to enable visual inspection of the target substrate during the printing process. Due to the type of deposition system used, a direct line of sight view through the nozzle is not possible and therefore the USB microscope is mounted at an offset angle from the cell deposition system assemblies.
Commenting on the development of the bio-printer and the vital role played by Lee Product’s VHS solenoid valves, Dr Will Shu at Heriot-Watt University said: “Printing living cells is extremely challenging and to the best of our knowledge, this is the first time that these cells have been 3D printed. The technique will allow us to create more accurate human tissue models which are essential to in-vitro drug development and toxicity testing and since the majority of drug discovery is targeting human disease, it makes sense to use human tissues.
”The development of the bio-printer has taken many years of effort and we are very pleased with the performance of Lee’s VHS solenoid valves, they are a vital component within the bio-printer printhead and we recommend them to our colleagues working on similar projects.”
Dr Shu added, “We also acknowledge the support and interaction from our contacts at Lee Products which has helped us to overcome the challenges of this project.”
This highly specialised application is an excellent example of the performance of Lee’s range of VHS Micro-Dispense Solenoid Valves which provide precise, repeatable, non-contact dispensing of fluids in the nanolitre to microlitre range. The valves feature a number of port configurations to facilitate quick and convenient connections to Lee’s 062 MINSTAC fittings and press-on tubing. The 062 MINSTAC outlet port can be used with Lee 062 MINSTAC tubing or atomising nozzles. Custom configurations and voltages are also available to suit specific applications.




Posted on September 23, 2015 by Healthinnovations

Ten million Canadians are living with diabetes or pre-diabetes. The Canadian Diabetes Association reports that more than 20 Canadians are newly diagnosed with the disease every hour of every day. It is also the seventh leading cause of death in Canada, with associated health-care costs estimated at nearly $9 billion a year. Type 2 diabetes accounts for 90 per cent of all cases, increasing the risk of blindness, nerve damage, stroke, heart disease and several other serious health conditions.

Insulin secretion from β cells of the pancreatic islets of Langerhans is impaired in type 2 diabetes (T2D).  Evidence suggests that this metabolic amplification of insulin secretion occurs distally in the secretory pathway, possibly at the calcium dependent exocytotic site.  Therefore the regulation or amplification of insulin is an important target for researchers around the world

Now, researchers from the University of Alberta have identified a new molecular pathway that manages the amount of insulin produced by the pancreatic cells, essentially a ‘dimmer’ switch that adjusts how much or how little insulin is secreted when blood sugar increases.  The team state that the dimmer appears to be lost in Type 2 diabetes, however, it can be restored and ‘turned back on’, reviving proper control of insulin secretion from islet cells of people with Type 2 diabetes.  The opensource study is published in the Journal of Clinical Investigation.

Previous studies show that the canonical mechanism of glucose-stimulated insulin secretion involving increases in metabolism-derived ATP, inhibition of KATP channels, and activation of VDCCs was first introduced more than 30 years ago and remains as a cornerstone mechanism for the triggering of insulin secretion’.  The KATP channel mechanism does not define the entire secretory response with multiple metabolic coupling intermediates proposed as factors that amplify the secretory response to a Ca2+exocytosis-based signal, with the net export of mitochondrial substrates being of great interest.

The current study examined pancreatic islet cells from 99 human organ donors.  Results show that the glucose-dependent amplification of exocytosis in human β cells, which is disrupted in type 2 diabetes, requires isocitrate flux through mitochondrial export which generates cytosolic NADPH and GSH. These then act through SENP1 to amplify the exocytosis of insulin, thereby controlling glucose homeostasis.  The lab then validated these data findings in a transgenic animal model.

The researchers state that the discovery is a potential game-changer in Type 2 diabetes research, leading to a new way of thinking about the disease and its future treatment.  The go on to add that understanding the islet cells in the pancreas that make insulin, how they work, and how they can fail, could lead to new ways to treat the disease, delaying or even preventing diabetes.

The team surmise that although the ability to restore and fix the dimmer switch in islet cells may have been proven on a molecular level, finding a way to translate those findings into clinical use could yet take decades. Despite this the group conclude that the findings show an important new way forward.

Source: University of Alberta


Pancreatic islet–specific knockout of Senp1 blunts insulin secretion due to an impaired amplification of exocytosis. Proposed pathway linking mitochondrial export of (iso)citrate, glutathione biosynthesis (blue), and glutathione reduction (orange) pathways to the amplification of insulin exocytosis (yellow). Isocitrate-to-SENP1 signaling amplifies insulin secretion and rescues dysfunctional β cells. MacDonald et al 2015.

24.  Nanotechnology

Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers.

Physicist Richard Feynman, the father of nanotechnology.

Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering.

The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled “There’s Plenty of Room at the Bottom” by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn’t until 1981, with the development of the scanning tunneling microscope that could “see” individual atoms, that modern nanotechnology began.

Medieval stained glass windows are an example of  how nanotechnology was used in the pre-modern era. (Courtesy: NanoBioNet)

It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:

  • There are 25,400,000 nanometers in an inch
  • A sheet of newspaper is about 100,000 nanometers thick
  • On a comparative scale, if a marble were a nanometer, then one meter would be the size of the Earth

Nanoscience and nanotechnology involve the ability to see and to control individual atoms and molecules. Everything on Earth is made up of atoms—the food we eat, the clothes we wear, the buildings and houses we live in, and our own bodies.

But something as small as an atom is impossible to see with the naked eye. In fact, it’s impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recently—about 30 years ago.

Once scientists had the right tools, such as the scanning tunneling microscope (STM) and the atomic force microscope (AFM), the age of nanotechnology was born.

Although modern nanoscience and nanotechnology are quite new, nanoscale materials were used for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didn’t know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with.

Today’s scientists and engineers are finding a wide variety of ways to deliberately make materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight, increased control of light spectrum, and greater chemical reactivity than their larger-scale counterparts.


Education and workforce development are critical to the advancement of nanotechnology and are encompassed within one of the four goals of the National Nanotechnology Initiative (NNI): “Develop and sustain educational resources, a skilled workforce, and a dynamic infrastructure and toolset to advance nanotechnology.” As new knowledge is created through exploratory research and development, it is a challenge to translate this understanding into the educational system and to the broader public. Over the past fifteen years of the NNI, there have been several activities that have made significant contributions in this area: public outreach and informal education by the NSF Nanoscale Informal Science Education  Network (NISE Net) through programs such as NanoDays; technician and workforce training through programs such as the NSF Advanced Technological Education Centers including the Nanotechnology Applications and Career Knowledge (NACK) Network; countless university courses and degree programs; and the emerging incorporation of nanoscience into the K-12 science education standards in states such as Virginia. To build upon this strong foundation, several announcements were made last week at the White House Forum on Small Business Challenges to Commercializing Nanotechnology including the establishment of a Nano and Emerging Technologies Student Leaders conference, a webinar series focused on providing information for teachers, and a web portal of nanoscale science and engineering educational resources.  – See more at: http://www.nano.gov/node/1415#sthash.fw1tMPiU.dpuf

25.  Antimicrobial film for future implants


(Nanowerk News) The implantation of medical devices is not without risks. Bacterial or fungal infections can occur and the body’s strong immune response may lead to the rejection of the implant. Researchers at Unit 1121 “Biomaterials and Bio-engineering” (Inserm/Strasbourg university) have succeeded in creating a biofilm with antimicrobial, antifungal and anti-inflammatory properties. It may be used to cover titanium implants (orthopaedic prostheses, pacemakers…) prevent or control post-operative infections. Other frequently used medical devices that cause numerous infectious problems, such as catheters, may also benefit.
These results are published in the journal Advanced Healthcare Materials (“Harnessing the Multifunctionality in Nature: A Bioactive Agent Release System with Self-Antimicrobial and Immunomodulatory Properties”).

26.  Characterizing the forces that hold everything together: UMass Amherst physicists offer new open source calculations for molecular interactions


UMass Amherst physicists, with others, provide a new software tool and database to help materials designers with the difficult calculations needed to predict the magnitude of van der Waals interactions between anisotropic or directionally dependent bodies such as those illustrated, with long-range torques. Though small, these forces are dominant on the nanoscale.

CREDIT: UMass Amherst

As electronic, medical and molecular-level biological devices grow smaller and smaller, approaching the nanometer scale, the chemical engineers and materials scientists devising them often struggle to predict the magnitude of molecular interactions on that scale and whether new combinations of materials will assemble and function as designed.

Characterizing the forces that hold everything together: UMass Amherst physicists offer new open source calculations for molecular interactions

Amherst. MA | Posted on September 23rd, 2015

This is because the physics of interactions at these scales is difficult, say physicists at the University of Massachusetts Amherst, who with colleagues elsewhere this week unveil a project known as Gecko Hamaker, a new computational and modeling software tool plus an open science database to aid those who design nano-scale materials.

In the cover story in today’s issue of Langmuir, Adrian Parsegian, Gluckstern Chair in physics, physics doctoral student Jaime Hopkins and adjunct professor Rudolf Podgornik on the UMass Amherst team report calculations of van der Waals interactions between DNA, carbon nanotubes, proteins and various inorganic materials, with colleagues at Case Western Reserve University and the University of Missouri who make up the Gecko-Hamaker project team.

To oversimplify, van der Waals forces are the intermolecular attractions between atoms, molecules, surfaces, that control interactions at the molecular level. The Gecko Hamaker project makes available to its online users a large variety of calculations for nanometer-level interactions that help to predict molecular organization and evaluate whether new combinations of materials will actually stick together and work.

In this work supported by the U.S. Department of Energy, Parsegian and colleagues say their open-science software opens a whole range of insights into nano-scale interactions that materials scientists haven’t been able to access before.

Parsegian explains, “Van der Waals forces are small, but dominant on the nanoscale. We have created a bridge between deep physics and the world of new materials. All miniaturization, all micro- and nano-designs are governed by these forces and interactions, as is behavior of biological macromolecules such as proteins and lipid membranes. These relationships define the stability of materials.”

He adds, “People can try putting all kinds of new materials together. This new database and our calculations are going to be important to many different kinds of scientists interested in colloids, biomolecular engineering, those assembling molecular aggregates and working with virus-like nanoparticles, and to people working with membrane stability and stacking. It will be helpful in a broad range of other applications.”

Podgornik adds, “They need to know whether different molecules will stick together or not. It’s a complicated problem, so they try various tricks and different approaches.” One important contribution of Gecko Hamaker is that it includes experimental observations seemingly unrelated to the problem of interactions that help to evaluate the magnitude of van der Waals forces.

Podgornik explains, “Our work is fundamentally different from other approaches, as we don’t talk only about forces but also about torques. Our methodology allows us to address orientation, which is more difficult than simply describing van der Waals forces, because you have to add a lot more details to the calculations. It takes much more effort on the fundamental level to add in the orientational degrees of freedom.”

He points out that their methods also allow Gecko Hamaker to address non-isotropic, or non-spherical and other complex molecular shapes. “Many molecules don’t look like spheres, they look like rods. Certainly in that case, knowing only the forces isn’t enough. You must calculate how torque works on orientation. We bring the deeper theory and microscopic understanding to the problem. Van der Waals interactions are known in simple cases, but we’ve taken on the most difficult ones.”

Hopkins, the doctoral student, notes that as an open-science product, Gecko Hamaker’s calculations and data are transparent to users, and user feedback improves its quality and ease of use, while also verifying the reproducibility of the science.


For more information, please click here

Janet Lathrop

27.  Researchers have succeeded in creating a biofilm with antimicrobial, antifungal and anti-inflammatory properties. (Image: Inserm / E.Falett)
Implantable medical devices (prosthesis/pacemakers) are an ideal interface for micro-organisms, which can easily colonize their surface. As such, bacterial infection may occur and lead to an inflammatory reaction. This may cause the implant to be rejected. These infections are mainly caused by bacteria such as Staphylococcus aureus, originating in the body, and Pseudomonas aeruginosa. These infections may also be fungal or caused by yeasts. The challenge presented by implanting medical devices in the body is preventing the occurrence of these infections, which lead to an immune response that compromises the success of the implant. Antibiotics are currently used during surgery or to coat certain implants. However, the emergence of multi-resistant bacteria now restricts their effectiveness.
A biofilm invisible to the naked eye…
It is within this context that researchers at the “Bioengineering and Biomaterials” Unit 1121 (Inserm/Strasbourg University) with four laboratories1 have developed a biofilm with antimicrobial and anti-inflammatory properties. Researchers have used a combination of two substances: polyarginine (PAR) and hyaluronic acid (HA), to develop and create a film invisible to the naked eye (between 400 and 600 nm thick) that is made of several layers. As arginine is metabolised by immune cells to fight pathogens, it has been used to communicate with the immune system to obtain the desired anti-inflammatory effect. Hyaluronic acid, a natural component of the body, was also chosen for its biocompatibility and inhibiting effect on bacterial growth.
…with embedded antimicrobial peptides,
The film is also unique due to the fact that it embeds natural antimicrobial peptides, in particular catestatin, to prevent possible infection around the implant. This is an alternative to the antibiotics that are currently used. As well as having a significant antimicrobial role, these peptides are not toxic to the body that they are secreted into. They are capable of killing bacteria by creating holes in their cellular wall and preventing any counter-attack on their side.
…on a thin silver coating,
In this study researchers show that poly(arginine), associated with hyaluronic acid, possesses microbial activity against Staphylococcus aureus (S. aureus) for over 24 hours. “In order to prolong this activity, we have placed a silver-coated precursor before applying the film. Silver is an anti-infectious material currently used on catheters and dressings. This strategy allows us to extend antimicrobial activity in the long term” explains Philippe Lavalle, Research Director at Inserm.
…effectively reducing inflammation, preventing and controlling infection
The results from numerous tests performed on this new film shows that it reduces inflammation and prevents the most common bacterial and fungal infections.
On the one hand, researchers demonstrate, through contact with human blood, that the presence of the film on the implant suppresses the activation of inflammatory markers normally produced by immune cells in response to the implant. Moreover, “the film inhibits the growth and long-term proliferation of staphylococcal bacteria (Staphylococcus aureus), yeast strains (Candida albicans) or fungi (Aspegillus fumigatus) that frequently cause implant-related infection” emphasises Philippe Lavalle.
Researchers conclude that this film may be used in vivo on implants or medical devices within a few years to control the complex microenvironment surrounding implants and to protect the body from infection.
Source: INSERM (Institut national de la santé et de la recherche médicale)

28.  Quantum dots light up under strain


Semiconductor nanocrystals, or quantum dots, are tiny, nanometer-sized particles with the ability to absorb light and re-emit it with well-defined colors. With low-cost fabrication, long-term stability and a wide palette of colors, they have become a building blocks of the display technology, improving the image quality of TV-sets, tablets, and mobile phones. Exciting quantum dot applications are also emerging in the fields of green energy, optical sensing, and bio-imaging.

Prospects have become even more appealing after a publication, entitled “Band structure engineering via piezoelectric fields in strained anisotropic CdSe/CdS nanocrystals,” was published in the journal Nature Communications last July. An international team, formed by scientists at the Italian Institute of Technology (Italy), the University Jaume I (Spain), the IBM research lab Zurich (Switzerland) and the University of Milano-Bicocca (Italy) demonstrated a radically new approach to manipulate the light emission of quantum dots.

The traditional operating principle of quantum dots is based on the so-called quantum confinement effect, where the particle size determines the color of the emitted light. The new strategy relies on a completely different physical mechanism; a strain induced electrical field inside the quantum dots. It is created by growing a thick shell around the dots. This way, researchers were able to compress the inner core, creating the intense internal electric field. This field now becomes the dominating factor in determining the emission properties.

The result is a new generation of quantum dots whose properties are beyond those enabled by quantum confinement alone. This not only broadens the application scope of the well-known CdSe/CdS material set but also of other materials. “Our findings add an important new degree of freedom to the development of quantum dot-based technological devices,” the researchers say. “For example, the elapsed time between light absorption and emission can be extended to be more than 100 times longer compared to conventional quantum dots, which opens the way towards optical memories and smart pixel new devices. The new material could also lead to optical sensors that are highly sensitive to the electrical field in the environment on the nanometer scale.”

Explore further: Resonant energy transfer from quantum dots to graphene

More information: “Band structure engineering via piezoelectric fields in strained anisotropic CdSe/CdS nanocrystals” Nat Commun. 2015 Jul 29; 6:7905. DOI: 10.1038/ncomms8905

Journal reference: Nature Communications

Read more at: http://phys.org/news/2015-09-quantum-dots-strain.html#jCp

29. Turing Reaction-diffusion Model Confirmed



In 1952, the legendary British mathematician and cryptographer Alan Turing proposed a model, which assumes formation of complex patterns through chemical interaction of two diffusing reagents. Russian scientists managed to prove that the corneal surface nanopatterns in 23 insect orders completely fit into this model.

Their work is published in the Proceedings of the National Academy of Sciences.

The work was done by a team working in the Institute of Protein Research of the Russian Academy of Sciences, (Pushchino, Russia) and the Department of Entomology at the Faculty of Biology of the Lomonosov Moscow State University. It was supervised by Professor Vladimir Katanaev, who also leads a lab in the University of Lausanne, Switzerland. Artem Blagodatskiy and Mikhail Kryuchkov performed the choice and preparation of insect corneal samples and analyzed the data. Yulia Lopatina from the Lomonosov Moscow State University played the role of expert entomologist, while Anton Sergeev performed the atomic force microscopy.

The initial goal of the study was to characterize the antireflective three-dimensional nanopatterns covering insect eye cornea, with respect to the taxonomy of studied insects and to get insight into their possible evolution path.

The result was surprising as the pattern morphology did not correlate with insect position on the evolutionary tree. Instead, Russian scientists have characterized four main morphological corneal nanopatterns as well as transition forms between them, omnipresent among the insect class. Another finding was that all the possible forms of the patterns directly matched to the array of patterns predicted by the famous Turing reaction-diffusion model published in 1952, what Russian scientists confirmed not by mere observation, but by mathematical modeling as well. The model assumes formation of complex patterns through chemical interaction of two diffusing reagents.

The analysis of corneal surface nanopatterns in 23 insect orders has been performed by means of atomic force microscopy with resolution up to single nanometers.

“This method allowed us to drastically expand the previously available data, acquired through scanning electron microscopy; it also made possible to characterize surface patterns directly, not based upon analysis of metal replicas. When possible, we always examined corneae belonging to distinct families of one order to get insight into intra-order pattern diversity,” Blagodatskiy said.

The main implication of the work is the understanding of the mechanisms underlying the formation of biological three-dimensional nano-patterns, demonstrating the first example of Turing reaction-diffusion model acting in the bio-nanoworld.

Interestingly, the Turing nanopatterning mechanism is common not only for the insect class, but also for spiders, scorpions and centipedes in other words — universal for arthropods. Due to the antireflective properties of insect corneal nanocoatings, the revealed mechanisms are paving the way for design of artificial antireflective nanosurfaces.

“A promising future development of the project is planned to be a genetic analysis of corneal nanopattern formation on platform of a well-studied Drosophila melanogaster (fruitfly) model. The wild-type fruitflies possess a nipple array type nanocoating on their eyes,” Blagodatskiy summarized.

Different combinations of overexpressed and underexpressed proteins known to be responsible for corneal development in Drosophila may alter the nipple pattern to another pattern type and thus shed the light on chemical nature of compounds, forming the Turing-type structures upon insect eyes. Revealing of proteins and\or other agents responsible for nanopattern formation will be a direct clue to artificial design of nanocoatings with desired properties. Another direction of project development will be the comparison

Citation: Artem Blagodatski, Anton Sergeev, Mikhail Kryuchkov, Yuliya Lopatina, Vladimir L. Katanaev. Diverse set of Turing nanopatterns coat corneae across insect lineages.Proceedings of the National Academy of Sciences, 2015; 112 (34): 10750 DOI:10.1073/pnas.1505748112

30.  Germ-free mice gain weight when transplanted with gut microbes from obese humans, in a diet-dependent manner.

By Ed Yong | September 5, 2013

Escherichia coliWIKIPEDIAPhysical traits like obesity and leanness can be “transmitted” to mice, by inoculating the rodents with human gut microbes. A team of scientists led byJeffrey Gordon from the Washington University School of Medicine in St. Louis found that germ-free mice put on weight when they were transplanted with gut microbes from an obese person, but not those from a lean person.

The team also showed that a “lean” microbial community could infiltrate and displace an “obese” one, preventing mice from gaining weight so long as they were on a healthy diet. The results were published today (September 5) in Science.

Gordon emphasized that there are many causes of obesity beyond microbes. Still, he said that studies like these “provide a proof-of-principle for ameliorating diseases.” By understanding how microbes and food interact to influence human health, researchers may be able to design effective probiotics that can prevent obesity by manipulating the microbiome.

The human gut is home to tens of trillions of microbes, which play crucial roles in breaking down food and influencing health. Gordon’s group and others have now shown that obese and lean people differ in their microbial communities. Just last week, the MetaHIT consortium showed that a quarter of Danish people studied had a very low number of bacterial genes in their gut—an impoverished state that correlated with higher risks of both obesity and metabolic diseases.

However, descriptive studies like these cannot tell scientists whether such microbial differences are the cause of obesity or a consequence of it. “A lot of correlations are being made between microbe community configurations and disease states, but we don’t know if these are casual or causal,” said Gordon. By using germ-free mice as living laboratories, Gordon and his colleagues aim to start moving “beyond careful description to direct tests of function,” he added.

“It’s extremely exciting and powerful to go from descriptive studies in humans to mechanistic studies in mice,” said Oluf Pedersen, an endocrinologist who was involved in the MetaHIT studies. “That’s beautifully illustrated in this paper.”

Gordon lab graduate student Vanessa Ridaura inoculated the germ-free mice with gut microbes from four pairs of female twins, each in which one person was obese and the other had a healthy weight. Mice that received the obese humans’ microbes gained more body fat, put on more weight, and showed stronger molecular signs of metabolic problems.

Once the transplanted microbes had taken hold in their guts, but before their bodies had started to change, Ridaura housed the two groups of mice together. Mice regularly eat one another’s feces, so these cage-mates inadvertently introduced their neighbors’ microbes to their own gut communities. Gordon called this the “Battle of the Microbiota.”

These co-housing experiments prevented the mice with “obese” microbes from putting on weight or developing metabolic problems, while those with the “lean” microbes remained at a healthy weight.

Gordon explains that the obese microbe communities, being less diverse than the lean ones, leave many “job openings” within the gut—niches that can be filled by the diverse lean microbes when they invade. “And obviously, those job openings aren’t there in the richer, lean gut community,” he said. “That’s why the invasion is one-directional.”

“But if invasion is so robust, why then isn’t there an epidemic of leanness?” asked Gordon. “The answer appears to be, in part, diet.”

In her initial experiments, Ridaura fed the mice standard chow, which is high in fiber and plant matter. She also blended up two new recipes, designed to reflect extremes of saturated fat versus fruit and vegetable consumption associated with Western diets.

If the mice were fed food low in fat and high in fruit and vegetables, Ridaura found the same results as before—the lean microbes could cancel out the effect of the obese ones. But when the mice were fed food low in fruit and vegetables and high in saturated fat, those with obese gut microbes still gained weight, no matter who their neighbors were.

This may be because the best colonizers among the lean communities were the Bacteroidetes—a group of bacteria that are excellent at breaking down the complex carbohydrates found in plant foods. When the mice ate plant-rich diets, the Bacteroidetes could fulfill a metabolic role that was vacant in the obese gut communities. When the mice ate unhealthy, plant-poor diets, “these vacancies weren’t there and the organisms couldn’t establish themselves,” said Gordon.

“We’re now trying to identify particular sets of organisms that can do what the complete community does,” Gordon added. The ultimate goal is to create a set of specific bacteria that could be safely administered as a probiotic that, along with a defined diet, could help these beneficial microbes to establish themselves and might effectively prevent weight gain.

“This study is an inspiration for us at MetaHIT,” said Pedersen. “It would be very interesting to take stools or cultures from extreme cases within our samples—people who have very rich or very poor gut microbiomes—and inoculate them into germ-free mice. . . . Now that we have a proof-of-concept, it’s obvious for us to follow up our findings through these studies.”

V.K. Ridaura et al., “Gut microbiota from twins discordant for obesity modulate metabolism in mice,” Science, doi: 10.1126/science.1241214, 2013.

31. Gut Microbes Treat Illness

Oral administration of a cocktail of bacteria derived from the human gut reduces colitis and allergy-invoked diarrhea in mice.

By Chris Palmer | July 10, 2013

Micrograph of germ-free mice colon colonized with 17 strains of human-derived Clostridia. Kenya Honda

An astounding array of microorganisms colonizes the human gut; our large intestines alone are home to 1014 bacteria from more than 1,000 species. Though scientists have long attempted to manipulate these microbial populations to affect health, probiotics have failed to reliably treat disease. However, a new study published today in Nature reports that a blend of specially selected strains of Clostridium bacteria derived from humans can significantly reduce symptoms of certain immune disorders in mice.

“[This work] shows that microbes can influence the balance and architecture of the immune system of their host,” said Sarkis Mazmanian, an immunologist at the California Institute of Technology who did not participate in the research. “I think it has tremendous potential for ameliorating human disease.”

Mammalian gut microbiota—the community of microorganisms that inhabit the gastrointestinal tract—have a long, intimate, and mostly symbiotic history with their hosts. The ubiquitous bugs are integral to some of the most basic of physiological functions, including metabolism and immune system development and function. However, specific gut microbes have also been linked to autoimmune disorders, obesity, inflammatory bowel disease, and possibly even neurological disorders. “It’s clear that gut microbes can affect many, many aspects of our physiology,” said Mazmanian.

Senior author Kenya Honda and his team previously reported that colonization of germ-free mice—mice that lack a microbiota—with a cocktail of a few dozen strains of Clostridium bacteria derived from wild-type mice promoted the activity of regulatory T cells (Treg) in the colon. Treg cells produce important anti-inflammatory immune molecules, including interleukin-10 and inducible T-cell co-stimulator, to prevent an overreaction of the immune system, and disruption of Treg cells is known to play a role in autoimmune disorders such as colitis, Crohn’s disease, food allergies, and type II diabetes. Indeed, mice treated with theClostridium cocktail appeared more resistant to allergies and intestinal inflammation.

Clostridia bacteria include the well-known tetanus and botulism toxins. “Clostridia are very diverse bacteria, and include some pathogens,” said Alexander Rudensky, an immunologist at the Memorial Sloan-Kettering Cancer Center in New York and a cofounder,  of Vedanta Biosciences, which he launched with the paper authors in 2010. “So, their role [in disease] may be surprising to immunologists and public, but not to microbiologists.”

To extend the clinical relevance of the previous results, Honda’s group repeated their experiment usingClostridium derived from a sample of human feces. As in the previous study, germ-free mice treated with specially selected strains of human-derived Clostridia displayed a significant increase in Treg cells. The treated mice also displayed reduced symptoms of colitis and allergy-induced diarrhea.

“This is a terrific advance to their previous studies where they showed that mouse microbiota can induce regulatory T cells,” said Mazmanian. “In this paper they’ve extended that to bacteria that come from humans, which they have tested in mice.”

The researchers used RNA sequencing of gut tissue samples of mice treated with human microbes to identify 17 specific non-virulent strains of Clostridium responsible for the increased production of Treg cells. They then sequenced the metagenomes of human ulcerative colitis patient guts, and found that they tended to carry lower levels of the 17 strains, with 5 out of the 17 showing a statistically significant reduction. “This work lays out the first instance of a rationally designed drug candidate isolated from human microbiota, which can be given to animals to treat autoimmune disease,” said study coauthor Bernat Olle, the chief operating officer of Vedanta Biosciences, which is developing therapies based on the new research.

Investigations into the mechanisms underlying Treg-cell induction pointed to small chain fatty acids and bacterial antigens that are cooperatively produced by the 17 strains of Clostridium. The small chain fatty acids and antigens in turn activate a transforming growth factor (TGF-beta) response that drives Treg cell differentiation and expansion.

“It’s very valuable to see studies like this one, where detailed analysis of microbial compositions is linked to biology,” said Rudensky.

Atarashi et al., “Treg induction by a rationally selected mixture of Clostridia strains from the human microbiota,” Nature, doi:10.1038/nature12331, 2013.


32.  Foxp3 targets revealed

The first comprehensive — but preliminary — list of Foxp3 targets in mice could provide clues to how the protein helps regulate the immune system

By Chandra Shekhar | January 22, 2007

The first comprehensive catalogue of mouse genes targeted by the transcriptional factor Foxp3 appears intwo papers published in this week’s Nature. The lists from both studies don’t always match, but the combined findings represent a key step in understanding how the protein helps regulatory T-cells maintain immune system tolerance and prevent autoimmune diseases. “The papers provide the first look at relating the transcriptional DNA-binding activity of Foxp3 with specific target genes,” said Fred Ramsdell of ZymoGenetics in Seattle, who was not involved in either study. “This is something the field has beenlooking to do for the past five years.” Expressed primarily in regulatory T-cells, Foxp3 is essential to both their development and normal function. Loss-of-function Foxp3 mutations in mice and humans result in fatal autoimmune diseases. A research team led by Alexander Rudensky of the University of Washington in Seattle, with Ye Zheng as first author, used ex vivo T-cells from mice with Foxp3 knocked out or tagged with GFP. Using a chromatin immunoprecipitation (ChIP) protocol, the team located nearly 1,300 Foxp3 binding sites on the mouse genome, from which it identified 702 Foxp3-bound genes. “Unlike other transcription factors, Foxp3 binds to only a few sites in the genome,” observed Rudensky. “But its binding results in very efficient changes in gene expression.” Another study, led by Richard Young of the Whitehead Institute in Cambridge, Mass. and Harald von Boehmer of the Dana-Farber Cancer Institute in Boston, also used ChIP to identify Foxp3 binding sites. Out of more than 1,500 binding sites, they identified 1,119 genes bound by Foxp3. Instead of ex vivo T-cells, however, the researchers used T-cell hybridomas transfected with Foxp3. This made it easier to observe the effects of T-cell receptor stimulation, explained study’s first author, Alexander Marson. “Foxp3 exerts a much stronger influence on its target genes in stimulated cells than in unstimulated cells,” he noted. Ethan Shevach of the National Institutes of Health in Bethesda, Md., who was not involved in either study, said he preferred the use of normal T-cells — as in the Zheng et al. study — to hybridomas. “There is no evidence that the cell [Marson et al] transfect with Foxp3 is a regulatory T-cell,” Shevach said. Some of the direct targets of Foxp3 identified in the two studies — such as members of the irf family — are transcription factors in their own right, indicating a second layer of regulation mediated by Foxp3. The target lists also include a number of genes for cell surface molecules, such as CD28, and signal transduction, such as Cdc42. “Some of these targets are red herrings,” cautioned Shevach. “Foxp3 may bind to them, but they may have nothing to do with regulatory cell function.” The results from the two studies differ significantly. For instance, Zheng et al. noted that ctla4 — an important T-cell inhibitor — was bound and strongly upregulated by Foxp3, but Marson et al. did not observe this. Conversely, while both studies found that Foxp3 bound to the receptor for IL2, a key player in immune response, only Marson et al. found IL2 itself to be a target. Further, while Zheng et al. determined that Foxp3 activated more genes than it suppressed, Marson et al. came to the opposite conclusion. “What I found most striking was the amount of non-overlap between the two datasets,” said Steve Ziegler of the Benaroya Research Institute in Seattle. “This may reflect the fact that they used two different systems for their chip-on-chip analysis.” Despite the discrepancies, experts said the studies would be a major help in research into immune tolerance. “Foxp3 is located in the nucleus and is hard to get at,” said Ziegler. “Downstream targets of it may be more accessible and give us more tractable surrogate markers of regulatory T-cells.” Chandra Shekhar cshekhar@the-scientist.com Links within this article Two papers: Y. Zheng, et al., “Genome-wide analysis of Foxp3 target genes in developing and mature regulatory T cells,” Nature, Jan 2007. A. Marson, et al., “Foxp3 occupancy and regulation of key target genes during T-cell stimulation,” Nature, Jan 2007. http://www.nature.com T.P. Toma, “Self-tolerance gene?” The Scientist, January 9, 2003 http://www.the-scientist.com/article/display/20994 M. Greener, “Hot on tolerance’s trail: The hunt for human Foxp3,” The Scientist, May 23, 2005http://www.the-scientist.com/article/display/15478 F. Ramsdell, “Foxp3 and natural regulatory T cells: Key to a cell lineage?” Immunity, August 2003. ‘http://www.immunity.com/content/article/abstract?uid=PIIS1074761303002073 Alexander Rudenskyhttp://depts.washington.edu/immunweb/faculty/profiles/rudensky.html Richard Younghttp://jura.wi.mit.edu/young_public/index.html Harald von Boehmer http://www.dana-farber.org/res/physician/detail.asp?personID=232&RD=True&group=%28Researcher%29 Ethan Shevachhttp://www3.niaid.nih.gov/labs/aboutlabs/li/cellularImmunologySection Steve Zieglerhttp://www.benaroyaresearch.org/investigators/ziegler_steven

32.  Lasker Winners Announced

This year’s prizes honor pioneering work on the unfolded protein response, deep-brain stimulation, and the discovery of cancer-related genes.

By Tracy Vence | September 8, 2014

Kazutoshi Mori (left), Peter Walter (right) ALBERT AND MARY LASKER FOUNDATION Kazutoshi Mori of Kyoto University in Japan and Peter Walter of the University of California, San Francisco, have won the 2014 Lasker Award for basic medical research. Mori and Walter are being honored by the Albert and Mary Lasker Foundation for their work related to the unfolded protein response—a cellular stress response that has been implicated in several protein-folding diseases.

In its announcement, the foundation said that “Mori and Walter’s work has led to a better understanding of inherited diseases such as cystic fibrosis, retinitis pigmentosa, and certain elevated cholesterol conditions in which unfolded proteins overwhelm the unfolded protein response.”

Three years ago, the Lasker Foundation honored Franz-Ulrich Hartl and Arthur Horwich for their protein-folding work with its 2011 basic research award.

Meanwhile, Alim Louis Benabid of Joseph Fourier University in Grenoble, France, and Mahlon DeLong of the Emory University School of Medicine in Atlanta, Georgia, have won the this year’s Lasker-DeBakey Clinical Medical Research Award for their deep-brain stimulation work that has been used to help restore and motor function in patients with advanced Parkinson’s disease.

And the University of Washington’s Mary-Claire King has won the 2014 Lasker-Koshland Special Achievement Award in Medical Science for “bold, imaginative, and diverse contributions to medical science and human rights” related to her work to reunite missing persons or their remains with their families, as well as her discovery of the cancer-related BRCA1 gene locus. In a commentary published in JAMA today (September 8), King and her colleagues advocated for population-based screening for cancer-related genetic variants. “Population-wide screening will require significant efforts to educate the public and to develop new counseling strategies, but this investment will both save women’s lives and provide a model for other public health programs in genomic medicine,” they wrote.

This year’s recipients will receive a $250,000 honorarium per category. The awards will be presented on Friday, September 19, in New York City.

33.  Protein Binding

Edited by: Thomas W. Durso S.D. Rosen, C.R. Bertozzi, “The selectins and their ligands,” Current Opinion in Cell Biology, 6:663-73, 1994. (Cited in more than 60 publications through April 1996) Comments by Steven D. Rosen, University of California, San Francisco The selectins are a trio of related proteins involved in leukocyte-endothelium interactions, affecting the ability of leukocytes-that is, white blood cells-to interact with blood vessel walls. THREEPEAT: The selectins are a threesome

By Carolyn Bertozzi | October 28, 1996

Edited by: Thomas W. Durso
S.D. Rosen, C.R. Bertozzi, “The selectins and their ligands,” Current Opinion in Cell Biology6:663-73, 1994. (Cited in more than 60 publications through April 1996) Comments by Steven D. Rosen, University of California, San Francisco

The selectins are a trio of related proteins involved in leukocyte-endothelium interactions, affecting the ability of leukocytes-that is, white blood cells-to interact with blood vessel walls.

THREEPEAT: The selectins are a threesome of related proteins, says UC-San Francisco’s Steven Rosen.

“One of the novel aspects of the selectins is that they function as carbohydrate-binding receptor molecules-that is, they recognize specific carbohydrate structures as their ligands, or counter-receptors,” Rosen says. “This means that in principle, it’s possible to interrupt the function of selectins by determining what carbohydrates they bind to and providing mimics for those carbohydrates in the form of soluble small molecules, thereby arriving at a new class or classes of anti-inflammatory substances.”The paper summarizes the three selectins and their physiological functions in leukocyte-endothelium interactions, and describes how they function.First identified at the molecular level in 1989 (L.M. Stoolman, Cell,56:907-10, 1989), selectins are the topic of this review paper by Steven D. Rosen, a professor in the department of anatomy and program in immunology at the University of California, San Francisco, and Carolyn R. Bertozzi, a former postdoc in Rosen’s lab and now an assistant professor of chemistry at the University of California, Berkeley.

Rosen explains that with leukocytes moving from the blood into tissues, the leukocyte-endothelium interaction is critical to inflammatory reactions.

ONE PLACE: UC-Berkeley’s Carolyn Bertozzi, Rosen’s former postdoc, was coauthor of the review paper.

“Leukocytes in tissue sites are protecting the individual from bacterial invasions and foreign substances that the individual wants to eliminate, but leukocytes can have an arsenal of destructive capabilities which can be turned on the individual’s own tissues. So inflammatory reactions have a down side. There are a lot of inflammatory diseases, such as rheumatoid arthritis, multiple sclerosis, lupus, and other autoimmune diseases.””In many cases, inflammatory reactions lead to pathological problems,” he points out. “It’s a defense mechanism the body has, but leukocytes being in tissue sites can cause problems as well as be of value to the individual.

He concludes: “The interest in the selectins was: Here’s a family of proteins that has involvement in leukocyte-endothelium interactions, therefore here’s a potential set of targets to prevent leukocyte entry into tissues and prevent inflammatory problems.”

Asked for his opinion on why this paper has been cited so much, Rosen replies: “There’s a huge amount of interest in the selectins, because there’s basic cell biology and biochemistry that everybody’s interested in here. . . . There’s a real convergence of the basic science with direct clinical applications. What you do in the lab can have immediate ramifications on the design of anti-inflammatory compounds. There’s tremendous biotech and pharmaceutical company interest in the selectins and their ligands.

“This has been a tremendously hot topic since 1989, and it will be for years to come. Our article put everything down in one place, from the basic cell biology to the clinical connections, and updated the carbohydrate information and ligand identification information in a very accessible way.”

In addition to reviewing the selectins, Rosen states, “the paper deals with what is known about the carbohydrates that the selectins recognize, and what is known about the macromolecules-the ligands-that carry these carbohydrates. What might make the carbohydrates that one selectin recognizes different from the carbohydrates that another selectin molecule might recognize-that is, what is the selectivity of carbohydrate binding among the three selectins?”

The paper also lists the animal models of inflammatory diseases in which selectins have been shown to play an important role, “where antagonism of the selectin leads to beneficial effects, in terms of decreasing damage,” Rosen notes.

Since the publication of this paper, he and Bertozzi have written a second review, updating ligand characterizations (S.D. Rosen, C.R. Bertozzi, Current Biology6:261-4, 1996).

“It has a lot more on carbohydrate specificity, and it’s got some new information on how one of the selectins recognizes its ligands,” Rosen notes. “Sulfation is important. At the time of the first review, sulfation was known to be important for the binding of one selectin to its ligands. . . . This review points to the importance of sulfation for the ligand of another selectin. The nature of the sulfation modifications of the ligands are very different for the two selectins.”

Additional LPBI articles:

MIT’s Promise for the MI Patient: A new cardiac patch uses Gold Nanowires to enhance Electrical Signaling between heart cells

Curator: Aviva Lev-Ari, PhD, RN


Nanotechnology and Heart Disease

Author and Curator:  Tilda Barliya PhD


AAAS February 14-18, 2013, Boston: Symposia – The Science of Uncertainty in Genomic Medicine

Reporter: Aviva Lev-Ari, PhD, RN


Robert S. Langer, Massachusetts Institute of Technology

Challenges and Opportunities at the Confluence of Biotechnology and Nanomaterials

Introduction to Tissue Engineering; Nanotechnology applications

Author, editor; Tilda Barliya PhD


Building a Drug-Delivery System(DDS): choice of polymers and drugs

Author: Tilda Barliya PhD


Read Full Post »

Rosa’s to like

Curator & Reporter: Larry H. Bernstein, MD, FCAP



Reality Check: Cancer Experts Discuss Hurdles Facing CAR-T Therapy



September 18th, 2015

There’s a lot of excitement these days about a type of cellular immunotherapy known as CAR-T, a method of modifying peoples’ immune cells to fight cancer. But you could also fill a book listing all the problems its makers will have to solve—how to test, manufacture, and even the define the nature of these cancer-killing cells—before the CAR-T story is a successful one.

These hurdles, not the hype, were the subject of a panel of experts from industry, academia, and the FDA at the Inaugural International Cancer Immunotherapy Conference in New York Thursday afternoon. The panelists included University of Pennsylvania professor Carl June, whose work has led to programs now in clinical testing at Novartis; Adaptimmune executive vice president Gwendolyn Binder-Scholl; and GlaxoSmithKline’s head of immuno-oncology Cedrik Britten, among others.

CAR-T stands for chimeric antigen receptor T cell, which describes an engineered version of the immune system’s attack dogs. CAR-T cells are a patient’s own T cells altered outside the body to be cancer killers, then put back in to go after tumor cells.

CAR-T therapies from Novartis, Juno Therapeutics (NASDAQ: JUNO), and Kite Pharma (NASDAQ: KITE) have produced impressive results so far for certain blood cancers, leading to long-lasting remissions in some patients.

But the field is early in its development. Researchers are trying to figure out how to make these therapies useful for more common cancers, such as lung, breast, and ovarian, and how to mitigate the overactive immune responses they can cause. Biotechs and pharma companies developing autologous therapies—which modify the cells of each individual patient—are wrestling with how to manufacture and distribute them at scale.

But a different, larger question looms, and it gets to the heart of why autologous T cell therapy is truly a new medical frontier. The cells that are delivered back into the patient are not what ends up doing the bulk of the therapeutic work.

The panelists Thursday noted how T cell therapies could throw a wrench into a typical, and crucial, clinical strategy. Early in the clinical testing of a drug, companies usually run what are known as dose escalation studies. Different doses of products are tested, low to high, to establish a trend of responses, see what safety issues pop up, and pick the optimal dose to move forward.

But because CAR-T cell populations expand once they’re put back into a patient’s body, doses are harder to define. What’s more, cranking up a dose for such a powerful therapy could be dangerous. “Are classical trial designs applicable, or do they have to be changed?” asked GSK’s Britten. “You can not have a simple dose escalation [study] with a drug that replicates.”

Adaptimmune’s Binder-Scholl called for more guidance from regulators to help figure out a more standardized scheme for dose escalation studies.

“I think the biology is going to make [that type of guidance] awfully challenging,” says Juno’s chief financial officer Steve Harr, who also wasn’t on the panel. “I would like to think over time we get into something a bit more predictable, and maybe we have some type of a standard, but we’re very early in this process.”


Medscape Cardiology Black on Cardiology

SPRINT Hypertension Trial: Preliminary Results Discussed


Henry R. Black, MD; William C. Cushman, MD    Disclosures | Sept 18, 2015

Stopped Early for Benefit

Henry R. Black, MD: Hi. I’m Dr Henry Black. I’m adjunct professor of medicine at the Langone New York University School of Medicine, and I’m here today with my long-term friend and colleague, Dr Bill Cushman. Bill, thank you very much for doing this.

William C. Cushman, MD: Delighted to be here.

Dr Black: What I want to talk about is the SPRINT study,[1]which you’ve been a primary participant in. The top-line resultswere just released. Tell us a little bit about SPRINT: who was in it, what the hypothesis was, and how it compares to the ACCORD study, which you also participated in.

Dr Cushman: Sure. I’m Dr Bill Cushman. I’m from Memphis, Tennessee. And I’m chief of the preventive medicine section at the VA and professor of preventive medicine at the University of Tennessee.

I was a network principal investigator in SPRINT, which meant that I oversaw about a quarter of the sites. SPRINT was a study sponsored by the National Institutes of Health (NIH), primarily the National Heart, Lung, and Blood Institute (NHLBI). But other institutes—the National Institute of Diabetes and Digestive and Kidney Diseases, the National Institute on Aging, the National Institute of Neurological Disorders and Stroke—were also involved.

SPRINT was a study of 9361 participants who were randomized to either a lower, more intensive goal of less than 120 mm Hg systolic blood pressure (SBP) compared with a goal of less than 140 mm Hg systolic. That was considered standard when we designed the study, and all guidelines recommended at least getting below 140 mm Hg.

We recruited a participant pool of high-risk hypertensive patients with SBPs of ≥130 mm Hg. They could be on medications (the majority were), but they didn’t have to be. Participants not only had to have elevated blood pressure, but they also had to be above age 50 and they had to have some other indices of risk: known cardiovascular disease, chronic kidney disease, or being above age 75, for example, or having a Framingham risk assessment for cardiovascular disease of ≥ 15% over 10 years.

They were randomly allocated to these two groups, with the intent of being followed for about 5 years. The primary outcome in SPRINT was a combined cardiovascular outcome that included myocardial infarction (MI), acute coronary syndrome other than MI, stroke, heart failure, or cardiovascular death.

Now, there are a lot of other outcomes in SPRINT, including whether this lower blood pressure goal would prevent dementia, changes on MRI, or chronic kidney disease. Those outcomes have not been stopped or announced yet, and we’re still collecting data on that.

The cardiovascular outcomes were viewed as so positive in terms of the benefit that the Data and Safety Monitoring Board recommended to Gary Gibbons, the director of the NHLBI, that the cardiovascular part of the trial—and the intensive intervention in particular—should be stopped and that the investigators and the participants should be unblinded. And that was done.

Dr Black: Were the antihypertensive regimens prescribed, or was it whatever the docs wanted to do?

Dr Cushman: Good point. We actually recommended using the major classes that were proven to be of benefit in cardiovascular outcome trials in hypertension: either thiazide-type diuretics, ACE inhibitors, angiotensin receptor blockers, or calcium blockers. It was primarily those four classes, and they could be combined in whatever way the investigators wanted. We did put a lot of emphasis on using thiazide-type diuretics because of the ALLHAT[2] results.

But the way they could be combined was really up to the investigators. Now, if the participants had known coronary disease or some other indication for a beta-blocker, that could certainly be used. And then other drugs could be added. We had a very large formulary representative of all the major classes of drugs— not only those classes, but also beta-blockers, alpha blockers, aldosterone inhibitors (spironolactone or amiloride, for example).

We had a lot of drugs available. They were predominantly purchased for the study, by NIH. There were only two drugs that were donated by the pharmaceutical companies. The study was entirely funded by NIH.


Dr Black: How is this different from ACCORD?[3]

Dr Cushman: In ACCORD, we had the same two SBP goals: less than 120 mm Hg compared with a SBP of less than 140 mm Hg. However, SPRINT is twice as large as ACCORD.

As you may remember, we did not show a significant benefit for the lower SBP goal for the overall cardiovascular outcome in the ACCORD trial. We did see a significant reduction in stroke of about 40%, but that was a secondary outcome. The primary outcomes in mortality were not reduced in ACCORD.

However, ACCORD was about half the size of SPRINT. And even though the ACCORD blood pressure study was done in patients with diabetes, on average, they were probably a little lower-risk than our SPRINT participants because of their somewhat younger age (average age, 62 years), the absence of real chronic kidney disease, and several other reasons.

Even though ACCORD didn’t show a statistically significant benefit, it did show a 12% reduction in the cardiovascular outcome with a confidence interval that could have included up to a 27% benefit.

In contrast, SPRINT was twice as large, with a higher-risk population with an older average age. We excluded people with diabetes because that was being looked at in ACCORD. And we excluded people who’d had a prior stroke because that was being looked at in the SPS3[4] post-stroke study in terms of blood pressure goals.

Despite that, we had a very high-risk population. And what we found was about a third of a reduction in the primary cardiovascular events. That was significant.

We also saw, quite importantly, about a 25% reduction in all-cause mortality. That was surprising. The results are quite clear that there’s dramatic benefit in terms of both cardiovascular events and total mortality.

Dr Black: You probably can’t tell us this yet, but what was the blood pressure achieved in the less-than-140 group compared with the less-than-120 group?

We also saw, quite importantly, about a 25% reduction in all-cause mortality. That was surprising. The results are quite clear that there’s dramatic benefit in terms of both cardiovascular events and total mortality.


Diabetes Drug Empagliflozin Cuts CV Deaths in Landmark EMPA-REG Trial


Lisa Nainggolan

STOCKHOLM ( updated with commentary ) — Patients with type 2 diabetes and established cardiovascular disease receiving the glucose-lowering agent empagliflozin (Jardiance, Boehringer Ingelheim/Lilly), a sodium glucose cotransporter-2 (SGLT-2) inhibitor, were less likely to die than those taking placebo in the large, much-anticipated EMPA-REG OUTCOME study, hailed here as a landmark trial.

The benefit on survival was seen regardless of the cause of death — empagliflozin prevented one in three cardiovascular deaths, with a significant 38% relative risk reduction in cardiovascular mortality, as well as a significant 32% relative reduction in all-cause mortality.

CV death was one component of the primary composite outcome, which also included nonfatal myocardial infarction (MI) or nonfatal stroke. It was the CV mortality benefit, however, that primarily drove the reduction in this end point.

“Empagliflozin is reducing death, the ultimate outcome,” senior author of the study, Silvio Inzucchi, MD, of Yale Diabetes Center, New Haven, Connecticut, told Medscape Medical News. “This is a first in my lifetime — a diabetes drug trial that has shown improved outcomes in high-risk cardiovascular patients.”

This is a first in my lifetime — a diabetes drug trial that has shown improved outcomes in high-risk cardiovascular patients.

Dr Inzucchi was given multiple rounds of applause as he presented the findings of EMPA-REG OUTCOME here at the European Association for the Study of Diabetes (EASD) 2015 Meeting, The study was also published simultaneously in the New England Journal of Medicine, by a team led by Bernard Zinman MD, director, Diabetes Centre, Mount Sinai Hospital, Toronto, Ontario.


Sept 17, 2015   http://dx.doi.org:/10.1056/NEJMoa1504720

Type 2 diabetes is a major risk factor for cardiovascular disease,1,2 and the presence of both type 2 diabetes and cardiovascular disease increases the risk of death.3 Evidence that glucose lowering reduces the rates of cardiovascular events and death has not been convincingly shown,4-6although a modest cardiovascular benefit may be observed after a prolonged follow-up period.7Furthermore, there is concern that intensive glucose lowering or the use of specific glucose-lowering drugs may be associated with adverse cardiovascular outcomes.8 Therefore, it is necessary to establish the cardiovascular safety benefits of glucose-lowering agents.9


Cohen’s Brain Bits: Let the Sunshine in?

http://www.medpagetoday.com/Blogs/CohensBrainBits/53630?xid=nl_mpt_DHE_2015-09-19&eun=g337145d0r    Published: Sep 18, 2015

By Joshua Cohen MD, MPH

Vitamin D is actually not a vitamin at all — it is a group of fat-soluble steroid hormones responsible for a host of important functions in the body. As it is found in low levels in most foods other than fish and dairy, vitamin D is primarily synthesized from cholesterol in the skin upon exposure to UVB radiation.

While the discovery of vitamin D nearly a century ago stemmed from its role in calcium homeostasis and metabolism, an abundance of studies in the past decade have demonstrated the critical role vitamin D plays in neuronal development and protection. Indeed, in the past few years, researchers have uncovered an association between vitamin D deficiency and an array of important neurologic diseases.

study in this week’s JAMA Neurology investigated the relationship between vitamin D levels, as measured in the blood as 25-hydroxyvitamin D, and the rate of cognitive decline in a population of 382 multi-ethnic older adults. Both vitamin D insufficient (12-20 ng/mL) and deficient (<12 ng/mL) participants demonstrated accelerated cognitive decline in multiple functional domains, especially episodic memory and executive function, that are the domains most affected in patients with Alzheimer’s dementia.

Previous studies have emphasized the essential role of vitamin D in the brain and have raised concern about the effect of vitamin D deficiency on the brain. Vitamin D’s neuroprotective roles include stimulation of neurotrophin release, neuroimmunomodulation, and interaction with reactive nitrogen and oxygen species. Vitamin D appears to also play a role in neurodevelopment through its regulation of nerve growth factor synthesis. Imaging studies have found increases in white matter hyperintensities and enlarged ventricles in vitamin D deficient study participants.


Channel Molecular Noise to Keep Cells Healthy


Molecular fluctuations or noise within and among cells can be manipulated to control the networks that govern the workings of living cells—promoting cellular health and potentially alleviating diseases such as cancer. [Daniel K. Wells]

Complex networks are noisy, whether they constitute food webs, power grids, or cells. And when networks buzz and crackle beyond normal bounds, bad things can happen: ecosystems can collapse, power grids can leave us in the dark, and cells can tumble into cancerous states.

All these networks are amenable to similar mathematical treatments says a scientific team at Northwestern University. The team, led by physicist Adilson E. Motter, Ph.D., substantiated this claim by focusing on a particularly difficult biophysical problem: the rational control of cellular behavior. To date, attempts to exert such control have been frustrated by the high dimensionality and noise that are inherent properties of large intracellular networks.

Dr. Motter and his colleagues noted that the response of biological systems to noise has been studied extensively. Yet they also realized that little had been done to exploit noise, or to at least channel it. They hoped to find a way to do so and thereby demonstrate the possibility of preserving or inducing desirable cell states.

Using a newly developed computational algorithm, Dr. Motter and colleagues showed that molecular-level noise can be manipulated to control the networks that govern the workings of living cells—promoting cellular health and potentially alleviating diseases such as cancer. They presented their results September 16 in the journal Physical Review X, in an article entitled, “Control of Stochastic and Induced Switching in Biophysical Networks.”

“Here we present a scalable, quantitative method based on the Freidlin-Wentzell action to predict and control noise-induced switching between different states in genetic networks that, conveniently, can also control transitions between stable states in the absence of noise,” wrote the authors. “We apply this methodology to models of cell differentiation and show how predicted manipulations of tunable factors can induce lineage changes, and further utilize it to identify new candidate strategies for cancer therapy in a cell death pathway model.”

Essentially, by leveraging noise, the team found that the high-dimensional gene regulatory dynamics could be controlled instead by controlling a much smaller and simpler network, termed a “network of state transitions.” In this network, cells randomly transition from one phenotypic state to another—sometimes from states representing healthy cell phenotypes to unhealthy states where the conditions are potentially cancerous. The transition paths between these states can be predicted, as cells making the same transition will typically travel along similar paths in their gene expression.

The team began by using noise to define the most-likely transition pathway between different system states, and connecting these paths into the network of state transitions. By doing so, the researchers could then focus on just one path between any two states, distilling a multidimensional system to a series of one-dimensional interconnecting paths.

Then, using their computational approach, the team identified optimal modifications of experimentally adjustable parameters, such as protein activation rates, to encourage desired transitions between different states.


Mitochondrial Protein Finding May Allow Scientists to Control Apoptosis



A protein embedded in the surface of mitochondria opens the door to apoptosis, causing cells to experience severe power failures, according to new work by researchers at Temple University School of Medicine. The study, appearing in Molecular Cell, suggests that blocking the door with a small-molecule inhibitor could be key to the treatment of cardiovascular diseases such as heart attack and stroke, where extensive mitochondrial dysfunction and cell death hinder tissue recovery.

The study (“SPG7 Is an Essential and Conserved Component of the Mitochondrial Permeability Transition Pore”), led by Muniswamy Madesh, Ph.D., associate professor in the department of biochemistry, the Cardiovascular Research Center, and the Center for Translational Medicine at Temple University School of Medicine (TUSM), shows that the protein, spastic paraplegia 7 (SPG7), is the central component of the so-called permeability transition pore (PTP), a protein complex in the mitochondrial membrane that mediates necrotic cell death (death caused by cell injury).

The identification of SPG7 marks a major advance in scientists’ understanding of how the PTP affects necrosis. Although first described in 1976, the molecular parts of the pore have eluded discovery. “The only known molecular component of the PTP prior to our discovery of SPG7 was a protein called CypD, which is necessary for pore function,” Dr. Madesh explained.

To identify genes that modulate PTP opening induced by calcium overload or increased levels of reactive oxygen species (ROS), the two primary factors that cause mitochondrial dysfunction and cell death via pore opening, Dr. Madesh’s team devised an RNA interference-based screen in which the activity of each gene under investigation was knocked down, or silenced, to examine its effects on mitochondrial calcium levels.

The researchers began with a panel of 128 different genes but after initial screening narrowed the field to just 14 candidate PTP components. Subsequent experiments showed that the loss of only one of them, SPG7, prevented pore opening.

Much of what is known about the PTP comes from studies of mitochondria in disease. In pathological states, particularly those involving hypoxia, calcium, and ROS accumulate within mitochondria, causing them to swell and prompting the PTP to open. Because pore opening disrupts the flow of electrons and protons across the mitochondrial membranes, which normally sustains energy production, it results in a catastrophic drop in cellular energy levels.

In the absence of disease, precisely how the PTP helps to mediate normal cellular physiology remains unclear. According to Dr. Madesh, “Under physiological conditions, SPG7 may function through transient pore openings to release toxic metabolites that have accumulated in mitochondria.” He plans to explore this possibility with knockout animal models.


“See-Through” Brain Developed by Japanese Researchers


Scientists at the RIKEN Brain Science Institute in Japan have developed a new method for creating transparent tissue that can be used to illuminate 3D brain anatomy at high resolutions. Published in Nature Neuroscience, the work showcases the novel technology and its practical importance in clinical science by showing how it has given new insights into Alzheimer’s disease plaques.

“The usefulness of optical clearing techniques can be measured by their ability to gather accurate 3D structural information that cannot be readily achieved through traditional 2D methods,” explains lead scientist Atsushi Miyawaki, M.D., Ph.D. “Here, we achieved this goal using a new procedure, and collected data that may resolve several current issues regarding the pathology of Alzheimer’s disease. While Superman’s x-ray vision is only the stuff of comics, our method, called ScaleS, is a real and practical way to see through brain and body tissue.”

In recent years, generating see-through tissue—a process called optical clearing—has become a goal for many researchers in life sciences because of its potential to reveal complex structural details of our bodies, organs, and cells—both healthy and diseased—when combined with advanced microscopy imaging techniques. Previous methods were limited because the transparency process itself can damage the structures under study.

The original recipe reported by the Miyawaki team in 2011, termed Scale, was an aqueous solution based on urea that suffered from this same problem. The research team spent five years improving the effectiveness of the original recipe to overcome this critical challenge, and the result is ScaleS, a new technique with many practical applications.

“The key ingredient of our new formula is sorbitol, a common sugar alcohol,” notes Dr. Miyawaki. “By combining sorbitol in the right proportion with urea, we could create transparent brains with minimal tissue damage, that can handle both florescent and immunohistochemical labeling techniques, and is even effective in older animals.”

The team has devised several variations of the Scale technique that can be used together. By combining ScaleS with AbScale—a variation for immunolabeling—and ChemScale—a variation for fluorescent chemical compounds—they generated multicolor high-resolution 3D images of amyloid beta plaques in older mice from a genetic mouse model of Alzheimer’s disease developed at the RIKEN BSI by the Takaomi Saido team.

After showing how ScaleS treatment can preserve tissue, the researchers put the technique to practical use by visualizing in 3D the mysterious “diffuse” plaques seen in the postmortem brains of Alzheimer’s disease patients that are typically undetectable using 2D imaging. Contrary to current assumptions, the diffuse plaques proved not to be isolated, but showed extensive association with microglia —mobile cells that surround and protect neurons.


GEN Roundup on Cell-Based Assays for Biological Relevancy

Cell-Based Assay Platforms are Evolving to Meet Diverse Challenges

  • Cell-based assay platforms are evolving to meet diverse challenges—mimicking disease states, preserving signaling pathways, modeling drug responses, and recreating environments conducive to tissue development.

GEN recently interviewed a number of experts on cell-based assay technology to get a sense of the state of the art and to find out where this technology might be most valuable to life sciences research.

  • GEN: What are some of the main challenges that are faced when validating cell-based assays?
  • Dr. Kelly: Considerable challenges come from using systems involving a living organism in the validation of cell-based assays. The characteristics of such systems will likely affect the criteria for validation suitability. These criteria might be specific for primary cells, immortalized cell lines, cancerous cell lines, or cells generated de novo from multipotent stem cells.
  • Chemical reagents are generally well characterized by parameters such as molecular weight, solubility, etc., which are unlikely to change between assays.
  • However, characteristics of primary cells or established cell lines, such as viability, growth phase, proliferation rate, level of metabolism, and even cell size are much more vulnerable.
  • Mr. Trinquet: Beyond developing the right cell-based assay, the main challenge remains the relevancy of the cell model for the target being investigated. Generally, a single assay must also be compatible with a broad variety of cell technologies/models, from engineered cells to more complex models, such as 2D, 3D, microtissue, primary culture, and induced pluripotent stem cell models.
  • This certainly adds some difficulty, given that protein expression levels may differ from one model to another. Also, these assays must generally translate well all along the value chain, from high-throughput screening to late stages of lead op, so that end users do not have to switch between too many assay technologies.
  • Dr. Hsu: Cell-based assays provide more biologically relevant information than biochemical assays for high-throughput screening and ADME/Tox. One challenge in developing and validating cell-based assays is to generate cells that reliably express the drug target and give reproducible results with good Z′ over time.
  • We developed and launched the industry’s first cell-based assays and profiling services for G-protein-coupled receptors. The expression of G-protein-coupled receptors has been worked out, but ion channels are challenging. Another challenge is to make sure the assays and readouts are target specific and predictive, with a good dynamic range and signal-to-noise ratio to differentiate compounds with different potencies and efficacies.
  • Dr. Khimani: Cell-based assays provide a complex and physiologically relevant medium to evaluate the effect of novel therapeutic or modulatory candidates. However, unlike traditional assay formats, cell-based assays introduce a number of challenging factors that must be considered—such as cell type, expression level, stability, and passage viability—when optimizing the assay conditions.
  • In addition, with complex cell-based assay systems, data extraction and signal-to-noise optimization can be time-consuming bottlenecks. Other challenges, particularly with high-content screening, include separate investments in instrumentation, training, data analysis, and data management, all leading to a lower throughput.
  • Dr. Fan: Cell-based assays are model systems, and the most critical challenge facing such assays is how well they reflect real biology. Cell-based assays offer great advantages over biochemical assays because they are conducted in cellular contexts. That said, most of the current cell-based assays use a homogeneous population of cells grown from immortalized cell lines, many of which express target proteins or reporters in excessive, nonphysiological amounts via transient transfection or randomly integrated stable clones. These cell models are far from the actual cellular context in normal or diseased tissue such as a tumor.
  • In addition, phenotypical consequences of an analyte of interest to the cell could reflect a combination of effects that a single cell-based assay would not be able to fully address. These factors impact the validation or correlation of the results of a cell-based assay with a phenotypical consequence, an animal model study, or a clinically relevant finding.
  • Dr. Piper: The most formidable challenge in generating and validating cell-based assays is achieving predictability and translatability. Next-generation re-targeting systems (such as the Jump-In™ platform) have made over-expression of genes, even multigene cassettes, fast, reliable, and easy compared to traditional single-cell cloning.
  • While simple overexpression of a target may be sufficient to drive a primary screen and identify hits, it often lacks a sufficiently complex pathophysiological context to robustly convert hits to lead candidates that are meaningful in clinical trials. These systems have value at early stages, but they would benefit from improvements or secondary screens that can better translate to clinical results.
  • Dr. Payne: The choice of a cell system remains a challenge. Cell lines produce reproducible results, but do not accurately model living systems. Although primary cells are more physiologically relevant, they are inherently variable, making it harder to deliver a robust cell-based assay.
  • Choosing appropriate endpoints can be time consuming: measuring one parameter is not enough to accurately determine the functionality of a drug. The ability to analyze several markers in multiplex assays provides greater information on drug efficacy and toxicity, the latter being important for failing flawed drugs earlier. Finally, once validated offline, assays still require revalidation when transferred to automated context.
  • GEN: What is more valuable to researchers with respect to cell-based assays miniaturization or ultra-high throughput?
  • Dr. Kelly: A single cell contains the complete genome of the species and thousands of expressed genes, implying that one cell could provide the same information as millions. High-throughput efforts should be aimed at our ability to multiplex, multivisualize, and microarray the enormous amount of information that one cell can provide.Mr. Trinquet: Miniaturization may be more important because the cells that are used are more complex and costly to produce massively. It comes to be particularly important when several assays need to be run in parallel using the same sample, such as cell lysate after stimulation.


Dana-Farber Researchers Use Gene Editing to Short-Circuit Sickle Cell Disease

Sep 16, 2015

a GenomeWeb staff reporter

NEW YORK (GenomeWeb) – Scientists have developed a gene editing strategy that could help treat sickle cell disease by short-circuiting the mutated hemoglobin causing the disease.

“We’ve now targeted the modifier of the modifier of a disease-causing gene,” Stuart Orkin, chairman of pediatric oncology at Dana-Farber Cancer Institute and associate chief of hematology/oncology at Boston Children’s Hospital, said in a statement. “It’s a very different approach to treating disease.”

Using CRISPR/Cas9 gene editing tools to systematically excise stretches of a promoter region of the enhancer gene BCL11A — which selects the type of hemoglobin that blood cells create — the researchers found an edit that inactivated BCL11A in human blood stem cells. The cut leads cells to increase levels of fetal hemoglobin, resulting in a milder form of sickle cell disease.

The scientists, led by Orkin and Daniel Bauer of Dana-Farber and Boston Children’s, and Feng Zhang of the Broad Institute, published their study today in Nature.

The human genome codes for both a fetal version and an adult version of hemoglobin. A mutation in the adult version of the protein causes sickle cell disease. BCL11A became a target of sickle cell disease research after Orkin’s laboratory revealed its direct role in the transition from fetal to adult hemoglobin in a 2009 study published in Nature. In 2013, a study led by Orkin and Bauer found the promoter region which controls expression of BCL11A in red blood cells.


Musical Scales


The quest to document an ancient sea creature reveals a cyclical chorus of fish songs.

By Kerry Grens | September 1, 2015



fish songs

fish songs

Several years ago, ichthyologist Eric Parmentier met a French marine biologist and filmmaker, Laurent Ballesta, who was organizing an expedition to South Africa to produce a documentary film on the coelacanth. This ancient fish—one whose fossil record dates back at least 350 million years—has an almost mythical legacy. Although it was widely assumed to have gone extinct 65 million years ago, a live specimen was found in 1938, and scientists have identified two extant species of coelacanth. Both species move in a peculiar way, waggling four lobe-like fins in an alternating pattern, as we do our arms and legs. Their anatomy is also unusual: a tiny brain, a joint at the back of the head that allows the animal to open its jaws widely, and only rudimentary vertebrae. Ballesta’s trip inspired Parmentier, who studies fish acoustics, to collaborate with the team. “I hoped to be the first guy to record [sounds of] the coelacanth.”

In the spring of 2013, divers successfully planted a hydrophone inside the cave and also shot video footage of a coelacanth. (The resulting documentary by Ballesta is available on YouTube. Although it is in French, the footage obviates the need for fluency to enjoy the film.) Day and night, for weeks, the hydrophone dutifully recorded the sounds within the cave. When Parmentier retrieved the files and went to analyze the recordings, there was one big problem: it was filled with dozens of different fish calls. “Maybe the coelacanth is in these sound files, but it’s completely masked by the other sounds,” he says.

Nonetheless, the tape captured ceaseless, never-before-heard chatter among the aquatic organisms within the cave (PNAS, 112:6092-97, 2015). To make some sense of it, Parmentier’s team undertook the laborious task of characterizing the sounds recorded over 19 nonconsecutive days (to make this feasible, the group pared down its analysis to the first nine minutes of every hour). The researchers assigned more than 2,700 sounds to 17 groups, most of which sounded to Parmentier like fish (one group was clearly dolphin, based on its high frequency, he says). These included frog-like croaks, grunts that sounded like a creaking door, a moan, and one that sounded like a whistle blown under water. “It’s fair to say, based on the characteristics of the sounds they were hearing, they are probably fish sounds,” says Erica Staaterman, a postdoc at the Smithsonian who studies fish acoustic communication.


NIH Awards Beth Israel Team $3M to Continue Study of Heart Disease Biomarker  Sep 17, 2015

a GenomeWeb staff reporter

NEW YORK (GenomeWeb) – The National Institutes of Health has awarded a Beth Israel Deaconess Medical Center (BIDMC) research team $3 million in funding to support the second phase of an effort to identify microRNAs that can be used to predict clinical outcomes of heart disease patients.

The grant, which was awarded under the NIH’s Extracellular RNA Communication program, follows a $4 million award the group received to kick off the project in 2013.

To date, the team has identified a number of miRNA biomarker candidates including miR-30d, which the researchers reported earlier this year as a predictor of beneficial cardiac remodeling in patients following a heart attack and a key player in preventing cell death.

With the latest grant, the investigators aim to validate miR-30d and other candidate miRNAs in several large patient cohorts.

microRNA-based tests

microRNA-based tests



Unraveling determinants of transcription factor binding outside the core binding site

Michal Levo, Einat Zalckvar, Eilon Sharon, Ana Carolina Dantas Machado, Yael Kalma, Maya Lotam-Pompan, Adina Weinberger, Zohar Yakhini, Remo Rohs and Eran Segal. “Unraveling determinants of transcription factor binding outside the core binding site”. Genome Res. July 2015 25: 1018-1029.


Binding of transcription factors (TFs) to regulatory sequences is a pivotal step in the control of gene expression. Despite many advances in the characterization of sequence motifs recognized by TFs, our ability to quantitatively predict TF binding to different regulatory sequences is still limited. Here, we present a novel experimental assay termed BunDLE-seq that provides quantitative measurements of TF binding to thousands of fully designed sequences of 200 bp in length within a single experiment. Applying this binding assay to two yeast TFs, we demonstrate that sequences outside the core TF binding site profoundly affect TF binding. We show that TF-specific models based on the sequence or DNA shape of the regions flanking the core binding site are highly predictive of the measured differential TF binding. We further characterize the dependence of TF binding, accounting for measurements of single and co-occurring binding events, on the number and location of binding sites and on the TF concentration. Finally, by coupling our in vitro TF binding measurements, and another application of our method probing nucleosome formation, to in vivo expression measurements carried out with the same template sequences serving as promoters, we offer insights into mechanisms that may determine the different expression outcomes observed. Our assay thus paves the way to a more comprehensive understanding of TF binding to regulatory sequences and allows the characterization of TF binding determinants within and outside of core binding sites.


Defective Mitochondria Transform Normal Cells into Tumors


An international research team reports that defects in mitochondria, play a key role in the transition from normal cells to cancerous ones. When the scientists disrupted a key component of mitochondria, otherwise normal cells took on characteristics of cancerous tumor cells.

Their study (“Disruption of cytochrome c oxidase function induces the Warburg effect and metabolic reprogramming”) is published Oncogene and was led by members of the lab of Narayan G. Avadhani, Ph.D., the Harriet Ellison Woodward Professor of Biochemistry in the department of biomedical sciences in the school of veterinary medicine at the University of Pennsylvania. Satish Srinivasan, Ph.D., a research investigator in Dr. Avadhani’s lab, was the lead author.

In 1924, German biologist Otto Heinrich Warburg observed that cancerous cells consumed glucose at a higher rate than normal cells and had defects in their grana, the organelles that are now known as mitochondria. He postulated that the mitochondrial defects led to problems in the process by which the cell produces energy, called oxidative phosphorylation, and that these defects contributed to the cells becoming cancerous.

“The first part of the Warburg hypothesis has held up solidly in that most proliferating tumors show high dependence on glucose as an energy source and they release large amounts of lactic acid,” said Dr. Avadhani. “But the second part, about the defective mitochondrial function causing cells to be tumorigenic, has been highly contentious.”

Read Full Post »

Overview of New Strategy for Treatment of T2DM: SGLT2 Inhibiting Oral Antidiabetic Agents


Author and Curator: Aviral Vatsa, PhD, MBBS

Type 2 diabetes mellitus (T2DM) is a chronic disease, which is affecting widespread populations in epidemic proportions across the globe 1. It is characterised by hyperglycemia, which if not controlled adequately, eventually leads to microvascular and metabolic complications (Fig 1). Traditionally, T2DM management includes alteration in lifestyle, oral hypoglycemic agents and/or insulin. The present pharmacological approaches predominantly target glucose metabolism by compensating for reduction in insulin secretion and/or insulin action. However, these approaches are often limited by inadequate glucose control and the the possibility of severe adverse effects such as hypoglycemia, weight gain, nausea, and sometimes lactic acidosis 2–4 (Fig 1). Hence the search for new drugs with different mechanism of action and with little side affects is key in providing better glycemic control in T2DM patients and hence offering better prognosis with reduced morbidity and mortality.

Figure 1 (credit: aviral vatsa): Short overview of Type 2 diabetes mellitus (T2DM): complications, present therapeutic approaches and their limitations.

Along with pancreas, our kidneys play a vital role in regulating glucose levels in the plasma. Under physiological conditions, kidneys absorb 99% of the plasma glucose filtered through the renal glomeruli tubules. Majority i.e. 80-90% of this renal glucose resorbtion is mediated via the sodium glucose co-transporter 2 (SGLT2) 5,6. SGLT2 is a high-capacity low-affinity transporter that is mainly located in the proximal segment S1 of the proximal convoluted tubule 6. Inhibition of SGLT2 activity can thus induce glucosuria which inturn can lower blood glucose levels without targeting insulin resistance and insulin secretion pathways of glucose modulation (Fig 2).

Figure 2 (credit: aviral vatsa): Schematic overview of regulation of plasma glucose by sodium glucose co-transporter (SGLT).

Thus inhibition of SGLT2 provides a novel way to modulate blood glucose levels and consequently limit long term complications of hyperglycemia 7,8. Moreover, SGLT2 inhibitors will selectively target the renal glucose transportation and spare the counter regulatory hormones involved in glucose metabolism because SGLT2 is almost exclusively located in the kidneys. This novel way of glucose modulation will likely avoid severe side affects, e.g. hypoglycemia and weight gain, that are seen with present antidiabetic pharmacological agents.

Agents currently under development

Table below gives an overview of the SGLT2 inhibotors in development.

(Credit: Chao et al 2010)


In summary, increasing urinary glucose excretion represents a new approach to addressing the challenge of hyperglycaemia. SGLT2 inhibitors may have indications both in the prevention and treatment of T2DM, and perhaps T1DM, with a possible application in obesity. Further studies in large numbers of human subjects are necessary to delineate efficacy, safety and how to most effectively use these agents in the treatment of diabetes.


  1. Diabetes Atlas. International Diabetes Federation, (2009) at <www.diabetesatlas.org>
  2. Intensive blood-glucose control with sulphonylureas or insulin compared with conventional treatment and risk of complications in patients with type 2 diabetes (UKPDS 33). UK Prospective Diabetes Study (UKPDS) Group. Lancet 352, 837–853 (1998).
  3. Buse, J. B. et al. Effects of exenatide (exendin-4) on glycemic control over 30 weeks in sulfonylurea-treated patients with type 2 diabetes. Diabetes Care 27, 2628–2635 (2004).
  4. Inzucchi, S. E. Oral antihyperglycemic therapy for type 2 diabetes: scientific review. JAMA 287, 360–372 (2002).
  5. Brown, G. K. Glucose transporters: Structure, function and consequences of deficiency. Journal of Inherited Metabolic Disease 23, 237–246 (2000).
  6. Wright, E. M. Renal Na+-glucose cotransporters. Am J Physiol Renal Physiol 280, F10–F18 (2001).
  7. Chao, E. C. & Henry, R. R. SGLT2 inhibition — a novel strategy for diabetes treatment. Nature Reviews Drug Discovery 9, 551–559 (2010).
  8. Ferrannini, E. & Solini, A. SGLT2 inhibition in diabetes mellitus: rationale and clinical prospects. Nature Reviews Endocrinology 8, 495–502 (2012).


Read Full Post »

%d bloggers like this: