Feeds:
Posts
Comments

Archive for December, 2015

Internet Publishing and Broadcasting and Web Search Portals: Google’s Profile by VB

Reporter: Aviva Lev-Ari, PhD, RN

Company Google News, Employees and Funding Information, Mountain View, CA

 

COMPANY

Google is a multinational corporation that is specialized in internet-related services and products. The company’s product portfolio includes Google Search, which provides users with access to information online; Knowledge Graph that allows to search for things, people, or places as well as builds systems recognizing speech and understanding natural language; Google Now, which provides information to users when they need it; Product Listing Ads that offer product image, price, and merchant information; AdWords, an auction-based advertising program; AdSense, which enables websites that are part of the Google Network to deliver ads; Google Display, a display advertising network; DoubleClick Ad Exchange, a marketplace for the trading display ad space; and YouTube that offers video, interactive, and other ad formats.

As on October 2, 2015, Google became Alphabet’s leading subsidiary, as well as the parent for Google’s Internet interests.

Employees >10K
EMPLOYEES
Featured Employees 101
Date Stage/series Amount Valuation
Aug-2004 * IPO 1.67B 23B
NEW MILESTONES
48
RELATED COMPANIES

Offices

Partner News (1214)

Milestones (1444)

Links (672)

People (100)

Related Companies (48)

Press Releases (5)

Funding –  Investors (10)

Investments/Acquisitions (43)

SOURCE 

http://www.vbprofiles.com/companies/3e122f809e597c1003565d3f#milestones

Read Full Post »

Contribution to Inflammatory Bowel Disease (IBD) of bacterial overgrowth in gut on a chip

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Contributions of microbiome and mechanical deformation to intestinal bacterial overgrowth and inflammation in a 

human gut-on-a-chip 
Gut-On-a-Chip Holds Clues for Treating Inflammatory Bowel Diseases
Greg Watry
Human intestinal epithelial cells cultured in the Wyss Institute's human gut-on-a-chip form differentiated intestinal villi when cultured in the presence of lifelike fluid flow and rhythmic, peristalsis-like motions. Here the villi are visible using a traditional microscope (left) or a confocal microscope (right); when the same villi are stained with fluorescent antibodies, it clearly reveals the nuclei in the intestinal cells (blue) and their specialized apical membranes when they contact the intestinal lumen (green). Credit: Wyss Institute at Harvard University
Human intestinal epithelial cells cultured in the Wyss Institute’s human gut-on-a-chip form differentiated intestinal villi when cultured in the presence of lifelike fluid flow and rhythmic, peristalsis-like motions. Here the villi are visible using a traditional microscope (left) or a confocal microscope (right); when the same villi are stained with fluorescent antibodies, it clearly reveals the nuclei in the intestinal cells (blue) and their specialized apical membranes when they contact the intestinal lumen (green). Credit: Wyss Institute at Harvard University

Roughly the size of a computer memory stick and made of clear flexible polymer, the human gut-on-a-chip was created by Harvard Univ.’s Wyss Institute in 2012. Three years later, researchers are utilizing the technology in hopes of creating new therapies for inflammatory bowel diseases (IBD).

The Centers for Disease Control and Prevention estimates that between 1 and 1.3 million people suffer from IBD, including such diseases as ulcerative colitis and Crohn’s disease. With origins still mysterious, IBD is currently incurable.

“It has not been possible to study…human intestinal inflammatory diseases, because it is not possible to independently control these parameters in animal studies or in vitro models,” wrote the researchers in Proceedings of the National Academy of the Sciences. “In particular, given the recent recognition of the central role of the intestinal microbiome in human health and disease, including intestinal disorders, it is critical to incorporate commensal microbes into experimental models, however, this has not been possible using conventional culture systems.”

Additionally, static in vitro methods fail to replicate the pathophysiology of human IBD.

But the hollow-channeled microfluidic gut-on-a-chip successfully simulates the human intestine’s physical structure, microenvironment, peristalsis-like motion, and fluid flow.

“With our human gut-on-a-chip, we can not only culture the normal gut microbiome for extended times, but we can also analyze contributions of pathogens, immune cells, and vascular and lymphatic endothelium, as well as model specific diseases to understand the complex pathophysiological responses of the intestinal tract,” said Donald Ingber, founding director of the Wyss Institute.

The device was “used to co-culture multiple commensal microbes in contact with living human intestinal epithelial cells for more than a week in vitro and to analyze how gut microbiome, inflammatory cells, and peristalsis-associated mechanical deformations independently contribute to intestinal bacterial overgrowth and inflammation,” the researchers wrote.

Thus far, use of the device has yielded two interesting observations.

Four proteins—called cytokines—work together to trigger an inflammatory responses that exacerbate the bowel, the researchers found. Potentially, this new discovery could lead to the development of treatments that block the cytokine interaction.

Another observation, the researchers noted, is that “by ceasing peristalsis-like motions while maintaining luminal flow, lack of epithelial deformation was shown to trigger bacterial overgrowth similar to that observed in patients with ileus and inflammatory bowel disease,” according to the researchers.

The researchers believe the micro-device may one day be applicable to precision medicine. Eventually, a custom treatment may arise from scientists using a patient’s gut microbiota and cells on a human gut-on-a-chip.

 

 

Contributions of microbiome and mechanical deformation to intestinal bacterial overgrowth and inflammation in a human gut-on-a-chip
Hyun Jung Kima,1, Hu Lia,2, James J. Collinsa,b,c,d,e,f,3, and Donald E. Ingbera,g,h,
http://www.pnas.org/content/early/2015/12/09/1522193112.full.pdf

A human gut-on-a-chip microdevice was used to coculture multiple commensal microbes in contact with living human intestinal epithelial cells for more than a week in vitro and to analyze how gut microbiome, inflammatory cells, and peristalsis-associated mechanical deformations independently contribute to intestinal bacterial overgrowth and inflammation. This in vitro model replicated results from past animal and human studies, including demonstration that probiotic and antibiotic therapies can suppress villus injury induced by pathogenic bacteria. By ceasing peristalsis-like motions while maintaining luminal flow, lack of epithelial deformation was shown to trigger bacterial overgrowth similar to that observed in patients with ileus and inflammatory bowel disease. Analysis of intestinal inflammation on-chip revealed that immune cells and lipopolysaccharide endotoxin together stimulate epithelial cells to produce four proinflammatory cytokines (IL-8, IL-6, IL-1β, and TNF-α) that are necessary and sufficient to induce villus injury and compromise intestinal barrier function. Thus, this human gut-on-a-chip can be used to analyze contributions of microbiome to intestinal pathophysiology and dissect disease mechanisms in a controlled manner that is not possible using existing in vitro systems or animal models.

 

Significance The main advance of this study is the development of a microengineered model of human intestinal inflammation and bacterial overgrowth that permits analysis of individual contributors to the pathophysiology of intestinal diseases, such as ileus and inflammatory bowel disease, over a period of weeks in vitro. By studying living human intestinal epithelium, with or without vascular and lymphatic endothelium, immune cells, and mechanical deformation, as well as living microbiome and pathogenic microbes, we identified previously unknown contributions of specific cytokines, mechanical motions, and microbiome to intestinal inflammation, bacterial overgrowth, and control of barrier function. We provide proof-of-principle to show that the microfluidic gut-on-a-chip device can be used to create human intestinal disease models and gain new insights into gut pathophysiology.

 

Various types of inflammatory bowel disease (IBD), such as Crohn’s disease and ulcerative colitis, involve chronic inflammation of human intestine with mucosal injury and villus destruction (1), which is believed to be caused by complex interactions between gut microbiome (including commensal and pathogenic microbes) (2), intestinal mucosa, and immune components (3). Suppression of peristalsis also has been strongly associated with intestinal pathology, inflammation (4, 5), and small intestinal bacterial overgrowth (5, 6) in patients with Crohn’s disease (7) and ileus (8). However, it has not been possible to study the relative contributions of these different potential contributing factors to human intestinal inflammatory diseases, because it is not possible to independently control these parameters in animal studies or in vitro models. In particular, given the recent recognition of the central role of the intestinal microbiome in human health and disease, including intestinal disorders (2), it is critical to incorporate commensal microbes into experimental models; however, this has not been possible using conventional culture systems. Most models of human intestinal inflammatory diseases rely either on culturing an intestinal epithelial cell monolayer in static Transwell culture (9) or maintaining intact explanted human intestinal mucosa ex vivo (10) and then adding live microbes and immune cells to the apical (luminal) or basolateral (mucosal) sides of the cultures, respectively. These static in vitro methods, however, do not effectively recapitulate the pathophysiology of human IBD. For example, intestinal epithelial cells cultured in Transwell plates completely fail to undergo villus differentiation, produce mucus, or form the various specialized cell types of normal intestine. Although higher levels of intestinal differentiation can be obtained using recently developed 3D organoid cultures (11), it is not possible to expose these cells to physiological peristalsis-like motions or living microbiome in long-term culture, because bacterial overgrowth occurs rapidly (within ∼1 d) compromising the epithelium (12). This is a major limitation because establishment of stable symbiosis between the epithelium and resident gut microbiome as observed in the normal intestine is crucial for studying inflammatory disease initiation and progression (13), and rhythmical mechanical deformations driven by peristalsis are required to both maintain normal epithelial differentiation (14) and restrain microbial overgrowth in the intestine in vivo (15).

Thus, we set out to develop an experimental model that would overcome these limitations. To do this, we adapted a recently described human gut-on-a-chip microfluidic device that enables human intestinal epithelial cells (Caco-2) to be cultured in the presence of physiologically relevant luminal flow and peristalsislike mechanical deformations, which promotes formation of intestinal villi lined by all four epithelial cell lineages of the small intestine (absorptive, goblet, enteroendocrine, and Paneth) (12, 16). These villi also have enhanced barrier function, drug-metabolizing cytochrome P450 activity, and apical mucus secretion compared with the same cells grown in conventional Transwell cultures, which made it possible to coculture a probiotic gut microbe (Lactobacillus rhamnosus GG) in direct contact with the intestinal epithelium for more than 2 wk (12), in contrast to static Transwell cultures (17) or organoid cultures (11) that lose viability within hours under similar conditions. In the present study, we leveraged this human gut-on-a-chip to develop a disease model of small intestinal bacterial overgrowth (SIBO) and inflammation. We analyzed how probiotic and pathogenic bacteria, lipopolysaccharide (LPS), immune cells, inflammatory cytokines, vascular endothelial cells and mechanical forces contribute individually, and in combination, to intestinal inflammation, villus injury, and compromise of epithelial barrier function. We also explored whether we could replicate the protective effects of clinical probiotic and antibiotic therapies on-chip to demonstrate its potential use as an in vitro tool for drug development, as well as for dissecting fundamental disease mechanisms.

 

Fig. 1. The human gut-on-a-chip microfluidic device and changes in phenotype resulting from different culture conditions on-chip, as measured using genome-wide gene profiling. (A) A photograph of the device. Blue and red dyes fill the upper and lower microchannels, respectively. (B) A schematic of a 3D cross-section of the device showing how repeated suction to side channels (gray arrows) exerts peristalsis-like cyclic mechanical strain and fluid flow (white arrows) generates a shear stress in the perpendicular direction. (C) A DIC micrograph showing intestinal basal crypt (red arrow) and villi (white arrow) formed by human Caco-2 intestinal epithelial cells grown for ∼100 h in the gut-on-achip under medium flow (30 μL/h) and cyclic mechanical stretching (10%, 0.15 Hz). (Scale bar, 50 μm.) (D) A confocal immunofluorescence image showing a horizontal cross-section of intestinal villi similar to those shown in Fig. 1C, stained for F-actin (green) that labels the apical brush border of these polarized intestinal epithelial cells (nuclei in blue). (Scale bar, 50 μm.) (E) Hierarchical clustering analysis of genome-wide transcriptome profiles (Top) of Caco-2 cells cultured in the static Transwell, the gut-on-a-chip (with fluid flow at 30 μL/h and mechanical deformations at 10%, 0.15 Hz) (Gut Chip), or the mechanically active gut-on-a-chip cocultured with the VSL#3 formulation containing eight probiotic gut microbes (Gut Chip + VSL#3) for 72 h compared with normal human small intestinal tissues (Duodenum, Jejunum, and Ileum; microarray data from the published GEO database). The dendrogram was generated based on the averages calculated across all replicates, and all branches in the cluster have the approximately unbiased (AU) P value equal to 100. The y axis next to the dendrogram represents the metric for Euclidean distance between samples. Corresponding pseudocolored GEDI maps analyzing profiles of 650 metagenes between samples described above (Bottom).

 

Fig. 2. Reconstitution of pathological intestinal injury induced by interplay between nonpathogenic or pathogenic enteroinvasive E. coli bacteria or LPS endotoxin with immune cells. (A) DIC images showing that the normal villus morphology of the intestinal epithelium cultured on-chip (Control) is lost within 24 h after EIEC (serotype O124:NM) are added to the apical channel of the chip (+EIEC; red arrows indicate bacterial colonies). (B) Effects of GFP-EC, LPS (15 μg/mL), EIEC, or no addition (Control) on intestinal barrier function (Left). Right shows the TEER profiles in the presence of human PBMCs (+PBMC). GFP-EC, LPS, and EIEC were added to the apical channel (intestinal lumen) at 4, 12, and 35 h, respectively, and PBMCs were subsequently introduced through the lower capillary channel at 44 h after the onset of experiment (0 h) (n = 4). (C) Morphological analysis of intestinal villus damage in response to addition of GFP-EC, LPS, and EIEC in the absence (−PBMC) or the presence of immune components (+PBMC). Schematics (experimental setup), phase contrast images (horizontal view, taken at 57 h after onset), and fluorescence confocal micrographs (vertical cross-sectional views at 83 h after onset) were sequentially displayed. F-actin and nuclei were coded with magenta and blue, respectively. (D) Quantification of intestinal injury evaluated by measuring changes in lesion area (Top; n = 30) and the height of the villi (Bottom; n = 50) in the absence (white) or the presence (gray) of PBMCs. Intestinal villi were grown in the gut-on-a-chip under trickling flow (30 μL/h) with cyclic deformations (10%, 0.15 Hz) during the preculture period for ∼100 h before stimulation (0 h, onset). Asterisks indicate statistical significance compared with the control at the same time point (*P < 0.001, **P < 0.05). (Scale bars, 50 μm.)

 

Recapitulating Organ-Level Intestinal Inflammatory Responses. During inflammation in the intestine, pathophysiological recruitment of circulating immune cells is regulated via activation of the underlying vascular endothelium. To analyze this organ-level inflammatory response in our in vitro model, a monolayer of human microvascular endothelial cells (Fig. 3 C and D and Fig. S6 A and C) or lymphatic endothelial cells (Fig. S6 B and C) was cultured on the opposite (abluminal) side of the porous ECM-coated membrane in the lower microchannel of the device to effectively create a vascular channel (Fig. 3C). To induce intestinal inflammatory responses, LPS (Fig. 3 C and D) or TNF-α (Fig. S6) was flowed through the upper epithelial channel for 24 h, and then PBMCs were added to the vascular channel for 1 h without flow (Fig. 3 C and D). Treatment with both LPS (or TNF-α) and PBMCs resulted in the activation of intercellular adhesion molecule-1 (ICAM-1) expression on the surface of the endothelium (Fig. 3 C and D, Left, and Fig. S6) and a significant increase (P < 0.001) in the number of PBMCs that adhered to the surface of the capillary endothelium compared with controls (Fig. 3D). These results are consistent with our qPCR results, which also showed up-regulation of genes involved in immune cell trafficking (Fig. S5). Neither addition of LPS nor PBMCs alone was sufficient to induce ICAM-1 expression in these cells (Fig. 3D), which parallels the effects of LPS and PBMCs on epithelial production of inflammatory cytokines (Fig. 3A) as well as on villus injury (Fig. 2 B and D).

Evaluating Antiinflammatory Probiotic and Antibiotic Therapeutics On-Chip. To investigate how the gut microbiome modulates these inflammatory reactions, we cocultured the human intestinal villi with the eight strains of probiotic bacteria in the VSL#3 formulation that significantly enhanced intestinal differentiation (Fig. 1E and Fig. S1B). To mimic the in vivo situation, we colonized our microengineered gut on a chip with the commensal microbes (VSL#3) first and then subsequently added immune cells (PBMCs), pathogenic bacteria (EIEC), or both in combination. The VSL#3 microbial cells inoculated into the germ-free lumen of the epithelial channel primarily grew as discrete microcolonies in the spaces between adjacent villi (Fig. 4A and Movie S3) for more than a week in culture (Fig. S7A), and no planktonic growth was detected. These microbes did not overgrow like the EIEC (Fig. 2A and Movie S2), although occasional microcolonies also appeared at different spatial locations in association with the tips of the villi (Fig. S7 B and C). The presence of these living components of the normal gut microbiome significantly enhanced (P < 0.001) intestinal barrier function, producing more than a 50% increase in TEER relative to control cultures (Fig. 4B) without altering villus morphology (Fig. 4C). This result is consistent with clinical studies suggesting that probiotics, including VSL#3, can significantly enhance intestinal barrier function in vivo (18).

To mimic the effects of antibiotic therapies that are sometimes used clinically in patients with intestinal inflammatory disease (29), we identified a dose and combination of antibiotics (100 units per mL penicillin and 100 μg/mL streptomycin) that produced effective killing of both EIEC and VSL#3 microbes in liquid cultures (Fig. S9) and then injected this drug mixture into the epithelial channel of guton-a-chip devices infected with EIEC. When we added PBMCs to these devices 1 h later, intestinal barrier function (Fig. 4B) and villus morphology (Fig. 4C) were largely protected from injury, and there was a significant reduction in lesion area (Fig. 4D). Thus, the gut-on-a-chip was able to mimic suppression of injury responses previously observed clinically using other antibiotics that produce similar bactericidal effects.

Analyzing Mechanical Contributions to Bacterial Overgrowth. Finally, we used the gut-on-a-chip to analyze whether physical changes in peristalsis or villus motility contribute to intestinal pathologies, such as the small intestinal bacterial overgrowth (SIBO) (5, 6) observed in patients with ileus (8) and IBD (7). When the GFPEC bacteria were cultured on the villus epithelium under normal flow (30 μL/h), but in the absence of the physiological cyclic mechanical deformations, the number of colonized bacteria was significantly higher (P < 0.001) compared with gut chips that experienced mechanical deformations (Fig. 5A). Bacterial cell densities more than doubled within 21 h when cultured under conditions without cyclic stretching compared with gut chips that experienced physiological peristalsis-like mechanical motions, even though luminal flow was maintained constant (Fig. 5B). Thus, cessation of epithelial distortion appears to be sufficient to trigger bacterial overgrowth, and motility-induced luminal fluid flow is not the causative factor as assumed previously (7).

 

Discussion One of the critical prerequisites for mimicking the living human intestine in vitro is to establish a stable ecosystem containing physiologically differentiated intestinal epithelium, gut bacteria, and immune cells that can be cultured for many days to weeks. Here we leveraged a mechanically active gut-on-a-chip microfluidic device to develop an in vitro model of human intestinal inflammation that permits stable long-term coculture of commensal microbes of the gut microbiome with intestinal epithelial cells. The synthetic model of the human living intestine we built recapitulated the minimal set of structures and functions necessary to mimic key features of human intestinal pathophysiology during chronic inflammation and bacterial overgrowth including epithelial and vascular inflammatory processes and destruction of intestinal villi.

Read Full Post »

Investment Trends in Series A and B by Major and by Micro US Venture Capital Firms

Reporters: Gerard Loiseau, ESQ and Aviva Lev-Ari, PhD, RN

 

Introduction by Gerard Loiseau, ESQ

The Future of Funding for Early-Stage Start-Ups in the Healthcare Businesses

 

 

  • Is Smart Money Pulling Back From Early-Stage Start-ups and if the answer is Yes, why ?
  1. In an article as of Monday, 02 February 2015 originally posted on GigaOM,
  • Bryan Dow, Executive Director at Mooreland Partners was talking about M&A and 3D Printing Collide.
  • As part of his expose, he talks about “the financing: filling the gaps” where he explains that “Strategic investors also currently consider 3D printing to be ‘non-investable’ due to the long lead time for business development. Size plays an important role, too. 3D printing companies just don’t have the revenue or EBITDA necessary to fit the threshold requirements.”
  • So how are Early-Stage Start-ups funding their development in the 3D Printing business ? Mainly Crowdfunding is the answer with companies like Kickstarter and Indiegogo.
  1. In an interview at McKinsey & Co., Eric Schadt, M.D., founding director of the Icahn Institute for Genomics and Multiscale Biology at New York’s Mount Sinai Health System, says questions will become easier to answer as more data is pulled together.
  • How big data will cause an evolution in medicine
  • November 24, 2015 | By Susan D. Hall
  • He says: “Technology is revolutionizing our understanding and treatment of disease”
  • Evolution? Revolution?
  • Big Data will help to build predictive models by aggregating more and more information from all the components of a disease.
  • It is evolution but will be very fast a revolution.
  • Big data are always used by Google, Amazon, (—), and the same processes can be applied to medicine. (The first subsidiary of Alphabet is Life Sciences Group@Google, now Verily)
  • About Wearables.
  • Mobile health apps utilization represents the future, and wearable-device allows anyone to manage one’s situation and/or to be managed in case of deviations from the “baseline”.
  • Big Data, patients, payers and pharma.
  • It introduces a partnership with the Patients, who will get a dashboard about their health situation.
  • Payers will be able to control the slide into a disease state and so to save money.
  • They will get better risk profiles.
  • Device makers can taka advantage of this new business model.
  • Pharma will be able to better understand the causal players of disease.
  • Talents
  • Experts have to be recruited according to this nearby new “ecosystem”
  • Translation will be a key item.
  • A major problem is the transformation of the life sciences, driven by this quantitative, statistical, computational model.
  • Mathematics and computer science will be essential.
  1. Gina Hall , Contributor, Silicon Valley Business Journal, says : Startup founders may have to work a little harder for funding as top venture capital firms pull out of early stage investing.
  • Gina Hall mentions a CB Insights report « The investment research firm said there were only 67 angel and Series A deals in the third quarter of 2015, the lowest quarterly count since the fourth quarter of 2010, which saw 64 deals”
  • Bill Maris, president and chief executive of Google Ventures and Andreessen Horowitz both said that they scaled down seed investments about two years ago.
  • They do consider that the number of candidates is increasing in huge proportions.
  1. Tess Townsend, Staff reporter, Inc.com@Tess_Townsend says : ‘Smart Money’ Is Getting Scarce for Startups
  • Investment in tech is rising, but less and less money is coming from top venture capital firms. What do they know that everyone else is just figuring out?
  • “Filling the place of smart money venture capitalists are investors with less experience in the market, such as mutual funds better known for public-market investing,” states The Information.  
  1. Digital Health : CB Insights October 5, 2015
  • “Four of the last 5 quarters have seen more than $200M in digital health financings involving the top 20 smart money VCs.”
  • Even if CB Insights comments on December 14 :
  • Data points on active corporates:
  • “The number of digital health investments in total was small, with pharma corps with only 28 deals involving pharma corps since 2013”
  1. Healthcare Start-ups Boom: 2015 Could See More Than $12B Invested Into VC-Backed Companies : CB Insights September 1/, 2015
  • There have been more than 400 healthcare deals in the first half of 2015, and the year is on track for a five-year high in funding.

 

 

Investment by Blue Chip VCs
Series A Series B Private Tech Global Financing First 9 month Stage A Funding # of Seres A Rounds
% % $ $Billion # of Transactions
2012 137.4
2013 7.5 11.4 165.2 2.0 243
2014 6.1 11.2 278.1 2.0 219
Est. 2015 5.3 9.0 241.7 1.3 115
SOURCE SignalFire, San Francisco
COMMENT: That growth is occurring in later-stage companies, series B and later; funding in the seed and series A rounds is down 20.5% and 28% in the first nine months compared with the same period last year, respectively.

DEFINITIONS by SignalFire:                                                   Series A: $4 million and $12 million

 

Investment Firm Type by Name 

14 “micro” VCs and 15 “major” VCs

Major VC Micro VC
General Catalyst Partners Harrison Metal
Khosla Ventures Lerer Hippeau Ventures
Kleiner Perkins Caufield & Byers O’Reilly AlphaTech Ventures
Lightspeed Venture Partners Y Combinator (Accelerator)
New Enterprise Associates (NEA) Forerunner Ventures
Redpoint Ventures Red Swan Ventures
Spark Capital LLC Lowercase Capital
Sequoia Capital Baseline Ventures
Greylock Partners Crosscut Ventures
Founders Fund Data Collective
Union Square Ventures First Round Capital
Andreessen Horowitz Founder Collective
Accel Partners Floodgate
Benchmark Felicis Ventures
Bessemer Venture Partners

SOURCE

https://www.theinformation.com/smart-money-pulls-backs-from-early-stage-startups?unlock=92821f&token=2053453a7297bdd5b8a501d6ea4fea0d4e1d2cbe

 

Conclusions by Gerard Loiseau, ESQ

  • Start-ups will have to adapt to these new challenges.
  • The biggest evolution is linked to Big Data and so to the New Entrants like Alphabet and Apple.
  • They are designing the future and the Cloud is their battlefield, following the pioneer, Amazon.com
  • People will wear the sensors allowing to identify variation in vital signs and indications of the state of their diseases. The transmission will be done through watches and other mobile devices in the Internet of Things (IoT) to the cloud and they will get a direct ping as their answer.
  • Labs and hospitals will be managed by their IT Data managed trough out the Cloud.
  • The New Entrants will create THEIR own “medical centres”  that will be Virtual.
  • They will provide the data, infrastructure, information and training to all the constituencies including Patient’s access to EMR.
  • Intelligent Medical Centre will operate the representation of completely reshaped existing workflows.
  • Start-ups will have to adapt to these new challenges by embracing the new reality.

The following article concerns trends seen in early round investment landscape in the United States, focusing on Seed Round and Series A and B Venture Capital and can be read at

Source: http://tomtunguz.com/us-it-fundraising-early-2016/?utm_content=bufferd960d&utm_medium=social&utm_source=linkedin.com&utm_campaign=buffer

by

What’s Really Happening In The US Venture Fundraising Market In Early 2016

The startup fundraising market in 2016 has been difficult to characterize. Punctuated by a concentrated decline in public tech stocks, the sentiment in Startupland has changed from resolute ebullience to a calmness approaching caution. Two months in, we can analyze January and February data. This posts analyses US headquartered information technology companies which VC-led investment rounds, except for the $793M Series C in Magic Leap, which I excluded as an outlier.

VCs invested about $2B in January and February 2016. The January figure equals the previous year, but February dollars deployed halved compared to 2015, reverting to 2014 levels.

The count of investments fell in January by 44% to roughly 130 and remained there in February.

For January investment round volumes to fall and for total dollars invested to remain the same, VCs must have invested in a disproportionate share of large, later stage rounds. As the year continued, round volumes held steady, but total dollars invested halved indicating the typical investment fell substantially. And the median investment chart above supports this hypothesis.

Let’s break this overall figure into Series A, B, C and Seed median investment sizes. Series As have fallen from their late 2015 highs by about 20%. The median Series B totals less than $15M, down 25% from the 2015 highs. Series Cs are stable at about $35M, and seed rounds continue their ascent with the typical Seed round at more than $2M.

The Seed round figure might be spurious. Without a fixed definition of a seed round, this number can move as the market includes a greater or lesser number of financings in this colloquial term.

We can segment the data by round size: rounds less than $2.5M, $15M, $50M, and greater 50+. Median seed rounds are then dominated by the +/- $150k initial investments of incubators and accelerators with large new portfolios like YCombinator and 500Startups whose demo days are fast approaching. The differing data points suggest to me small Series As and Seeds are being further conflated; for entrepreneurs it’s often better to characterize a $3M round as a seed, rather than a Series A.

No analysis is perfect. But this data does provide a lens into the state of the fundraising market. Series A and B sizes are down from their highs. Overall investment in February halved. But the data raises more questions.

What is really happening with seed round size? Was VC activity in February an aberration or representative of a deeper change in sentiment? In all likelihood, February was a bit of a wait-and-see month. Most importantly, the data supports the notion that investors are still looking to invest, and round sizes are relatively stable. with the exception of the B. The Series B will likely be the most challenging round to raise in the beginning of this year.

Read Full Post »

Visualizing metal-impregnated neurons with spectral confocal microscopy

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Novel Imaging Captures Beauty of Metal-Labeled Neurons in 3D

These are silver-impregnated dendrites from an insect motor neuron, captured through spectral confocal microscopy. (Credit: Grant M. Barthel, Karen A. Mesce, Karen J. Thompson)

Researchers have discovered a dazzling new method of visualizing neurons that promises to benefit neuroscientists and cell biologists alike: by using spectral confocal microscopy to image tissues impregnated with silver or gold.

Rather than relying on the amount of light reflecting off metal particles, this novel process, to be presented in the journal eLife, involves delivering light energy to silver or gold nanoparticles deposited on neurons and imaging the higher energy levels resulting from their vibrations, known as surface plasmons.

This technique is particularly effective as the light emitted from metal particles is resistant to fading, meaning that decades-old tissue samples achieved through other processes, such as the Golgi stain method from the late 1880s, can be imaged repeatedly.

The new process was achieved by using spectral detection on a Laser Scanning Confocal Microscope (LSCM), first made available in the late 1980s and, until now, used most extensively for fluorescent imaging.

Paired with such methods, silver- and gold-based cell labeling is poised to unlock new information in a myriad of archived specimens. Furthermore, silver-impregnated preparations should retain their high image quality for a century or more, allowing for archivability that could aid in clinical research and disease-related diagnostic techniques for cancer and neurological disorders.

“For the purposes of medical diagnostics, older and newer specimens could be compared with the knowledge that signal intensity would remain fairly uniform regardless of sample age or repeated light exposure,” says contributing author Karen Mesce from the University of Minnesota.

“With the prediction that superior resolution microscopic techniques will continue to evolve, older archived samples could be reimagined with newer technologies and with the confidence that the signal in question was preserved. The progression or stability of a cancer or other disease could therefore be charted with accuracy over long periods of time.”

To appreciate the enhanced image quality produced by the new technique, the team first examined a conventional brightfield image of a metal-labelled neuron within a grasshopper’s abdominal ganglion, a type of mini-brain which, even at that size, presented out-of-focus structures.

They then imaged the same ganglion with the spectral LSCM adjusted to the manufacturer’s traditional fluorescence settings, resulting only in strong natural fluorescence and a collective dark blur in place of the silver-labelled neurons.

However, after collecting the light energy emitted from vibrating surface plasmons in the spectral LSCM, the team obtained spectacular three-dimensional computer images of silver and gold-impregnated neurons. This holds enormous potential for stimulating a re-examination of archived preparations, including Golgi-stained and cobalt/silver-labelled nervous systems.

Additionally, by using a number of different metal-based cell-labeling techniques in combination with the new LSCM protocols, tissue and cell specimens can be generated and imaged with ease and in great three-dimensional detail. Changes in even small structural details of neurons can be identified, which are often important indicators of neurological disease, learning and memory, and brain development.

“Both new and archived preparations are essentially permanent and the information gathered from them increases the data available for characterizing neurons as individuals or as members of classes for comparative studies, adding to emerging neuronal banks,” says co-first author Karen Thompson from Agnes Scott College.

“Just as plasmon resonance can explain the continued intensity of the red (caused by silver nanoparticles) and yellow (gold nanoparticles) colors in centuries-old medieval stained glass and other works of art, metal-impregnated neurons are also likely never to fade, neither in the information they provide nor in their intrinsic beauty,” adds Mesce.

 

 

This chapter is partly modified from:

Towards a 3D View of Cellular Architecture: Correlative Light Microscopy and Electron Tomography.

Valentijn J.A.; van Driel L.F.; Jansen K.A.; Valentijn K.M.; Koster A.J.
Book chapter in: Reiner Salzer. Biomedical Imaging: Principles and Applications. Wiley, 2010.

The present thesis reports on newly developed tools and strategies for correlative light and electron microscopy, and their application to cell biology research aimed at furthering our understanding of the structure-functional mechanisms of varying current biological applications. Accordingly, this Introduction consists of two main parts: the first one will discuss past and present strategies for correlative light and electron microscopy (CLEM) as an introduction to the subsequent chapters in this thesis, all of which will describe new developments and applications in the field; the second one will elaborate on the rationale behind the biological applications that were undertaken.

The term ‘correlative microscopy’ is employed in the biomedical literature to designate any combination of two or more microscopic techniques applied to the same region in a biological specimen. The purpose of correlative microscopy is to obtain complementary data, each imaging modality providing different information, on the specimen that is under investigation. Correlative light and electron microscopy (CLEM) is by far the most widespread form of correlative microscopy. CLEM makes use of the fact that imaging with photons on the one hand, and electrons on the other hand, each offers specific advantages over one another. For instance, the low-magnification range inherent to light microscopy (LM) is particularly well-suited for the rapid scanning of large and heterogeneous sample areas, while the high magnification and -resolution that can be achieved by electron microscopy (EM) allows for the subsequent zooming in on selected areas of interest to obtain ultrastructural detail. A further advantage of LM is that it can be used to study dynamic processes, up to the molecular level, in living cells and tissues. The recent surge in live cell imaging research has catalyzed a renewed interest in CLEM methodologies, as the interpretation of the dynamic processes observed by LM often requires high-resolution information from EM data. CLEM is also gaining in momentum in the field of cryo-electron microscopy where the low contrast conditions and low electron dose requirements put a constraint on the detection efficacy.

Current CLEM procedures face a number of challenges. Firstly, sample preparation methods for LM and EM can be quite divergent due to different requirements for preservation, embedding, sectioning, and counterstaining. Therefore, alternative sample preparation protocols need to be devised that are suitable for both LM and EM. Secondly, CLEM often requires the correlated localization of specific molecules in cells or tissues, for which specialized detection systems need to be developed. Standard detection methods are based on tagging of molecules either with fluorochromes for LM, or with gold particles for EM, whereas some CLEM applications require a tag to be visible in both modalities. Thirdly, the transition from imaging by LM to EM may involve handling and additional processing of samples, which can lead to changes in orientation and morphology of the sample. This in turn can hamper the finding back of, and correlation with previously established areas of interest.

In the present post-genomics climate, EM is coming back with a vengeance. Despite the dip in EM-based research during the previous decade, the development of novel EM technologies moved forward at a steady pace, resulting in several breakthrough applications. Among them are electron tomography and cryo-electron tomography, which are techniques for high-resolution 3D visualization and which are gradually becoming mainstream tools in structural molecular biology. As will be discussed in more detail below, (cryo-)electron tomography is often hampered by the lack of landmarks in the 2D views used to select areas of interest.

The diversity of goals to be achieved by CLEM constrains the development of universally applicable protocols. For instance, correlating live cell imaging data of a fluorescent protein with an ultrastructural endpoint requires a different CLEM approach than if the main goal is to pinpoint a rarely occurring structure of interest for EM investigation. CLEM can also be used as an alternative for immuno-EM, or to locate structures for cryo-EM if high-resolution structural details are required. As a consequence of the diversity of applications, there are to date numerous methods to correlate LM and EM data, and more developments, improvements, and applications, are likely to follow. Depending on the purpose of the application, three groups of CLEM methodologies can be distinguished:

  • Those that combine live cell imaging with ultrastructural information (see figure 1),
  • Those that combine LM of fixed or immobilized samples with ultrastructural information (figure 2),
  • Those that combine LM and EM data from the same sections (figure 3)

 

 

Using Laser Scanning Confocal Microscopy as a Guide for Electron Microscopic Study: A Simple Method for Correlation of Light and Electron Microscopy’

XUE J. SUN, LESLIE P. TOLBERT,2 and JOHN G. HILDEBRAND
J Histochem and Cytochcem 1995; 43(3):329-335

 

Anatomic study of synaptic connections in the nervous system is laborious and difficult, especially when neurons are large or have fine branches embedded among many other processes. Although electron microscopy provides a powerful tool for such study, the correlation of light microscopic appearance and electron microscopic detail is very time consuming. We report here a simple method combining laser scanning confocal microscopy and electron microscopy for study of the synaptic relationships of the neurons in the antennal lobe, the first central neuropil in the olfactory pathway, of the moth Manduca sexta. Neurons were labeled intracellularly with neurobiotin or biocytin, two widely used stains. The tissue was then sectioned on a vibratome and processed with both streptavidin-nanogold (for electron microscopic study) and streptavidin-Cy3 (for confocal microscopic study) and embedded in epon/araldite. Interesting areas of the labeled neuron were imaged in the epon/araldite sectioned at the indicated depth for electron microscopic study. This method provides an easy, reliable way to correlate three-dimensional light miaoscopic information with electron microscopic detail, and can be very useful in studies of synaptic connections.

Introduction Study of synaptic connections is fundamental to an understanding of nervous system function. Physiological recording in combination with cell staining with different dyes yields especially useful information about the functional properties and morphology of neurons in various nervous systems. Light microscopic (LM) analysis of neurons, however, often raises the question of whether synaptic connections occur at specialized regions of the neurites. Knowledge of whether synaptic connections between two neurons exist and where on the neuritic tree they occur provides valuable information about how signals are integrated by the neurons and thus about the role the neurons play in a given pathway. Electron microscopy (EM) provides a powerful tool for this type of study. Unfortunately, correlation of three-dimensional LM data with EM detail is often difficult, as neurons are often large (ranging up to hundreds of micrometers in length) with complex branching patterns. Moreover, EM data usually lose most large-scale three- dimensional information. Much work has been done to attempt to overcome these problems (3,4,8,10,14,19,21,25,26,31,34,35). To date, the most successful methods for correlating EM and threedimensional data have been

(a) three-dimensional reconstruction of thin sections,

(b) semi-thin sectioning of tissue, serial reconstruction of the neuron at the light microscopic level, re-embedment, and finally thin-sectioning of selected semi-thin sections for electron microscopic study, and

(c) high-voltage EM observation of relatively thick sections and reconstruction of high-voltage EM data.

Although combination of these techniques or computer-assisted three-dimensional reconstruction and image processing techniques greatly facilitate the task, it is still very time-consuming to pursue this type of study.

Recently, laser scanning confocal microscopy (LSCM) has provided a convenient way to obtain three-dimensional morphology of neurons, as optical sections of relatively thick tissue can be obtained easily and rapidly (5-7,20,24,27,28,30). Equally importantly, information is gathered as digitized optical sections and is readily displayed as three-dimensional stacks. Combination of LSCM and EM appears to be a promising way to study synaptic connections.

 

Several different techniques have been used to bridge the gap between LM and EM (3,4,8,10,14,19,21,25,26,31,34,35). One common way is to section the block into semi-thin sections, make a threedimensional reconstruction of the neuron at the LM level, and then thin-section areas of interest (4,8,10,19,21). However, three-dimensional reconstruction from sections is tedious, and valuable tissue may be lost during reembedding and resectioning. High-voltage EM is a reasonable alternative, as relatively thicker sections can be observed, thus reducing the number of sections needed to span a neuron. This method, however, requires access to a high-voltage EM. We have developed a technique using LSCM as a guide for EM study. A biotin-labeled neuron is rendered detectable by both fluorescence (for LSCM) and immunogold (for EM). After such double labeling, interesting areas of physiologically identified and intracellularly labeled neurons can be investigated at the EM level to see where synaptic connections are formed. Although with our method three-dimensional reconstruction might also be needed, the number of vibratome sections needed to span a neuron is small (two to four in our case with 80-pm sections). This greatly reduces the task.

The major challenge in correlation of LSCM and EM is the compatibility of the labeling methods. Visualization methods using diaminobenzidine (either in a peroxidase reaction or in a reaction catalyzed by photo-oxidation of fluorescent dyes) and Golgi staining have been used widely for both LM and EM observations (see, e.g., 8,26,35). The light- and electron-dense label, however, is not suitable for LSCM imaging, which is best performed on fluorescent  labels. One possibility is to use fluorescent labels, perform LSCM, and, after LSCM imaging, convert the fluorescent dyes into electron-dense materials by using either photo-oxidation of fluorescent dyes [such as Lucifer Yellow (16) or DiI as suggested by Vischer and Durrenberger (32)] in the presence of diaminobenzidine or an antibody against the dye injected into the cell. It is possible to determine the position of an interesting area of a neuron by LSCM with this method, but relocating the same area after plastic embedding is much more difficult than the presently reported method because of shrinkage of the tissue during tissue processing for EM and the inability to monitor the exact position by LSCM during sectioning. The reflection mode of LSCM has been applied successfully to detect electron-dense label in several systems (6,7,22,33). With this mode of imaging, Deitch et al. (6) have proposed another way of combining LSCM and EM. Reflection signals in the LSCM are often weak, however, and do not take full advantage of the excellent resolving capabilities of the confocal microscope.

……

The unending fascination with the Golgi method

Y Koyama*
Department of Anatomy and Neuroscience, Graduate School of Medicine, Osaka University, Suita, Osaka, Japan
http://www.oapublishinglondon.com/article/848

In the second half of the 19th century, Camillo Golgi provided a breakthrough staining technique for visualizing whole neurons, which are seen as black bodies due to intracellular staining with microcrystalline silver chromate. The high contrast, selective staining properties enabled identification of complete neuronal morphology. This staining technique, termed the Golgi method, was later improved by Ramón y Cajal and popularised through his tireless experiments. Morphological analysis, using the Golgi method, led to the discovery of neuronal microstructures such as dendritic spines and growth cones and helped give rise to the ‘neuron doctrine’. Many post-mortem human brains as well as brains of experimental animals have since been stained using this method. In combination with other morphological techniques (e.g. electron microscopy and immunohistochemistry), the Golgi method has allowed us to glean more information regarding the neuronal networks present in various brain regions. However, the Golgi method is a difficult first choice for morphological analysis since it is capricious, complicated and time-consuming and has poor reproducibility.

Recent increases in the number of in vivo animal experiments and of post-mortem brains collected following neurological disorders heighten the need for the Golgi method to be viewed as a crucial morphological tool for assessing abnormalities in single neurons, as well as in neuronal networks. Fortunately, over 100 years of neuroanatomical diligence has seen significant contributions to overcoming the shortfalls of this method. The advent of modified Golgi methods with potential use as routine techniques, together with the development of the kit-based Golgi–Cox method, has made the Golgi method more accessible to neuroanatomists.

This review surveys the technical fundamentals, history and evolution of Golgi methods and intends to spark an interest in the Golgi method within every neuroscientist, novice and old pro alike and to allow them to appreciate this useful technique.

Conclusion

Many neuroanatomists, including us, feel a strong attraction to the Golgi method as a powerful morphological tool. Our researchers have identified unwanted issues of the various Golgi methods and then have been working to remedy these problems. We encourage the reader to adopt staining using the Golgi method as its utility continues to evolve.

 

While studying the brain function, it is extremely important to investigate the precise shape and morphological changes of individual neurons. A neuron is a specialised cell that emits an electrical signal to allow information exchange, a characteristic not found in the cells of other organs. Both the short ‘dendrite’, which is intricately branched like a tree branch and the long ‘axon’ emanate from the nerve cell body. In addition, neuronal spines located on dendrites receive electric signals from other neurons and are involved in neuronal plasticity. Thus, together with neural cells such as neuroglia, neurons form complicated networks known as ‘neural circuits’.

To appreciate the complexity of such intricate neural networks, staining methods that allow the visualisation of neuronal cells in thinly sliced brain sections are used. In particular, Nissl stains and silver impregnation are commonly used. Nissl staining, which is based on a mechanism combining a basic aniline dye with Nissl granules, can stain both the cell nucleoli and rough endoplasmic reticula in neurons. In contrast, silver impregnation using neuronal argent affinity can stain an entire neuron, but not the myelin sheath. Neural circuitry refers to the combination of many interacting neural cells and is immensely complex morphologically, with many neurons intertwined with one another within a restricted space. Unfortunately, because the above techniques stain all neural cells with equal probability, it is difficult to identify and appreciate the morphology of a single cell amongst the mass of other stained cells.

In contrast, the Golgi method, focused in this review, has allowed for the visualisation of entire neurons and glia in high detail and with good contrast. Moreover, compared to Nissl staining and silver impregnation, the Golgi method has the beneficial feature of characteristically selective staining. Because neurons are stained only sparsely with the Golgi method, it is a powerful staining technique for providing a complete, detailed representation of a single neuron. The aim of this review was to discuss the history and evolution of the Golgi method.

………

Dendritic vulnerability in neurodegenerative disease: insights from analyses of cortical pyramidal neurons in transgenic mouse models

In neurodegenerative disorders, such as Alzheimer’s disease, neuronal dendrites and dendritic spines undergo significant pathological changes. Because of the determinant role of these highly dynamic structures in signaling by individual neurons and ultimately in the functionality of neuronal networks that mediate cognitive functions, a detailed understanding of these changes is of paramount importance. Mutant murine models, such as the Tg2576 APP mutant mouse and the rTg4510 tau mutant mouse have been developed to provide insight into pathogenesis involving the abnormal production and aggregation of amyloid and tau proteins, because of the key role that these proteins play in neurodegenerative disease. This review showcases the multidimensional approach taken by our collaborative group to increase understanding of pathological mechanisms in neurodegenerative disease using these mouse models. This approach includes analyses of empirical 3D morphological and electrophysiological data acquired from frontal cortical pyramidal neurons using confocal laser scanning microscopy and whole-cell patch-clamp recording techniques, combined with computational modeling methodologies. These collaborative studies are designed to shed insight on the repercussions of dystrophic changes in neocortical neurons, define the cellular phenotype of differential neuronal vulnerability in relevant models of neurodegenerative disease, and provide a basis upon which to develop meaningful therapeutic strategies aimed at preventing, reversing, or compensating for neurodegenerative changes in dementia.

Keywords: Alzheimer’s disease, Amyloid, Computational modeling, Dendritic spine, Tau, Whole-cell patch-clamp

Neocortical pyramidal neurons possess extensive apical and basilar dendritic trees, which integrate information from thousands of excitatory and inhibitory synaptic inputs. Dendrites respond to inputs with postsynaptic potentials, which are relayed to the soma where they are summed; if a threshold potential is exceeded, an action potential is generated. Voltage attenuation in space and time along dendrites is fundamental to summation and is influenced by a number of interacting morphological properties (such as diameter and length) and active properties (such as distribution of ion channels) of the dendritic shaft (Hausser et al. 2000; Kampa et al. 2007; Stuart and Spruston 2007; Henry et al. 2008). Further complexity arises from the presence of thousands of biophysically active dendritic spines, the principal receptive site for excitatory glutamatergic inputs to a neuron.

Dendrites and dendritic spines in particular are dynamic structures, which undergo significant changes across the life span. Under non-pathological conditions, the number of spines on pyramidal neuron dendrites increases substantially over the course of development, is reduced during maturation to adulthood, and remains relatively stable during adulthood (for review see Bhatt et al. 2009). Then, during normal aging, significant changes in spine number, distribution, and morphology occur (Duan et al. 2003). Spines also undergo significant structural modifications under conditions where synaptic strength is experimentally modified, usually with protocols designed to evoke longterm potentiation or long-term depression (for review see Alvarez and Sabatini 2007; Bhatt et al. 2009; Holtmaat and Svoboda 2009). In many neurodegenerative diseases, significant alterations in the dendritic arbor occur, together with substantial spine loss and alterations in spine morphology (reviewed in Halpain et al. 2005). Gaining an understanding of these sublethal changes to neurons, which detrimentally impact neuronal signaling, and ultimately cognitive function, is an important goal.

Transgenic mouse models have been useful for elucidating mechanisms of amyloid-and tau-induced pathology, although no single model fully recapitulates human disease (for review see Duff and Suleman 2004; Spires et al. 2005). Two of the most commonly employed mouse models of neurodegenerative disease are the Tg2576 amyloid precursor protein (APP) mutant mouse and the rTg4510 tau mutant mouse, which develop significant pathological aggregations of amyloid-beta (Aβ) and tau proteins, respectively. Tg2576 transgenic mice overexpress the Swedish double mutation of the human APP gene, which leads to progressive formation of soluble Aβ peptides and fibrillar Aβ deposits in the form of amyloid plaques. Increased Aβ levels in these mice are associated with progressive structural changes to neurons (although not with neuron death), and cognitive impairment (Hsiao et al. 1996). In the rTg4510 mouse model, expression of the P301L mutant human tau variant leads to progressive development of neurofibrillary tangles (NFTs), neuronal death, and memory impairment reminiscent of the pathology observed in human tauopathies (Santacruz et al. 2005).

In this review, we discuss changes in the structure and function of dendrites and spines of pyramidal neurons that are associated with pathological aggregation of Aβ and tau in these and other mouse models of neurodegenerative disease. We also discuss, as a point of comparison, the impact of normal brain aging on the morphofunctional properties of pyramidal neurons in aged macaque monkeys. This is not a comprehensive review of the literature on such changes in human neurodegenerative diseases or of the many studies of these changes in mouse models (for reviews see Duff and Suleman 2004; Spires and Hyman 2004;Lewis and Hutton 2005; Duyckaerts et al. 2008; Giannakopoulos et al. 2009). Rather, the focus here is specifically on our multidimensional collaborative efforts to understand the functional consequences of pathological changes in the structure of individual layer 3 frontal cortical pyramidal neurons in commonly employed mouse models of neurodegeneration. These studies focus on layer 3 pyramidal neurons in the neocortex because they are the principal neurons involved in corticocortical circuits that mediate many cognitive functions of the frontal cortex and may be selectively targeted in neurodegenerative disease (Morrison and Hof 1997, 2002; Hof and Morrison 2004).

An external file that holds a picture, illustration, etc.
Object name is nihms273899f1.jpg

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3045830/bin/nihms273899f1.jpg

Structural properties of mouse neocortical pyramidal neurons. a 40 × CLSM image of a layer 3 pyramidal cell from the frontal cortex of a wild-type mouse. The neuron was filled with biocytin during patch-clamp recording and subsequently labeled with Alexastreptavidin 488. b100 × CLSM image of the spiny dendrite shown within the red box in (a). c Electron micrograph showing the ultrastructure of a pyramidal neuron spine, with a prominent postsynaptic density and spine apparatus. Courtesy of Dr. Alan Peters. Scale bars a 40, b 5, c 0.5 μm

The structural properties of dendrites underlie their passive membrane (cable) properties, which are membrane capacitance Cm, specific membrane resistance Rm and axial resistivity Ra. These cable properties determine, at a fundamental level, the degree of summation of synaptic inputs and the spatial distribution of electrical signals. Because summation of synaptic inputs by dendrites is determined in large part by their structure, dendritic morphology plays a critical role in action potential generation (Mainen and Sejnowski 1996; Koch and Segev 2000; Euler and Denk 2001; Vetter et al. 2001; Krichmar et al. 2002; Ascoli 2003). Importantly, both branching topology and surface irregularities including dendritic varicosities (Surkis et al. 1998) contribute to the excitable properties of neurons. For example, recent simulation studies have demonstrated that marked differences in the efficacy of action potential back propagation in different neural types are attributable in large part to variations in dendritic morphology (for review see Stuart et al. 1997;Waters et al. 2005).

Integration of synaptic inputs by dendrites is mediated not only by basic passive cable properties, but also by active properties, which include the number and distribution of voltage, ligand, as well as second messenger-gated transmembrane ionic channels (reviewed in Migliore and Shepherd 2002; Magee and Johnston 2005;Johnston and Narayanan 2008; Nusser 2009). Dendrites possess a rich array of sodium, calcium, and potassium channels that are distributed either uniformly or non-uniformly across a given dendrite (for reviews, see Johnston et al. 1996; Migliore and Shepherd 2002; Magee and Johnston 2005; Johnston and Narayanan 2008; Nusser 2009). For example, layer 5 cortical pyramidal neuron dendrites possess a gradient of the hyperpolarization-activated mixed cationic HCN channels that increase in density from the soma to the apical tuft (Berger et al. 2001; Lorincz et al. 2002). The interaction of intrinsic ionic and synaptic conductances with passive properties determined by dendritic morphology can effectively alter the cable properties of the dendritic tree (Bernander et al. 1991; Segev and London 2000; Bekkers and Hausser 2007) adding a further layer of complexity to signaling by individual neurons. Computational modeling approaches have been extensively employed to shed light on dendritic structure–function relationships and into potential interactions between a multitude of dendritic active and passive properties. These approaches, as discussed at the end of this review, have been useful for gaining insight on dendrites that are too thin to be studied with electrophysiological approaches, and for providing empirical researchers with testable hypotheses relevant to the functional consequences of dendritic dystrophy in neurodegenerative diseases.

Dendritic spines

Dendrites of glutamatergic pyramidal neurons are studded by thousands of dendritic spines, which are the site of most of the glutamatergic synapses in the brain, that confer further functional complexity to these processes. Spine density, shape, and distribution are all important contributors to neuronal excitability that are superimposed on the electrophysiological properties of dendritic shafts (Wilson 1988; Stratford et al. 1989;Baer and Rinzel 1991; Tsay and Yuste 2002). Spines are small appendages that extend from approximately 0.5–3 lm from dendritic shafts and are usually <1 μm in diameter (Fig. 1b, c). Mouse layer 3 frontal cortical pyramidal neurons typically possess approximately 6,000 dendritic spines (Rocher et al. 2008). Spines can be broadly categorized as falling into one of several different morphological types, namely “thin”, “stubby”, “mushroom”, and “filopodia” which are normally seen in large numbers only during development (for review see Yuste and Bonhoeffer 2004; Bourne and Harris 2007). Although most spines do fall into one of these broad categories, serial electron microscopy reveals that there is also a continuum between the different morphological subtypes (Bourne and Harris 2007). Spine morphology determines the strength, stability and function of excitatory synaptic connections that subserve the neuronal networks underlying cognitive function. Smaller spines, such as the thin and filopodia types are less stable and more motile (Trachtenberg et al. 2002; Kasai et al. 2003; Holtmaat et al. 2005) and as a result, are more plastic than large spines such as the mushroom and stubby types (Grutzendler et al. 2002). The size and morphology of the spine head is correlated with the number of docked presynaptic vesicles (Schikorski and Stevens 1999) and the number of postsynaptic receptors (Nusser et al. 1998), and hence with the size of synaptic currents and synapse strength. In addition, a small head size permits fast diffusion of calcium within the spine, while the neck length shapes the time constant for calcium compartmentalization (for review see Yuste and Bonhoeffer 2001; Nimchinsky et al. 2002), modulating postsynaptic mechanisms that play an important role in synaptic plasticity linked to functions, such as learning and memory (Holthoff et al. 2002; Alvarez and Sabatini 2007; Bloodgood and Sabatini 2007). Spine neck length and diameter also affect diffusional coupling between dendrite and spine (Svoboda et al. 1996; Yuste et al. 2000; Bloodgood and Sabatini 2005), and spine density and shape regulate the degree of anomalous diffusion of chemical signals within the dendrite (Santamaria et al. 2006). Spine morphology also varies dynamically in response to synaptic activity (Lendvai et al. 2000; Hering and Sheng 2001; Zuo et al. 2005).

Over the past few decades, it has become increasingly evident that local dendritic spine structure and distribution (Matus and Shepherd 2000) play a key role in the electrical and biochemical signaling of dendrites (Nimchinsky et al. 2002; Matus 2005; Bourne and Harris 2007). However, spines present challenges to the standard cable model of dendrites. The common way to model the effects of spines in a passive cable equation model is to reduce the membrane resistance and increase the membrane capacitance by a factor proportional to the increased membrane surface area due to spines (Jaslove 1992). This modification predicts that voltage should attenuate more drastically in space along spiny dendrites, relative to their smooth counterparts. Because of their capacity for plasticity and because they are the location of most excitatory synapses in the cerebral cortex, accurately characterizing the structure of dendritic spines is essential for understanding their contributions to neuronal, and ultimately to cognitive function.

An external file that holds a picture, illustration, etc.
Object name is nihms273899f2.jpg

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3045830/bin/nihms273899f2.jpg

Analytical methods to quantify morphological structure. a xy projection of the montage of CLSM tiled image stacks from a wildtype mouse layer 3 frontal cortical pyramidal neuron. Scale bar 40 μm. b Tree structure from the data shown in (a), extracted using NeuronStudio. cAutomatic spine detection and visualization in NeuronStudio. Left automatically detected spines overlaid on a maximum projection of a typical dataset; right the same data and spines volume-rendered in 3D. d Left 2D Rayburst sample used for diameter estimation. Rays cast using the sampling core is shown in orange with the one chosen as the diameter shown in blue. The green line indicates the surface detected by the Rayburst samples. Right spine diameter estimation using a 2D Rayburst run at the center of mass (small green squares) of a single layer. Theblue line indicates the resulting width of the structure as calculated by Rayburst, and provides an approximate length of 0.7 μm. e Optimized fits of scaling exponents (black fitted lines) and optimal scaling regions (gray shaded bands) for the apical dendrites of a typical layer 3 pyramidal cell projecting from superior temporal cortex to area 46 (see Kabaso et al. (2009) for details). f Contributions of branching and tapering exponents to global mass scaling (measured by total area exponent dA) in spine-corrected apical dendrites of long projection neurons of young rhesus monkeys, in the proximal and medial scaling regions (I, II in e). The total branching and tapering vary across the two scaling regions, yet the total area in each region is conserved [modified from Wearne et al. (2005) and Kabaso et al. (2009)]

An external file that holds a picture, illustration, etc.
Object name is nihms273899f4.jpg

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3045830/bin/nihms273899f4.jpg

Dendritic spine loss in pyramidal neurons from Tg2576 and rTg4510 mice. CLSM images of dendritic segments with spines of neurons from wild-type (top) and transgenic (bottom) mice. Scale bars 10 μm [modified from Rocher et al. (2008, 2009)]

3D reconstruction and analytical approaches

Early methods for digitizing 3D neuronal structures relied on interactive manual tracing from a computer screen (Capowski 1985). These methods were time-consuming, subjective, and lacked precision. In recent years, automated methods have been developed that use image analysis algorithms to extract neuronal morphology directly from 3D microscopy and overcome the limitations of manual techniques (Koh et al. 2002; He et al. 2003; Wearne et al. 2005). Newer methods use pattern recognition routines to track or detect a structure locally without the need for global image operations (Al-Kofahi et al. 2002; Streekstra and van Pelt 2002; Schmitt et al. 2004; Myatt et al. 2006; Santamaria-Pang et al. 2007). Most are designed to work on a broad range of signal-to-noise ratios and even on multiple imaging modalities. This results in increased computational complexity, which makes the use of these methods as interactive reconstruction tools for high-resolution data less than optimal.

In our studies of neuronal structure, morphological reconstruction is performed using NeuronStudio, a neuron morphology reconstruction software tool (http://www.mssm.edu/cnic/tools.html), developed by Wearne et al. (2005) and Rodriguez et al. (2006, 2008, 2009). Neuron- Studio has been designed for low computational complexity to allow interactive semi-automated extraction of neuronal morphology from medium to high-quality de-convolved 3D CLSM and MPLSM image stacks of fluorescently labeled neurons. It features automated extraction of dendrites and dendritic spines as well as a rich set of manual editing and visualization modalities. Figure 2a shows an xy projection of CLSM data from a wild-type mouse layer 3 cortical pyramidal neuron and Fig. 2b shows the automated reconstruction of the neuron’s dendritic arbor obtained using NeuronStudio. Because fluorescence intensity can vary with adequacy of filling, imaging depth and xyspatial extent in CLSM and MPLSM image stacks, data segmentation within NeuronStudio adapts the iterative self-organizing data analysis (ISODATA) method (Ridler and Calvard 1978) to compute local thresholds dynamically. This method is appropriate for datasets exhibiting a bimodal distribution of intensity values, such as the grayscale images characteristic of de-convolved LSM image stacks.

Automated 3D reconstruction of dendritic spines

The assessment of spine numbers and distribution and their classification into subtypes has historically been a labor intensive and relatively inaccurate process. Spine numbers could only be estimated because spines extending primarily in the z plane relative to the dendritic shaft could not be counted. With the advent of CLSM, accurate 3D spine assessment came a step closer, but was still a highly timeconsuming and labor-intensive undertaking. Improving upon previous spine detection algorithms (Koh et al. 2002; Cheng et al. 2007), Rodriguez et al. (2008) devised an efficient and robust method for automated spine detection, available in NeuronStudio.

3D measures of spatial complexity

Traditionally, Sholl analysis (1953) has been used in two dimensions to quantify the spatial complexity of dendritic branching patterns with increasing distance from the soma. Fractal analyses have also been used (Smith et al. 1989; Caserta et al. 1995; Jelinek and Elston 2001; Henry et al. 2002) to quantify spatial complexity as a power law scaling exponent, describing the rate of change of the number of branches over a large portion of the dendrites, and best visualized as the slope of a log–log plot of these two quantities. We have recently extended this work (Rothnie et al. 2006; Kabaso et al. 2009) to include three power law exponents describing global spatial complexity: rates of change of dendritic mass, branching, and taper.

…..

In conclusion, these analytical tools provide rapid, objective analyses of the high-resolution data that we collect from wild-type and transgenic mouse neurons. With NeuronStudio, we are able to perform analyses of differences in dendrite diameter and spine shapes that were not available previously, and these kinds of studies are currently ongoing in our laboratories. Moving forward, we will evaluate whether the global patterns in spatial complexity observed in rhesus monkey pyramidal neurons are similarly present in mouse neurons. We will also compare the spatial complexity of wild-type and transgenic neurons to determine whether global mass homeostasis is conserved in neurodegeneration as it seems to be in aging.

Alterations in the structure of dendrites and spines of cortical pyramidal neurons in Tg2576 and rTg4510 mutant mice

Dendritic changes in neurodegenerative disease ….

Clearly, there is a need for many more studies on the functional electrophysiological consequences to individual neurons of the significant structural changes in neurodegenerative disease. Given the lack of such studies, and also technical considerations such as space clamp limitations and the impossibility of recording from distal dendrites or spines, the use of modeling methods to understand potential functional consequences of structural changes is very important.

Insights from modeling

For nearly 60 years, mathematical models have been used to investigate neuronal function. Hodgkin and Huxley’s (1952) mathematical model of action potential generation predicted the existence of ion channels, decades before ion channels were observed experimentally (Neher and Sakmann 1976). Other models were groundbreaking in describing how dendrites filter signals as passive electrical cables (Rall 1959; Goldstein and Rall 1974), but were limited in their ability to apply directly to realistic morphologic data. Since then, applications of mathematical techniques (Fitzhugh 1961; Nagumo et al. 1962; Rinzel and Ermentrout 1989), advances in computational software (Bower and Beeman 1998; Carnevale and Hines 2006), model reduction (Clements and Redman 1989; Pinsky and Rinzel 1994), and computing power have resulted in models that have great potential for yielding insights into neuronal function. In particular, models have shown that morphology is a critical determinant of neuronal firing properties (Zador et al. 1995; Mainen and Sejnowski 1996; Vetter et al. 2001; Schaefer et al. 2003; Stiefel and Sejnowski 2007). The effects of morphology are further amplified by the actions of ion channels distributed throughout the dendrites (for review see Johnston and Narayanan 2008), both of which shape patterns of synaptic input. Our ability to understand neuronal function depends largely on analysis of the nonlinear interactions between morphology, electrical membrane properties, and synaptic input.

……..

Our modeling results make several predictions that may apply to transgenic mouse models of neurodegeneration. First, changes in dendrite length, diameter, and spine densities/numbers in transgenic neurons compared to wildtype neurons may significantly impact the attenuation of signals to and from the soma. This may happen over an entire neuron, or in limited regions, such as dendrites passing through or near fibrillar amyloid deposits, or in apical tufts that undergo atrophy. Long projection neurons are particularly vulnerable in AD (Hof et al. 1990; Bussiére et al. 2003; Hof and Morrison 2004), giving greater weight to the significant differences in voltage attenuation observed in long projection neurons (Kabaso et al. 2009). Second, changes in spine density may affect the diffusion rate of intracellular messengers and ions. Elongated dendrites may result in less trapping of electrical and chemical signals within spines; this phenomenon may be functionally significant. We must also consider how spine loss might impact the amount of excitatory input that a neuron receives. Finally, it is likely that interactions between morphology and active parameters vary between wild-type and transgenic neurons. To study this more fully we must measure both detailed morphological properties and ionic currents in wild-type and transgenic neurons. To evaluate the degree to which these predictions truly apply to the transgenic mouse models, we will apply the mathematical methods described here directly to the transgenic data that we have collected. These computational studies will likely lead to explicit predictions of which parameters to change, and by how much, to counteract morphological changes that affect physiological function in neurodegenerative disease.

Read Full Post »

Roche is developing a high-throughput low cost sequencer for NGS, How NGS Will Revolutionize Reproductive Diagnostics: November Meeting, Boston MA, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

Roche is developing a high-throughput low cost sequencer for NGS

Reporter: Stephen J. Williams, PhD

 

Reported from Diagnostic World News

Long-Read Sequencing in the Age of Genomic Medicine

 

 

By Aaron Krol

December 16, 2015 | This September, Pacific Biosciences announced the creation of the Sequel, a DNA sequencer half the cost and seven times as powerful as its previous RS II instrument. PacBio, with its unique long-read sequencing technology, had already secured a place in high-end research labs, producing finished, highly accurate genomes and helping to explore the genetic “dark matter” that other next-generation sequencing (NGS) instruments miss. Now, in partnership with Roche Diagnostics, PacBio is repositioning itself as a company that can serve hospitals as well.

“Pseudogenes, large structural variants, validation, repeat disorders, polymorphic regions of the genome―all those are categories where you practically need PacBio,” says Bobby Sebra, Director of Technology Development at the Icahn School of Medicine at Mount Sinai. “Those are gaps in the system right now for short-read NGS.”

Mount Sinai’s genetic testing lab owns three RS II sequencers, running almost around the clock, and was the first lab to announce it had bought a Sequel just weeks after the new instruments were launched. (It arrived earlier this month and has been successfully tested.) Sebra’s group uses these sequencers to read parts of the genome that, thanks to their structural complexity, can only be assembled from long, continuous DNA reads.

There are a surprising number of these blind spots in the human genome. “HLA is a huge one,” Sebra says, referring to a highly variable region of the genome involved in the immune system. “It impacts everything from immune response, to pharmacogenomics, to transplant medicine. It’s a pretty important and really hard-to-genotype locus.”

Nonetheless, few clinical organizations are studying PacBio or other long-read technologies. PacBio’s instruments, even the Sequel, come with a relatively high price tag, and research on their value in treating patients is still tentative. Mount Sinai’s confidence in the technology is surely at least partly due to the influence of Sebra―an employee of PacBio for five years before coming to New York―and Genetics Department Chair Eric Schadt, at one time PacBio’s Chief Scientific Officer.

Even here, the sequencers typically can’t be used to help treat patients, as the instruments are sold for research use only. Mount Sinai is still working on a limited number of tests to submit as diagnostics to New York State regulators.

Physician Use

Roche Diagnostics, which invested $75 million in the development of the Sequel, wants to change that. The company is planning to release its own, modified version of the instrument in the second half of 2016, specifically for diagnostic use. Roche will initially promote the device for clinical studies, and eventually seek FDA clearance to sell it for routine diagnosis of patients.

In an email to Diagnostics World, Paul Schaffer, Lifecycle Leader for Roche’s sequencing platforms division, wrote that the new device will feature an integrated software pipeline to interpret test results, in support of assays that Roche will design and validate for clinical indications. The instrument will also have at least minor hardware modifications, like near field communication designed to track Roche-branded reagents used during sequencing.

This new version of the Sequel will probably not be the first instrument clinical labs turn to when they decide to start running NGS. Short-read sequencers are sure to outcompete the Roche machine on price, and can offer a pretty useful range of assays, from co-diagnostics in cancer to carrier testing for rare genetic diseases. But Roche can clear away some of the biggest barriers to entry for hospitals that want to pursue long-read sequencing.

Today, institutions like Mount Sinai that use PacBio typically have to write a lot of their own software to interpret the data that comes off the machines. Off-the-shelf analysis, with readable diagnostic reports for doctors, will make it easier for hospitals with less research focus to get on board. To this end, Roche acquired Bina, an NGS analysis company that handles structural variants and other PacBio specialties, in late 2014.

The next question will be whether Roche can design a suite of tests that clinical labs will want to run. Long-read sequencing is beloved by researchers because it can capture nearly complete genomes, finding the correct order and orientation of DNA reads. “The long-read technologies like PacBio’s are going to be, in the future, the showcase that ties it all together,” Sebra says. “You need those long reads as scaffolds to bring it together.”

But that envisions a future in which doctors will want to sequence their patients’ entire genomes. When it comes to specific medical tests, targeting just a small part of the genome connected to disease, Roche will have to content itself with some niche applications where PacBio stands out.

Early Applications

“At this time we are not releasing details regarding the specific assays under development,” Schaffer told Diagnostics World in his email. “However, virology and genetics are a key focus, as they align with other high-priority Roche Diagnostics products.”

Genetic disease is the obvious place to go with any sequencing technology. Rare hereditary disorders are much easier to understand on a genetic level than conditions like diabetes or heart disease; typically, the pathology can be traced back to a single mutation, making it easy to interpret test results.

Some of these mutations are simply intractable for short-read sequencers. A whole class of diseases, the PolyQ disorders and other repeat disorders, develop when a patient has too many copies of a single, repetitive sequence in a gene region. The gene Huntingtin, for example, contains a long stretch of the DNA code CAG; people born with 40 or more CAG repeats in a row will develop Huntington’s disease as they reach early adulthood.

These disorders would be a prime target for Roche’s sequencer. The Sequel’s long reads, spanning thousands of DNA letters at a stretch, can capture the entire repeat region of Huntingtin at a stretch, unlike short-read sequencers that would tend to produce a garbled mess of CAG reads impossible to count or put in order.

Nonetheless, the length of reads is not the only obstacle to understanding these very obstinate diseases. “The entire category of PolyQ disorders, and Fragile X and Huntington’s, is really important,” says Sebra. “But to be frank, they’re the most challenging even with PacBio.” He suggests that, even without venturing into the darkest realms of the genome, a long-read sequencer might actually be useful for diagnosing many of the same genetic diseases routinely covered by other instruments.

That’s because, even when the gene region involved in a disease is well known, there’s rarely only one way for it to go awry. “An example of that is Gaucher’s disease, in a gene called GBA,” Sebra says. “In that gene, there are hundreds of known mutations, some of which you can absolutely genotype using short reads. But others, you would need to phase the entire block to really understand.” Long-read sequencing, which is better at distinguishing maternal from paternal DNA and highlighting complex rearrangements within a gene, can offer a more thorough look at diseases with many genetic permutations, especially when tracking inheritance through a family.

“You can think of long-read sequencing as a really nice way to supplement some of the inherited panels or carrier screening panels,” Sebra says. “You can also use PacBio to verify variants that are called with short-read sequencing.”

Virology is, perhaps, a more surprising focus for Roche. Diagnosing a viral (or bacterial, or fungal) infection with NGS only requires finding a DNA read unique to a particular species or strain, something short-read sequencers are perfectly capable of.

But Mount Sinai, which has used PacBio in pathogen surveillance projects, has seen advantages to getting the full, completely assembled genomes of the organisms it’s tracking. With bacteria, for instance, key genes that confer resistance to antibiotics might be found either in the native genome, or inside plasmids, small packets of DNA that different species of bacteria freely pass between each other. If your sequencer can assemble these plasmids in one piece, it’s easier to tell when there’s a risk of antibiotic resistance spreading through the hospital, jumping from one infectious species to another.

Viruses don’t share their genetic material so freely, but a similar logic can still apply to viral infections, even in a single person. “A virus is really a mixture of different quasi-species,” says Sebra, so a patient with HIV or influenza likely has a whole constellation of subtly different viruses circulating in their body. A test that assembles whole viral genomes—which, given their tiny size, PacBio can often do in a single read—could give physicians a more comprehensive view of what they’re dealing with, and highlight any quasi-species that affect the course of treatment or how the virus is likely to spread.

The Broader View

These applications are well suited to the diagnostic instrument Roche is building. A test panel for rare genetic diseases can offer clear-cut answers, pointing physicians to any specific variants linked to a disorder, and offering follow-up information on the evidence that backs up that call.

That kind of report fits well into the workflows of smaller hospital labs, and is relatively painless to submit to the FDA for approval. It doesn’t require geneticists to puzzle over ambiguous results. As Schaffer says of his company’s overall NGS efforts, “In the past two years, Roche has been actively engaged in more than 25 partnerships, collaborations and acquisitions with the goal of enabling us to achieve our vision of sample in to results out.”

But some of the biggest ways medicine could benefit from long-read sequencing will continue to require the personal touch of labs like Mount Sinai’s.

Take cancer, for example, a field in which complex gene fusions and genetic rearrangements have been studied for decades. Tumors contain multitudes of cells with unique patchworks of mutations, and while long-read sequencing can pick up structural variants that may play a role in prognosis and treatment, many of these variants are rarely seen, little documented, and hard to boil down into a physician-friendly answer.

An ideal way to unravel a unique cancer case would be to sequence the RNA molecules produced in the tumor, creating an atlas of the “transcriptome” that shows which genes are hyperactive, which are being silenced, and which have been fused together. “When you run something like IsoSeq on PacBio and you can see truly the whole transcriptome, you’re going to figure out all possible fusions, all possible splicing events, and the true atlas of reads,” says Sebra. “Cancer is so diverse that it’s important to do that on an individual level.”

Occasionally, looking at the whole transcriptome, and seeing how a mutation in one gene affects an entire network of related genes, can reveal an unexpected treatment option―repurposing a drug usually reserved for other cancer types. But that takes a level of attention and expertise that is hard to condense into a mass-market assay.

And, Sebra suggests, there’s another reason for medical centers not to lean too heavily on off-the-shelf tests from vendors like Roche.

Devoted as he is to his onetime employer, Sebra is also a fan of other technologies now emerging to capture some of the same long-range, structural information on the genome. “You’ve now got 10X Genomics, BioNano, and Oxford Nanopore,” he says. “Often, any two or even three of those technologies, when you merge them together, can get you a much more comprehensive story, sometimes faster and sometimes cheaper.” At Mount Sinai, for example, combining BioNano and PacBio data has produced a whole human genome much more comprehensive than either platform can achieve on its own.

The same is almost certainly true of complex cases like cancer. Yet, while companies like Roche might succeed in bringing NGS diagnostics to a much larger number of patients, they have few incentives to make their assays work with competing technologies the way a research-heavy institute like Mount Sinai does.

“It actually drives the commercialization of software packages against the ability to integrate the data,” Sebra says.

Still, he’s hopeful that the Sequel can lead the industry to pay more attention to long-read sequencing in the clinic. “The RS II does a great job of long-read sequencing, but the throughput for the Sequel is so much higher that you can start to achieve large genomes faster,” he says. “It makes it more accessible for people who don’t own the RS II to get going.” And while the need for highly specialized genetics labs won’t be falling off anytime soon, most patients don’t have the luxury of being treated in a hospital with the resources of Mount Sinai. NGS companies increasingly see physicians as some of their most important customers, and as our doctors start checking into the health of our genomes, it would be a shame if ubiquitous short-read sequencing left them with blind spots.

Source: http://diagnosticsworldnews.com/2015/12/16/long-read-sequencing-age-genomic-medicine.aspx

 

 

Read Full Post »

How Will FDA’s new precision FDA Science 2.0 Collaboration Platform Protect Data? Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 1: Next Generation Sequencing (NGS)

How Will FDA’s new precisionFDA Science 2.0 Collaboration Platform Protect Data?

Reporter: Stephen J. Williams, Ph.D.

As reported in MassDevice.com

FDA launches precisionFDA to harness the power of scientific collaboration

FDA VoiceBy: Taha A. Kass-Hout, M.D., M.S. and Elaine Johanson

Imagine a world where doctors have at their fingertips the information that allows them to individualize a diagnosis, treatment or even a cure for a person based on their genes. That’s what President Obama envisioned when he announced his Precision Medicine Initiative earlier this year. Today, with the launch of FDA’s precisionFDA web platform, we’re a step closer to achieving that vision.

PrecisionFDA is an online, cloud-based, portal that will allow scientists from industry, academia, government and other partners to come together to foster innovation and develop the science behind a method of “reading” DNA known as next-generation sequencing (or NGS). Next Generation Sequencing allows scientists to compile a vast amount of data on a person’s exact order or sequence of DNA. Recognizing that each person’s DNA is slightly different, scientists can look for meaningful differences in DNA that can be used to suggest a person’s risk of disease, possible response to treatment and assess their current state of health. Ultimately, what we learn about these differences could be used to design a treatment tailored to a specific individual.

The precisionFDA platform is a part of this larger effort and through its use we want to help scientists work toward the most accurate and meaningful discoveries. precisionFDA users will have access to a number of important tools to help them do this. These tools include reference genomes, such as “Genome in the Bottle,” a reference sample of DNA for validating human genome sequences developed by the National Institute of Standards and Technology. Users will also be able to compare their results to previously validated reference results as well as share their results with other users, track changes and obtain feedback.

Over the coming months we will engage users in improving the usability, openness and transparency of precisionFDA. One way we’ll achieve that is by placing the code for the precisionFDA portal on the world’s largest open source software repository, GitHub, so the community can further enhance precisionFDA’s features.Through such collaboration we hope to improve the quality and accuracy of genomic tests – work that will ultimately benefit patients.

precisionFDA leverages our experience establishing openFDA, an online community that provides easy access to our public datasets. Since its launch in 2014, openFDA has already resulted in many novel ways to use, integrate and analyze FDA safety information. We’re confident that employing such a collaborative approach to DNA data will yield important advances in our understanding of this fast-growing scientific field, information that will ultimately be used to develop new diagnostics, treatments and even cures for patients.

fda-voice-taha-kass-1x1Taha A. Kass-Hout, M.D., M.S., is FDA’s Chief Health Informatics Officer and Director of FDA’s Office of Health Informatics. Elaine Johanson is the precisionFDA Project Manager.

 

The opinions expressed in this blog post are the author’s only and do not necessarily reflect those of MassDevice.com or its employees.

So What Are the Other Successes With Such Open Science 2.0 Collaborative Networks?

In the following post there are highlighted examples of these Open Scientific Networks and, as long as

  • transparancy
  • equal contributions (lack of heirarchy)

exists these networks can flourish and add interesting discourse.  Scientists are already relying on these networks to collaborate and share however resistance by certain members of an “elite” can still exist.  Social media platforms are now democratizing this new science2.0 effort.  In addition the efforts of multiple biocurators (who mainly work for love of science) have organized the plethora of data (both genomic, proteomic, and literature) in order to provide ease of access and analysis.

Science and Curation: The New Practice of Web 2.0

Curation: an Essential Practice to Manage “Open Science”

The web 2.0 gave birth to new practices motivated by the will to have broader and faster cooperation in a more free and transparent environment. We have entered the era of an “open” movement: “open data”, “open software”, etc. In science, expressions like “open access” (to scientific publications and research results) and “open science” are used more and more often.

Curation and Scientific and Technical Culture: Creating Hybrid Networks

Another area, where there are most likely fewer barriers, is scientific and technical culture. This broad term involves different actors such as associations, companies, universities’ communication departments, CCSTI (French centers for scientific, technical and industrial culture), journalists, etc. A number of these actors do not limit their work to popularizing the scientific data; they also consider they have an authentic mission of “culturing” science. The curation practice thus offers a better organization and visibility to the information. The sought-after benefits will be different from one actor to the next.

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

  • Using Curation and Science 2.0 to build Trusted, Expert Networks of Scientists and Clinicians

Given the aforementioned problems of:

        I.            the complex and rapid deluge of scientific information

      II.            the need for a collaborative, open environment to produce transformative innovation

    III.            need for alternative ways to disseminate scientific findings

CURATION MAY OFFER SOLUTIONS

        I.            Curation exists beyond the review: curation decreases time for assessment of current trends adding multiple insights, analyses WITH an underlying METHODOLOGY (discussed below) while NOT acting as mere reiteration, regurgitation

 

      II.            Curation providing insights from WHOLE scientific community on multiple WEB 2.0 platforms

 

    III.            Curation makes use of new computational and Web-based tools to provide interoperability of data, reporting of findings (shown in Examples below)

 

Therefore a discussion is given on methodologies, definitions of best practices, and tools developed to assist the content curation community in this endeavor

which has created a need for more context-driven scientific search and discourse.

However another issue would be Individual Bias if these networks are closed and protocols need to be devised to reduce bias from individual investigators, clinicians.  This is where CONSENSUS built from OPEN ACCESS DISCOURSE would be beneficial as discussed in the following post:

Risk of Bias in Translational Science

As per the article

Risk of bias in translational medicine may take one of three forms:

  1. a systematic error of methodology as it pertains to measurement or sampling (e.g., selection bias),
  2. a systematic defect of design that leads to estimates of experimental and control groups, and of effect sizes that substantially deviate from true values (e.g., information bias), and
  3. a systematic distortion of the analytical process, which results in a misrepresentation of the data with consequential errors of inference (e.g., inferential bias).

This post highlights many important points related to bias but in summarry there can be methodologies and protocols devised to eliminate such bias.  Risk of bias can seriously adulterate the internal and the external validity of a clinical study, and, unless it is identified and systematically evaluated, can seriously hamper the process of comparative effectiveness and efficacy research and analysis for practice. The Cochrane Group and the Agency for Healthcare Research and Quality have independently developed instruments for assessing the meta-construct of risk of bias. The present article begins to discuss this dialectic.

  • Information dissemination to all stakeholders is key to increase their health literacy in order to ensure their full participation
  • threats to internal and external validity  represent specific aspects of systematic errors (i.e., bias)in design, methodology and analysis

So what about the safety and privacy of Data?

A while back I did a post and some interviews on how doctors in developing countries are using social networks to communicate with patients, either over established networks like Facebook or more private in-house networks.  In addition, these doctor-patient relationships in developing countries are remote, using the smartphone to communicate with rural patients who don’t have ready access to their physicians.

Located in the post Can Mobile Health Apps Improve Oral-Chemotherapy Adherence? The Benefit of Gamification.

I discuss some of these problems in the following paragraph and associated posts below:

Mobile Health Applications on Rise in Developing World: Worldwide Opportunity

According to International Telecommunication Union (ITU) statistics, world-wide mobile phone use has expanded tremendously in the past 5 years, reaching almost 6 billion subscriptions. By the end of this year it is estimated that over 95% of the world’s population will have access to mobile phones/devices, including smartphones.

This presents a tremendous and cost-effective opportunity in developing countries, and especially rural areas, for physicians to reach patients using mHealth platforms.

How Social Media, Mobile Are Playing a Bigger Part in Healthcare

E-Medical Records Get A Mobile, Open-Sourced Overhaul By White House Health Design Challenge Winners

In Summary, although there are restrictions here in the US governing what information can be disseminated over social media networks, developing countries appear to have either defined the regulations as they are more dependent on these types of social networks given the difficulties in patient-physician access.

Therefore the question will be Who Will Protect The Data?

For some interesting discourse please see the following post

Atul Butte Talks on Big Data, Open Data and Clinical Trials

 

Read Full Post »

Atrial Fibrillation Surgery Market worth $1.73 Billion by 2020

Reporter: Aviva Lev-Ari, PhD, RN

 

The report “Atrial Fibrillation Surgery Market by Procedure (Catheter Ablation, Surgical Ablation/Maze), Product (Catheter Ablation, Surgical Ablation), and Geography (Canada, China, France, Germany, India, Japan, U.S., U.K.) – Global Forecast to 2020″.

 

About 4.4% people out of diagnosed atrial fibrillation population are treated using ablation technology in the U.S., Europe, and Japan combined. The atrial fibrillation population treated with ablation is expected to grow with a double digit rate in these regions. The increasing prevalence of atrial fibrillation and advancement in the ablation technologies drive the use of ablation products in the treatment of atrial fibrillation.

The European Society of Cardiology guidelines recommend use of surgical and catheter ablation procedures for the treatment of atrial fibrillation, if anti-arrhythmic drug therapy results are inadequate. Further, guidelines recommend using surgical ablation for the treatment of patients failing catheter ablation.

This market is segmented, on the basis of surgical procedures, into catheter ablation and surgical ablation. Catheter ablation procedures hold the maximum share in this market and are expected to grow with the highest CAGR during forecast period of 2015 to 2020.

The geographical segmentation includes North America, Europe, Asia-Pacific and RoW. North America holds the largest share in this market, both in terms of number of procedures and revenue. However, Asia-Pacific is expected to witness the highest growth rate owing to the presence of a large patient population, growing medical tourism, and high focus of global market players on emerging Asia-Pacific countries such as China and India.

Major Players in AF Surgery Market are

Biosense Webster, Inc.
St. Jude Medical, Inc.
Medtronic, Inc.
Boston Scientific Corporation
Articure, Inc.
Cardiofocus, Inc.

SOURCE

From: MarketsandMarkets <Newsletter@marketsandmarkets.com>

Reply-To: MarketsandMarkets <Newsletter@marketsandmarkets.com>

Date: Tuesday, December 15, 2015 at 10:30 AM

To: Aviva Lev-Ari <AvivaLev-Ari@alum.berkeley.edu>

Subject: Atrial Fibrillation Surgery Market worth $1.73 Billion by 2020

Read Full Post »

Translational Gene Editing – June 16-17, 2016 in Boston, MA by CHI, Westin Boston Waterfront, Boston, MA, Volume 2 (Volume Two: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS and BioInformatics, Simulations and the Genome Ontology), Part 2: CRISPR for Gene Editing and DNA Repair

Translational Gene Editing – June 16-17, 2016 in Boston, MA by CHI, Westin Boston Waterfront, Boston, MA

Reporter: Aviva Lev-Ari, PhD, RN

 

Gene editing, particularly using the CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)/Cas9 system, is now being extensively used as a research and functional screening tool in drug discovery. Cambridge Healthtech Institute’s second annual conference on Translational Gene Editing will bring together experts from all aspects of preclinical research, from early target discovery to drug delivery, to talk about the progress being made in gene editing and how it’s being applied. Learn about ways in which CRISPR/Cas9 is being used to identify targets, create relevant cell lines and in vivo disease models, set up functional screens, and for targeted drug delivery. What can you do to overcome some of the inherent challenges with design, delivery and off-target effects associated with CRISPR/Cas9? Hear from experts in pharma/biotech, academic and government labs who will share their experiences leveraging the utility of gene editing for diverse applications, particularly in oncology and immunotherapy.

PRELIMINARY AGENDA

Gene Editing for Screening Pathways and Drug Targets

CRISPR-Cas9 Whole Genome Screening: Going Where No Screen Has Gone Before
Ralph Garippa, Ph.D., Director, RNAi Core Facility, Sloan-Kettering Institute, Memorial Sloan-Kettering Cancer Center

Parallel shRNA and CRISPR/Cas9 Screens Reveal Biology of Stress Pathways and Identify Novel Drug Targets

Michael Bassik, Ph.D., Assistant Professor, Department of Genetics, Stanford University

Cross-Species Synthetic Lethal Screens and Applications to Drug Discovery

Norbert Perrimon, Ph.D., Professor, Department of Genetics, Harvard Medical School and Investigator, Howard Hughes Medical Institute

Scouring the Non-Coding Genome by Saturating Edits

Daniel E Bauer M.D., Ph.D., Assistant Professor of Pediatrics, Harvard Medical School and Staff Physician in Pediatric Hematology/Oncology, Boston Children’s Hospital and Dana-Farber Cancer Institute, Principal Faculty, Harvard Stem Cell Institute

Building the CRISPR Toolbox

Optimized sgRNA Libraries for Genetic Screens With CRISPR-Cas9

John Doench, Ph.D., Associate Director, Genetic Perturbation Platform, Broad Institute of Harvard and MIT

Beyond Cas9: Discovering Single Effector CRISPR Tools

Jonathan Gootenberg, Member, Laboratories of Dr. Aviv Regev and Dr. Feng Zhang, Department of Systems Biology, Harvard Medical School, and Broad Institute of Harvard and MIT

CRISPR-Cas9 Genome Editing Improves Sub-cellular Localization Studies

Netanya Y. Spencer, M.D., Ph.D., Research Fellow in Medicine, Joslin Diabetes Center, Harvard Medical School

Improving Precision and Delivery

Nucleic Acid Delivery Systems for RNA Therapy and Gene Editing

Daniel G. Anderson, Ph.D., Professor, Department of Chemical Engineering, Institute for Medical Engineering & Science, Harvard-MIT Division of Health Sciences & Technology and David H. Koch Institute for Integrative Cancer Research, Massachusetts Institute of Technology

Understanding and Translating In vivo Findings

Application of Genome Editing Tools to Model Human Genetics Findings in Drug Discovery

Myung Shin, Ph.D., Principal Scientist, Biology-Discovery, Genetics and Pharmacogenomics, Merck Research Laboratories

Translating CRISPR/Cas9 Into Novel Medicines

Alexandra Glucksmann., Ph.D., COO, Editas Medicine

Recommended Short Course*

SC8: A Primer to Gene Editing: Tools and Application – Learn More

Instructors:

-John Doench, Ph.D., Associate Director, Genetic Perturbation Platform, Broad Institute of Harvard and MIT

-Michael Bassik, Ph.D., Assistant Professor, Department of Genetics, Stanford University

-Stephanie Mohr, Ph.D., Lecturer, Genetics & Director of the Drosophila RNAi Screening Center, Harvard Medical School

– Claire Yanhui Hu, Ph.D, Senior Bioinformatician, Drosophila RNAi Screening Center, Department of Genetics, Harvard Medical School

The course will help the novice understand the basics of how gene editing works, what tools are available for use and how those tools differ from each other. For the expert, this course will offer details on the CRISPR technology, how to set up CRISPR-based screens and complement it with existing RNAi-based screens using proper analysis and follow-up studies. The instructors will also cover the use of gene editing in drug discovery and disease modeling and best practices for design and workflows when working with other model systems, besides mammalian cells.

*Separate Registration Required

For questions or suggestions about the meeting, please contact:

Tanuja Koppal, Ph.D.

Conference Director

Cambridge Healthtech Institute

T: (+1) 973-525-4667

E: tkoppal@healthtech.com

For sponsorship and exhibit sales information including sponsored podium presentations, contact:

Joseph Vacca

Associate Director, Business Development

Cambridge Healthtech Institute

T: (+1) 781-972-5431

E: jvacca@healthtech.com

For more information visit

WorldPreclinicalCongress.com/Gene-Editing

Read Full Post »

Are CXCR4 Antagonists Making a Comeback in Cancer Chemotherapy?

Reporter: Stephen J. Williams, Ph.D.

Biospace News reported that Massachusetts based X4 Pharmaceuticals is using $34B to launch two clinical trials on its CXCR4 inhibitor X4P-001 in refractory clear cell renal cell carcinoma and refractory epithelial ovarian cancer.

The full report is below:

X4 Pharma Uses $37.5 Million to Push Cancer Therapies into Human Trials

 

December 14, 2015
By Alex Keown, BioSpace.com Breaking News Staff

CAMBRIDGE, Mass. – Massachusetts based X4 Pharmaceuticals is beginning human trials for its oncology program using CXCR4 inhibitors, the Boston Business Journal reported this morning.

After spending years in stealth mode, the company, helmed by former Genzyme executives, is launching two clinical two clinical studies initiating in 2016 in refractory clear cell renal cell carcinoma and refractory epithelial ovarian cancer with its lead drug candidate, X4P-001.

The company’s pipeline is based on drug compounds that originate from a portfolio of oral CXCR4 inhibitors exclusively licensed from Sanofi (SNY), X4 said in a statement. Inhibition of CXCR4, a receptor over-expressed in many cancers, is designed to block non-cancerous immuno-suppressive and pro-angiogenic cells from populating the tumor, thereby disrupting the cancer microenvironment and restoring normal immune surveillance functions. The novel mechanism of CXCR4 inhibition increases the ability of T-Cells to track and destroy cancer. X4 is leveraging its CXCR4 research against past experience working with Genzyme’s plerixafor, which is also a CXCR4 blocker.

In an interview with the Journal, Paula Ragan, X4’s chief executive officer, said the CXCR4 protein “acts as a beacon to attract cells to surround a tumor, effectively hiding the tumor from the body’s T cells that would otherwise destroy them.” Developing a therapy to block the protein will prevent the tumors from hiding and allow it to be treated.

Ragan said the Phase Ia trial for X4P-001 will test safety and dosage in a small trial of about 20 people, the Journal reported. If all goes well the company would start a Phase 2a trial by the end of 2016 in around 50 or 60 patients, the Journal said.

In September, Ragan said a CXCR4 antagonist could potentially be paired with promising oncology drugs like Merck & Co. (MRK)’s Keytruda, or Bristol-Myers Squibb (BMY)’s Opdivo. Keytruda has been shown to be effective in treating patients with three types of cancer, melanoma, lung cancer and mesothelioma. Opdivo is a treatment of patients with metastatic squamous non-small cell lung cancer (NSCLC) with progression on or after platinum-based chemotherapy.

The cytokines and cytokine receptors have been investigated before for their utility as a chemotherapeutic target as they are highly expressed on tumors and promote metastasis. The general idea was that tumor cells secrete cytokines which promote their growth and metastases and attract immune cells which also secrete growth-promoting cytokines. Many tumor types have shown increased expression of these cytokines and cytokine receptors. However only some development efforts have shown promise, there have been no approved drugs in this class. As written in a previous post (Tumor Associated Macrophages: The Double-Edged Sword Resolved?), there could be many biological reasons for this, as well as difficulties in interpreting preclinical results in immunocompromised mice.

CXCR4: from the review by Ori Wald, Oz M. Shapira, Uzi Izha

 

Source: Wald O, Shapira OM, Izhar U. CXCR4/CXCL12 Axis in Non Small Cell Lung Cancer (NSCLC) Pathologic Roles and Therapeutic Potential. Theranostics 2013; 3(1):26-33. doi:10.7150/thno.4922. Available from http://www.thno.org/v03p0026.htm

 

Chemokines, a family of 48 chemotactic cytokines interact with their 7 transmembrane G-protein-coupled receptors to guide immune cell trafficking in the body under both physiologic and pathologic conditions (20, 21). Tumor cells, which express a relatively restricted repertoire of chemokine and chemokine receptors, utilize and manipulate the chemokine system in a manner that benefits both local tumor growth and distant dissemination (20, 22, 23). In the tumor microenvironment autocrine and paracrine chamokine/chemokine receptor loops interact to promote tumor cell survival and growth, and also to enhance tumor neo-angiogenesis (20, 22, 23). At distant sites, it is the tissue-produced chemokine which guide/attracts the metastasis of chemokine receptor expressing tumor cells (20).

Among the 19 chemokine receptors, CXCR4 is the receptor most widely expressed by malignant tumors and whose role in tumor biology is most thoroughly studied (20). The chemokine CXCL12 is the sole ligand of CXCR4 and the majority of research that focus on the role of CXCR4 in cancer relates to this chemokine/chemokine receptor pair (24, 25). Nevertheless, in 2006 another receptor for CXCL12 was identified and named CXCR7 (26). CXCR7 is expressed during embriogenesis, angiogenesis and in various malignant tissues including NSCLC. CXCR7 is thought to act in part as a scavenger of CXCL12 however additional functions for this receptor have also been reported (2628). In distinct form CXCR4, CXCR7 binds not only CXCL12 but also the chemokine CXCL11 (26, 27). Moreover, the signaling cascades that are generated upon binding of CXCL12 to CXCR4 or CXCR7 vary at least partly, depending on which of the receptors is engaged (26, 27). This review focuses mainly on data collected regarding the expression and function of CXCR4 in NSCLC, nevertheless it is important to keep in mind that whenever CXCL12 is mentioned the effects related to its expression may be attributed in part to CXCR7 expression and function.

Relative to normal cells in the tumor’s tissue of origin, malignant cells often over express CXCR4, this phenotype can be induced by multiple oncogenic alternations and appears to promote tumor cell survival, proliferation, invasion and metastasis (20, 2935).

 

 

 

cxcrimagetumorncls

Potential roles for CXCR4/CXCL12 in NSCLC. NSCLC tumor cells express CXCR4 and produce CXCL12. Tumor expressed CXCR4 guides metastatic spread to sights such as the brain, bone marrow and liver that express high levels of CXCL12. In addition, CXCR4/CXCL12 interactions act locally in autocrine and paracrine manners to enhance primary tumor growth and to alter its inflammatory milieu. Tumor and tumor microenvironment secreted CXCL12 enhance tumor cell survival and growth and may also guide trafficking of immune and bone marrow derived cells into the tumor microenvironment. Furthermore, alternations in the tumor microenvironment result from the stimulation of tumor cells with CXCL12 that in turn enhance the production of additional chemokines such as the pro-inflammatory and pro-proliferative chemokine CCL20) pro-angiogenic and pro-proliferative chemokine (CXCL1 – IL-8). Figure from Wald O, Shapira OM, Izhar U. CXCR4/CXCL12 Axis in Non Small Cell Lung Cancer (NSCLC) Pathologic Roles and Therapeutic Potential. Theranostics 2013; 3(1):26-33. doi:10.7150/thno.4922. Available from http://www.thno.org/v03p0026.htm

For further reference on CXCR4 and development of CXCR4 inhibitors please see the following references:

  1. Peled A, Wald O, Burger J. Development of novel CXCR4-based therapeutics. Expert Opin Investig Drugs. 2012Mar;21(3):341-53
  2. Balkwill FR. The chemokine system and cancer. J Pathol. 2012Jan;226(2):148-57
  3. Burger JA, Kipps TJ. CXCR4: a key receptor in the crosstalk between tumor cells and their microenvironment. Blood. 2006Mar1;107(5):1761-7
  4. Burger JA, Peled A. CXCR4 antagonists: targeting the microenvironment in leukemia and other cancers. Leukemia. 2009Jan;23(1):43-52
  5. Otsuka S, Bebb G. The CXCR4/SDF-1 chemokine receptor axis: a new target therapeutic for non-small cell lung cancer. J Thorac Oncol. 2008Dec;3(12):1379-83

 

Current CXCR4 inhibitors in development

somecxcr4inhibitors

Figure. Structures of Representative Small Molecule CXCR4 Antagonists and CXCL12 Inhibitors. From JJ Zariek et al. Fragment-Based Optimization of Small Molecule CXCL12 Inhibitors for Antagonizing the CXCL12/CXCR4 Interaction. Curr Top Med Chem. 2012; 12(24): 2727–2740.

 

A Phase 1 Trial of LY2510924, a CXCR4 Peptide Antagonist, in Patients with Advanced Cancer

This manuscript reports the results of a phase I study designed to evaluate the safety and tolerability of the C-X-C motif receptor 4 (CXCR4) inhibitor LY2510924 in patients with advanced cancer. LY2510924 is a peptide antagonist, which blocks stromal cell-derived factor-1 (SDF-1) from CXCR4 binding. CXCR4 is often overexpressed in many cancers and involved in the metastasis of solid tumors. LY2510924 was tolerated with mostly Grade 1/2 adverse events, revealed favorable pharmacokinetics, and demonstrated evidence of target engagement as indicated by dose dependent increases in CD34+ cells.

Anti-CXCR4 (BMS-936564) Alone and in Combination With Lenalidomide/Dexamethasone or Bortezomib/Dexamethasone in Relapsed/Refractory Multiple Myeloma

The purpose of this study is to determine 1) the safety and tolerability of multiple intravenous doses of anti-CXCR4 (BMS-936564) as monotherapy and as combination, and 2) the maximum tolerated dose (MTD) of BMS-936564 in combination with Lenalidomide/Dexamethasone or Bortezomib/Dexamethasone in subjects with relapsed or refractory multiple myeloma.

 

Novel CXCR4 Antagonist BL-8040 Enters Clinical Testing for CML – See more at: http://www.cancernetwork.com/chronic-myeloid-leukemia/novel-cxcr4-antagonist-enters-clinical-testing-cml#sthash.RoXCC5W6.dpuf

 

Other posts on this Open Access Journal on CXCR4 and Chemokines in Cancer Include

Assessing effects of antimetastatic treatment

Understanding the Stem Cell Niche: A Webinar by The Scientist

Protein regulator of HIV replication

Immunotherapy in Cancer: A Series of Twelve Articles in the Frontier of Oncology by Larry H Bernstein, MD, FCAP

Humanized Mice May Revolutionize Cancer Drug Discovery

Tumor Associated Macrophages: The Double-Edged Sword Resolved?

 

 

 

 

 

 

 

Read Full Post »

Assessing effects of antimetastatic treatment

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Combining Kinetic Ligand Binding and 3D Tumor Invasion Technologies to Assess Drug Residence Time and Anti-metastatic Effects of CXCR4 Inhibitors

Application Note 3D Cell Culture, ADME/Tox, Cell Imaging, Cell-Based Assays
BioTek Instruments, Inc. P.O. Box 998, Highland Park, Winooski, Vermont 05404-0998
Brad Larson and Leonie Rieger, BioTek Instruments, Inc., Winooski, VT
Nicolas Pierre, Cisbio US, Inc., Bedford, MA
Hilary Sherman, Corning Incorporated, Life Sciences, Kennebunk, ME

http://vertassets.blob.core.windows.net/download/ba9da411/ba9da411-a56c-42d3-a1a0-8c128224947f/cisbio_residence_time_app_note_final.pdf

 

Metastasis, the spread of cancer cells from the original tumor to secondary locations within the body, is linked to approximately 90% of cancer deaths1 . The expression of chemokine receptors, such as CXCR4 and CCR7, is tightly correlated with the metastatic properties of breast cancer cells. In vivo, neutralizing the interaction of CXCR4 and its known ligand, SDF1-α (CXCL12), significantly impaired the metastasis of breast cancer cells and cell migration2 . Traditionally, the discovery of novel agents has been guided by the affinity of the ligand for the receptor under equilibrium conditions, largely ignoring the kinetic aspects of the ligandreceptor interaction. However, awareness of the importance of binding kinetics has started to increase due to accumulating evidence3, 4, 5, 6 suggesting that the in vivo effectiveness of ligands may be attributed to the time a particular ligand binds to its receptor (drug-target residence time).

Similarly, appropriate in vitro cell models have also been lacking to accurately assess the ability of novel therapies to inhibit tumor invasion. Tumors in vivo exist as a three-dimensional (3D) mass of multiple cell types, including cancer and stromal cells7 . Therefore, incorporating a 3D spheroid-type cellular structure that includes co-cultured cell types forming a tumoroid, provides a more predictive model than the use of individual cancer cells cultured on the bottom of a well in traditional two-dimensional (2D) format.

Here we examine the drug-target residence time of various CXCR4 inhibitors using a direct, homogeneous ligand binding assay and CXCR4 expressing cell line in a kinetic format. This inhibitor panel was further tested in a 3D tumor invasion assay to determine whether there is a correlation between the molecule’s CXCR4 residence time and inhibition of the phenotypic effect of tumor invasion. MDA-MB-231 breast adenocarcinoma cells, known to be invasive, and metastasize to lung from primary mammary fat pad tumors8 , were included, in addition to primary human dermal fibroblasts. Cellular analysis algorithms provided accurate quantification of changes to the original tumoroid structure, as well as invadopodia development. The combination presents an accurate, yet easy-to-use method to assess target-based and phenotypic effects of new, potential anti-metastatic drugs.

……

Cytation™ 5 Cell Imaging Multi-Mode Reader Cytation 5 is a modular multi-mode microplate reader that combines automated digital microscopy and microplate detection. Cytation 5 includes filter- and monochromator-based microplate reading; the microscopy module provides high resolution microscopy in fluorescence, brightfield, color brightfield and phase contrast. With special emphasis on live-cell assays, Cytation 5 features temperature control to 65 °C, CO2 / O2 gas control and dual injectors for kinetic assays. Shaking and Gen5 software are also standard. The instrument was used to image spheroids, as well as individual cell invasion through the Matrigel matrix.

Tag-lite® Receptor Ligand Binding Assay

Figure 1. Tag-lite® Receptor Ligand Binding Assay Procedure. The Tag-lite CXCR4 assay relies on a fully functional SNAP-tag fused CXCR4 receptor and fluorescently labeled ligand SDF1-α. Being homogeneous, the binding assay allows for binding events to be precisely recorded in time. The assay can be used to derive the kinetic binding parameters of unlabeled compounds by application of the Motulsky and Mahan equations.

……

Results and Discussion

Drug-Target Residence Time

Determination Association Kinetics of SDF1-α-d2 Labeled Ligand

The final Drug-Target Residence Time value takes into account the observed on and off rates of the unlabeled inhibitors as well as the labeled SDF1-α-d2 ligand, and is computed by incorporation of the Motulsky and Mahan equation9 . The first step to calculate the final value was to perform an associative binding experiment using a concentration range of 0-100 nM of the d2 acceptor fluor labeled ligand. Binding was monitored kinetically over a period of 40 minutes.

Figure 2. Association binding graph of SDF1-α-d2. Observed associative binding curves calculated from HTRF ratios of wells containing SDF1-α-d2 ligand concentrations ranging from 0-100 nM. Non-specific binding values subtracted from total ratios to determine observed specific binding.

Binding increases over time until it plateaus after several minutes (Figure 2). The plateau in an association experiment depends on the concentration of labeled SDF1-α used. Higher plateaus will be obtained with higher concentrations. Fitting of the curves with Graph Pad Prism yields the observed association rate values for all concentrations tested or kobs.

The Kd value of the labeled ligand was also determined by plotting the HTRF ratios generated after a binding equilibrium was reached with the different concentrations of ligand tested.

Figure 3. SDF1-α-d2 saturation binding curve. HTRF ratios generated upon the achievement of binding equilibrium of tested [SDF1-α-d2].

In a saturation binding experiment, increasing concentrations of labeled SDF1-α result in increased binding. Saturation is obtained when no further binding can be recorded. The ligand concentration that binds to half the receptor sites at equilibrium or Kd was 29 nM.

An assessment of whether the labeled SDF1-α ligand follows the Law of Mass action can also be carried out. If the system does follow the Law of Mass action then kobs increases linearly with increasing concentrations of SDF1-α.

Due to the linear shape of the curve, and an R2 value >0.9, Law of Mass Action was proven for the labeled SDF1-α ligand. This allowed for the use of Graph Pad Prism software to derive association and dissociation rate constants from the linear regression line. The rate constant values experimentally found or mathematically derived are summarized in Table 1. kon,SDF1-α-d2 and koff ,SDF1-α-d2 were 0.001 nM-1.s-1 and 0.04 s-1, respectively

Table   SDF1-α-d2 Kinetic Binding Characterization

Association Kinetics of SDF1-α-d2 Labeled Ligand In the theory developed by Motulsky and Mahan, an unlabeled competitor is co-incubated with a labeled ligand during a kinetic association experiment. Here, a single concentration of the SDF1-α-d2 ligand, 25 nM, was co-incubated with multiple concentrations of the unlabeled SDF1-α competitors in the presence of the CXCR4 expressing cells. Kinetic binding of the labeled ligand was then monitored over time.

Figure 5. Kinetics of Competitive Binding. Plot of specific binding HTRF ratios over time for the SDF1-α-d2 ligand when in the presence of 100, 10, or 1 nM concentrations of (A.) AMD 3100, (B.) AMD 3465, or (C.) IT1t.

From the curve fitting of the observed SDF1-α-d2 kinetic binding, and incorporation of the Law of Mass Action linear regression line, k(off) (Min-1) values were then calculated. Final residence time (R) values could then be determined using the following formula:

R = 1/k(off)

Therefore, molecules having a lower k(off) rate reside at the target receptor for longer periods of time.

Table 2. SDF1-α Competitor Dissociation Rate and Residence Time Values.

 

From the shape of the curves in Figure 5, and a comparison of the residence time values generated for the labeled ligand and unlabeled competitors (Table 2), qualitative and quantitative assumptions regarding the various competitors can then be made. First, if the competitor dissociates faster from its target than the ligand (smaller R value), such as is seen with AMD 3100 (Figure 5A), the specific binding of the ligand will slowly and monotonically approach its equilibrium in time. However, when the competitor dissociates slower (larger R value), the association curve of the ligand consists of two phases, starting with a typical “overshoot” and then a decline until a new equilibrium is reached. Competitors whose residence times are greater than that of the SDF1-α-d2 ligand, such as AMD 3465 and IT1t (Figure 5B and C), may then exhibit a stronger inhibitory response when used in the confirmatory phenotypic 3D tumor invasion assay.

Interruption of Invasion via SDF1-α Ligand Binding Inhibition As stated previously, interruption of the interaction between CXCR4 and its known ligand, SDF1-α, impairs metastasis of breast cancer and cell migration2 . Therefore, a phenotypic assessment of the CXCR4 inhibitor panel was then performed to determine whether changes in the level of tumor migration could be detected, and more importantly, if compounds exhibiting longer residence times compared to SDF1-α-d2 exhibited a higher inhibitory effect on migration through the 3D matrix. MDA-MB-231 breast adenocarcinoma cells, co-cultured with human dermal fibroblasts, were used as the in vitro tumor model. This breast cancer cell line has been previously shown to express the CXCR4 receptor10.

 

Figure 6. Image-based Monitoring of MDA-MB-231/Fibroblast Tumor Invasion. Overlaid brightfield and fluorescent images captured using a 4x objective, after a 0 and 5 day incubation period with AMD 3465, IT1t, and CTCE 9908. Imaging channel representation: Brightfield – Total cells and invadopodia; GFP – MDA-MB-231 cells; RFP – Fibroblasts.

Figure 7. Quantification of Invasive Tumor Area. 4x overlaid images captured following 5 day (A.) 100 and (B.) 0 μM IT1t incubation with tumoroids. Object masks automatically drawn by Gen5 using the following criteria: Threshold: 5000 RFU; Min. Object Size: 400 μm; Max. Object Size: 1500 μm; Image Smoothing Strength: 0; Background Flattening Size: Auto.

 

Cellular analysis is performed with the Cytation 5 using the brightfield signal to quantify the extent of invasion. Minimum and maximum object sizes, as well as brightfield threshold values are set such that a precise object mask is automatically drawn around each tumoroid in its entirety (Figure 7A and B). The same criteria are used for all images evaluated during the experiment. This allows for a quantitative comparison of the area covered within each object mask to be completed.

Figure 8. Tumor Invasion Inhibition Determination. Graphs of individual tumoroid areas on day 0, and subsequent to five day invasion period in the presence of inhibitor concentrations.

 

The 4x images displayed (Figure 6), as well as the graphs in Figure 8, demonstrating total tumoroid area coverage before and after the incubation period illustrate the ability of CXCR4 inhibitors to interrupt tumor invasion consistent with the previously determined residence time. AMD 3465 and IT1t, which exhibit a residence time longer than SDF1-α-d2, effectively minimize tumor invasion in a dose dependent manner. The decrease in MDAMB-231 GFP and fibroblast RFP expression exhibited after a 5 day 100 μM IT1t incubation, also seen after a 7 day AMD 3465 incubation of the same concentration (data not shown), may also indicate the chronic cytotoxic effects that elevated dosing of these compounds can have on both cancer and stromal cells. All other compounds show little to no effect on the ability of the tumoroid to migrate through the 3D matrix. While AMD 3465 and ITt1 display the same sub-nanomolar potency, AMD3465 prevails as a CXCR4 inhibitor due to its greater residence time.

 

Conclusions The Tag-lite CXCR4 ligand binding assay provides a simple, yet robust cell-based approach to determine kinetic binding of known receptor ligands, as well as competitive binding of test molecules. The simultaneous dual emission capture and injection capabilities of the Synergy Neo allow accurate calculations of kinetic association and dissociation rates to be made when used in conjunction with the Tag-lite® assay. Corning Spheroid Microplates then provide an easy-to-use, consistent method to perform spheroid aggregation and confirmatory 3D tumor invasion assays. Imaging of spheroid formation, as well as invading structures can be performed by the Cytation™ 5 using brightfield or fluorescent channels to easily track tumoroid invasion. The flexible cellular analysis capacity of the Gen5™ Data Analysis Software also allows for accurate assessment of 3D tumor invasion during the entire incubation period. The combination of assay chemistry, cell model, kinetic microplate and image-based monitoring, in addition to cellular analysis provide an ideal method to better understand the target-based and phenotypic effects of potential inhibitors of tumor invasion and metastasis.

References

1. Saxe, Charles. ‘Unlocking The Mysteries Of Metastasis’. ExpertVoices 2013. http://www.cancer.org/ cancer/news/expertvoices/post/2013/01/23/unlockingthe-mysteries-of-metastasis.aspx. Accessed 16 Mar. 2015.

2. Müller, A., Homey, B., Soto, H., Ge, N., Catron, D., Buchanan, M., McClanahan, T., Mruphy, E., Yuan, W., Wagner, S., Barrera, J., Mohar, A., Verástegui, E., Zlotnik, A. Involvement of chemokine receptors in breast cancer metastasis. Nature. 2001, 410, 50-56.

3. Swinney, D. Biochemical mechanisms of drug action: what does it take for success? Nat Rev Drug Discov. 2004, 3, 801-808.

4. Copeland, R., Pompliano, D., Meek, T. Drugtarget residence time and its implications for lead optimization. Nat Rev Drug Discov. 2006,5, 730-739.

5. Tummino, P., Copeland, R. Residence time of receptor-ligand complexes and its effect on biological function. Biochemistry. 2008, 47, 5481-5492.

6. Zhang, R., Monsma, F. The importance of drug-target residence time. Curr Opin Drug Discov Devel. 2009, 12, 488-496.

7. Mao, Y., Keller, E., Garfield, D., Shen, K., Wang, J. Stromal cells in tumor microenvironment and breast cancer. Cancer Metast Rev. 2013, 32, 303-315.

8. Kamath, L., Meydani, A., Foss, F., Kuliopulos, A. Signaling from protease-activated receptor-1 inhibits migration and invasion of breast cancer cells. Cancer Res. 2001, 61, 5933-5940.

9. Motulsky, H., Mahan, L. The kinetics of competitive radioligand binding predicted by the law of mass action. Mol Pharmacol. 1984, 25, 1-9.

10. Sun, Y., Mao, X, Fan, C, Liu, C., Guo, A., Guan, S., Jin, Q., Li, B., Yao, F., Jin, F. CXCL12-CXCR4 axis promotes the natural selection of breast cancer cell metastasis. Tumor Biol. 2014, 35, 7765-7773.

 

Read Full Post »

« Newer Posts - Older Posts »