Feeds:
Posts
Comments

Archive for the ‘BioBanking’ Category

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Reporter and Curator: Larry H. Bernstein, MD, FCAP

 

he large amount of funding tied to continued research and support of postdoctoral fellows leads one to ask how following the money can lead to discredited work in th elite scientific community.

Moreover, the pressure to publish in prestigious journals with high impact factors is a road to academic promotion.  In the last twenty years, it is unusual to find submissions for review with less than 6-8 authors, with the statement that all contributed to the work.  These factors can’t be discounted outright, but it is easy for work to fall through the cracks when a key investigator has over 200 publications and holds tenure in a great research environment.  But that is where we find ourselves today.

There is another issue that comes up, which is also related to the issue of carrying out research, and then protecting the work for commercialization.  It is more complicated in the sense that it is necessary to determine whether there is prior art, and then there is the possibility that after the cost of filing patent and a 6 year delay in obtaining protection, there is as great a cost in bringing the patent to finasl production.

I.  Individual actions undermining trust.

II. The patent dilemma.

III. The value of a clinical trial.

IV. The value contributions of RAP physicians
(radiologists, anesthesiologists, and pathologists – the last for discussion)
Those who maintain and inform the integrity of medical and surgical decisions

 

I. Top heart lab comes under fire

Kelly Servick

Science 18 July 2014: Vol. 345 no. 6194 p. 254 DOI: 10.1126/science.345.6194.25

 

In the study of cardiac regeneration, Piero Anversa is among the heavy hitters. His research into the heart’s repair mechanisms helped kick-start the field of cardiac cell therapy (see main story). After more than 4 decades of research and 350 papers, he heads a lab at Harvard Medical School’s Brigham and Women’s Hospital (BWH) in Boston that has more than $6 million in active grant funding from the National Institutes of Health (NIH). He is also an outspoken voice in a field full of disagreement.

So when an ongoing BWH investigation of the lab came to light earlier this year, Anversa’s colleagues were transfixed. “Reactions in the field run the gamut from disbelief to vindication,” says Mark Sussman, a cardiovascular researcher at San Diego State University in California who has collaborated with Anversa. By Sussman’s account, Anversa’s reputation for “pushing the envelope” and “challenging existing dogma” has generated some criticism. Others, however, say that the disputes run deeper—to doubts about a cell therapy his lab has developed and about the group’s scientific integrity. Anversa told Science he was unable to comment during the investigation.

“People are talking about this all the time—at every scientific meeting I go to,” says Charles Murry, a cardiovascular pathologist at the University of Washington, Seattle. “It’s of grave concern to people in the field, but it’s been frustrating,” because no information is available about BWH’s investigation. BWH would not comment for this article, other than to say that it addresses concerns about its researchers confidentially.

In April, however, the journal Circulation agreed to Harvard’s request to retract a 2012 paper on which Anversa is a corresponding author, citing “compromised” data. The Lancet also issued an “Expression of Concern” about a 2011 paper reporting results from a clinical trial, known as SCIPIO, on which Anversa collaborated. According to a notice from the journal, two supplemental figures are at issue.

For some, Anversa’s status has earned him the benefit of the doubt. “Obviously, this is very disconcerting,” says Timothy Kamp, a cardiologist at the University of Wisconsin, Madison, but “I would be surprised if it was an implication of a whole career of research.”

Throughout that career, Anversa has argued that the heart is a prolific, lifelong factory for new muscle cells. Most now accept the view that the adult heart can regenerate muscle, but many have sparred with Anversa over his high estimates for the rate of this turnover, which he maintained in the retracted Circulation paper.

Anversa’s group also pioneered a method of separating cells with potential regenerative abilities from other cardiac tissue based on the presence of a protein called c-kit. After publishing evidence that these cardiac c-kit+cells spur new muscle growth in rodent hearts, the group collaborated in the SCIPIO trial to inject them into patients with heart failure. In The Lancet, the scientists reported that the therapy was safe and showed modest ability to strengthen the heart—evidence that many found intriguing and provocative. Roberto Bolli, the cardiologist whose group at the University of Louisville in Kentucky ran the SCIPIO trial, plans to test c-kit+ cells in further clinical trials as part of the NIH-funded Cardiovascular Cell Therapy Research Network.

But others have been unable to reproduce the dramatic effects Anversa saw in animals, and some have questioned whether these cells really have stem cell–like properties. In May, a group led by Jeffery Molkentin, a molecular biologist at Cincinnati Children’s Hospital Medical Center in Ohio, published a paper in Nature tracing the genetic lineage of c-kit+ cells that reside in the heart. He concluded that although they did make new muscle cells, the number is “astonishingly low” and likely not enough to contribute to the repair of damaged hearts. Still, Molkentin says that he “believe[s] in their therapeutic potential” and that he and Anversa have discussed collaborating.

Now, an anonymous blogger claims that problems in the Anversa lab go beyond controversial findings. In a letter published on the blog Retraction Watch on 30 May, a former research fellow in the Anversa lab described a lab culture focused on protecting the c-kit+ cell hypothesis: “[A]ll data that did not point to the ‘truth’ of the hypothesis were considered wrong,” the person wrote. But another former lab member offers a different perspective. “I had a great experience,” says Federica Limana, a cardiovascular disease researcher at IRCCS San Raffaele Pisana in Rome who spent 2 years of her Ph.D. work with the group in 1999 and 2000, as it was beginning to investigate c-kit+ cells. “In that period, there was no such pressure” to produce any particular result, she says.

Accusations about the lab’s integrity, combined with continued silence from BWH, are deeply troubling for scientists who have staked their research on theories that Anversa helped pioneer. Some have criticized BWH for requesting retractions in the midst of an investigation. “Scientific reputations and careers hang in the balance,” Sussman says, “so everyone should wait until all facts are clearly and fully disclosed.”

 

II.  Trolling Along: Recent Commotion About Patent Trolls

July 17, 2014

PriceWaterhouseCoopers recently released a study about 2014 Patent Litigation. PwC’s ultimate conclusion was that case volume increased vastly and damages continue a general decline, but what’s making headlines everywhere is that “patent trolls” now account for 67% of all new patent lawsuits (see, e.g., Washington Post and Fast Company).

Surprisingly, looking at PwC’s study, the word “troll” is not to be found. So, with regard to patent trolls, what does this study really mean for companies, patent owners and casual onlookers?

First of all, who are these trolls?

“Patent Troll” is a label applied to patent owners who do not make or manufacture a product, or offer a service. Patent trolls live (and die) by suing others for allegedly practicing an invention that is claimed by their patents.

The politically correct term is Non-practicing Entity (NPE). PwC solely uses the term NPE, which it defines as an entity that does not have the capability to design, manufacture, or distribute products with features protected by the patent.

So, what’s so bad about them?

The common impression of an NPEs is a business venture looking to collect and monetize assets (i.e., patents). In the most basic strategy, an NPE typically buys patents with broad claims that cover a wide variety of technologies and markets, and then sues a large group of alleged patent infringers in the hope to collect a licensing royalty or a settlement. NPEs typically don’t want to spend money on a trial unless they have to, and one tactic uses settlements with smaller businesses to build a “war chest” for potential suits with larger companies.

NPEs initiating a lawsuit can be viewed positively, such as a just defense of the lowly inventor who sold his patent to someone (with deeper pockets) who could fund the litigation to protect the inventor’s hard work against a mega-conglomerate who ripped off his idea.

Or NPE litigation can be seen negatively, such as an attorney’s demand letter on behalf of an anonymous shell corporation to shake down dozens of five-figure settlements from all the local small businesses that have ever used a fax machine.

NPEs can waste a company’s valuable time and resources with lawsuits, yet also bring value to their patent portfolios by energizing a patent sales and licensing market. There are unscrupulous NPEs, but it’s hardly the black and white situation that some media outlets are depicting.

What did PwC say about trolls?

Well, the PwC study looked at the success rates and awards of patent litigation decisions. One conclusion is that damages awards for NPEs averaged more than triple those for practicing entities over the last four years. We’ll come back to this statistic.

Another key observation is that NPEs have been successful 25% of the time overall, versus 35% for practicing entities. This makes sense because of the burden of proof the NPEs carry as a plaintiff at trial and the relative lack of success for NPEs at summary judgment. However, PwC’s report states that both types of entities win about two-thirds of their trials.

But what about this “67% of all patent trials are initiated by trolls” discussion?

The 67% number comes from the RPX Corporation’s litigation report (produced January 2014) that quantified the percentage of NPE cases filed in 2013 as 67%, compared to 64% in 2012, 47% in 2011, 30% in 2010 and 28% in 2009.

PwC refers to the RPX statistics to accentuate that this new study indicates that only 20% ofdecisions in 2013 involved NPE-filed cases, so the general conclusion would be that NPE cases tend to settle or be dismissed prior to a court’s decision. Admittedly, this is indicative of the prevalent “spray and pray” strategy where NPEs prefer to collect many settlement checks from several “targets” and avoid the courtroom.

In this study, who else is an NPE?

If someone were looking to dramatize the role of “trolls,” the name can be thrown around liberally (and hurtfully) to anyone who owns and asserts a patent without offering a product or a service. For instance, colleges and universities fall under the NPE umbrella as their research and development often ends with a series of published papers rather than a marketable product on an assembly line.

In fact, PwC distinguishes universities and non-profits from companies and individuals within their NPE analysis, with only about 5% of the NPE cases from 1995 to 2013 being attributed to universities and non-profits. Almost 50% of the NPE cases are attributed to an “individual,” who could be the listed inventor for the patent or a third-party assignee.

The word “troll” is obviously a derogatory term used to connote greed and hiding (under a bridge), but the term has adopted a newer, meme-like status as trolls are currently depicted as lacking any contribution to society and merely living off of others’ misfortunes and fears. [Three Billy Goats Gruff]. This is not always the truth with NPEs (e.g., universities).

No one wants to be called a troll—especially in front of a jury—so we’ve even recently seen courts bar defendants from referring to NPEs as such colorful terms as a “corporate shell,” “bounty hunter,” “privateer,” or someone “playing the lawsuit lottery.” [Judge Koh Bans Use Of Term ” Patent Troll” In Apple Jury Trial]

Regardless of the portrayal of an NPE, most people in the patent world distinguish the “trolls” by the strength of the patent, merits of the alleged infringement and their behavior upon notification. Often these are expressed as “frivolity” of the case and “gamesmanship” of the attorneys. Courts are able to punish plaintiffs who bring frivolous claims against a party and state bar associations are tasked with monitoring the ethics of attorneys. The USPTO is tasked with working to strengthen the quality of patents.

What’s the take-away from this study regarding NPEs?

The study focuses on patent litigation that produced a decision, therefore the most important and relevant conclusion is that, over the last four years, average damages awards for NPEs are more than triple the damages for practicing entities. Everything else in these articles, such as the initiation of litigation by NPEs, settlement percentages, and the general behavior of patent trolls is pure inference beyond the scope of the study.

This may sound sympathetic to trolls, but keep in mind that the study highlights that NPEs have more than triple the damages on average compared to practicing entities and it is meant to shock the reader a bit. One explanation for this is that NPEs are in the best position to choose the patents they want to assert and choose the targets they wish to sue—especially when the NPE is willing to ride that patent all the way to the end of a long, expensive trial. Sometimes settling is not an option. Chart 2b indicates that the disparity in the damages awarded to NPEs relative to practicing entities has always been big (since 2000), but perhaps going from two-fold from 2000 – 2009 to three times as much in the past 4 years indicates that NPEs are improving at finding patents and/or picking battles to take all the way to a court decision. More than anything, this seems to reflect the growth in the concept of patents as a business asset.

The PwC report is chock full of interesting patterns and trends of litigation results, so it’s a shame that the 67% number makes the headlines—far more interesting are the charts comparing success rates by 4-year periods (Chart 6b) or success rates for NPEs and practicing entities in front of a jury verusin front of a bench (Chart 6c), as well as other tables that reveal statistics for specific districts of the federal courts. Even the stats that look at the success rates of each type of NPE are telling because the reader sees that universities and non-profits have a higher success rate than non-practicing companies or individuals.

What do we do about the trolls?

The White House has recently called for Congress to do something about the trolls as horror stories of scams and shake-downs are shared. A bill was gaining momentum in the Senate, when Senator Leahy took it off the agenda in early July. That bill had miraculously passed 325-91 in the House and President Obama was willing to sign it if the Senate were to pass it. The bill was opposed by trial attorneys, universities, and bio-pharmaceutical businesses who felt as though the law would severely inhibit everyone’s access to the courts in order to hinder just the trolls. Regardless, most people think that the sitting Congressmen merely wanted a “win” prior to the mid-term elections and that patent reform is unlikely to reappear until next term.

In the meantime, the Supreme Court has recently reiterated rules concerning attorney fee-shifting on frivolous patent cases, as well as clarifying the validity of software patents. Time will tell if these changes have any effects on the damages awards that PwC’s study examined or even if they cause a chilling of the number of patent lawsuit filings.

Furthermore, new ways to challenge the validity of asserted patents have been initiated via the America Invents Act. For example, the Inter Partes Review (IPR) has yielded frightening preliminary statistics as to slowing, if not killing, patents that have been asserted in a suit. While these administrative trials are not cheap, many view these new tools at the Patent Trial and Appeals Board as anti-troll measures. It will be interesting to watch how the USPTO implements these procedures in the near future, especially while former Google counsel, Acting Director Michelle K. Lee, oversees the office.

In the private sector, Silicon Valley has recently seen a handful of tech companies come together as the License on Transfer Network, a group hoping to disarm the “Patent Assertion Entities.” Joining the LOT Network comes via an agreement that creates a license for use of a patent by anyone in the LOT network once that patent is sold. The thought is that the NPEs who consider purchasing patents from companies in the LOT Network will have fewer companies to sue since the license to the other active LOT participants will have triggered upon the transfer and, thus, the NPE will not be as inclined to “troll.” For instance, if a member-company such as Google were to sell a patent to a non-member company and an NPE bought that patent, the NPE would not be able to sue any members of the LOT Network with that patent.

Other notes

NPEs are only as evil as the people who run them—that being said, there are plenty of horror stories of small businesses receiving phantom demand letters that threaten a patent infringement suit without identifying themselves or the patent. This is an out-and-out scam and a plague on society that results in wasted time and resource, and inevitably higher prices on the consumer end.

It is a sin and a shame that patent rights can be misused in scams and shake-downs of businesses around us, but there is a reason that U.S. courts are so often used to defend patent rights. The PwC study, at minimum, reflects the high stakes of the patent market and perhaps the fragility. Nevertheless, merely monitoring the courts may not keep the trolls at bay.

I’d love to hear your thoughts.

*This is provided for informational purposes only, and does not constitute legal or financial advice. The information expressed is subject to change at any time and should be checked for completeness, accuracy and current applicability. For advice, consult a suitably licensed attorney or patent agent.

 

III. Large-scale analysis finds majority of clinical trials don’t provide meaningful evidence

Ineffective TreatmentsMedical Ethics • Tags: Center for Drug Evaluation and ResearchClinical trialCTTIDuke University HospitalFDAFood and Drug AdministrationNational Institutes of HealthUnited States National Library of Medicine

04 May 2012

DURHAM, N.C.— The largest comprehensive analysis of ClinicalTrials.gov finds that clinical trials are falling short of producing high-quality evidence needed to guide medical decision-making. The analysis, published today in JAMA, found the majority of clinical trials is small, and there are significant differences among methodical approaches, including randomizing, blinding and the use of data monitoring committees.

“Our analysis raises questions about the best methods for generating evidence, as well as the capacity of the clinical trials enterprise to supply sufficient amounts of high quality evidence to ensure confidence in guideline recommendations,” said Robert Califf, M.D., first author of the paper, vice chancellor for clinical research at Duke University Medical Center, and director of the Duke Translational Medicine Institute.

The analysis was conducted by the Clinical Trials Transformation Initiative (CTTI), a public private partnership founded by the Food and Drug Administration (FDA) and Duke. It extends the usability of the data in ClinicalTrials.gov for research by placing the data through September 27, 2010 into a database structured to facilitate aggregate analysis. This publically accessible database facilitates the assessment of the clinical trials enterprise in a more comprehensive manner than ever before and enables the identification of trends by study type.

 

The National Library of Medicine (NLM), a part of the National Institutes of Health, developed and manages ClinicalTrials.gov. This site maintains a registry of past, current, and planned clinical research studies.

“Since 2007, the Food and Drug Administration Amendment Act has required registration of clinical trials, and the expanded scope and rigor of trial registration policies internationally is producing more complete data from around the world,” stated Deborah Zarin, MD, director, ClinicalTrials.gov, and assistant director for clinical research projects, NLM. “We have amassed over 120,000 registered clinical trials. This rich repository of data has a lot to say about the national and international research portfolio.”

This CTTI project was a collaborative effort by informaticians, statisticians and project managers from NLM, FDA and Duke. CTTI comprises more than 60 member organizations with the goal of identifying practices that will improve the quality and efficiency of clinical trials.

“Since the ClinicalTrials.gov registry contains studies sponsored by multiple entities, including government, industry, foundations and universities, CTTI leaders recognized that it might be a valuable source for benchmarking the state of the clinical trials enterprise,” stated Judith Kramer, MD, executive director of CTTI.

The project goal was to produce an easily accessible database incorporating advances in informatics to permit a detailed characterization of the body of clinical research and facilitate analysis of groups of studies by therapeutic areas, by type of sponsor, by number of participants and by many other parameters.

“Analysis of the entire portfolio will enable the many entities in the clinical trials enterprise to examine their practices in comparison with others,” says Califf. “For example, 96% of clinical trials have ≤1000 participants, and 62% have ≤ 100. While there are many excellent small clinical trials, these studies will not be able to inform patients, doctors and consumers about the choices they must make to prevent and treat disease.”

The analysis showed heterogeneity in median trial size, with cardiovascular trials tending to be twice as large as those in oncology and trials in mental health falling in the middle. It also showed major differences in the use of randomization, blinding, and data monitoring committees, critical issues often used to judge the quality of evidence for medical decisions in clinical practice guidelines and systematic overviews.

“These results reinforce the importance of exploration, analysis and inspection of our clinical trials enterprise,” said Rachel Behrman Sherman, MD, associate director for the Office of Medical Policy at the FDA’s Center for Drug Evaluation and Research. “Generation of this evidence will contribute to our understanding of the number of studies in different phases of research, the therapeutic areas, and ways we can improve data collection about clinical trials, eventually improving the quality of clinical trials.”

Related articles

 

IV.  Lawmakers urge CMS to extend MU hardship exemption for pathologists

 

Eighty-nine members of Congress have asked the Centers for Medicare & Medicaid Services to give pathologists a break and extend the hardship exemption they currently enjoy for all of Stage 3 of the Meaningful Use program.In the letter–dated July 10 and addressed to CMS Administrator Marilyn Tavenner–the lawmakers point out that CMS had recognized in its 2012 final rule implementing Stage 2 of the program that it was difficult for pathologists to meet the Meaningful Use requirements and granted a one year exception for 2015, the first year that penalties will be imposed. They now are asking that the exception be expanded to include the full five-year maximum allowed under the American Recovery and Reinvestment Act.

“Pathologists have limited direct contact with patients and do not operate in EHRs,” the letter states. “Instead, pathologists use sophisticated computerized laboratory information systems (LISs) to support the work of analyzing patient specimens and generating test results. These LISs exchange laboratory and pathology data with EHRs.”

Interestingly, the lawmakers’ exemption request is only on behalf of pathologists, even though CMS had granted the one-year hardship exception to pathologists, radiologists and anesthesiologists.

Rep. Tom Price (R-Ga.), one of the members spearheading the letter, had also introduced a bill (H.R. 1309) in March 2013 that would exclude pathologists from the incentives and penalties of the Meaningful Use program. The bill, which has 31 cosponsors, is currently sitting in committee. That bill also does not include relief for radiologists or anesthesiologists.

CMS has provided some flexibility about the hardship exceptions in the past, most recently by allowing providers to apply for one due to EHR vendor delays in upgrading to Stage 2 of the program.

However, CMS also noted in the 2012 rule granting the one-year exception that it was granting the exception in large part because of the then-current lack of health information exchange and that “physicians in these three specialties should not expect that this exception will continue indefinitely, nor should they expect that we will grant the exception for the full 5-year period permitted by statute.”

To learn more:
– read the letter (.pdf)

Read Full Post »

A Great University engaged in Drug Discovery: University of Pittsburgh

 

Reporter and Curator: Larry H. Bernstein, MD, FCAP

 

The US-based pharmaceutical companies have been consolidating and now are moving offshore to reduce taxes and other costs.  A part of the problem has been the large cost of clinical trials, the failure to detect toxicities in the early phases, and late phase failure or drug resistance conferring short term success.  This has been at a rate above 60%.  The result is that Big Pharma is looking to recycling old drugs for repurposing. Whatever success can be obtained from this, there is a larger problem in not having a comprehensive biological understanding of the problems imposed by the complexity on a deeper understanding.  I present here a major university, very well recognized in genetics, proteomics, and experimental pathology engaged in the drug development effort with reasonable promise of successes.

 

Perspective On: A Drug Discovery Lab

As lab manager at the University of Pittsburgh Drug Discovery Institute (UPDDI), Celeste Reese and her team use high-content imaging strategies and work with many other labs both within the university and outside the university on a wide range of projects.

By Rachel Muenz | July 03, 2014

 

We try to use new technologies and approaches and quantitative systems pharmacology (QSP) to complement the traditional drug discovery strategies

We try to use new technologies and approaches and quantitative systems pharmacology (QSP) to complement the traditional drug discovery strategies

 

 

Finding Clinically Relevant Solutions

Hard work, teamwork, and a whole lot of multitasking help this lab overcome a tough economic environment

“We try to use new technologies and approaches and quantitative systems pharmacology (QSP) to complement the traditional drug discovery strategies that are used by the large pharmacy companies,” she explains, adding that, on average, they have seven to ten active projects going on at any given time. “Right now we have a metastatic breast cancer program, a head and neck cancer project, and a Huntington’s disease project. We do some zebra fish modeling, some development of novel HIV diagnostics, liver modeling, and a variety of other things.”

Those projects take place in the institute’s 11,000 square feet of space, which covers two floors of the building the institute occupies and includes a large open lab on the top floor and an imaging lab, automation lab, and tissue culture facility on the floor below. Working in that space are 34 staff, including seven faculty, four graduate students, and five undergraduates, with the rest made up of technical specialists, administrative staff, and Reese herself. As in many other labs, staff members have a wide range of education levels—from high school for the undergrads all the way up to extensive post-doctoral experience for the faculty, Reese says, adding that staff receive quite a bit of training when they begin.

“The university has a lot of training modules that we send people to for such things as chemical hygiene, safety, and blood-borne pathogens, even things like safe shipping,” she says. “Then there are modules like conflict of interest training and research integrity training, which are also provided by the university. In-house, we train everyone on our equipment and on the procedures and protocols that we use within our institute.”

Training the grads and undergrads on those lab procedures is a big part of Reese’s role as lab manager, a task that she considers one of the highlights of the position.

“I really like working with the graduate students who come into the lab,” Reese says. “They always have a fresh perspective and they’re always challenging established protocols. They’re fresh and enthusiastic.”

The Catalyst Express robot is used to load plates onto a high-content imaging platform.It was a similar enthusiasm for science that led Reese to pursue the field in university, which led to a job in a pharmacology lab after graduation, getting her interested in the drug discovery field and—after 14 years staying home to raise her children—eventually brought her to the UPDDI, where she has worked for the past eight years.

“I’ve always loved science in general but then after college I got the job in the pharmacology lab and I just really liked experimental design and problem solving and implementation—which eventually led into the lab management position,” says Reese, who has now been lab manager at the UPDDI for four years.

Because of her enjoyment of experimenting, along with her other management duties of looking after supplies and equipment, Reese also likes to keep a hand in what’s going on in the lab.

“I keep an active role in at least one of the research projects that we have going on,” she explains. “I find that that’s very helpful in the lab management area as well, because I see key things while I’m doing experiments that I normally wouldn’t see on a walkthrough.”

Blocking out the day

Liquid nitrogen cell bank.

Liquid nitrogen cell bank.

 

 

Liquid nitrogen cell bank.For Reese, scheduling chunks of time for certain tasks is critical in ensuring she meets her goals for the day.

“Time management’s key when you’re trying to cover as many roles as it takes to do this job,” she says. “I try to keep the mornings for the lab management tasks and then the afternoons are usually taken up with meetings, experimental design and implementation, or data analysis.”

That means Reese’s mornings typically involve coming in, checking on what’s happening in the lab, looking after the ordering of supplies for the week, and attending to any equipment problems and emails. Along with meetings, her afternoons are usually taken up with running or designing experiments or analyzing data. Of course, the rest of the staff have a variety of different roles.

A few programs and regular inventory checks help keep everything organized.

“One of the big tools we have is a purchasing program that we have developed in-house—an access program that we use and a similar one for equipment reservations and things like that,” Reese says. “We do a weekly inventory. We have two stockroom areas and we have two student workers who go out and stock all the individual work areas for people every day. And then we also have written protocols and established procedures for things like routine equipment maintenance and buffer preparations and such.”

She adds that the main challenge her lab faces is the same one that many other labs face—doing more with less in the current tough economic climate. For her lab, multitasking and teamwork are a big part of solving that issue.

“We just have really talented people here,” Reese says of her staff. “Everybody takes on a variety of roles. Everybody pitches in with things like routine equipment maintenance and … rather than having one person in each job, everybody covers a variety of tasks.” Because of that strong teamwork, Reese finds she doesn’t need to do much to motivate members of the lab.

“I don’t manage people—I just try to lead by example and try to take care of any issues that come up promptly rather than put things off,” she explains. “Everybody’s pretty self-motivated and hardworking here.”

An automated compound storage system is used to store the institute’s screening libraries.

An automated compound storage system is used to store the institute’s screening libraries.

 

six separate tissue culture facilities

six separate tissue culture facilities

 

 

 

 

 

 

 

 

 

 

 

An automated compound storage system is used to store the institute’s screening libraries. The UPDDI has six separate tissue culture facilities equipped with biosafety cabinets, incubators, and microscopes.

The tech side

Along with the aforementioned high-content imaging, Reese’s lab also uses automated liquid handling platforms, biosensors, microfluidics, and immunofluorescence and fluorescence microscopy, and they are starting to implement 3D cell culture strategies to tackle their many projects.

“These fluorescent proteins react to the physiological changes in the cell in real time,” Reese says of the lab’s work with biosensors. “And [with] microfluidics you actually have a moving system. The system is more clinically relevant— it’s a better model for the in vivo systems.”

By “clinically relevant” Reese says she basically means the center is trying to more closely model what is actually going on in the human body, rather than relying on traditional 2D cell culture models or high throughput methods. That focus on clinically relevant methods is a result of big changes in the pharmaceutical industry in recent years.

Top 5 Instruments in the Lab

  • GE InCell6000 Imaging System
  • Agilent (Velocity 11) Bravo Liquid Handling Platform
  • Thermo Scientific Multidrop Combi Dispenser
  • PerkinElmer EnVision 2103 Multilabel Plate Reader
  • Brooks (Matrical) Ministore Automated Compound  Management System

“In the drug discovery field in general, big pharma has been using the mass-scale high throughput screening for a long time and of course now we’re coming to the patent cliff for a lot of the pharmaceutical companies, when a lot of their moneymakers are going off patent,” Reese explains. “So here, we’re trying to move away from that high throughput screening toward a more high-content [screening] where we’re looking at more clinically relevant methods and QSP approaches for drug discovery.”

And the most interesting work the lab is doing right now?

“I would say the coolest thing we have going on is a liver microphysiology project,” Reese says. “We’re making a liver biomimetic, which will be integrated with other organ biomimetics to create a human-on-a-chip for use as a model for drug toxicity and other kinds of organ analysis.”

Categories: Research-Specific Labs

Tags: Drug Discovery Labs

 

Read Full Post »

Genomics, Proteomics and standards

Larry H. Bernstein, MD, FCAP, Curator

http://pharmaceuticalintelligence/7/6/2014/Genomics, Proteomics and standards

This article is a look at where the biomedical research sciences are in developing standards for development in the near term.

 

Let’s Not Wait for the FDA: Raising the Standards of Biomarker Development – A New Series

published by Theral Timpson on Tue, 07/01/2014 – 15:03

We talk a lot on this show about the potential of personalized medicine. Never before have we learned at such breakneck speed just how our bodies function. The pace of biological research staggers the mind and hints at a time when we will “crack the code” of the system that is homo sapiens, going from picking the low hanging fruit to a more rational approach. The high tech world has put at the fingertips of biologists just the tools to do it. There is plenty of compute, plenty of storage available to untangle, or decipher the human body. Yet still, we talk of potential.

Chat with anyone heavily involved in the life science industry–be it diagnostics or pharma– and you’ll quickly hear that we must have better biomarkers.

Next week we launch a series, Let’s Not Wait for the FDA: Raising the Standards of Biomarker Development, where we will pursue the “hotspots” that are haunting those in the field.

The National Biomarker Development Alliance (NBDA) is a non profit organization based at Arizona State University and led by the formidable Anna Barker, former deputy director of the NCI. The aim of the NBDA is to identify problem areas in biomarker development–from the biospecimen and sampling issues to experiment design to bioinformatics challenges–and raise the standards in each area. This series of interviews is based on their approach. We will purse each of these topics with a special guest.

The place to start is with samples. The majority of researchers who are working on biomarker assays don’t give much thought to the “story” of their samples. Yet the quality of their research will never exceed the quality of the samples with which they start–a very scary thought according toCarolyn Compton, a former pathologist, now professor of pathology at ASU and Johns Hopkins. Carolyn worked originally as a clinical pathologist and knows first hand the the issues around sample degradation. She left the clinic when she was recruited to the NCI with the mission of bringing more awareness to the issue of bio specimens. She joins us as our first guest in the series.

That Carolyn has straddled the world of the clinic and the world of research is key to her message. And it’s key to this series. As we see an increased push to “translate” research into clinical applications, we find that these two worlds do not work enough together.

Researchers spend a lot of time analyzing data and developing causal relationships from certain biological molecules to a disease. But how often do these researchers consider how the history of a sample might be altering their data?

“Garbage in, garbage out,” says Carolyn, who links low quality samples with the abysmal non-reproducable rate of most published research.

Two of our guests in the series have worked on the adaptive iSpy breast cancer trials. These are innovative clinical trials that have been designed to “adapt” to the specific biology of those in the trial. Using the latest advances in genetics, the iSPY trials aim to match experimental drugs with the molecular makeup of tumors most likely to respond to them. And the trials are testing multiple drugs at once.

Don Berry is known for bringing statistics to clinical trials. He designed the iSpy trials and joins us to explain how these new trials work and of the promise of the adaptive design.

Laura Esserman is the director of the breast cancer center at UCSC and has been heavily involved in the implementation of the iSpy trials. Esserman is concerned that “if we keep doing conventional clinical trials, people are going to give up on doing them.” An MBA as well as an MD, Esserman brings what she learned about innovation in the high-tech industry to treatment for breast cancer.

From there we turn to the topic of “systems biology” where we will chat with George Poste, a tour de force when it comes to considering all of the various aspects of biology. Anyone who has ever been present for one of George’s presentations has no doubt come away scratching your head wondering if we’ll ever really glimpse the whole system that is a human being. If there is one brain that has seen all the rooms and hallways of our complex system, it’s George Poste.

We’ll finish the series by interviewing David Haussler from UCSC of Genome Browser fame. Recently Haussler has worked extensively on an NCI project, The Cancer Genome Atlas, to bring together data sets and connect cancer researchers around the world. What is the promise and pitfalls David sees with the latest bioinformatics tools?

George Poste says that in the literature we have identified 150,000 biomarkers that have causal linkage to disease. Yet only 100 of these have been commercialized and are used in the clinic. Why is the number so low? We hope to come up with some answers in this series.

 

 

Why Hasn’t Clinical Genetics Taken Off? (part 2)

published by Sultan Meghji on Fri, 06/20/2014 – 14:49

 

In my previous post, I made the broad comment that education of the patient and front line doctors was the single largest barrier to entry for clinical genetics. Here I look at the steps in the scientific process and where the biggest opportunities lie:

The Sequencing (still)

PCR is a perfectly reasonable technology for sequencing in the research lab today, but the current configuration of technologies need to change. We need to move away from an expert level skill set and a complicated chemistry process in the lab to a disposable, consumer friendly set of technologies. I’m not convinced PCR is the right technology for that and would love to see nanopore be a serious contender, but lack of funding for a broad spectrum of both physics-only as well as physical-electrical startups have slowed the progress of these technologies. And waiting in the wings, other technologies are spinning up in research labs around the world. Price is no longer a serious problem in the space – reliable, repeatable, easy to use sequencing technologies are. The complexity of the current technology (both in terms of sample preparation and machine operation) is a big hurdle.

The Analysis (compute)

Over the last few years, quite a bit of commentary and effort has been put into making the case that the compute is a significant challenge (including more than a few comments by yours truly in that vein!). Today, it can be said with total confidence that compute is NOT a problem. Compute has been commoditized. Through excellent new software to advanced platforms and new hardware, it is a trivial exercise to do the analysis and costs tiny amounts of money ($<25 per sample on a cloud provider appears to be the going rate for a clinical exome in terms of platform & infrastructure cost). Integration with the sequencer and downstream medical middleware is the biggest opportunity.

The Analysis (value)

The bigger challenge on the analysis is the specific things being analyzed as mapped to the needs of the patient. We are still in a world where the vast majority of the sequencing work is being done in support of a specific patient with a specific disease. There isn’t even broad consensus yet in the scientific community about the basics of the pipeline (see my blog posthere for an attempt at capturing what I’m seeing in the market). A movement away from the recent trend in studying specific indications (esp. cancer) is called for. Broadening the sample population will allow us to pick simpler, clearer and easier pipelines which will then make them more adoptable. It would be a massive benefit to the world if the scientific, medical and regulatory communities would get together and start creating, in a crowdsourced manner, a small number of databases that are specifically useful to healthy people. Targeting things like nutrition, athletics, metabolism, and other normal aspects of daily life. A dataset that could, when any one person’s DNA is references, would find something useful. Including the regulators is key so that we can begin to move away from the old fashioned model of clearances that still permeate the industry.

The Regulators

Beyond the broader issues around education I referenced in my previous post, there is a massive upgrade in the regulation infrastructure that is needed. We still live in a world of fax machines, overnight shipping of paper documents and personal relationships all being more important than the quality of the science you as an innovator are bringing to bear.

Consider the recent massive growth in wearables, fitness trackers and other instrumentation local to the human body. Why must we treat clinical genetics simply as a diagnostic and not, as it should be, as a fundamental set of quantitative data about your body that you can leverage in a myriad of ways. Direct to consumer (DTC) genetics companies, most notably 23andme, have approached this problem poorly – instead of making it valuable to the average consumer, what they’ve done is attempted to straddle the line between medical and not. The Fitbit model has shown very clearly that lifestyle activities can be directly harnessed to build commercial value in scaling health related activities without becoming a regulatory issue. It’s time for genetics to do the same thing.

 

 

Development and Role of the Human Reference Sequence in Personal Genomics

Posted by @finchtalk on July 3, 2014

discovery in a digital world

 

 

 

A few weeks back, we published a review about the development and role of the human reference genome. A key point of the reference genome is that it is not a single sequence. Instead it is an assembly of consensus sequences that are designed to deal with variation in the human population and uncertainty in the data. The reference is a map and like a geographical maps evolves though increased understanding over time.

From the Wiley On Line site:

Abstract

Genome maps, like geographical maps, need to be interpreted carefully. Although maps are essential to exploration and navigation they cannot be completely accurate. Humans have been mapping the world for several millennia, but genomes have been mapped and explored for just a single century with the greatest advancements in making a sequence reference map of the human genome possible in the past 30 years. After the deoxyribonucleic acid (DNA) sequence of the human genome was completed in 2003, the reference sequence underwent several improvements and today provides the underlying comparative resource for a multitude genetic assays and biochemical measurements. However, the ability to simplify genetic analysis through a single comprehensive map remains an elusive goal.

Key Concepts:

  • Maps are incomplete and contain errors.
  • DNA sequence data are interpreted through biochemical experiments or comparisons to other DNA sequences.
  • A reference genome sequence is a map that provides the essential coordinate system for annotating the functional regions of the genome and comparing differences between individuals’ genomes.
  • The reference genome sequence is always product of understanding at a set point in time and continues to evolve.
  • DNA sequences evolve through duplication and mutation and, as a result, contain many repeated sequences of different sizes, which complicates data analysis.
  • DNA sequence variation happens on large and small scales with respect to the lengths of the DNA differences to include single base changes, insertions, deletions, duplications and rearrangements.
  • DNA sequences within the human population undergo continual change and vary highly between individuals.
  • The current reference genome sequence is a collection of sequences, an assembly, that include sequences assembled into chromosomes, sequences that are part of structurally complex regions that cannot be assembled, patches (fixes) that cannot be included in the primary sequence, and high variability sequences that are organised into alternate loci.
  • Genetic analysis is error prone and the data require validation because the methods for collecting DNA sequences create artifacts and the reference sequence used for comparative analyses is incomplete.

Keywords:DNA sequencing

 

Read Full Post »

Cancer Labs at School of Medicine @ Technion: Janet and David Polak Cancer and Vascular Biology Research Center

Cancer Labs at School of Medicine @ Technion

Reporter: Aviva Lev-Ari, PhD, RN

Article ID #139: Cancer Labs at School of Medicine @ Technion: Janet and David Polak Cancer and Vascular Biology Research Center. Published on 5/28/2014

WordCloud Image Produced by Adam Tubman

Janet and David Polak Cancer and Vascular Biology Research CenterThe Rappaport Faculty of Medicine Research Institute and Faculty of Medicine, Technion – Israel Institute of Technology, Haifa, Israel

The center was established in 2003 to promote an in-depth interdisciplinary basic and clinical research on the control of cellular and molecular processes that are involved in cancer initiation and progression. We strongly believe that the understanding of basic biological processes that underlie normal development and their deregulation in cancer, is crucial for our ability to identify molecular targets for early detection, intervention, and cure of the disease. We are interested in a broad view of cancer – from the single malignantly transformed cell and its microenvironment, through the entire tumor in the animal. We focus on targeted ubiquitin-mediated degradation of key regulatory proteins that are involved in malignant transformation [Prof. Aaron Ciechanover (Nobel Prize in Chemistry 2004)], angiogenesis and cancer progression (Prof. Gera Neufeld), metastasis and tumor microenvironment (Prof. Israel Vlodavsky), as well as genetic and genomic dissection of embryonic and cancer transcriptional networks (Dr. Amir Orian). Towards these objectives, we combine molecular, biochemical, cell biological with Drosophila genetic and genomics experimental approaches, as well as employing advanced models of angiogenesis and metastasis.

We believe that scientific excellence and collegiality go together. Therefore, the center has an open and friendly atmosphere, creating a highly stimulating environment. The center is located in the 11th Floor of the Rappaport Faculty of Medicine building. It currently trains 45 graduate students, post-doctoral fellows, clinicians and researchers that are at the heart of our research. Formal and informal collaborations between individuals and laboratories are on-going and encouraged. We are running a series of joint seminars to which we invite researchers from Israel and abroad. The Center has advanced state-of-the-art microscopic and image analysis equipment, as well as other shared pieces of infrastructural equipment . The center is an integral part of the Faculty of Medicine and the Rappaport Research Institute which are home for excellent research groups, and enjoys their advanced Interdepartmental Equipment Unit. It is also adjacent to the Rambam Medical Center – the major hospital in the north of Israel – which provides us with access to rich clinical material and collaboration with clinicians. Many of them spend active research periods in our laboratories and bring the bench closer to the patient bed and vice versa. The Center is in an active phase of growth, and offers excellent research opportunities, space and facilities for students, post-doctoral fellows, and physicians.

Research Groups

The Ubiquitin System and Cellular Protein Turnover and Interactions

Immunity and Host Defense

Cardiovascular Biology

The Central Nervous System in Health and Disease

Developmental Biology and Cancer Research

Genetics

SOURCE 

http://www.rappaport.org.il/Rappaport/Templates/ShowPage.asp?DBID=1&TMID=842&FID=76

The cancer and vascular biology research center was established in 2003 to promote an in-depth interdisciplinary basic and clinical research on the control of cellular and molecular processes that are involved in cancer development and progression. Our goal is to advance knowledge in fundamental biological questions that are highly relevant for cancer.

The cancer and vascular biology research center was established in 2003 to promote an in-depth interdisciplinary basic and clinical research on the control of cellular and molecular processes that are involved in cancer development and progression. Our goal is to advance knowledge in fundamental biological questions that are highly relevant for cancer.

SOURCE

http://www.technioncancer.co.il/index.php

Home  >>  Research Groups

Aaron Ciechanover
Protein Turnover

Intracellular protein degradation and mechanisms of cancer
Israel Vlodavsky
Cancer Biology

Impact of heparanase and the tumor microenvironment on cancer progression: Basic aspects and clinical implications
Gera Neufeld
Tumor Progression & Angiogenesis

Blood vessels and tumor progression: The neuropilin connection
Amir Orian
Genetic Networks

Genetic networks in development and cancer
Home
About the Cancer Centers
Research Groups
Administration / Contact
Join – Us
Seminars and Events
Links
Beyond Science
Friends and supporters

Ms. Sigal Alfasi – Izrael, Center’s coordinator
e-mail: gsigal@tx.technion.ac.il
Tel: +972-4-829-5424
Fax: +972-4-852-3947

SOURCE

http://www.technioncancer.co.il/ResearchGroups.php

Yuval Shaked, PhD

Assistant Professor of Molecular Pharmacology

PhD, 2004 – Hebrew University, Israel

Understanding host – tumor interactions during cancer therapy

Personalized medicine holds promise of better cures with fewer side effects for many diseases. Individualized cancer therapy is sometimes utilized after multiple attempts of standard therapies and is based on several considerations, such as tumor type, acquired resistance to a specific therapy, previous treatment protocols, and other tumor-related factors. We have recently demonstrated that many cancer therapies can induce pro-tumorigenic or metastatic effects that derive not only from the tumor cells themselves, but also from host cells within the tumor microenvironment. The focus of research in my laboratory is to identify, characterize, and seek ways to block such pro-tumorigenic host effects observed after anti-cancer therapy, and thus potentially improve the outcome of current cancer therapies. Our findings may foster a paradigm shift in cancer therapy by minimizing the gap between preclinical findings and the clinical setting, laying the foundation for development of entirely new strategies for improving cancer therapy.

SOURCE

http://www.rappaport.org.il/Rappaport/Templates/ShowPage.asp?DBID=1&TMID=610&FID=77&PID=0&IID=1268

 

Other Related articled published on this Open Access Online Scientific Journal included the following:

D&D NT’s Solution: Galectin Proteins for Therapy and Diagnosis of Autoimmune Inflammatory and Cancer Diseases, Dr. Itshak Golan, CEO

http://pharmaceuticalintelligence.com/2014/05/28/dd-nts-solution-galectin-proteins-for-therapy-and-diagnosis-of-autoimmune-inflammatory-and-cancer-diseases-dr-itshak-golan-ceo/

MaimoniDex RA:  Monoclonal Antibodies for Therapy and Diagnosis of Cancer and Autoimmune Inflammatory Diseases – Dr. Itshak Golan, CEO

http://pharmaceuticalintelligence.com/2014/05/28/maimonidex-ra-monoclonal-antibodies-for-therapy-and-diagnosis-of-cancer-and-autoimmune-inflammatory-diseases-dr-itshak-golan-ceo/

Read Full Post »

Leaders in Biobanking Congress 2014September 15-17, 2014 | Seattle, WA| Healthtech.com/Biobanking 

Reporter: Aviva Lev-Ari, PhD, RN

Article ID #137: Leaders in Biobanking Congress 2014 – Final Agenda, September 15-17, 2014 | Seattle, WA| Healthtech.com/Biobanking. Published on 5/20/2014

WordCloud Image Produced by Adam Tubman

 

FINAL AGENDA ANNOUNCEMENT

 

Leaders in Biobanking Congress 2014                       

September 15-17, 2014 | Seattle, WA| Healthtech.com/Biobanking 

Register by May 30th & SAVE! | Download Conference Brochure

 

Today, biospecimen collections are used by multiple research groups for varying research aims, from basic

research through clinical trials. A well-managed biobank is a critical prerequisite for high-quality biological

research. The proper collection, processing, storage and tracking of biospecimens are critical components

allowing researchers to better link molecular and clinical information. Thus, by necessity, biobanking is

both a science and a business. Cambridge Healthtech Institute’s Sixth Annual Leaders in Biobanking Congress:

 Maximizing Your Investment in Biospecimens addresses both the business and science of biobanking, bringing

together biomedical and biopharmaceutical researchers, regulators, biorepository managers and practitioners

to investigate the best strategies for effective use of biospecimens within  today’s cutting edge research.

Topics Include:

 

Plenary Keynote Presentations on Defining Precision Medicine – It Takes a Village, – in which four speakers

emphasize the diverse insights, broad resources and expansive collaborations needed to meet the promise of this field:

  • Dr. Carolyn Compton of National Biomarkers Development Alliance and Arizona State University
  • Dr. James Olson of Seattle Children’s Hospital and Presage Biosciences
  • Dr. Nathan Price of the Institute for Systems Biology
  • Dr. John Slattery of University of Washington School of Medicine

Biobanking Is a Business, which highlights costs, resource optimization, self-sustainability and other issues

managers must consider to run successful biorepositories

Informed Consent: Biosamples Start with Patients, which considers how to balance scientific utility with

individual privacy, with examples on iPSC research, pediatric biobanking and unexpected results

Putting Biosamples to Use, which emphasizes connections between good biobanking practices and biomarker

development, translational research and personalized medicine  Optimizing Value from Your Biospecimens, which offers insights on improving quality, biobanking informatics and global infrastructures for that purpose Biospecimen Processing and Storage Optimization, which explores temperature control, freeze-thaw effects, stabilization and other important factors that affect the quality of bio-and cryopreserved samples  Case Study Presentations, which illustrate active Biobanker/Biouser Partnerships and each party’s needs, contributions, bottlenecks and scientific results

Confirmed Speakers Include:

  • Cary D. Austin, M.D., Ph.D., D.A.B.P., Pathologist, Pathology, gRED, Genentech, Inc.
  • Geoffrey Baird, M.D., Ph.D., Assistant Professor, Laboratory Medicine, University of Washington
  • Stefano Begolo, Ph.D., Research Scientist, Rustem F. Ismagilov Laboratory, Division of Chemistry and Chemical Engineering, California Institute of Technology
  • Andrew Brooks, Ph.D., COO, RUCDR Infinite Biologics; Associate Professor, Genetics, Rutgers University
  • Wylie Burke, M.D., Ph.D., Professor, Bioethics and Humanities, University of Washington
  • Carolyn Compton, M.D., Ph.D., CMO, National Biomarkers Development Alliance and Professor, School of Life Sciences, Arizona State University
  • Hoda Elgendy, Ph.D., CSO, BioActive Technologies, Inc.
  • Donna Farley, Associate Consultant, Eli Lilly and Company
  • Dayong Gao, Ph.D., Professor, Mechanical Engineering and Bioengineering, University of Washington
  • Allison Hubel, Ph.D., Professor, Mechanical Engineering and Director, Biopreservation Core Resource, University of Minnesota
  • Devon D. Kelly, Director, OHSU Knight BioLibrary, Oregon Health & Science University, Knight Cancer Institute
  • Bernard A. LaSalle, Director, Operations, BMIC, Biomedical Informatics, Center for Clinical and Translational Science, University of Utah
  • Sylvia M. Lee, M.D., Medical Oncologist, Seattle Cancer Care Alliance
  • Michael Liebman, Ph.D., Managing Director, IPQ Analytics, LLC
  • Geoffrey P. Lomax, Ph.D., Senior Officer, Standards Working Group, California Institute for Regenerative Medicine
  • James M. Olson, M.D., Ph.D., Member, Clinical Research Division, Fred Hutchinson Cancer Research Center; Professor, Pediatric Hematology and Oncology, University of Washington; Attending Physician, Seattle Children’s Hospital;  Founder, Presage Biosciences and Blaze Bioscience
  • Bruce Pharr, Vice President, Life Science Research Systems, Remedy Informatics
  • Nathan D. Price, Ph.D., Associate Director, Institute for Systems Biology
  • Shannon Sewards, Associate Director, Human Subjects Division, University of Washington
  • Stephen Schmechel, M.D., Ph.D., Associate Professor, UW Medicine Pathology, University of Washington
  • John T. Slattery, Ph.D., Vice Dean, Research and Graduate Education, School of Medicine and Professor, Pharmacology and Medicine, University of Washington School of Medicine
  • Stephanie Elaine Soares, MS, CCRP, Research Administrator, Urology, University of California Davis
  • Suzane Vercauteren, M.D., Ph.D., FRCPC, Head, Division of Hematopathology, BC Children’s Hospital and Clinical  Assistant Professor, Pathology and Laboratory Medicine, University of British Columbia
  • Amelia Wall Warner, President and CEO, Gentris
  • Wendell G. Yarbrough, M.D., MMHC, FACS, Professor of Surgery, Otolaryngology and Pathology; Section Chief,  Otolaryngology; Co-Director, Molecular Virology Research Program; Director, Head and Neck Cancer Program,
  • Smilow Cancer Hospital, Yale University
  • William H. Yong, M.D., Professor, Pathology and Laboratory Medicine, Neuropathologist, National Neurologic AIDS Bank Director, Brain Tumor Translational Resource David Geffen School of Medicine, University of  California, Los Angeles

 Claire Zhu, Ph.D., Program Director, Division of Cancer Prevention, National Cancer Institute, National Institutes of Health 

SOURCE

From: Leaders in Biobanking Congress <kerris@healthtech.com>
Date: Tue, 20 May 2014 13:43:27 -0400
To: <avivalev-ari@alum.berkeley.edu>

Read Full Post »

The Bank Where Doctors Can Stash Your Genome

Reporter: Aviva Lev-Ari, PhD, RN

See on Scoop.itCardiovascular and vascular imaging

A new company offers a “gene vault” for doctors who want to add genomics to patient care.

 

Genomic sequencing might be more common in medicine if doctors had a simple way to send for the test and keep track of the data.

 

That’s the hope of Coriell Life Sciences in Camden, N.J., a startup that grew out of a partnership between the Coriell Institute for Medical Research and IBM. The company wants to facilitate the process of ordering, storing and interpreting whole-genome-sequence data for doctors. The company launched in January and is now working with different health-care providers to set up its service.

 

“The intent is that the doctor would order a test like any other diagnostic test they order today,” says Scott Megill, president of Coriell Life Sciences. The company would facilitate sequencing the patient’s DNA (through existing sequencing companies such as Illumina or Ion Torrent), store it in its so-called gene vault, and act as the middleman between doctors and companies that offer interpretation services. Finally, “we will return the genetic result in the human readable form back to the electronic medical record so the doctor can read it and interpret it for the patient,” says Megill.

 

“You need a robust software infrastructure for storing, analyzing, and presenting information,” says Jon Hirsch, who founded Syapse, a California-based company developing software to analyze biological data sets for diagnosing patients. “Until that gets built, you can generate all the data you want, but it’s not going to have any impact outside the few major centers of genomics medicine,” he says.

 

The company will use a board of scientific advisors to guide them to the best interpretation programs available. “No one company is in position to interpret the entire genome for its meaning,” says Michael Christman, CEO of the Coriell Institute for Medical Research. “But by having one’s sequence in the gene vault, then the physician will be able to order interpretative engines, analogous to apps for the iPhone,” he says. Doctors could order an app to analyze a patient’s genome for DNA variants linked to poor drug response at one point, and later on, order another for variants linked to heart disease.

 

The cloud-based workflow could help doctors in different locations take advantage of expert interpretations anywhere, says Christman. “This would allow a doctor who’s at a community clinic in Tulsa, Okla., order an interpretation of breast cancer sequences derived at Sloan Kettering,” he says.

See on mashable.com

Read Full Post »

« Newer Posts