Feeds:
Posts
Comments

Archive for the ‘Patents’ Category

Human genome: UK to become world number 1 in DNA testing

Reporter: Aviva Lev-Ari, PhD, RN

 

 

 

£300 million investment that will transform how diseases are diagnosed and treated announced by the Prime Minister today.

 

 

The UK is set to become the world leader in ground-breaking genetic research into cancer and rare diseases, which will transform how diseases are diagnosed and treated, thanks to a package of investment worth more than £300 million, the Prime Minister will announce today.

The 4 year project will allow scientists to do pioneering new research to decode 100,000 human genomes – a patient’s personal DNA code. The landmark project is on a scale not seen anywhere else in the world.

It is part of the Prime Minister’s commitment to ensure the NHS as well as the UK’s research and life science sector is at the forefront of global advances in modern medicine.

Sequencing the genome of a person with cancer or someone with a rare disease will help scientists and doctors understand how disease works. The project has the potential to transform the future of health care, with new and better tests, drugs and treatment. It is expected to provide a lifeline to thousands of families affected by rare genetic diseases and cancers.

The Prime Minister has pledged that the UK will map 100,000 human genomes by 2017.

Now, as world leading research organisations join forces, the 100,000 Genomes Project has reached a major milestone in a package of new investment.

The Prime Minister is today unveiling a new partnership between Genomics England and the company Illumina that will deliver infrastructure and expertise to turn the plan into reality. As part of this, Illumina’s services for whole genome sequencing have been secured in a deal worth around £78 million.

In turn, Illumina will invest around £162 million into the work in England over 4 years, creating new knowledge and jobs in the field of genome sequencing. The investment will not only help the life science industry to thrive, but potentially create opportunities for talented UK scientists to lead the world. It will also pave the way for all NHS patients to eventually benefit from this exciting new technology.

This research puts the NHS at the forefront of scientific discovery. This is in line with the Prime Minister’s vision for the NHS to be the first mainstream health service in the world to offer genomic medicine as part of routine care.

Prime Minister David Cameron said:

This agreement will see the UK lead the world in genetic research within years. I am determined to do all I can to support the health and scientific sector to unlock the power of DNA, turning an important scientific breakthrough into something that will help deliver better tests, better drugs and above all better care for patients.

As our plan becomes a reality, I believe we will be able to transform how devastating diseases are diagnosed and treated in the NHS and across the world, while supporting our best scientists and life science businesses to discover the next wonder drug or breakthrough technology.

The Wellcome Trust has invested more than £1 billion in genomic research and has agreed to spend £27 million on a world class sequencing hub at its Genome Campus near Cambridge. This will house Genomics England’s operations alongside those of the internationally respected Sanger Institute.

The agreement will place Genomics England at the heart of one of the world’s most vibrant genomic science and technology clusters, and allow scientists to work with world-class researchers from the Sanger Institute, the European Bioinformatics Institute, and biotechnology companies based on the same site.

The Medical Research Council has also earmarked £24 million to help provide the computing power to make sure that the data of participants will be properly analysed, interpreted and made available to doctors and researchers securely.

NHS England has started the process of selecting the first NHS Genomics Medicine Centres. Successful centres will help to progress this ambitious project by inviting cancer and rare disease patients to take part to have their genome sequenced. NHS England has agreed to underwrite an NHS contribution of up to £20 million over the life of the project.

The cash injection – and new partnerships – will mean excellent progress can be made on the 100,000 Genomes Project. It is expected that around 40,000 NHS patients could benefit directly from the research. Ultimately this work will pave the way for genomics-based medicine to become part of everyday practice throughout the NHS.

Participation in the project will be based on consent, and people’s data will be strictly protected through Genomics England’s secure data services.

Life Sciences Minister George Freeman said:

Genomics England’s ground breaking partnership with Illumina confirms Britain’s position as a world leader in the field of genetic medicine. This project will help us map genomes on an unprecedented scale and bring better treatments to people with cancers and rare diseases for generations to come.

This project is also very important for the economy and the development of life sciences in this country – including creating valuable jobs in Cambridge and beyond.

Sir John Chisholm, Executive Chair of Genomics England said:

This is a real milestone in turning this ambitious project into what we always intended which is a world leading project capable of delivering immense benefit to current and future patients.

Jay Flatley, CEO of Illumina said:

This is a momentous day for the UK to push the boundaries of medical science and create the first comprehensive national program for genomic healthcare.

Illumina is committed to partnering with Genomics England as they look to implement vital changes in the way healthcare is practiced. This project confirms the UK as a leader in the global race to implement genomic technology and create a lasting legacy for patients, the NHS and the UK economy.

Jeremy Farrar, Director of the Wellcome Trust, said:

Understanding humanity’s genetic code is not only going to be fundamental to the medicine of the future. It is an essential part of medicine today. In rare congenital diseases, in cancer and in infections, genomic insights are already transforming diagnosis and treatment.

The Wellcome Trust has invested more than £1 billion in genome research that has built this understanding, including pivotal contributions to the Human Genome Project, the world-leading science of the Sanger Institute, and critical work in global health, medical ethics and public engagement. Genomics England will further exploit this knowledge for medical advances that help patients, within a robust ethical framework that relies on their informed consent, so supporting its efforts is a logical next step. We will be proud to host its sequencing hub alongside Sanger’s at our Hinxton genome campus, and to fund researchers who use its data to investigate disease.

Simon Stevens, NHS England’s Chief Executive said:

The NHS is now set to become one of the world’s ‘go-to’ health services for the development of innovative genomic tests and patient treatments, building on our long track record as the nation that brought humanity antibiotics, vaccines, modern nursing, hip replacements, IVF, CT scanners, and breakthrough discoveries from the circulation of blood to the existence of DNA.

The NHS’ comparative advantage in unlocking patient benefits from the new genomic revolution stems from our unique combination of a large and diverse population, with universal access to care, multi-year data that spans care settings, world-class medicine and science, and an NHS funding system that enables upstream investment in prevention and new ways of working, as demonstrated by this ground-breaking 100,000 Genomes Project.

Genetic disorders and genomics

Rare diseases are uncommon but there are between 5,000 to 8,000 known genetic disorders. Around 3 million people are affected by them, half of these are children.

When the Human Genome Project was undertaken in the early 1990s, it took 13 years and over £2 billion to sequence the first whole human genome. But now with advances in technology, the speed and cost of sequencing a human genome has fallen dramatically.

Our understanding of how to use this information has also increased. We still have a lot to learn, but these advances have opened up the potential use of genomics medicine within mainstream healthcare.

Genomics England is a wholly owned by the Department of Health. It was set up to deliver the 100,000 Genomes Project. This flagship project will sequence 100,000 whole genomes from NHS patients by 2017.

Genomics England has 4 main aims:

  • to bring benefit to patients
  • to create an ethical and transparent programme based on consent
  • to enable new scientific discovery and medical insights
  • to kickstart the development of a UK genomics industry

The project is focusing on patients with rare diseases, and their families, as well as patients with common cancers. The project is currently in its pilot phase and the main project begins in 2015.

SOURCE

https://www.gov.uk/government/news/human-genome-uk-to-become-world-number-1-in-dna-testing

 

Read Full Post »

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Reporter and Curator: Larry H. Bernstein, MD, FCAP

 

he large amount of funding tied to continued research and support of postdoctoral fellows leads one to ask how following the money can lead to discredited work in th elite scientific community.

Moreover, the pressure to publish in prestigious journals with high impact factors is a road to academic promotion.  In the last twenty years, it is unusual to find submissions for review with less than 6-8 authors, with the statement that all contributed to the work.  These factors can’t be discounted outright, but it is easy for work to fall through the cracks when a key investigator has over 200 publications and holds tenure in a great research environment.  But that is where we find ourselves today.

There is another issue that comes up, which is also related to the issue of carrying out research, and then protecting the work for commercialization.  It is more complicated in the sense that it is necessary to determine whether there is prior art, and then there is the possibility that after the cost of filing patent and a 6 year delay in obtaining protection, there is as great a cost in bringing the patent to finasl production.

I.  Individual actions undermining trust.

II. The patent dilemma.

III. The value of a clinical trial.

IV. The value contributions of RAP physicians
(radiologists, anesthesiologists, and pathologists – the last for discussion)
Those who maintain and inform the integrity of medical and surgical decisions

 

I. Top heart lab comes under fire

Kelly Servick

Science 18 July 2014: Vol. 345 no. 6194 p. 254 DOI: 10.1126/science.345.6194.25

 

In the study of cardiac regeneration, Piero Anversa is among the heavy hitters. His research into the heart’s repair mechanisms helped kick-start the field of cardiac cell therapy (see main story). After more than 4 decades of research and 350 papers, he heads a lab at Harvard Medical School’s Brigham and Women’s Hospital (BWH) in Boston that has more than $6 million in active grant funding from the National Institutes of Health (NIH). He is also an outspoken voice in a field full of disagreement.

So when an ongoing BWH investigation of the lab came to light earlier this year, Anversa’s colleagues were transfixed. “Reactions in the field run the gamut from disbelief to vindication,” says Mark Sussman, a cardiovascular researcher at San Diego State University in California who has collaborated with Anversa. By Sussman’s account, Anversa’s reputation for “pushing the envelope” and “challenging existing dogma” has generated some criticism. Others, however, say that the disputes run deeper—to doubts about a cell therapy his lab has developed and about the group’s scientific integrity. Anversa told Science he was unable to comment during the investigation.

“People are talking about this all the time—at every scientific meeting I go to,” says Charles Murry, a cardiovascular pathologist at the University of Washington, Seattle. “It’s of grave concern to people in the field, but it’s been frustrating,” because no information is available about BWH’s investigation. BWH would not comment for this article, other than to say that it addresses concerns about its researchers confidentially.

In April, however, the journal Circulation agreed to Harvard’s request to retract a 2012 paper on which Anversa is a corresponding author, citing “compromised” data. The Lancet also issued an “Expression of Concern” about a 2011 paper reporting results from a clinical trial, known as SCIPIO, on which Anversa collaborated. According to a notice from the journal, two supplemental figures are at issue.

For some, Anversa’s status has earned him the benefit of the doubt. “Obviously, this is very disconcerting,” says Timothy Kamp, a cardiologist at the University of Wisconsin, Madison, but “I would be surprised if it was an implication of a whole career of research.”

Throughout that career, Anversa has argued that the heart is a prolific, lifelong factory for new muscle cells. Most now accept the view that the adult heart can regenerate muscle, but many have sparred with Anversa over his high estimates for the rate of this turnover, which he maintained in the retracted Circulation paper.

Anversa’s group also pioneered a method of separating cells with potential regenerative abilities from other cardiac tissue based on the presence of a protein called c-kit. After publishing evidence that these cardiac c-kit+cells spur new muscle growth in rodent hearts, the group collaborated in the SCIPIO trial to inject them into patients with heart failure. In The Lancet, the scientists reported that the therapy was safe and showed modest ability to strengthen the heart—evidence that many found intriguing and provocative. Roberto Bolli, the cardiologist whose group at the University of Louisville in Kentucky ran the SCIPIO trial, plans to test c-kit+ cells in further clinical trials as part of the NIH-funded Cardiovascular Cell Therapy Research Network.

But others have been unable to reproduce the dramatic effects Anversa saw in animals, and some have questioned whether these cells really have stem cell–like properties. In May, a group led by Jeffery Molkentin, a molecular biologist at Cincinnati Children’s Hospital Medical Center in Ohio, published a paper in Nature tracing the genetic lineage of c-kit+ cells that reside in the heart. He concluded that although they did make new muscle cells, the number is “astonishingly low” and likely not enough to contribute to the repair of damaged hearts. Still, Molkentin says that he “believe[s] in their therapeutic potential” and that he and Anversa have discussed collaborating.

Now, an anonymous blogger claims that problems in the Anversa lab go beyond controversial findings. In a letter published on the blog Retraction Watch on 30 May, a former research fellow in the Anversa lab described a lab culture focused on protecting the c-kit+ cell hypothesis: “[A]ll data that did not point to the ‘truth’ of the hypothesis were considered wrong,” the person wrote. But another former lab member offers a different perspective. “I had a great experience,” says Federica Limana, a cardiovascular disease researcher at IRCCS San Raffaele Pisana in Rome who spent 2 years of her Ph.D. work with the group in 1999 and 2000, as it was beginning to investigate c-kit+ cells. “In that period, there was no such pressure” to produce any particular result, she says.

Accusations about the lab’s integrity, combined with continued silence from BWH, are deeply troubling for scientists who have staked their research on theories that Anversa helped pioneer. Some have criticized BWH for requesting retractions in the midst of an investigation. “Scientific reputations and careers hang in the balance,” Sussman says, “so everyone should wait until all facts are clearly and fully disclosed.”

 

II.  Trolling Along: Recent Commotion About Patent Trolls

July 17, 2014

PriceWaterhouseCoopers recently released a study about 2014 Patent Litigation. PwC’s ultimate conclusion was that case volume increased vastly and damages continue a general decline, but what’s making headlines everywhere is that “patent trolls” now account for 67% of all new patent lawsuits (see, e.g., Washington Post and Fast Company).

Surprisingly, looking at PwC’s study, the word “troll” is not to be found. So, with regard to patent trolls, what does this study really mean for companies, patent owners and casual onlookers?

First of all, who are these trolls?

“Patent Troll” is a label applied to patent owners who do not make or manufacture a product, or offer a service. Patent trolls live (and die) by suing others for allegedly practicing an invention that is claimed by their patents.

The politically correct term is Non-practicing Entity (NPE). PwC solely uses the term NPE, which it defines as an entity that does not have the capability to design, manufacture, or distribute products with features protected by the patent.

So, what’s so bad about them?

The common impression of an NPEs is a business venture looking to collect and monetize assets (i.e., patents). In the most basic strategy, an NPE typically buys patents with broad claims that cover a wide variety of technologies and markets, and then sues a large group of alleged patent infringers in the hope to collect a licensing royalty or a settlement. NPEs typically don’t want to spend money on a trial unless they have to, and one tactic uses settlements with smaller businesses to build a “war chest” for potential suits with larger companies.

NPEs initiating a lawsuit can be viewed positively, such as a just defense of the lowly inventor who sold his patent to someone (with deeper pockets) who could fund the litigation to protect the inventor’s hard work against a mega-conglomerate who ripped off his idea.

Or NPE litigation can be seen negatively, such as an attorney’s demand letter on behalf of an anonymous shell corporation to shake down dozens of five-figure settlements from all the local small businesses that have ever used a fax machine.

NPEs can waste a company’s valuable time and resources with lawsuits, yet also bring value to their patent portfolios by energizing a patent sales and licensing market. There are unscrupulous NPEs, but it’s hardly the black and white situation that some media outlets are depicting.

What did PwC say about trolls?

Well, the PwC study looked at the success rates and awards of patent litigation decisions. One conclusion is that damages awards for NPEs averaged more than triple those for practicing entities over the last four years. We’ll come back to this statistic.

Another key observation is that NPEs have been successful 25% of the time overall, versus 35% for practicing entities. This makes sense because of the burden of proof the NPEs carry as a plaintiff at trial and the relative lack of success for NPEs at summary judgment. However, PwC’s report states that both types of entities win about two-thirds of their trials.

But what about this “67% of all patent trials are initiated by trolls” discussion?

The 67% number comes from the RPX Corporation’s litigation report (produced January 2014) that quantified the percentage of NPE cases filed in 2013 as 67%, compared to 64% in 2012, 47% in 2011, 30% in 2010 and 28% in 2009.

PwC refers to the RPX statistics to accentuate that this new study indicates that only 20% ofdecisions in 2013 involved NPE-filed cases, so the general conclusion would be that NPE cases tend to settle or be dismissed prior to a court’s decision. Admittedly, this is indicative of the prevalent “spray and pray” strategy where NPEs prefer to collect many settlement checks from several “targets” and avoid the courtroom.

In this study, who else is an NPE?

If someone were looking to dramatize the role of “trolls,” the name can be thrown around liberally (and hurtfully) to anyone who owns and asserts a patent without offering a product or a service. For instance, colleges and universities fall under the NPE umbrella as their research and development often ends with a series of published papers rather than a marketable product on an assembly line.

In fact, PwC distinguishes universities and non-profits from companies and individuals within their NPE analysis, with only about 5% of the NPE cases from 1995 to 2013 being attributed to universities and non-profits. Almost 50% of the NPE cases are attributed to an “individual,” who could be the listed inventor for the patent or a third-party assignee.

The word “troll” is obviously a derogatory term used to connote greed and hiding (under a bridge), but the term has adopted a newer, meme-like status as trolls are currently depicted as lacking any contribution to society and merely living off of others’ misfortunes and fears. [Three Billy Goats Gruff]. This is not always the truth with NPEs (e.g., universities).

No one wants to be called a troll—especially in front of a jury—so we’ve even recently seen courts bar defendants from referring to NPEs as such colorful terms as a “corporate shell,” “bounty hunter,” “privateer,” or someone “playing the lawsuit lottery.” [Judge Koh Bans Use Of Term ” Patent Troll” In Apple Jury Trial]

Regardless of the portrayal of an NPE, most people in the patent world distinguish the “trolls” by the strength of the patent, merits of the alleged infringement and their behavior upon notification. Often these are expressed as “frivolity” of the case and “gamesmanship” of the attorneys. Courts are able to punish plaintiffs who bring frivolous claims against a party and state bar associations are tasked with monitoring the ethics of attorneys. The USPTO is tasked with working to strengthen the quality of patents.

What’s the take-away from this study regarding NPEs?

The study focuses on patent litigation that produced a decision, therefore the most important and relevant conclusion is that, over the last four years, average damages awards for NPEs are more than triple the damages for practicing entities. Everything else in these articles, such as the initiation of litigation by NPEs, settlement percentages, and the general behavior of patent trolls is pure inference beyond the scope of the study.

This may sound sympathetic to trolls, but keep in mind that the study highlights that NPEs have more than triple the damages on average compared to practicing entities and it is meant to shock the reader a bit. One explanation for this is that NPEs are in the best position to choose the patents they want to assert and choose the targets they wish to sue—especially when the NPE is willing to ride that patent all the way to the end of a long, expensive trial. Sometimes settling is not an option. Chart 2b indicates that the disparity in the damages awarded to NPEs relative to practicing entities has always been big (since 2000), but perhaps going from two-fold from 2000 – 2009 to three times as much in the past 4 years indicates that NPEs are improving at finding patents and/or picking battles to take all the way to a court decision. More than anything, this seems to reflect the growth in the concept of patents as a business asset.

The PwC report is chock full of interesting patterns and trends of litigation results, so it’s a shame that the 67% number makes the headlines—far more interesting are the charts comparing success rates by 4-year periods (Chart 6b) or success rates for NPEs and practicing entities in front of a jury verusin front of a bench (Chart 6c), as well as other tables that reveal statistics for specific districts of the federal courts. Even the stats that look at the success rates of each type of NPE are telling because the reader sees that universities and non-profits have a higher success rate than non-practicing companies or individuals.

What do we do about the trolls?

The White House has recently called for Congress to do something about the trolls as horror stories of scams and shake-downs are shared. A bill was gaining momentum in the Senate, when Senator Leahy took it off the agenda in early July. That bill had miraculously passed 325-91 in the House and President Obama was willing to sign it if the Senate were to pass it. The bill was opposed by trial attorneys, universities, and bio-pharmaceutical businesses who felt as though the law would severely inhibit everyone’s access to the courts in order to hinder just the trolls. Regardless, most people think that the sitting Congressmen merely wanted a “win” prior to the mid-term elections and that patent reform is unlikely to reappear until next term.

In the meantime, the Supreme Court has recently reiterated rules concerning attorney fee-shifting on frivolous patent cases, as well as clarifying the validity of software patents. Time will tell if these changes have any effects on the damages awards that PwC’s study examined or even if they cause a chilling of the number of patent lawsuit filings.

Furthermore, new ways to challenge the validity of asserted patents have been initiated via the America Invents Act. For example, the Inter Partes Review (IPR) has yielded frightening preliminary statistics as to slowing, if not killing, patents that have been asserted in a suit. While these administrative trials are not cheap, many view these new tools at the Patent Trial and Appeals Board as anti-troll measures. It will be interesting to watch how the USPTO implements these procedures in the near future, especially while former Google counsel, Acting Director Michelle K. Lee, oversees the office.

In the private sector, Silicon Valley has recently seen a handful of tech companies come together as the License on Transfer Network, a group hoping to disarm the “Patent Assertion Entities.” Joining the LOT Network comes via an agreement that creates a license for use of a patent by anyone in the LOT network once that patent is sold. The thought is that the NPEs who consider purchasing patents from companies in the LOT Network will have fewer companies to sue since the license to the other active LOT participants will have triggered upon the transfer and, thus, the NPE will not be as inclined to “troll.” For instance, if a member-company such as Google were to sell a patent to a non-member company and an NPE bought that patent, the NPE would not be able to sue any members of the LOT Network with that patent.

Other notes

NPEs are only as evil as the people who run them—that being said, there are plenty of horror stories of small businesses receiving phantom demand letters that threaten a patent infringement suit without identifying themselves or the patent. This is an out-and-out scam and a plague on society that results in wasted time and resource, and inevitably higher prices on the consumer end.

It is a sin and a shame that patent rights can be misused in scams and shake-downs of businesses around us, but there is a reason that U.S. courts are so often used to defend patent rights. The PwC study, at minimum, reflects the high stakes of the patent market and perhaps the fragility. Nevertheless, merely monitoring the courts may not keep the trolls at bay.

I’d love to hear your thoughts.

*This is provided for informational purposes only, and does not constitute legal or financial advice. The information expressed is subject to change at any time and should be checked for completeness, accuracy and current applicability. For advice, consult a suitably licensed attorney or patent agent.

 

III. Large-scale analysis finds majority of clinical trials don’t provide meaningful evidence

Ineffective TreatmentsMedical Ethics • Tags: Center for Drug Evaluation and ResearchClinical trialCTTIDuke University HospitalFDAFood and Drug AdministrationNational Institutes of HealthUnited States National Library of Medicine

04 May 2012

DURHAM, N.C.— The largest comprehensive analysis of ClinicalTrials.gov finds that clinical trials are falling short of producing high-quality evidence needed to guide medical decision-making. The analysis, published today in JAMA, found the majority of clinical trials is small, and there are significant differences among methodical approaches, including randomizing, blinding and the use of data monitoring committees.

“Our analysis raises questions about the best methods for generating evidence, as well as the capacity of the clinical trials enterprise to supply sufficient amounts of high quality evidence to ensure confidence in guideline recommendations,” said Robert Califf, M.D., first author of the paper, vice chancellor for clinical research at Duke University Medical Center, and director of the Duke Translational Medicine Institute.

The analysis was conducted by the Clinical Trials Transformation Initiative (CTTI), a public private partnership founded by the Food and Drug Administration (FDA) and Duke. It extends the usability of the data in ClinicalTrials.gov for research by placing the data through September 27, 2010 into a database structured to facilitate aggregate analysis. This publically accessible database facilitates the assessment of the clinical trials enterprise in a more comprehensive manner than ever before and enables the identification of trends by study type.

 

The National Library of Medicine (NLM), a part of the National Institutes of Health, developed and manages ClinicalTrials.gov. This site maintains a registry of past, current, and planned clinical research studies.

“Since 2007, the Food and Drug Administration Amendment Act has required registration of clinical trials, and the expanded scope and rigor of trial registration policies internationally is producing more complete data from around the world,” stated Deborah Zarin, MD, director, ClinicalTrials.gov, and assistant director for clinical research projects, NLM. “We have amassed over 120,000 registered clinical trials. This rich repository of data has a lot to say about the national and international research portfolio.”

This CTTI project was a collaborative effort by informaticians, statisticians and project managers from NLM, FDA and Duke. CTTI comprises more than 60 member organizations with the goal of identifying practices that will improve the quality and efficiency of clinical trials.

“Since the ClinicalTrials.gov registry contains studies sponsored by multiple entities, including government, industry, foundations and universities, CTTI leaders recognized that it might be a valuable source for benchmarking the state of the clinical trials enterprise,” stated Judith Kramer, MD, executive director of CTTI.

The project goal was to produce an easily accessible database incorporating advances in informatics to permit a detailed characterization of the body of clinical research and facilitate analysis of groups of studies by therapeutic areas, by type of sponsor, by number of participants and by many other parameters.

“Analysis of the entire portfolio will enable the many entities in the clinical trials enterprise to examine their practices in comparison with others,” says Califf. “For example, 96% of clinical trials have ≤1000 participants, and 62% have ≤ 100. While there are many excellent small clinical trials, these studies will not be able to inform patients, doctors and consumers about the choices they must make to prevent and treat disease.”

The analysis showed heterogeneity in median trial size, with cardiovascular trials tending to be twice as large as those in oncology and trials in mental health falling in the middle. It also showed major differences in the use of randomization, blinding, and data monitoring committees, critical issues often used to judge the quality of evidence for medical decisions in clinical practice guidelines and systematic overviews.

“These results reinforce the importance of exploration, analysis and inspection of our clinical trials enterprise,” said Rachel Behrman Sherman, MD, associate director for the Office of Medical Policy at the FDA’s Center for Drug Evaluation and Research. “Generation of this evidence will contribute to our understanding of the number of studies in different phases of research, the therapeutic areas, and ways we can improve data collection about clinical trials, eventually improving the quality of clinical trials.”

Related articles

 

IV.  Lawmakers urge CMS to extend MU hardship exemption for pathologists

 

Eighty-nine members of Congress have asked the Centers for Medicare & Medicaid Services to give pathologists a break and extend the hardship exemption they currently enjoy for all of Stage 3 of the Meaningful Use program.In the letter–dated July 10 and addressed to CMS Administrator Marilyn Tavenner–the lawmakers point out that CMS had recognized in its 2012 final rule implementing Stage 2 of the program that it was difficult for pathologists to meet the Meaningful Use requirements and granted a one year exception for 2015, the first year that penalties will be imposed. They now are asking that the exception be expanded to include the full five-year maximum allowed under the American Recovery and Reinvestment Act.

“Pathologists have limited direct contact with patients and do not operate in EHRs,” the letter states. “Instead, pathologists use sophisticated computerized laboratory information systems (LISs) to support the work of analyzing patient specimens and generating test results. These LISs exchange laboratory and pathology data with EHRs.”

Interestingly, the lawmakers’ exemption request is only on behalf of pathologists, even though CMS had granted the one-year hardship exception to pathologists, radiologists and anesthesiologists.

Rep. Tom Price (R-Ga.), one of the members spearheading the letter, had also introduced a bill (H.R. 1309) in March 2013 that would exclude pathologists from the incentives and penalties of the Meaningful Use program. The bill, which has 31 cosponsors, is currently sitting in committee. That bill also does not include relief for radiologists or anesthesiologists.

CMS has provided some flexibility about the hardship exceptions in the past, most recently by allowing providers to apply for one due to EHR vendor delays in upgrading to Stage 2 of the program.

However, CMS also noted in the 2012 rule granting the one-year exception that it was granting the exception in large part because of the then-current lack of health information exchange and that “physicians in these three specialties should not expect that this exception will continue indefinitely, nor should they expect that we will grant the exception for the full 5-year period permitted by statute.”

To learn more:
– read the letter (.pdf)

Read Full Post »

Life-work in Engineering of Improved Heart Valve

Curator and Reporter: Larry H Bernstein, MD, FCAP

 

An authority and author of the book on cardiovascular valve devices is challenged by patient’s mother to go beyond what is available.  The results are splendid after re-engineering the design to the problem.

 

Reverse Engineering A Human Heart Valve

By Jim Pomager

aortic valve - a remarkable piece of biomechanical engineering

aortic valve – a remarkable piece of biomechanical engineering

 

 

 

The aortic valve is a remarkable piece of biomechanical engineering. On any given day, the leaflets (or cusps) of a healthy aortic valve will open and close 100,000+ times, allowing the proper amount of blood to flow from the heart to the rest of the body. Over a lifetime, a healthy valve endures more than 3.4 billion heartbeats.

Unfortunately, the aortic valve doesn’t always remain healthy. (What organ does?) According to the American Heart Association, up to 1.5 million people in the United States suffer from aortic stenosis (AS), a calcification of the aortic valve that narrows its opening and restricts blood flow. In the early stages, the disease is often asymptomatic, but as it progresses, it can cause chest pain, weakness, and difficulty breathing. And in approximately 300,000 people worldwide, the condition develops into severe AS, which has a one-year survival rate of approximately 50 percent, if left untreated.

Fortunately, there are treatment options.  The most common and successful is aortic valve replacement (AVR), wherein a mechanical or tissue-based valve is substituted for the diseased valve. For decades, replacement valves were implanted via open heart surgery, which involves an extended hospital stay and months of recovery. But in recent years, a promising new approach has emerged: transcatheter aortic valve implementation (TAVI), also known as transcatheter aortic valve replacement (TAVR). In TAVI, a tissue-based artificial valve is delivered into the diseased heart valve via a blood vessel, rather than through a large incision in the chest.

TAVI has many benefits, the most obvious (and compelling) of which is its noninvasiveness, which means shorter recovery times and faster attainment of quality-of-life outcomes for the patient. Replacement of a transcatheter aortic valve (TAV) can also be a minimally invasive exercise — a second TAV can simply be implanted within the first.

On the other hand, the use of TAVI procedures in U.S. hospitals is not yet widespread (though it is growing rapidly). The longevity of current-generation TAVs also remains unknown because it is an emerging technology, compared to evidence of 15+ years for surgically implanted heart valves. Plus, TAVI is only approved in the U.S. for use in AS patients who are either ineligible for surgical valve replacement or at high risk. (TAVI has been available in Europe since 2007, and clinical trials are underway in the U.S. for its use in intermediate-risk patients.)

What’s really needed is an improved TAV — one that outperforms current transcatheter valves, is as durable as a surgical valve, and operates more like … well, a healthy human aortic valve. Such a valve would open the door to TAVI’s use in the hundreds of thousands of lower-risk (and generally younger) AS patients whose only current option is a surgically implanted valve, and who would rather not have their chest opened.

Now, a man who has dedicated his professional career to studying the aortic valve has invented a new artificial valve design that he says will revolutionize TAVI. And if everything goes according to plan, his TAV will reach European patients in 2015 and U.S. patients soon after. How did he and his startup company design such technology? By reverse engineering the aortic valve.

The Man Behind The Valve

Mano Thubrikar

Mano Thubrikar

 

 

 

Mano Thubrikar, quite literally wrote the book on heart valves and heart disease — two of them, in fact. His The Aortic Valve (1989) and Vascular Mechanics and Pathology (2007) are leading textbooks in cardiovascular studies, and the former is widely used as a guide in the design of bioprosthetic heart valves.

After earning an undergraduate degree in metallurgy, a master’s in materials science, and a Ph.D. in biomedical engineering, Dr. Thubrikar spent the first 30 years of his career exclusively in academic research. He studied the aortic valve and bioprostheses from almost every conceivable angle while working at the University of Virginia (UVA) and at the Carolinas Medical Center and the University of North Carolina (UNC) at Charlotte.

But in 2003, Dr. Thubrikar received a phone call that would change the trajectory of his career and set him on the path to develop a novel TAV technology. A woman contacted him to discuss her son, a 35-year-old athlete with a calcified aortic valve. The condition was the result of a bicuspid valve, a congenital condition where the aortic valve has two cusps, rather than the customary three. The man needed a valve replacement, and his only choice was to have a mechanical heart valve surgically implanted. However, the surgical valve meant he would have to stay on anticoagulants for the rest of his life, effectively ending his athletic pursuits. Dr. Thubrikar informed the mother that there just weren’t any treatments available that would allow her son to continue his active lifestyle.

“Didn’t you write the book on the aortic valve?” she asked. “Why didn’t you make a valve that my son could use?”

The conversation and question deeply affected the researcher. “I went home and was so disturbed,” he told me during a recent visit to his office. “I talked to my wife and said, “You know what? Years of research, writing papers, and giving presentations — that’s done. I now need to make a heart valve.”

Soon after, Dr. Thubrikar left Carolinas Medical Center to embark on his new mission. He joined artificial heart valve pioneer Edwards Lifesciences as a Distinguished Scientist, but left after it became clear that the company’s plans for him didn’t align with his own.

So in 2007 — coincidentally, the same year Edwards launched the first commercially available TAV device — Dr. Thubrikar returned to academia, joining the staff at the South Dakota School of Mines & Technology. There he spent the next three years working on a new artificial valve design — one based on decades of research on the physics behind the human aortic valve.

Looking To The Human Body For Design Output
According to Dr. Thubrikar’s research, the natural aortic valve follows four strong design principles for maximum longevity and optimal hemodynamic performance. Those criteria are:

1. A specific coaptation height — When the valve’s three leaflets come together to close the valve, there is some surface-to-surface contact between the leaflets, rather than an edge-to-edge seal. This safety margin helps prevent against blood leakage back into the left ventricle.

2. No folds in the leaflets — Natural aortic valve cusps flex without folding. Folds would crease the tissue and cause unwanted stress on the leaflets, negatively impacting durability.

3. Minimum overall height — Extra height would produce dead space, which can lead to a variety of issues.

4. Minimum leaflet flexion — The human aortic valve manages to open completely with the leaflets moving only 70 degrees, not the 90 degrees you might expect. Again, this improves the valve’s longevity.

“You almost need to be a solid geometry design engineer to understand the math and the equations behind these principles,” he explained. “With these criteria, however, you have design parameters for the aortic valve. The mathematical equations give you the output of how an artificial valve should be designed.”

Dimensions of the natural aortic valve

Dimensions of the natural aortic valve

Dimensions of the natural aortic valve

 

 

Based on these four principles, Dr. Thubrikar reverse engineered the aortic heart valve, developing a new artificial valve design that mimics the aortic valve’s precise geometry. In October 2010, he launched a startup company called Thubrikar Aortic Valve, Inc. to commercialize his new creation, which he calls Optimum TAV and touts as “nature’s valve by design.”

“When someone asks me, ‘How does your valve compare with Edwards’?’ or ‘How does your valve compare with Medtronic’s?’, I say ‘We don’t compare our valve to them,'” Dr. Thubrikar told me. “We compare our valve with the natural aortic valve.”

On the surface, Optimum TAV looks similar to other artificial heart valves on the market, with three leaflets of bovine pericardium tissue mounted on a metal stent-frame. (In fact, the design is often mistaken for another widely used surgical valve.) But according to Dr. Thubrikar, it has a unique combination of features that will help it overcome the major design limitations of current-generation TAVs (if we’re going to compare). Those design limitations include:

  • Suture holes in the leaflet body — While all TAVs (including Optimum TAV) are constructed by sewing animal tissue to a metal frame, piercing the flexion zone of the leaflets leads to potential wear. Optimum TAV does not have a single suture hole in the working portion of the leaflet body.
  • Blood flow through frame — Some TAV frames are as tall as 5 cm in height, extending up into the aorta once implanted. As a result, blood must pass through the frame to enter the coronary arteries. Proteins in the blood will accumulate on the frame, and can eventually break loose and cause thromboembolisms (blood clots).  Optimum TAV is only 2 cm in height. (Related, the low height of the Thubrikar valve also makes it less likely to require a pacemaker.)
  • Thick outer frame — The thicker the frame, the smaller the valve opening will be, allowing less blood to pass through. This opening is referred to as the valve’s EOA, or effective orifice area. The average EOA of a surgical valve is around 1.9 cm2, and some TAVs have EOAs as small as 1.5 cm2(technically, a mild form of stenosis). In bench tests, Optimum TAV’s EOA was 2.3 to 2.4 cm2. (A healthy aortic valve has an EOA of approximately 2.7 cm2.)
  • Clipped calcified leaflets — Some current TAVs are anchored to the patient’s original valve using a paper-clip like mechanism. In this design, there is the potential that the TAVs leaflets will come into contact with the old, calcified leaflets during the operation, causing wear. Optimum TAV’s design eliminates the possibility of contact between the leaflets and native valve.
  • Paravalvular leakage — In some cases, a space forms between the outside of a TAV and the surrounding heart tissue, and blood can leak through. Optimum TAV has a high skirt to prevent this type of gap from developing. In addition, Optimum TAV’s novel frame architecture allows it to conform to and seal off either a round or elliptical annulus (the ring-shaped base of the original valve). This is particularly helpful in minimizing or eliminating leakage in bicuspid patients, who often have an irregularly shaped annulus.
  • Balloon expansion — TAV frames made of stainless steel must be forced open by a balloon. The TAV’s tissue can get caught between the balloon and the frame and potentially tear. Optimum TAV’s frame is made of nitinol, which automatically expands once deployed from the catheter.

 

optimum TAV

optimum TAV

 

 

Optimum TAV

“Other technologies have built-in issues,” Dr. Thubrikar said. “To be able to avoid those problems in a comprehensive fashion is no small feat.”

Trial By Fire
During the two and a half years following the establishment of Thubrikar Aortic Valve, Optimum TAV seemed to be moving steadily toward market. The company raised enough funding to get started, primarily from friends, family, physicians, entrepreneurs, and technology industry executives. Patent applications were filed, suppliers were selected, valves were painstakingly produced (by hand, over one-and-a-half to two days each), and preclinical testing began.

Members of the Thubrikar Aortic Valve team

Members of the Thubrikar Aortic Valve team

 

 

Members of the Thubrikar Aortic Valve team (left to right): Deodatt Wadke, member of the board of directors and cofounder; Samir Wadke, executive director of business development and cofounder; Dr. Mano Thubrikar, president and founder; Samuel Evans, research engineer II; and Nikhil Heble, counsel, secretary, and cofounder

But the fledgling company was dealt a major setback in April 2013, when a fire destroyed the Horsham, Pa. office building to which the Thubrikar Aortic Valve laboratory had recently relocated (from South Dakota). All of its equipment was destroyed and needed to be replaced. The company had to relocate to nearby Norristown, Pa. Not an ideal scenario for a startup trying to make the most of extremely limited resources.

The company was undeterred by the fire, and the last year has been a successful one for Thubrikar. The company completed most of its preclinical testing (including implants in 12 animals and two diseased human cadaver hearts), reached design freeze on Optimum TAV, filed a provisional patent application for its proprietary delivery catheter, and achieved almost $2 million in total funding. Perhaps the biggest milestone came in August 2013, when Optimum TAV met the International Organization for Standardization’s (ISO’s) durability requirements by surpassing 200 million cycles in a third-party ISO certified laboratory.

The durability testing has continued, and Optimum TAV continues to function beyond 390 million cycles, which approximates 11 years in vivo. Surgical valves typically last anywhere from 12 to 18 years, and Thubrikar expects his valve to last at least that long.

“I would not be surprised if it surpasses the longevity of even the surgical valve,” he said.

The company also received its first institutional investment, from Delaware Crossing Investor Group (DCIG), in 2014. The primary DCIG investor, Marv Woodall, led the commercialization of the world’s first stents as president of Johnson & Johnson Interventional Systems (now Cordis) and was on the board of director of the first TAV company, Percutaneous Valve Technologies (PVT, now part of Edwards Lifesciences). Thubrikar has recruited him as its business advisor.

What Lies Ahead
Like many other developers of novel medical devices, Thubrikar Aortic Valve has decided to take its product to market through Europe initially, given European regulators’ comfort level with TAV and the FDA’s steep requirement for clinical trials. “We have spoken to the FDA and will continue to do so on a regular basis,” according to Dr. Thubrikar. “But they asked for a lot more preclinical testing than the European Notified Bodies to start a clinical trial.”

The company is now working to raise an additional $2 million to $10 million, and expects the granting of its patent for Optimum TAV in 2014. The finances will enable Thubrikar to not only conduct a first-in-human (FIH) feasibility study in up to 15 patients this year, but also to expand to a full European clinical trial of about 65 additional patients in 2015. If all goes well, a 2015 CE Mark for Optimum TAV isn’t out of the question.

However, trial success is vital, since today’s investors — and large companies in search of technology acquisitions — wait for significant clinical data to accumulate before backing a medical device. “We realize that until we actually implant the valve in a patient, other companies will think, ‘You don’t know what can go wrong,'” Dr. Thubrikar explained. “We had one big company say, ‘We will pay you four times as much once the product is in a patient.’ They want you to de-risk everything, to work out all the bugs yourself on your own dime.”

Yet Dr. Thubrikar thinks its only a matter of time until his life’s work finally arrives in the hands of interventional cardiologists, who he said have been “knocking at his door” since he first presented a paper on the technology in 2012. Since then, he has spoken at several of the largest interventional cardiology conferences, and word continues to spread about Optimum TAV. Like many other researchers-turned-entreprenuers, he steadfastly believes that his invention will eventually reach the market, where it can begin helping patients — like the one whose mother contacted him a decade ago.

“If hell freezes over, if we don’t get any money, I don’t care,” he said. “I don’t care how it happens. We are going to make a heart valve. That’s the only mission in my life.”

For more information on Thubrikar Aortic Valve and Optimum TAV, visit http://tavi.us/.

 

 

 

 

Read Full Post »

USPTO Guidance On Patentable Subject Matter

USPTO Guidance On Patentable Subject Matter

Curator and Reporter: Larry H Bernstein, MD, FCAP

LH Bernstein

LH Bernstein

 

 

 

 

 

 

Revised 4 July, 2014

http://pharmaceuticalintelligence.com/2014/07/03/uspto-guidance-on-patentable-subject-matter

 

I came across a few recent articles on the subject of US Patent Office guidance on patentability as well as on Supreme Court ruling on claims. I filed several patents on clinical laboratory methods early in my career upon the recommendation of my brother-in-law, now deceased.  Years later, after both brother-in-law and patent attorney are no longer alive, I look back and ask what I have learned over $100,000 later, with many trips to the USPTO, opportunities not taken, and a one year provisional patent behind me.

My conclusion is

(1) that patents are for the protection of the innovator, who might realize legal protection, but the cost and the time investment can well exceed the cost of startup and building a small startup enterprize, that would be the next step.

(2) The other thing to consider is the capability of the lawyer or firm that represents you.  A patent that is well done can be expected to take 5-7 years to go through with due diligence.   I would not expect it to be done well by a university with many other competing demands. I might be wrong in this respect, as the climate has changed, and research universities have sprouted engines for change.  Experienced and productive faculty are encouraged or allowed to form their own such entities.

(3) The emergence of Big Data, computational biology, and very large data warehouses for data use and integration has changed the landscape. The resources required for an individual to pursue research along these lines is quite beyond an individuals sole capacity to successfully pursue without outside funding.  In addition, the changed designated requirement of first to publish has muddied the water.

Of course, one can propose without anything published in the public domain. That makes it possible for corporate entities to file thousands of patents, whether there is actual validation or not at the time of filing.  It would be a quite trying experience for anyone to pursue in the USPTO without some litigation over ownership of patent rights. At this stage of of technology development, I have come to realize that the organization of research, peer review, and archiving of data is still at a stage where some of the best systems avalailable for storing and accessing data still comes considerably short of what is needed for the most complex tasks, even though improvements have come at an exponential pace.

I shall not comment on the contested views held by physicists, chemists, biologists, and economists over the completeness of guiding theories strongly held.  Only history will tell.  Beliefs can hold a strong sway, and have many times held us back.

I am not an expert on legal matters, but it is incomprehensible to me that issues concerning technology innovation can be adjudicated in the Supreme Court, as has occurred in recent years. I have postgraduate degrees in  Medicine, Developmental Anatomy, and post-medical training in pathology and laboratory medicine, as well as experience in analytical and research biochemistry.  It is beyond the competencies expected for these type of cases to come before the Supreme Court, or even to the Federal District Courts, as we see with increasing frequency,  as this has occurred with respect to the development and application of the human genome.

I’m not sure that the developments can be resolved for the public good without a more full development of an open-access system of publishing. Now I present some recent publication about, or published by the USPTO.

DR ANTHONY MELVIN CRASTO

Dr. Melvin Castro - Organic Chemistry and New Drug Development

Dr. Melvin Castro – Organic Chemistry and New Drug Development

 

 

 

 

 

 

 

 

YOU ARE FOLLOWING THIS BLOG You are following this blog, along with 1,014 other amazing people (manage).

patentimages.storage.goog…

USPTO Guidance On Patentable Subject Matter: Impediment to Biotech Innovation

Joanna T. Brougher, David A. Fazzolare J Commercial Biotechnology 2014 20(3):Brougher

jcbiotech-patents

jcbiotech-patents

 

 

 

 

 

 

 

 

 

 

 

Abstract In June 2013, the U.S. Supreme Court issued a unanimous decision upending more than three decades worth of established patent practice when it ruled that isolated gene sequences are no longer patentable subject matter under 35 U.S.C. Section 101.While many practitioners in the field believed that the USPTO would interpret the decision narrowly, the USPTO actually expanded the scope of the decision when it issued its guidelines for determining whether an invention satisfies Section 101.

The guidelines were met with intense backlash with many arguing that they unnecessarily expanded the scope of the Supreme Court cases in a way that could unduly restrict the scope of patentable subject matter, weaken the U.S. patent system, and create a disincentive to innovation. By undermining patentable subject matter in this way, the guidelines may end up harming not only the companies that patent medical innovations, but also the patients who need medical care.  This article examines the guidelines and their impact on various technologies.

Keywords:   patent, patentable subject matter, Myriad, Mayo, USPTO guidelines

Full Text: PDF

References

35 U.S.C. Section 101 states “Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.

” Prometheus Laboratories, Inc. v. Mayo Collaborative Services, 566 U.S. ___ (2012)

Association for Molecular Pathology et al., v. Myriad Genetics, Inc., 569 U.S. ___ (2013).

Parke-Davis & Co. v. H.K. Mulford Co., 189 F. 95, 103 (C.C.S.D.N.Y. 1911)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Funk Brothers Seed Co. v. Kalo Inoculant Co., 333 U.S. 127, 131 (1948)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

DOI: http://dx.doi.org/10.5912/jcb664

 

Science 4 July 2014; 345 (6192): pp. 14-15  DOI: http://dx.doi.org/10.1126/science.345.6192.14
  • IN DEPTH

INTELLECTUAL PROPERTY

Biotech feels a chill from changing U.S. patent rules

A 2013 Supreme Court decision that barred human gene patents is scrambling patenting policies.

PHOTO: MLADEN ANTONOV/AFP/GETTY IMAGES

A year after the U.S. Supreme Court issued a landmark ruling that human genes cannot be patented, the biotech industry is struggling to adapt to a landscape in which inventions derived from nature are increasingly hard to patent. It is also pushing back against follow-on policies proposed by the U.S. Patent and Trademark Office (USPTO) to guide examiners deciding whether an invention is too close to a natural product to deserve patent protection. Those policies reach far beyond what the high court intended, biotech representatives say.

“Everything we took for granted a few years ago is now changing, and it’s generating a bit of a scramble,” says patent attorney Damian Kotsis of Harness Dickey in Troy, Michigan, one of more than 15,000 people who gathered here last week for the Biotechnology Industry Organization’s (BIO’s) International Convention.

At the meeting, attorneys and executives fretted over the fate of patent applications for inventions involving naturally occurring products—including chemical compounds, antibodies, seeds, and vaccines—and traded stories of recent, unexpected rejections by USPTO. Industry leaders warned that the uncertainty could chill efforts to commercialize scientific discoveries made at universities and companies. Some plan to appeal the rejections in federal court.

USPTO officials, meanwhile, implored attendees to send them suggestions on how to clarify and improve its new policies on patenting natural products, and even announced that they were extending the deadline for public comment by a month. “Each and every one of you in this room has a moral duty … to provide written comments to the PTO,” patent lawyer and former USPTO Deputy Director Teresa Stanek Rea told one audience.

At the heart of the shake-up are two Supreme Court decisions: the ruling last year in Association for Molecular Pathology v. Myriad Genetics Inc. that human genes cannot be patented because they occur naturally (Science, 21 June 2013, p. 1387); and the 2012 Mayo v. Prometheus decision, which invalidated a patent on a method of measuring blood metabolites to determine drug doses because it relied on a “law of nature” (Science, 12 July 2013, p. 137).

Myriad and Mayo are already having a noticeable impact on patent decisions, according to a study released here. It examined about 1000 patent applications that included claims linked to natural products or laws of nature that USPTO reviewed between April 2011 and March 2014. Overall, examiners rejected about 40%; Myriad was the basis for rejecting about 23% of the applications, and Mayo about 35%, with some overlap, the authors concluded. That rejection rate would have been in the single digits just 5 years ago, asserted Hans Sauer, BIO’s intellectual property counsel, at a press conference. (There are no historical numbers for comparison.) The study was conducted by the news service Bloomberg BNA and the law firm Robins, Kaplan, Miller & Ciseri in Minneapolis, Minnesota.

USPTO is extending the decisions far beyond diagnostics and DNA?

The numbers suggest USPTO is extending the decisions far beyond diagnostics and DNA, attorneys say. Harness Dickey’s Kotsis, for example, says a client recently tried to patent a plant extract with therapeutic properties; it was different from anything in nature, Kotsis argued, because the inventor had altered the relative concentrations of key compounds to enhance its effect. Nope, decided USPTO, too close to nature.

In March, USPTO released draft guidance designed to help its examiners decide such questions, setting out 12 factors for them to weigh. For example, if an examiner deems a product “markedly different in structure” from anything in nature, that counts in its favor. But if it has a “high level of generality,” it gets dinged.

The draft has drawn extensive criticism. “I don’t think I’ve ever seen anything as complicated as this,” says Kevin Bastian, a patent attorney at Kilpatrick Townsend & Stockton in San Francisco, California. “I just can’t believe that this will be the standard.”

USPTO officials appear eager to fine-tune the draft guidance, but patent experts fear the Supreme Court decisions have made it hard to draw clear lines. “The Myriad decision is hopelessly contradictory and completely incoherent,” says Dan Burk, a law professor at the University of California, Irvine. “We know you can’t patent genetic sequences,” he adds, but “we don’t really know why.”

Get creative in using Draft Guidelines!

For now, Kostis says, applicants will have to get creative to reduce the chance of rejection. Rather than claim protection for a plant extract itself, for instance, an inventor could instead patent the steps for using it to treat patients. Other biotech attorneys may try to narrow their patent claims. But there’s a downside to that strategy, they note: Narrower patents can be harder to protect from infringement, making them less attractive to investors. Others plan to wait out the storm, predicting USPTO will ultimately rethink its guidance and ease the way for new patents.

 

Public comment period extended

USPTO has extended the deadline for public comment to 31 July, with no schedule for issuing final language. Regardless of the outcome, however, Stanek Rea warned a crowd of riled-up attorneys that, in the world of biopatents, “the easy days are gone.”

 

United States Patent and Trademark Office

Today we published and made electronically available a new edition of the Manual of Patent Examining Procedure (MPEP). Manual of Patent Examining Procedure uspto.gov http://www.uspto.gov/web/offices/pac/mpep/index.html Summary of Changes

PDF Title Page
PDF Foreword
PDF Introduction
PDF Table of Contents
PDF Chapter 600 –
PDF   Parts, Form, and Content of Application Chapter 700 –
PDF    Examination of Applications Chapter 800 –
PDF   Restriction in Applications Filed Under 35 U.S.C. 111; Double Patenting Chapter 900 –
PDF   Prior Art, Classification, and Search Chapter 1000 –
PDF  Matters Decided by Various U.S. Patent and Trademark Office Officials Chapter 1100 –
PDF   Statutory Invention Registration (SIR); Pre-Grant Publication (PGPub) and Preissuance Submissions Chapter 1200 –
PDF    Appeal Chapter 1300 –
PDF   Allowance and Issue Appendix L –
PDF   Patent Laws Appendix R –
PDF   Patent Rules Appendix P –
PDF   Paris Convention Subject Matter Index 
PDF Zipped version of the MPEP current revision in the PDF format.

Manual of Patent Examining Procedure (MPEP)Ninth Edition, March 2014

The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to:
http://uspto-mpep.ideascale.com.

Manual of Patent Examining Procedure (MPEP) Ninth Edition, March 2014
The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to: http://uspto-mpep.ideascale.com.

Note: For current fees, refer to the Current USPTO Fee Schedule.
Consolidated Laws – The patent laws in effect as of May 15, 2014. Consolidated Rules – The patent rules in effect as of May 15, 2014.  MPEP Archives (1948 – 2012)
Current MPEP: Searchable MPEP

The documents updated in the Ninth Edition of the MPEP, dated March 2014, include changes that became effective in November 2013 or earlier.
All of the documents have been updated for the Ninth Edition except Chapters 800, 900, 1000, 1300, 1700, 1800, 1900, 2000, 2300, 2400, 2500, and Appendix P.
More information about the changes and updates is available from the “Blue Page – Introduction” of the Searchable MPEP or from the “Summary of Changes” link to the HTML and PDF versions provided below. Discuss the Manual of Patent Examining Procedure (MPEP) Welcome to the MPEP discussion tool!

We have received many thoughtful ideas on Chapters 100-600 and 1800 of the MPEP as well as on how to improve the discussion site. Each and every idea submitted by you, the participants in this conversation, has been carefully reviewed by the Office, and many of these ideas have been implemented in the August 2012 revision of the MPEP and many will be implemented in future revisions of the MPEP. The August 2012 revision is the first version provided to the public in a web based searchable format. The new search tool is available at http://mpep.uspto.gov. We would like to thank everyone for participating in the discussion of the MPEP.

We have some great news! Chapters 1300, 1500, 1600 and 2400 of the MPEP are now available for discussion. Please submit any ideas and comments you may have on these chapters. Also, don’t forget to vote on ideas and comments submitted by other users. As before, our editorial staff will periodically be posting proposed new material for you to respond to, and in some cases will post responses to some of the submitted ideas and comments.Recently, we have received several comments concerning the Leahy-Smith America Invents Act (AIA). Please note that comments regarding the implementation of the AIA should be submitted to the USPTO via email t aia_implementation@uspto.gov or via postal mail, as indicated at the America Invents Act Web site. Additional information regarding the AIA is available at www.uspto.gov/americainventsact  We have also received several comments suggesting policy changes which have been routed to the appropriate offices for consideration. We really appreciate your thinking and recommendations!

FDA Guidance for Industry:Electronic Source Data in Clinical Investigations

Electronic Source Data

Electronic Source Data

 

 

 

 

 

 

 

The FDA published its new Guidance for Industry (GfI) – “Electronic Source Data in Clinical Investigations” in September 2013.
The Guidance defines the expectations of the FDA concerning electronic source data generated in the context of clinical trials. Find out more about this Guidance.
http://www.gmp-compliance.org/enews_4288_FDA%20Guidance%20for%20Industry%3A%20Electronic%20Source%20Data%20in%20Clinical%20Investigations
_8534,8457,8366,8308,Z-COVM_n.html

After more than 5 years and two draft versions, the final version of the Guidance for
Industry (GfI) – “Electronic Source Data in Clinical Investigations” was published in
September 2013. This new FDA Guidance defines the FDA’s expectations for sponsors,
CROs, investigators and other persons involved in the capture, review and retention of
electronic source data generated in the context of FDA-regulated clinical trials.In an
effort to encourage the modernization and increased efficiency of processes in clinical
trials, the FDA clearly supports the capture of electronic source data and emphasizes
the agency’s intention to support activities aimed at ensuring the reliability, quality,
integrity and traceability of this source data, from its electronic source to the electronic
submission of the data in the context of an authorization procedure. The Guidance
addresses aspects as data capture, data review and record retention. When the
computerized systems used in clinical trials are described, the FDA recommends
that the description not only focus on the intended use of the system, but also on
data protection measures and the flow of data across system components and
interfaces. In practice, the pharmaceutical industry needs to meet significant
requirements regarding organisation, planning, specification and verification of
computerized systems in the field of clinical trials. The FDA also mentions in the
Guidance that it does not intend to apply 21 CFR Part 11 to electronic health records
(EHR). Author: Oliver Herrmann Q-Infiity Source: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/
Guidances/UCM328691.pdf
Webinar: https://collaboration.fda.gov/p89r92dh8wc

 

Read Full Post »

« Newer Posts