Feeds:
Posts
Comments

Archive for the ‘Mass automation of plasma proteins’ Category

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Why did this occur? The matter of Individual Actions Undermining Trust, The Patent Dilemma and The Value of a Clinical Trials

Reporter and Curator: Larry H. Bernstein, MD, FCAP

 

he large amount of funding tied to continued research and support of postdoctoral fellows leads one to ask how following the money can lead to discredited work in th elite scientific community.

Moreover, the pressure to publish in prestigious journals with high impact factors is a road to academic promotion.  In the last twenty years, it is unusual to find submissions for review with less than 6-8 authors, with the statement that all contributed to the work.  These factors can’t be discounted outright, but it is easy for work to fall through the cracks when a key investigator has over 200 publications and holds tenure in a great research environment.  But that is where we find ourselves today.

There is another issue that comes up, which is also related to the issue of carrying out research, and then protecting the work for commercialization.  It is more complicated in the sense that it is necessary to determine whether there is prior art, and then there is the possibility that after the cost of filing patent and a 6 year delay in obtaining protection, there is as great a cost in bringing the patent to finasl production.

I.  Individual actions undermining trust.

II. The patent dilemma.

III. The value of a clinical trial.

IV. The value contributions of RAP physicians
(radiologists, anesthesiologists, and pathologists – the last for discussion)
Those who maintain and inform the integrity of medical and surgical decisions

 

I. Top heart lab comes under fire

Kelly Servick

Science 18 July 2014: Vol. 345 no. 6194 p. 254 DOI: 10.1126/science.345.6194.25

 

In the study of cardiac regeneration, Piero Anversa is among the heavy hitters. His research into the heart’s repair mechanisms helped kick-start the field of cardiac cell therapy (see main story). After more than 4 decades of research and 350 papers, he heads a lab at Harvard Medical School’s Brigham and Women’s Hospital (BWH) in Boston that has more than $6 million in active grant funding from the National Institutes of Health (NIH). He is also an outspoken voice in a field full of disagreement.

So when an ongoing BWH investigation of the lab came to light earlier this year, Anversa’s colleagues were transfixed. “Reactions in the field run the gamut from disbelief to vindication,” says Mark Sussman, a cardiovascular researcher at San Diego State University in California who has collaborated with Anversa. By Sussman’s account, Anversa’s reputation for “pushing the envelope” and “challenging existing dogma” has generated some criticism. Others, however, say that the disputes run deeper—to doubts about a cell therapy his lab has developed and about the group’s scientific integrity. Anversa told Science he was unable to comment during the investigation.

“People are talking about this all the time—at every scientific meeting I go to,” says Charles Murry, a cardiovascular pathologist at the University of Washington, Seattle. “It’s of grave concern to people in the field, but it’s been frustrating,” because no information is available about BWH’s investigation. BWH would not comment for this article, other than to say that it addresses concerns about its researchers confidentially.

In April, however, the journal Circulation agreed to Harvard’s request to retract a 2012 paper on which Anversa is a corresponding author, citing “compromised” data. The Lancet also issued an “Expression of Concern” about a 2011 paper reporting results from a clinical trial, known as SCIPIO, on which Anversa collaborated. According to a notice from the journal, two supplemental figures are at issue.

For some, Anversa’s status has earned him the benefit of the doubt. “Obviously, this is very disconcerting,” says Timothy Kamp, a cardiologist at the University of Wisconsin, Madison, but “I would be surprised if it was an implication of a whole career of research.”

Throughout that career, Anversa has argued that the heart is a prolific, lifelong factory for new muscle cells. Most now accept the view that the adult heart can regenerate muscle, but many have sparred with Anversa over his high estimates for the rate of this turnover, which he maintained in the retracted Circulation paper.

Anversa’s group also pioneered a method of separating cells with potential regenerative abilities from other cardiac tissue based on the presence of a protein called c-kit. After publishing evidence that these cardiac c-kit+cells spur new muscle growth in rodent hearts, the group collaborated in the SCIPIO trial to inject them into patients with heart failure. In The Lancet, the scientists reported that the therapy was safe and showed modest ability to strengthen the heart—evidence that many found intriguing and provocative. Roberto Bolli, the cardiologist whose group at the University of Louisville in Kentucky ran the SCIPIO trial, plans to test c-kit+ cells in further clinical trials as part of the NIH-funded Cardiovascular Cell Therapy Research Network.

But others have been unable to reproduce the dramatic effects Anversa saw in animals, and some have questioned whether these cells really have stem cell–like properties. In May, a group led by Jeffery Molkentin, a molecular biologist at Cincinnati Children’s Hospital Medical Center in Ohio, published a paper in Nature tracing the genetic lineage of c-kit+ cells that reside in the heart. He concluded that although they did make new muscle cells, the number is “astonishingly low” and likely not enough to contribute to the repair of damaged hearts. Still, Molkentin says that he “believe[s] in their therapeutic potential” and that he and Anversa have discussed collaborating.

Now, an anonymous blogger claims that problems in the Anversa lab go beyond controversial findings. In a letter published on the blog Retraction Watch on 30 May, a former research fellow in the Anversa lab described a lab culture focused on protecting the c-kit+ cell hypothesis: “[A]ll data that did not point to the ‘truth’ of the hypothesis were considered wrong,” the person wrote. But another former lab member offers a different perspective. “I had a great experience,” says Federica Limana, a cardiovascular disease researcher at IRCCS San Raffaele Pisana in Rome who spent 2 years of her Ph.D. work with the group in 1999 and 2000, as it was beginning to investigate c-kit+ cells. “In that period, there was no such pressure” to produce any particular result, she says.

Accusations about the lab’s integrity, combined with continued silence from BWH, are deeply troubling for scientists who have staked their research on theories that Anversa helped pioneer. Some have criticized BWH for requesting retractions in the midst of an investigation. “Scientific reputations and careers hang in the balance,” Sussman says, “so everyone should wait until all facts are clearly and fully disclosed.”

 

II.  Trolling Along: Recent Commotion About Patent Trolls

July 17, 2014

PriceWaterhouseCoopers recently released a study about 2014 Patent Litigation. PwC’s ultimate conclusion was that case volume increased vastly and damages continue a general decline, but what’s making headlines everywhere is that “patent trolls” now account for 67% of all new patent lawsuits (see, e.g., Washington Post and Fast Company).

Surprisingly, looking at PwC’s study, the word “troll” is not to be found. So, with regard to patent trolls, what does this study really mean for companies, patent owners and casual onlookers?

First of all, who are these trolls?

“Patent Troll” is a label applied to patent owners who do not make or manufacture a product, or offer a service. Patent trolls live (and die) by suing others for allegedly practicing an invention that is claimed by their patents.

The politically correct term is Non-practicing Entity (NPE). PwC solely uses the term NPE, which it defines as an entity that does not have the capability to design, manufacture, or distribute products with features protected by the patent.

So, what’s so bad about them?

The common impression of an NPEs is a business venture looking to collect and monetize assets (i.e., patents). In the most basic strategy, an NPE typically buys patents with broad claims that cover a wide variety of technologies and markets, and then sues a large group of alleged patent infringers in the hope to collect a licensing royalty or a settlement. NPEs typically don’t want to spend money on a trial unless they have to, and one tactic uses settlements with smaller businesses to build a “war chest” for potential suits with larger companies.

NPEs initiating a lawsuit can be viewed positively, such as a just defense of the lowly inventor who sold his patent to someone (with deeper pockets) who could fund the litigation to protect the inventor’s hard work against a mega-conglomerate who ripped off his idea.

Or NPE litigation can be seen negatively, such as an attorney’s demand letter on behalf of an anonymous shell corporation to shake down dozens of five-figure settlements from all the local small businesses that have ever used a fax machine.

NPEs can waste a company’s valuable time and resources with lawsuits, yet also bring value to their patent portfolios by energizing a patent sales and licensing market. There are unscrupulous NPEs, but it’s hardly the black and white situation that some media outlets are depicting.

What did PwC say about trolls?

Well, the PwC study looked at the success rates and awards of patent litigation decisions. One conclusion is that damages awards for NPEs averaged more than triple those for practicing entities over the last four years. We’ll come back to this statistic.

Another key observation is that NPEs have been successful 25% of the time overall, versus 35% for practicing entities. This makes sense because of the burden of proof the NPEs carry as a plaintiff at trial and the relative lack of success for NPEs at summary judgment. However, PwC’s report states that both types of entities win about two-thirds of their trials.

But what about this “67% of all patent trials are initiated by trolls” discussion?

The 67% number comes from the RPX Corporation’s litigation report (produced January 2014) that quantified the percentage of NPE cases filed in 2013 as 67%, compared to 64% in 2012, 47% in 2011, 30% in 2010 and 28% in 2009.

PwC refers to the RPX statistics to accentuate that this new study indicates that only 20% ofdecisions in 2013 involved NPE-filed cases, so the general conclusion would be that NPE cases tend to settle or be dismissed prior to a court’s decision. Admittedly, this is indicative of the prevalent “spray and pray” strategy where NPEs prefer to collect many settlement checks from several “targets” and avoid the courtroom.

In this study, who else is an NPE?

If someone were looking to dramatize the role of “trolls,” the name can be thrown around liberally (and hurtfully) to anyone who owns and asserts a patent without offering a product or a service. For instance, colleges and universities fall under the NPE umbrella as their research and development often ends with a series of published papers rather than a marketable product on an assembly line.

In fact, PwC distinguishes universities and non-profits from companies and individuals within their NPE analysis, with only about 5% of the NPE cases from 1995 to 2013 being attributed to universities and non-profits. Almost 50% of the NPE cases are attributed to an “individual,” who could be the listed inventor for the patent or a third-party assignee.

The word “troll” is obviously a derogatory term used to connote greed and hiding (under a bridge), but the term has adopted a newer, meme-like status as trolls are currently depicted as lacking any contribution to society and merely living off of others’ misfortunes and fears. [Three Billy Goats Gruff]. This is not always the truth with NPEs (e.g., universities).

No one wants to be called a troll—especially in front of a jury—so we’ve even recently seen courts bar defendants from referring to NPEs as such colorful terms as a “corporate shell,” “bounty hunter,” “privateer,” or someone “playing the lawsuit lottery.” [Judge Koh Bans Use Of Term ” Patent Troll” In Apple Jury Trial]

Regardless of the portrayal of an NPE, most people in the patent world distinguish the “trolls” by the strength of the patent, merits of the alleged infringement and their behavior upon notification. Often these are expressed as “frivolity” of the case and “gamesmanship” of the attorneys. Courts are able to punish plaintiffs who bring frivolous claims against a party and state bar associations are tasked with monitoring the ethics of attorneys. The USPTO is tasked with working to strengthen the quality of patents.

What’s the take-away from this study regarding NPEs?

The study focuses on patent litigation that produced a decision, therefore the most important and relevant conclusion is that, over the last four years, average damages awards for NPEs are more than triple the damages for practicing entities. Everything else in these articles, such as the initiation of litigation by NPEs, settlement percentages, and the general behavior of patent trolls is pure inference beyond the scope of the study.

This may sound sympathetic to trolls, but keep in mind that the study highlights that NPEs have more than triple the damages on average compared to practicing entities and it is meant to shock the reader a bit. One explanation for this is that NPEs are in the best position to choose the patents they want to assert and choose the targets they wish to sue—especially when the NPE is willing to ride that patent all the way to the end of a long, expensive trial. Sometimes settling is not an option. Chart 2b indicates that the disparity in the damages awarded to NPEs relative to practicing entities has always been big (since 2000), but perhaps going from two-fold from 2000 – 2009 to three times as much in the past 4 years indicates that NPEs are improving at finding patents and/or picking battles to take all the way to a court decision. More than anything, this seems to reflect the growth in the concept of patents as a business asset.

The PwC report is chock full of interesting patterns and trends of litigation results, so it’s a shame that the 67% number makes the headlines—far more interesting are the charts comparing success rates by 4-year periods (Chart 6b) or success rates for NPEs and practicing entities in front of a jury verusin front of a bench (Chart 6c), as well as other tables that reveal statistics for specific districts of the federal courts. Even the stats that look at the success rates of each type of NPE are telling because the reader sees that universities and non-profits have a higher success rate than non-practicing companies or individuals.

What do we do about the trolls?

The White House has recently called for Congress to do something about the trolls as horror stories of scams and shake-downs are shared. A bill was gaining momentum in the Senate, when Senator Leahy took it off the agenda in early July. That bill had miraculously passed 325-91 in the House and President Obama was willing to sign it if the Senate were to pass it. The bill was opposed by trial attorneys, universities, and bio-pharmaceutical businesses who felt as though the law would severely inhibit everyone’s access to the courts in order to hinder just the trolls. Regardless, most people think that the sitting Congressmen merely wanted a “win” prior to the mid-term elections and that patent reform is unlikely to reappear until next term.

In the meantime, the Supreme Court has recently reiterated rules concerning attorney fee-shifting on frivolous patent cases, as well as clarifying the validity of software patents. Time will tell if these changes have any effects on the damages awards that PwC’s study examined or even if they cause a chilling of the number of patent lawsuit filings.

Furthermore, new ways to challenge the validity of asserted patents have been initiated via the America Invents Act. For example, the Inter Partes Review (IPR) has yielded frightening preliminary statistics as to slowing, if not killing, patents that have been asserted in a suit. While these administrative trials are not cheap, many view these new tools at the Patent Trial and Appeals Board as anti-troll measures. It will be interesting to watch how the USPTO implements these procedures in the near future, especially while former Google counsel, Acting Director Michelle K. Lee, oversees the office.

In the private sector, Silicon Valley has recently seen a handful of tech companies come together as the License on Transfer Network, a group hoping to disarm the “Patent Assertion Entities.” Joining the LOT Network comes via an agreement that creates a license for use of a patent by anyone in the LOT network once that patent is sold. The thought is that the NPEs who consider purchasing patents from companies in the LOT Network will have fewer companies to sue since the license to the other active LOT participants will have triggered upon the transfer and, thus, the NPE will not be as inclined to “troll.” For instance, if a member-company such as Google were to sell a patent to a non-member company and an NPE bought that patent, the NPE would not be able to sue any members of the LOT Network with that patent.

Other notes

NPEs are only as evil as the people who run them—that being said, there are plenty of horror stories of small businesses receiving phantom demand letters that threaten a patent infringement suit without identifying themselves or the patent. This is an out-and-out scam and a plague on society that results in wasted time and resource, and inevitably higher prices on the consumer end.

It is a sin and a shame that patent rights can be misused in scams and shake-downs of businesses around us, but there is a reason that U.S. courts are so often used to defend patent rights. The PwC study, at minimum, reflects the high stakes of the patent market and perhaps the fragility. Nevertheless, merely monitoring the courts may not keep the trolls at bay.

I’d love to hear your thoughts.

*This is provided for informational purposes only, and does not constitute legal or financial advice. The information expressed is subject to change at any time and should be checked for completeness, accuracy and current applicability. For advice, consult a suitably licensed attorney or patent agent.

 

III. Large-scale analysis finds majority of clinical trials don’t provide meaningful evidence

Ineffective TreatmentsMedical Ethics • Tags: Center for Drug Evaluation and ResearchClinical trialCTTIDuke University HospitalFDAFood and Drug AdministrationNational Institutes of HealthUnited States National Library of Medicine

04 May 2012

DURHAM, N.C.— The largest comprehensive analysis of ClinicalTrials.gov finds that clinical trials are falling short of producing high-quality evidence needed to guide medical decision-making. The analysis, published today in JAMA, found the majority of clinical trials is small, and there are significant differences among methodical approaches, including randomizing, blinding and the use of data monitoring committees.

“Our analysis raises questions about the best methods for generating evidence, as well as the capacity of the clinical trials enterprise to supply sufficient amounts of high quality evidence to ensure confidence in guideline recommendations,” said Robert Califf, M.D., first author of the paper, vice chancellor for clinical research at Duke University Medical Center, and director of the Duke Translational Medicine Institute.

The analysis was conducted by the Clinical Trials Transformation Initiative (CTTI), a public private partnership founded by the Food and Drug Administration (FDA) and Duke. It extends the usability of the data in ClinicalTrials.gov for research by placing the data through September 27, 2010 into a database structured to facilitate aggregate analysis. This publically accessible database facilitates the assessment of the clinical trials enterprise in a more comprehensive manner than ever before and enables the identification of trends by study type.

 

The National Library of Medicine (NLM), a part of the National Institutes of Health, developed and manages ClinicalTrials.gov. This site maintains a registry of past, current, and planned clinical research studies.

“Since 2007, the Food and Drug Administration Amendment Act has required registration of clinical trials, and the expanded scope and rigor of trial registration policies internationally is producing more complete data from around the world,” stated Deborah Zarin, MD, director, ClinicalTrials.gov, and assistant director for clinical research projects, NLM. “We have amassed over 120,000 registered clinical trials. This rich repository of data has a lot to say about the national and international research portfolio.”

This CTTI project was a collaborative effort by informaticians, statisticians and project managers from NLM, FDA and Duke. CTTI comprises more than 60 member organizations with the goal of identifying practices that will improve the quality and efficiency of clinical trials.

“Since the ClinicalTrials.gov registry contains studies sponsored by multiple entities, including government, industry, foundations and universities, CTTI leaders recognized that it might be a valuable source for benchmarking the state of the clinical trials enterprise,” stated Judith Kramer, MD, executive director of CTTI.

The project goal was to produce an easily accessible database incorporating advances in informatics to permit a detailed characterization of the body of clinical research and facilitate analysis of groups of studies by therapeutic areas, by type of sponsor, by number of participants and by many other parameters.

“Analysis of the entire portfolio will enable the many entities in the clinical trials enterprise to examine their practices in comparison with others,” says Califf. “For example, 96% of clinical trials have ≤1000 participants, and 62% have ≤ 100. While there are many excellent small clinical trials, these studies will not be able to inform patients, doctors and consumers about the choices they must make to prevent and treat disease.”

The analysis showed heterogeneity in median trial size, with cardiovascular trials tending to be twice as large as those in oncology and trials in mental health falling in the middle. It also showed major differences in the use of randomization, blinding, and data monitoring committees, critical issues often used to judge the quality of evidence for medical decisions in clinical practice guidelines and systematic overviews.

“These results reinforce the importance of exploration, analysis and inspection of our clinical trials enterprise,” said Rachel Behrman Sherman, MD, associate director for the Office of Medical Policy at the FDA’s Center for Drug Evaluation and Research. “Generation of this evidence will contribute to our understanding of the number of studies in different phases of research, the therapeutic areas, and ways we can improve data collection about clinical trials, eventually improving the quality of clinical trials.”

Related articles

 

IV.  Lawmakers urge CMS to extend MU hardship exemption for pathologists

 

Eighty-nine members of Congress have asked the Centers for Medicare & Medicaid Services to give pathologists a break and extend the hardship exemption they currently enjoy for all of Stage 3 of the Meaningful Use program.In the letter–dated July 10 and addressed to CMS Administrator Marilyn Tavenner–the lawmakers point out that CMS had recognized in its 2012 final rule implementing Stage 2 of the program that it was difficult for pathologists to meet the Meaningful Use requirements and granted a one year exception for 2015, the first year that penalties will be imposed. They now are asking that the exception be expanded to include the full five-year maximum allowed under the American Recovery and Reinvestment Act.

“Pathologists have limited direct contact with patients and do not operate in EHRs,” the letter states. “Instead, pathologists use sophisticated computerized laboratory information systems (LISs) to support the work of analyzing patient specimens and generating test results. These LISs exchange laboratory and pathology data with EHRs.”

Interestingly, the lawmakers’ exemption request is only on behalf of pathologists, even though CMS had granted the one-year hardship exception to pathologists, radiologists and anesthesiologists.

Rep. Tom Price (R-Ga.), one of the members spearheading the letter, had also introduced a bill (H.R. 1309) in March 2013 that would exclude pathologists from the incentives and penalties of the Meaningful Use program. The bill, which has 31 cosponsors, is currently sitting in committee. That bill also does not include relief for radiologists or anesthesiologists.

CMS has provided some flexibility about the hardship exceptions in the past, most recently by allowing providers to apply for one due to EHR vendor delays in upgrading to Stage 2 of the program.

However, CMS also noted in the 2012 rule granting the one-year exception that it was granting the exception in large part because of the then-current lack of health information exchange and that “physicians in these three specialties should not expect that this exception will continue indefinitely, nor should they expect that we will grant the exception for the full 5-year period permitted by statute.”

To learn more:
– read the letter (.pdf)

Read Full Post »

USPTO Guidance On Patentable Subject Matter

USPTO Guidance On Patentable Subject Matter

Curator and Reporter: Larry H Bernstein, MD, FCAP

LH Bernstein

LH Bernstein

 

 

 

 

 

 

Revised 4 July, 2014

http://pharmaceuticalintelligence.com/2014/07/03/uspto-guidance-on-patentable-subject-matter

 

I came across a few recent articles on the subject of US Patent Office guidance on patentability as well as on Supreme Court ruling on claims. I filed several patents on clinical laboratory methods early in my career upon the recommendation of my brother-in-law, now deceased.  Years later, after both brother-in-law and patent attorney are no longer alive, I look back and ask what I have learned over $100,000 later, with many trips to the USPTO, opportunities not taken, and a one year provisional patent behind me.

My conclusion is

(1) that patents are for the protection of the innovator, who might realize legal protection, but the cost and the time investment can well exceed the cost of startup and building a small startup enterprize, that would be the next step.

(2) The other thing to consider is the capability of the lawyer or firm that represents you.  A patent that is well done can be expected to take 5-7 years to go through with due diligence.   I would not expect it to be done well by a university with many other competing demands. I might be wrong in this respect, as the climate has changed, and research universities have sprouted engines for change.  Experienced and productive faculty are encouraged or allowed to form their own such entities.

(3) The emergence of Big Data, computational biology, and very large data warehouses for data use and integration has changed the landscape. The resources required for an individual to pursue research along these lines is quite beyond an individuals sole capacity to successfully pursue without outside funding.  In addition, the changed designated requirement of first to publish has muddied the water.

Of course, one can propose without anything published in the public domain. That makes it possible for corporate entities to file thousands of patents, whether there is actual validation or not at the time of filing.  It would be a quite trying experience for anyone to pursue in the USPTO without some litigation over ownership of patent rights. At this stage of of technology development, I have come to realize that the organization of research, peer review, and archiving of data is still at a stage where some of the best systems avalailable for storing and accessing data still comes considerably short of what is needed for the most complex tasks, even though improvements have come at an exponential pace.

I shall not comment on the contested views held by physicists, chemists, biologists, and economists over the completeness of guiding theories strongly held.  Only history will tell.  Beliefs can hold a strong sway, and have many times held us back.

I am not an expert on legal matters, but it is incomprehensible to me that issues concerning technology innovation can be adjudicated in the Supreme Court, as has occurred in recent years. I have postgraduate degrees in  Medicine, Developmental Anatomy, and post-medical training in pathology and laboratory medicine, as well as experience in analytical and research biochemistry.  It is beyond the competencies expected for these type of cases to come before the Supreme Court, or even to the Federal District Courts, as we see with increasing frequency,  as this has occurred with respect to the development and application of the human genome.

I’m not sure that the developments can be resolved for the public good without a more full development of an open-access system of publishing. Now I present some recent publication about, or published by the USPTO.

DR ANTHONY MELVIN CRASTO

Dr. Melvin Castro - Organic Chemistry and New Drug Development

Dr. Melvin Castro – Organic Chemistry and New Drug Development

 

 

 

 

 

 

 

 

YOU ARE FOLLOWING THIS BLOG You are following this blog, along with 1,014 other amazing people (manage).

patentimages.storage.goog…

USPTO Guidance On Patentable Subject Matter: Impediment to Biotech Innovation

Joanna T. Brougher, David A. Fazzolare J Commercial Biotechnology 2014 20(3):Brougher

jcbiotech-patents

jcbiotech-patents

 

 

 

 

 

 

 

 

 

 

 

Abstract In June 2013, the U.S. Supreme Court issued a unanimous decision upending more than three decades worth of established patent practice when it ruled that isolated gene sequences are no longer patentable subject matter under 35 U.S.C. Section 101.While many practitioners in the field believed that the USPTO would interpret the decision narrowly, the USPTO actually expanded the scope of the decision when it issued its guidelines for determining whether an invention satisfies Section 101.

The guidelines were met with intense backlash with many arguing that they unnecessarily expanded the scope of the Supreme Court cases in a way that could unduly restrict the scope of patentable subject matter, weaken the U.S. patent system, and create a disincentive to innovation. By undermining patentable subject matter in this way, the guidelines may end up harming not only the companies that patent medical innovations, but also the patients who need medical care.  This article examines the guidelines and their impact on various technologies.

Keywords:   patent, patentable subject matter, Myriad, Mayo, USPTO guidelines

Full Text: PDF

References

35 U.S.C. Section 101 states “Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.

” Prometheus Laboratories, Inc. v. Mayo Collaborative Services, 566 U.S. ___ (2012)

Association for Molecular Pathology et al., v. Myriad Genetics, Inc., 569 U.S. ___ (2013).

Parke-Davis & Co. v. H.K. Mulford Co., 189 F. 95, 103 (C.C.S.D.N.Y. 1911)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Funk Brothers Seed Co. v. Kalo Inoculant Co., 333 U.S. 127, 131 (1948)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

DOI: http://dx.doi.org/10.5912/jcb664

 

Science 4 July 2014; 345 (6192): pp. 14-15  DOI: http://dx.doi.org/10.1126/science.345.6192.14
  • IN DEPTH

INTELLECTUAL PROPERTY

Biotech feels a chill from changing U.S. patent rules

A 2013 Supreme Court decision that barred human gene patents is scrambling patenting policies.

PHOTO: MLADEN ANTONOV/AFP/GETTY IMAGES

A year after the U.S. Supreme Court issued a landmark ruling that human genes cannot be patented, the biotech industry is struggling to adapt to a landscape in which inventions derived from nature are increasingly hard to patent. It is also pushing back against follow-on policies proposed by the U.S. Patent and Trademark Office (USPTO) to guide examiners deciding whether an invention is too close to a natural product to deserve patent protection. Those policies reach far beyond what the high court intended, biotech representatives say.

“Everything we took for granted a few years ago is now changing, and it’s generating a bit of a scramble,” says patent attorney Damian Kotsis of Harness Dickey in Troy, Michigan, one of more than 15,000 people who gathered here last week for the Biotechnology Industry Organization’s (BIO’s) International Convention.

At the meeting, attorneys and executives fretted over the fate of patent applications for inventions involving naturally occurring products—including chemical compounds, antibodies, seeds, and vaccines—and traded stories of recent, unexpected rejections by USPTO. Industry leaders warned that the uncertainty could chill efforts to commercialize scientific discoveries made at universities and companies. Some plan to appeal the rejections in federal court.

USPTO officials, meanwhile, implored attendees to send them suggestions on how to clarify and improve its new policies on patenting natural products, and even announced that they were extending the deadline for public comment by a month. “Each and every one of you in this room has a moral duty … to provide written comments to the PTO,” patent lawyer and former USPTO Deputy Director Teresa Stanek Rea told one audience.

At the heart of the shake-up are two Supreme Court decisions: the ruling last year in Association for Molecular Pathology v. Myriad Genetics Inc. that human genes cannot be patented because they occur naturally (Science, 21 June 2013, p. 1387); and the 2012 Mayo v. Prometheus decision, which invalidated a patent on a method of measuring blood metabolites to determine drug doses because it relied on a “law of nature” (Science, 12 July 2013, p. 137).

Myriad and Mayo are already having a noticeable impact on patent decisions, according to a study released here. It examined about 1000 patent applications that included claims linked to natural products or laws of nature that USPTO reviewed between April 2011 and March 2014. Overall, examiners rejected about 40%; Myriad was the basis for rejecting about 23% of the applications, and Mayo about 35%, with some overlap, the authors concluded. That rejection rate would have been in the single digits just 5 years ago, asserted Hans Sauer, BIO’s intellectual property counsel, at a press conference. (There are no historical numbers for comparison.) The study was conducted by the news service Bloomberg BNA and the law firm Robins, Kaplan, Miller & Ciseri in Minneapolis, Minnesota.

USPTO is extending the decisions far beyond diagnostics and DNA?

The numbers suggest USPTO is extending the decisions far beyond diagnostics and DNA, attorneys say. Harness Dickey’s Kotsis, for example, says a client recently tried to patent a plant extract with therapeutic properties; it was different from anything in nature, Kotsis argued, because the inventor had altered the relative concentrations of key compounds to enhance its effect. Nope, decided USPTO, too close to nature.

In March, USPTO released draft guidance designed to help its examiners decide such questions, setting out 12 factors for them to weigh. For example, if an examiner deems a product “markedly different in structure” from anything in nature, that counts in its favor. But if it has a “high level of generality,” it gets dinged.

The draft has drawn extensive criticism. “I don’t think I’ve ever seen anything as complicated as this,” says Kevin Bastian, a patent attorney at Kilpatrick Townsend & Stockton in San Francisco, California. “I just can’t believe that this will be the standard.”

USPTO officials appear eager to fine-tune the draft guidance, but patent experts fear the Supreme Court decisions have made it hard to draw clear lines. “The Myriad decision is hopelessly contradictory and completely incoherent,” says Dan Burk, a law professor at the University of California, Irvine. “We know you can’t patent genetic sequences,” he adds, but “we don’t really know why.”

Get creative in using Draft Guidelines!

For now, Kostis says, applicants will have to get creative to reduce the chance of rejection. Rather than claim protection for a plant extract itself, for instance, an inventor could instead patent the steps for using it to treat patients. Other biotech attorneys may try to narrow their patent claims. But there’s a downside to that strategy, they note: Narrower patents can be harder to protect from infringement, making them less attractive to investors. Others plan to wait out the storm, predicting USPTO will ultimately rethink its guidance and ease the way for new patents.

 

Public comment period extended

USPTO has extended the deadline for public comment to 31 July, with no schedule for issuing final language. Regardless of the outcome, however, Stanek Rea warned a crowd of riled-up attorneys that, in the world of biopatents, “the easy days are gone.”

 

United States Patent and Trademark Office

Today we published and made electronically available a new edition of the Manual of Patent Examining Procedure (MPEP). Manual of Patent Examining Procedure uspto.gov http://www.uspto.gov/web/offices/pac/mpep/index.html Summary of Changes

PDF Title Page
PDF Foreword
PDF Introduction
PDF Table of Contents
PDF Chapter 600 –
PDF   Parts, Form, and Content of Application Chapter 700 –
PDF    Examination of Applications Chapter 800 –
PDF   Restriction in Applications Filed Under 35 U.S.C. 111; Double Patenting Chapter 900 –
PDF   Prior Art, Classification, and Search Chapter 1000 –
PDF  Matters Decided by Various U.S. Patent and Trademark Office Officials Chapter 1100 –
PDF   Statutory Invention Registration (SIR); Pre-Grant Publication (PGPub) and Preissuance Submissions Chapter 1200 –
PDF    Appeal Chapter 1300 –
PDF   Allowance and Issue Appendix L –
PDF   Patent Laws Appendix R –
PDF   Patent Rules Appendix P –
PDF   Paris Convention Subject Matter Index 
PDF Zipped version of the MPEP current revision in the PDF format.

Manual of Patent Examining Procedure (MPEP)Ninth Edition, March 2014

The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to:
http://uspto-mpep.ideascale.com.

Manual of Patent Examining Procedure (MPEP) Ninth Edition, March 2014
The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to: http://uspto-mpep.ideascale.com.

Note: For current fees, refer to the Current USPTO Fee Schedule.
Consolidated Laws – The patent laws in effect as of May 15, 2014. Consolidated Rules – The patent rules in effect as of May 15, 2014.  MPEP Archives (1948 – 2012)
Current MPEP: Searchable MPEP

The documents updated in the Ninth Edition of the MPEP, dated March 2014, include changes that became effective in November 2013 or earlier.
All of the documents have been updated for the Ninth Edition except Chapters 800, 900, 1000, 1300, 1700, 1800, 1900, 2000, 2300, 2400, 2500, and Appendix P.
More information about the changes and updates is available from the “Blue Page – Introduction” of the Searchable MPEP or from the “Summary of Changes” link to the HTML and PDF versions provided below. Discuss the Manual of Patent Examining Procedure (MPEP) Welcome to the MPEP discussion tool!

We have received many thoughtful ideas on Chapters 100-600 and 1800 of the MPEP as well as on how to improve the discussion site. Each and every idea submitted by you, the participants in this conversation, has been carefully reviewed by the Office, and many of these ideas have been implemented in the August 2012 revision of the MPEP and many will be implemented in future revisions of the MPEP. The August 2012 revision is the first version provided to the public in a web based searchable format. The new search tool is available at http://mpep.uspto.gov. We would like to thank everyone for participating in the discussion of the MPEP.

We have some great news! Chapters 1300, 1500, 1600 and 2400 of the MPEP are now available for discussion. Please submit any ideas and comments you may have on these chapters. Also, don’t forget to vote on ideas and comments submitted by other users. As before, our editorial staff will periodically be posting proposed new material for you to respond to, and in some cases will post responses to some of the submitted ideas and comments.Recently, we have received several comments concerning the Leahy-Smith America Invents Act (AIA). Please note that comments regarding the implementation of the AIA should be submitted to the USPTO via email t aia_implementation@uspto.gov or via postal mail, as indicated at the America Invents Act Web site. Additional information regarding the AIA is available at www.uspto.gov/americainventsact  We have also received several comments suggesting policy changes which have been routed to the appropriate offices for consideration. We really appreciate your thinking and recommendations!

FDA Guidance for Industry:Electronic Source Data in Clinical Investigations

Electronic Source Data

Electronic Source Data

 

 

 

 

 

 

 

The FDA published its new Guidance for Industry (GfI) – “Electronic Source Data in Clinical Investigations” in September 2013.
The Guidance defines the expectations of the FDA concerning electronic source data generated in the context of clinical trials. Find out more about this Guidance.
http://www.gmp-compliance.org/enews_4288_FDA%20Guidance%20for%20Industry%3A%20Electronic%20Source%20Data%20in%20Clinical%20Investigations
_8534,8457,8366,8308,Z-COVM_n.html

After more than 5 years and two draft versions, the final version of the Guidance for
Industry (GfI) – “Electronic Source Data in Clinical Investigations” was published in
September 2013. This new FDA Guidance defines the FDA’s expectations for sponsors,
CROs, investigators and other persons involved in the capture, review and retention of
electronic source data generated in the context of FDA-regulated clinical trials.In an
effort to encourage the modernization and increased efficiency of processes in clinical
trials, the FDA clearly supports the capture of electronic source data and emphasizes
the agency’s intention to support activities aimed at ensuring the reliability, quality,
integrity and traceability of this source data, from its electronic source to the electronic
submission of the data in the context of an authorization procedure. The Guidance
addresses aspects as data capture, data review and record retention. When the
computerized systems used in clinical trials are described, the FDA recommends
that the description not only focus on the intended use of the system, but also on
data protection measures and the flow of data across system components and
interfaces. In practice, the pharmaceutical industry needs to meet significant
requirements regarding organisation, planning, specification and verification of
computerized systems in the field of clinical trials. The FDA also mentions in the
Guidance that it does not intend to apply 21 CFR Part 11 to electronic health records
(EHR). Author: Oliver Herrmann Q-Infiity Source: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/
Guidances/UCM328691.pdf
Webinar: https://collaboration.fda.gov/p89r92dh8wc

 

Read Full Post »

Larry H Bernstein, MD, FCAP, Author and Curator

http://pharmaceuticalintelligence.com/2014/06/22/Proteomics – The Pathway to Understanding and Decision-making in Medicine

This dialogue is a series of discussions introducing several perspective on proteomics discovery, an emerging scientific enterprise in the -OMICS- family of disciplines that aim to clarify many of the challenges toward the understanding of disease and aiding in the diagnosis as well as guiding treatment decisions. Beyond that focus, it will contribute to personalized medical treatment in facilitating the identification of treatment targets for the pharmaceutical industry. Despite enormous advances in genomics research over the last two decades, there is a still a problem in reaching anticipated goals for introducing new targeted treatments that has seen repeated failures in stage III of clinical trials, and even when success has been achieved, it is temporal.  The other problem has been toxicity of agents widely used in chemotherapy.  Even though the genomic approach brings relieve to the issues of toxicity found in organic chemistry derivative blocking reactions, the specificity for the target cell without an effect on normal cells has been elusive.

This is not confined to cancer chemotherapy, but can also be seen in pain medication, and has been a growing problem in antimicrobial therapy.  The stumbling block has been inability to manage a multiplicity of reactions that also have to be modulated in a changing environment based on 3-dimension structure of proteins, pH changes, ionic balance, micro- and macrovascular circulation, and protein-protein and protein- membrane interactions. There is reason to consider that the present problems can be overcome through a much better modification of target cellular metabolism as we peel away the confounding and blinding factors with a multivariable control of these imbalances, like removing the skin of an onion.

This is the first of a series of articles, and for convenience we shall here  only emphasize the progress of application of proteomics to cardiovascular disease.

growth in funding proteomics 1990-2010

growth in funding proteomics 1990-2010

Part I.

Panomics: Decoding Biological Networks  (Clinical OMICs 2014; 5)

Technological advances such as high-throughput sequencing are transforming medicine from symptom-based diagnosis and treatment to personalized medicine as scientists employ novel rapid genomic methodologies to gain a broader comprehension of disease and disease progression. As next-generation sequencing becomes more rapid, researchers are turning toward large-scale pan-omics, the collective use of all omics such as genomics, epigenomics, transcriptomics, proteomics, metabolomics, lipidomics and lipoprotein proteomics, to better understand, identify, and treat complex disease.

Genomics has been a cornerstone in understanding disease, and the sequencing of the human genome has led to the identification of numerous disease biomarkers through genome-wide association studies (GWAS). It was the goal of these studies that these biomarkers would serve to predict individual disease risk, enable early detection of disease, help make treatment decisions, and identify new therapeutic targets. In reality, however, only a few have gone on to become established in clinical practice. For example in human GWAS studies for heart failure at least 35 biomarkers have been identified but only natriuretic peptides have moved into clinical practice, where they are limited primarily for use as a diagnostic tool.

Proteomics Advances Will Rival the Genetics Advances of the Last Ten Years

Seventy percent of the decisions made by physicians today are influenced by results of diagnostic tests, according to N. Leigh Anderson, founder of the Plasma Proteome Institute and CEO of SISCAPA Assay Technologies. Imagine the changes that will come about when future diagnostics tests are more accurate, more useful, more economical, and more accessible to healthcare practitioners. For Dr. Anderson, that’s the promise of proteomics, the study of the structure and function of proteins, the principal constituents of the protoplasm of all cells.

In explaining why proteomics is likely to have such a major impact, Dr. Anderson starts with a major difference between the genetic testing common today, and the proteomic testing that is fast coming on the scene. “Most genetic tests are aimed at measuring something that’s constant in a person over his or her entire lifetime. These tests provide information on the probability of something happening, and they can help us understand the basis of various diseases and their potential risks. What’s missing is, a genetic test is not going to tell you what’s happening to you right now.”

Mass Spec-Based Multiplexed Protein Biomarkers

Clinical proteomics applications rely on the translation of targeted protein quantitation technologies and methods to develop robust assays that can guide diagnostic, prognostic, and therapeutic decision-making. The development of a clinical proteomics-based test begins with the discovery of disease-relevant biomarkers, followed by validation of those biomarkers.

“In common practice, the discovery stage is performed on a MS-based platform for global unbiased sampling of the proteome, while biomarker qualification and clinical implementation generally involve the development of an antibody-based protocol, such as the commonly used enzyme linked ELISA assays,” state López et al. in Proteome Science (2012; 10: 35–45). “Although this process is potentially capable of delivering clinically important biomarkers, it is not the most efficient process as the latter is low-throughput, very costly, and time-consuming.”

Part II.  Proteomics for Clinical and Research Use: Combining Protein Chips, 2D Gels and Mass Spectrometry in 

The next Step: Exploring the Proteome: Translation and Beyond

N. Leigh Anderson, Ph.D., Chief Scientific Officer, Large Scale Proteomics Corporation

Three streams of technology will play major roles in quantitative (expression) proteomics over the coming decade. Two-dimensional electrophoresis and mass spectrometry represent well-established methods for, respectively, resolving and characterizing proteins, and both have now been automated to enable the high-throughput generation of data from large numbers of samples.

These methods can be powerfully applied to discover proteins of interest as diagnostics, small molecule therapeutic targets, and protein therapeutics. However, neither offers a simple, rapid, routine way to measure many proteins in common samples like blood or tissue homogenates.

Protein chips do offer this possibility, and thus complete the triumvirate of technologies that will deliver the benefits of proteomics to both research and clinical users. Integration of efforts in all three approaches are discussed, highlighting the application of the Human Protein Index® database as a source of protein leads.

leighAnderson

leighAnderson

N. Leigh Anderson, Ph D. is Chief Scientific Officer of the Proteomics subsidiary of Large Scale Biology Corporation (LSBC).
Dr. Anderson obtained his B.A. in Physics with honors from Yale and a Ph.D. in Molecular Biology from Cambridge University
(England) where he worked with M. F. Perutz as a Churchill Fellow at the MRC Laboratory of Molecular Biology. Subsequently
he co-founded the Molecular Anatomy Program at the Argonne National Laboratory (Chicago) where his work in the development
of 2D electrophoresis and molecular database technology earned him, among other distinctions, the American Association for
Clinical Chemistry’s Young Investigator Award for 1982, the 1983 Pittsburgh Analytical Chemistry Award, 2008 AACC Outstanding
Research Award, and 2013 National Science Medal..

In 1985 Dr. Anderson co-founded LSBC in order to pursue commercial development and large scale applications of 2-D electro-
phoretic protein mapping technology. This effort has resulted in a large-scale proteomics analytical facility supporting research
work for LSBC and its pharmaceutical industry partners. Dr. Anderson’s current primary interests are in the automation of proteomics
technologies, and the expansion of LSBC’s proteomics databases describing drug effects and disease processes in vivo and in vitro.
Large Scale Biology went public in August 2000.

Part II. Plasma Proteomics: Lessons in Biomarkers and Diagnostics

Exposome Workshop
N Leigh Anderson
Washington 8 Dec 2011

QUESTIONS AND LESSONS:

CLINICAL DIAGNOSTICS AS A MODEL FOR EXPOSOME INDICATORS
TECHNOLOGY OPTIONS FOR MEASURING PROTEIN RESPONSES TO EXPOSURES
SCALE OF THE PROBLEM: EXPOSURE SIGNALS VS POPULATION NOISE

The Clinical Plasma Proteome
• Plasma and serum are the dominant non-invasive clinical sample types
– standard materials for in vitro diagnostics (IVD)
• Proteins measured in clinically-available tests in the US
– 109 proteins via FDA-cleared or approved tests
• Clinical test costs range from $9 (albumin) to $122 (Her2)
• 90% of those ever approved are still in use
– 96 additional proteins via laboratory-developed tests (not FDA
cleared or approved)
– Total 205 proteins (≅ products of 211genes, excluding Ig’s)
• Clinically applied proteins thus account for
– About 1% of the baseline human proteome (1 gene :1 protein)
– About 10% of the 2,000+ proteins observed in deep discovery
plasma proteome datasets

“New” Protein Diagnostics Are FDA-Cleared at a Rate of ~1.5/yr:
Insufficient to Meet Dx or Rx Development Needs

FDA clearance of protein diagnostics

FDA clearance of protein diagnostics

A  Major Technology Gulf Exists Between Discovery

Proteomics and Routine Diagnostic Platforms

Two Streams of Proteomics
A.  Problem Technology
Basic biology: maximum proteome coverage (including PTM’s, splices) to
provide unbiased discovery of mechanistic information
• Critical: Depth and breadth
• Not critical: Cost, throughput, quant precision

B.  Discovery proteomics
Specialized proteomics field,
large groups,
complex workflows and informatics

Part III.  Addressing the Clinical Proteome with Mass Spectrometric Assays

N. Leigh Anderson, PhD, SISCAPA Assay Technologies, Inc.

protein changes in biological mechanisms

protein changes in biological mechanisms

No Increase in FDA Cleared Protein Tests in 20 yr

“New” Protein Tests in Plasma Are FDA-Cleared at a Rate of ~1.5/yr:
Insufficient to Meet Dx or Rx Development Needs

See figure above

An Explanation: the Biomarker Pipeline is Blocked at the Verification Step

Immunoassay Weaknesses Impact Biomarker Verification

1) Specificity: what actually forms the immunoassay sandwich – or prevents its
formation – is not directly visualized

2) Cost: an assay developed to FDA approvable quality costs $2-5M per
protein

Major_Plasma_Proteins

Major_Plasma_Proteins

Immunoassay vs Hybrid MS-based assays

Immunoassay vs Hybrid MS-based assays

MASS SPECTROMETRY: MRM’s provide what is missing in..IMMUNOASSAYS:

– SPECIFICITY
– INTERNAL STANDARDIZATION
– MULTIPLEXING
– RAPID CONFIGURATION PROVIDED A PROTEIN CAN ACT LIKE A SMALL
MOLECULE

MRM of Proteotypic Tryptic Peptides Provides Highly Specific Assays for Proteins > 1ug/ml in Plasma

Peptide-Level MS Provides High Structural Specificity
Multiple Reaction Monitoring (MRM) Quantitation

ADDRESSING MRM LIMITATIONS VIA SPECIFIC ENRICHMENT OF ANALYTE  PEPTIDES: SISCAPA

– SENSITIVITY
– THROUGHPUT (LC-MS/MS CYCLE TIME)

SISCAPA combines best features of immuno and MS

SISCAPA combines best features of immuno and MS

SISCAPA Process Schematic Diagram
Stable Isotope-labeled Standards with Capture on Anti-Peptide Antibodies

An automated process for SISCAPA targeted protein quantitation utilizes high affinity capture antibodies that are immobilized on magnetic beads

An automated process for SISCAPA targeted protein quantitation utilizes high affinity capture antibodies that are immobilized on magnetic beads

Antibodies sequence specific peptide binding

Antibodies sequence specific peptide binding

SISCAP target enrichmant

SISCAP target enrichmant

Multiple reaction monitoring (MRM) quantitation

Multiple reaction monitoring (MRM) quantitation

protein-quantitation-via-signature-peptides.png

protein-quantitation-via-signature-peptides.png

First SISCAP Assay - thyroglobulin

First SISCAP Assay – thyroglobulin

personalized reference range within population range

Glycemic control in DM

Glycemic control in DM

Part IV. National Heart, Lung, and Blood Institute Clinical

Proteomics Working Group Report
Christopher B. Granger, MD; Jennifer E. Van Eyk, PhD; Stephen C. Mockrin, PhD;
N. Leigh Anderson, PhD; on behalf of the Working Group Members*
Circulation. 2004;109:1697-1703 doi: 10.1161/01.CIR.0000121563.47232.2A
http://circ.ahajournals.org/content/109/14/1697

Abstract—The National Heart, Lung, and Blood Institute (NHLBI) Clinical Proteomics Working Group
was charged with identifying opportunities and challenges in clinical proteomics and using these as a
basis for recommendations aimed at directly improving patient care. The group included representatives
of clinical and translational research, proteomic technologies, laboratory medicine, bioinformatics, and
2 of the NHLBI Proteomics Centers, which form part of a program focused on innovative technology development.

This report represents the results from a one-and-a-half-day meeting on May 8 and 9, 2003. For the purposes
of this report, clinical proteomics is defined as the systematic, comprehensive, large-scale identification of
protein patterns (“fingerprints”) of disease and the application of this knowledge to improve patient care
and public health through better assessment of disease susceptibility, prevention of disease, selection of
therapy for the individual, and monitoring of treatment response. (Circulation. 2004;109:1697-1703.)
Key Words: proteins diagnosis prognosis genetics plasma

Part V.  Overview: The Maturing of Proteomics in Cardiovascular Research

Jennifer E. Van Eyk
Circ Res. 2011;108:490-498  doi: 10.1161/CIRCRESAHA.110.226894
http://circres.ahajournals.org/content/108/4/490

Abstract: Proteomic technologies are used to study the complexity of proteins, their roles, and biological functions.
It is based on the premise that the diversity of proteins, comprising their isoforms, and posttranslational modifications
(PTMs) underlies biology.

Based on an annotated human cardiac protein database, 62% have at least one PTM (phosphorylation currently dominating),
whereas 25% have more than one type of modification.

The field of proteomics strives to observe and quantify this protein diversity. It represents a broad group of technologies
and methods arising from analytic protein biochemistry, analytic separation, mass spectrometry, and bioinformatics.
Since the 1990s, the application of proteomic analysis has been increasingly used in cardiovascular research.

prevalence-of-cardiovascular-diseases-in-adults-by-age-and-sex-u-s-2007-2010.

prevalence-of-cardiovascular-diseases-in-adults-by-age-and-sex-u-s-2007-2010.

Technology development and adaptation have been at the heart of this progress. Technology undergoes a maturation,

becoming routine and ultimately obsolete, being replaced by newer methods. Because of extensive methodological
improvements, many proteomic studies today observe 1000 to 5000 proteins.

Only 5 years ago, this was not feasible. Even so, there are still road blocks. Nowadays, there is a focus on obtaining
better characterization of protein isoforms and specific PTMs. Consequentl, new techniques for identification and
quantification of modified amino acid residues are required, as is the assessment of single-nucleotide polymorphisms
in addition to determination of the structural and functional consequences.

In this series, 4 articles provide concrete examples of how proteomics can be incorporated into cardiovascular
research and address specific biological questions. They also illustrate how novel discoveries can be made and
how proteomic technology has continued to evolve. (Circ Res. 2011;108:490-498.)
Key Words: proteomics technology protein isoform posttranslational modification polymorphism

Part VI.   The -omics era: Proteomics and lipidomics in vascular research

Athanasios Didangelos, Christin Stegemann, Manuel Mayr∗

King’s British Heart Foundation Centre, King’s College London, UK

Atherosclerosis 2012; 221: 12– 17     http://dx.doi.org/10.1016/j.atherosclerosis.2011.09.043

a b s t r a c t

A main limitation of the current approaches to atherosclerosis research is the focus on the investigation of individual
factors, which are presumed to be involved in the pathophysiology and whose biological functions are, at least in part, understood.

These molecules are investigated extensively while others are not studied at all. In comparison to our detailed
knowledge about the role of inflammation in atherosclerosis, little is known about extracellular matrix remodelling
and the retention of individual lipid species rather than lipid classes in early and advanced atherosclerotic lesions.

The recent development of mass spectrometry-based methods and advanced analytical tools are transforming
our ability to profile extracellular proteins and lipid species in animal models and clinical specimen with the goal
of illuminating pathological processes and discovering new biomarkers.

Fig. 1. ECM in atherosclerosis

Fig. 1. ECM in atherosclerosis. The bulk of the vascular ECM is synthesised by smooth muscle cells and composed primarily of collagens, proteoglycans and glycoproteins.During the early stages of atherosclerosis, LDL binds to the proteoglycans of the vessel wall, becomes modified, i.e. by oxidation (ox-LDL), and sustains a proinflammatory cascade that is proatherogenic

Lipidomics of atherosclerotic plaques

Lipidomics of atherosclerotic plaques

Fig. 2. Lipidomics of atherosclerotic plaques. Lipids were separated by ultra performance reverse phase
liquid chromatography on a Waters® ACQUITY UPLC® (HSS T3 Column, 100 mm × 2.1 mm i.d., 1.8 _m
particle size, 55 ◦C, flow rate 400 _L/min, Waters, Milford MA, USA) and analyzed on a quadrupole time-of-flight
mass spectrometer (Waters® SYNAPTTM HDMSTM system) in both positive (A) and negative ion mode (C).
In positive MS mode, lysophosphatidyl-cholines (lPCs) and lysophosphatidylethanolamines (lPEs) eluted first;
followed by phosphatidylcholines (PCs), sphingomyelin (SMs), phosphatidylethanol-amines (PEs) and cholesteryl
esters (CEs); diacylglycerols (DAGs) and triacylglycerols (TAGs) had the longest retention times. In negative MS mode,
fatty acids (FA) were followed by phosphatidyl-glycerols (PGs), phosphatidyl-inositols (PIs), phosphatidylserines (PS)
and PEs. The chromatographic peaks corresponding to the different classes were detected as retention time-mass to
charge ratio (m/z) pairs and their areas were recorded. Principal component analyses on 629 variables from triplicate
analysis (C1, 2, 3 = control 1, 2, 3; P1, 2, 3 = endarterectomy patient 1, 2, 3) demonstrated a clear separation of
atherosclerotic plaques and control radial arteries in positive (B) and negative (D) ion mode. The clustering of the
technical replicates and the central projection of the pooled sample within the scores plot confirm the reproducibility
of the analyses, and the Goodness of Fit test returned a chi-squared of 0.4 and a R-squared value of 0.6.

Challenges in mass spectrometry

Mass spectrometry is an evolving technology and the technological advances facilitate the detection and quantification
of scarce proteins. Nonetheless, the enrichment of specific subproteomes using differential solubilityor isolation of cellular
organelleswill remain important to increase coverage and, at least partially, overcome the inhomogeneity of diseased tissue,
one of the major factors affecting sample-to-sample variation.

Proteomics is also the method of choice for the identification of post-translational modifications, which play an essential
role in protein function, i.e. enzymatic activation, binding ability and formation of ECM structures. Again, efficient enrichment
is essential to increase the likelihood of identifying modified peptides in complex mixtures. Lipidomics faces similar challenges.
While the extraction of lipids is more selective, new enrichment methods are needed for scarce lipids as well as labile lipid
metabolites, that may have important bioactivity. Another pressing issue in lipidomics is data analysis, in particular the lack
of automated search engines that can analyze mass spectra obtained from instruments of different vendors. Efforts to
overcome this issue are currently underway.

Conclusions

Proteomics and lipidomics offer an unbiased platform for the investigation of ECM and lipids within atherosclerosis. In
combination, these innovative technologies will reveal key differences in proteolytic processes responsible for plaque rupture
and advance our understanding of ECM – lipoprotein interactions in atherosclerosis.

references

Virtualization in Proteomics: ‘Sakshat’ in India, at IIT Bombay(tginnovations.wordpress.com)

Proteome Portraits (the-scientist.com)

A Protease for ‘Middle-down’ Proteomics(pharmaceuticalintelligence.com)

Intrinsic Disorder in the Human Spliceosomal Proteome(ploscompbiol.org)

proteome

proteome

active site of eNOS (PDB_1P6L) and nNOS (PDB_1P6H).

active site of eNOS (PDB_1P6L) and nNOS (PDB_1P6H).

Table - metabolic  targets

Table – metabolic targets

HK-II Phosphorylation

Read Full Post »

« Newer Posts