Posts Tagged ‘Eric Topol’

2013 – Personal Perspectives on Revolutionizing Medicine and Top Stories in Cardiology

Reporter: Aviva Lev-Ari, PhD, RN

Topol Reviews 2013: A Year of Revolutionizing Medicine

 Medscape > Eric Topol on Medscape

Director, Scripps Translational Science Institute; Chief Academic Officer, Scripps Health; Professor of Genomics, The Scripps Research Institute, La Jolla, California; Editor-in-Chief, Medscape

Disclosure: Eric J. Topol, MD, has disclosed the following relevant financial relationships:
Serve(d) as a director, officer, partner, employee, advisor, consultant, or trustee for: AltheaDX; Biological Dynamics; Cypher Genomics (Co-founder); Dexcom; Genapsys; Gilead Sciences, Inc.; Portola Pharmaceuticals; Quest Diagnostics; Sotera Wireless; Volcano. Received research grant from: National Institutes of Health; Qualcomm Foundation

December 11, 2013

Practice Changers: Lab Innovations and Genetic Testing

It was almost a year ago that I signed on as Editor-in-Chief of Medscape, and I’m extremely grateful for the opportunity and for the extensive input so many of you have provided. In my last monthly newsletter for the year, I would like to expound on why I think this is the most exciting time ever in the history of medicine and how it will be imminently practice-changing.

Looking at the Laboratory

Let me first turn to laboratories, a big part of how we practice. We send our patients to the clinic or hospital lab, or a central facility, to get their blood drawn. Typically, multiple tubes of blood are obtained; the costs are not transparent; and perhaps even worse, the results are not easily or routinely accessible for most patients. Last month, I highlighted a new entity on the scene — Theranos — and interviewed Elizabeth Holmes, the young CEO.

Theranos will be in all Walgreens stores before long, leveraging microfluidic technology to do hundreds of assays with a droplet of blood, with a fully transparent cost list, and ultimately with results directly going to both the patient and doctor. After 60 years of unchanged laboratory medicine practice, this new, innovative model will help drive disruption — just the kind of shake-up that we have needed.

Second, while touching on labs, recently there has been a big flap between 23andMe, a direct-to-consumer genomics company, and the US Food and Drug Administration (FDA).[1]This was probably attributable to a prolonged lapse of communication on the part of the company, concurrent with aggressive marketing of the product. 23andMe has temporarily stopped providing health-related genetic testing, such as disease susceptibility, carrier state, and pharmacogenomics. Their intention is to work things out with the FDA and get their full $99 panel back up in the months ahead. Why is this an important issue? A Nature editorial[2] posited, “so even if regulators or doctors want to, they will not be able to stand between ordinary people and their DNA for very long.”

Genetic Tests and the “Angelina Effect”

In May of this year, Angelina Jolie published her “My Medical Choice” op-ed,[3] signaling her decision to not only get her BRCA 1,2 genes sequenced but to also undergo bilateral mastectomy.The impact of the so-called “Angelina effect” has been felt worldwide, with a large spike in BRCA testing driven by consumers, and challenges to prevailing cultural norms in places such as Israel, where there is a very high rate of pathogenic BRCA mutations but close to the lowest rate of preventive surgery.[4]

The issue at hand is the availability of genetic tests to patients, which didn’t exist before. Pregnant women can now, in their first trimester, have a single tube of blood drawn to screen for multiple chromosomal aberrations. Amniocentesis is quickly becoming a bygone procedure.[5] The power of genetic testing in practice is just starting to be felt and will be increasingly transformative in the years ahead.

Restructuring and Digitizing Medicine

The Out-of-Hospital Experience

Third, the structural icons of medicine are undergoing reassessment. What do we do with hospitals and clinics in a digital medicine world? George Halvorson, the outgoing CEO of Kaiser Permanente, has weighed in on this by saying that we should start delivering healthcare “farther and farther” from the hospital setting and “even out of doctors’ offices.”[6]

Cisco did a large consumer survey and found that over 70% of patients would prefer a virtual rather than a physical office visit.[7] A tweet that I put out in response to a Fast Company article[8] that said “the idea of going down to your doctor’s office is going to feel as foreign as going to the video store” attracted considerable attention.

Just this week, a large Intel poll of 12,000 consumers found that most believe that hospitals as we know them today will be “obsolete in the near future.”[9] The fact that we are even now questioning what to do with our hospitals and clinics is telling in itself and reflects the profound forthcoming changes in medicine.

Tracking the Human Body 

What Made Medical News in 2013?

Fourth and finally, the explosion of sensors is especially worth noting. This year, the FDA approval of smartphone ECGs and digitized pills heralded the beginning of many more novel digital ways that we will be tracking patients in the future. A watch that passively and continuously captures blood pressure from every heartbeat is just around the corner. We don’t even know what “normal” blood pressure is when it can be assessed 24/7, throughout the night, and during any time of stress, and this is representative of what the era of wireless sensor tracking will bring.

I hope that I have convinced you, with just a few examples, that this is an extraordinary time in medicine. We are all lucky to be a part of it, to see it go through its major reconfiguration and refinement. I will continue to post the links to anything that I think is particularly interesting on a daily basis via Twitter, and you are welcome to follow me @EricTopol.

Wishing you and your family all the best in the New Year. Despite some counterforces, let’s hope that 2014 takes medicine to new heights, with ever more palpable changes and improvements in the way that we render healthcare for our patients.

Eric J. Topol, MD

Editor-in-Chief, Medscape


  1. The FDA and thee. The Wall Street Journal. November 25, 2013.http://online.wsj.com/news/articles/SB10001424052702304465604579220003539640102 Accessed December 10, 2013.
  2. The FDA and me. Nature. December 3, 2013. http://www.nature.com/news/the-fda-and-me-1.14289 Accessed December 10, 2013.
  3. Jolie A. My medical choice. The New York Times. May 14, 2013.http://www.nytimes.com/2013/05/14/opinion/my-medical-choice.html Accessed December 10, 2013.
  4. Rabin RC. In Israel, a push to screen for cancer gene leaves many conflicted. The New York Times. November 26, 2013. http://www.nytimes.com/2013/11/27/health/in-israel-a-push-to-screen-for-cancer-gene-leaves-many-conflicted.html Accessed December 10, 2013.
  5. Topol EJ. Topol predicts genomic screening will replace amniocentesis. Medscape. November 11, 2013.http://www.medscape.com/viewarticle/814052 Accessed December 10, 2013.
  6. Friedman B. The future of healthcare: virtual physician visits & bedless hospitals. Lab Soft News. April 1, 2013.http://labsoftnews.typepad.com/lab_soft_news/2013/04/the-future-of-healthcare-less-emphasis-on-hospital-visits.html Accessed December 10, 2013.
  7. Cisco. Cisco study reveals 74 percent of consumers open to virtual doctor visit. March 4, 2013.http://newsroom.cisco.com/release/1148539/Cisco-Study-Reveals-74-Percent-of-Consumers-Open-to-Virtual-Doctor-Visit Accessed December 11, 2013.
  8. Fast Company. Could ePatient networks become the superdoctors of the future?http://www.fastcoexist.com/1680617/could-epatient-networks-become-the-superdoctors-of-the-future
  9. Fisher N. Global study finds majority believe traditional hospitals will be obsolete in the near future. Forbes. December 9, 2013. http://www.forbes.com/sites/theapothecary/2013/12/09/global-study-finds-majority-believe-traditional-hospitals-will-be-obsolete-in-the-near-future/ Accessed December 10, 2013.



John Mandrola’s Top 11 Cardiology Stories of 2013

by John M. Mandrola, MD

Clinical Electrophysiologist, Baptist Medical Associates, Louisville, Kentucky

Disclosure: John M. Mandrola, MD, has disclosed the following relevant financial relationships:
Served as a speaker or member of a speakers bureau for: Biosense/Webster

In Medscape Cardiology, December 20, 2013


1. Obamacare/Affordable Care Act

The reforms that sweep in with the tidal waves of Obamacare will transform the landscape of cardiology. Things look differently already, but even more change is coming. Optimism is healthier than pessimism, so my assessment is: Obamacare will be associated with better heart disease outcomes.

Here’s why: What single factor limits improvement of outcomes in heart disease? It’s surely not a lack of access to echocardiograms, or new antiplatelet drugs, or LAA occlusion devices. Rather, it’s the lack of patients’ adherence to healthy lifestyles choices. Cardiologists have reached a therapeutic threshold. Gains in the treatment of heart disease have become and will likely stay incremental. The next big jump in heart disease outcomes will require patients’ actions — not doctors’.

The chief strength of Obamacare is that it ushers in the era of cost-shifting to patients. People will pay more for care. This, I believe, will favor the adoption of healthy lifestyles. Skin in the game, will, on the whole, do great things for heart health. The car analogy: We get our oil changed in our car because preventative maintenance is cost-effective. If you never had to pay for a new car, there’d be little incentive not to trash your current one.

I can hear the naysayers. Placing more of the costs on patients will keep them from getting care. Yes, in isolated cases, which will surely be amplified — this might be true. But overall, 3 arguments refute this thinking: First is that in the past decade, both deaths from heart disease and number of cardiology procedures have declined. Patients are doing better while we do less. Second is the observation that countries that do far fewer procedures boast better CV outcomes. Third, you don’t really believe that doctors control outcomes, do you?

2. The George Bush Stent Case

More than 2 decades ago, a mentor at Indiana taught me that squishing a high-grade coronary lesion did not reduce the risk for heart attack or death. I still remember where I was when I heard that. It was that counterintuitive. The notion that the vulnerable plaque is not the one that looks like a baddie on an angiogram has been proven time and time again. What’s truly remarkable is the resistance of the cardiology community to accept it. Perchance, our visceral reactions to angiograms have clouded our interpretation of science.

Cynics would believe that the overuse of stents — in the face of contrary clinical evidence — is due to financial incentives. They point to examples of outrageous behavior on the part of a tiny few outliers behaving very badly. I can’t deny that incentives don’t play a role, but I think this story has more to do with the cognitive bias stemming from the success of acute primary angioplasty. It’s tempting to merge the stunning benefits of intervening in an acute MI situation to the nonacute situations.

The George Bush story is big because the media attention forced us to look again at the science of the COURAGE trial.[1] What’s more, this story gave strength to those who question the entrenched paradigm of ischemia-guided revascularization. Imagine the implications for cardiology if there was little reason to look for asymptomatic ischemia.

3. Cholesterol Guidelines: Who Decides the “Need” for a Statin?

The cholesterol guidelines[2] had some obvious practice-changing revelations: (1) the end of nonstatin cholesterol-lowering drugs; (2) cessation of treating to numbers; (3) the notion of using statins as cardiovascular risk reducers, rather than cholesterol-lowering drugs; (4) the fight over where CV risk warrants statin intervention.

These are big issues, but I don’t see them as the biggest part of the 2013 cholesterol guideline story. I think what makes this a tipping point in clinical cardiology is the notion that the ultimate decision to take a statin falls with the patient.

Writing to patients in Forbes, Dr. Harlan Krumholz says:

It is your decision. Your doctors can guide you, but you deserve to be informed about the decision and make the choice that feels most comfortable to you. You do not know if you will be the person who avoids a heart attack or will suffer a side effect. You should have the information about what you are likely to gain by taking the medication — and what risks you are incurring. The decision to take the drug should mean that you believe that you are more likely to benefit from the drug than to be harmed by it. And even if a drug has a benefit for you, you have a right to decide whether it is right for you.

This is huge because it brings patient-centered, shared decision-making to the mainstream. Before the cholesterol guidelines, shared decision-making was something you read about in academic journals. But now, across doctors’ offices throughout the United States, low-risk patients will have to decide whether their 1-in-100 chance of preventing a heart attack is worth the 1-in-100 chance of developing diabetes or other statin side effects. Getting patients to see tradeoffs, NNTs, and aligning care with their goals isn’t just a story of 2013; it’s a story of the decade.

JNC-8, Obesity and AF, and NOACs

4. High Blood Pressure Guidelines

I often tell this story to patients: When I was a younger doctor, I would take my 94-year-old grandfather around to see the best doctors in town. We both held to the fantasy that doctors could “fix” him. Mostly he had age-related problems. He did, however, own one shining beacon of good health: He had perfect blood pressure, without medication. My message to patients is that my grandfather lived to 94 because of those BP readings.

What I learned from my grandfather’s case, which has now been borne out in the new JNC 8 guidelines,[3] is that it matters how one achieves good blood pressure. The new guidelines, chaired by a family medicine professor (how cool is that?), continues to disrupt the concept that more drug treatment leads to better outcomes.

It is indeed striking what can be found when one looks carefully and systematically at absolute benefits of treatments from randomized clinical trials. Truthfully, did you know that there was essentially no evidence that treating mild high blood pressure in patients younger than 60 improves outcomes? I didn’t.

Here the affect heuristic looms large. I find great pleasure in the idea that the medical establishment is now poised to embrace common sense. Namely, that modifying a single risk factor with a chemical that surely has multiple system-wide effects does not necessarily improve outcomes.

5. In Electrophysiology, Treat the Underlying Cause of AF

There are a few landmark studies I keep around the exam room for show-and-tell. 2013 brought another keeper. Dr. Prashanthan Sanders and colleagues (from Adelaide, Australia) are authors of the most impactful study in all of cardiology in 2013.[4]

Here is the story: Atrial fibrillation is increasing exponentially. Electrophysiologists see patients at the end of the disease spectrum. Rate control, rhythm control, and anticoagulation are each important treatment strategies, but they don’t address the root cause of AF. In previous work in animal models, this group of researchers showed that obesity increases the susceptibility to AF.

The hypothesis was that weight loss (and aggressive attention to other cardiometabolic risk factors) would reduce AF burden. They randomly assigned patients on their waiting list for AF ablation to 2 groups: (1) a physician-led aggressive program that targeted primarily weight loss, but also hypertension, sleep apnea, glucose control, and alcohol reduction; or (2) standard care with lifestyle counseling.

The findings were striking. Compared with the group of patients receiving standard care, patients in the physician-directed program lost weight, reported less AF symptoms, and had fewer AF episodes recorded. Most impressive were the structural effects noted on echocardiograms. Patients in the intervention group had regression of left ventricular hypertrophy and reduction in left atrial size.

Though this is a small trial, it is practice-changing for cardiology. It shows that treating modifiable risk factors remodels the heart and in so doing reduces the burden of AF. In an interview in JAMA, Dr. Sanders says aggressive risk factor treatment should be a standard of care. I agree. Right now, AF ablation is too often thought of in terms of a supraventricular tachycardia ablation — a fix for a fluke of nature. It’s not that way. In the majority of AF cases, the same excesses that cause atherosclerosis also cause AF. Rather than make 50 burns in the atria, it makes much more sense to address the root cause.


6. Novel Anticoagulants Face Value-Based Headwinds

Tell me you haven’t been in this situation: You are making rounds on a patient with newly diagnosed AF, admitted the night before. She has multiple risk factors for stroke. Her heart rate has been controlled and her symptoms improved. There are now 2 choices for anticoagulation: (1) Start warfarin, and while waiting for an adequate INR, cover with IV-heparin (days in hospital) or low-molecular-weight heparin (teaching- and dollar-intensive); or (2) Begin a novel oral anticoagulant (NOAC) and discharge the patient that day. It’s so much easier to use NOAC drugs.

But then what happens when the “starter” kits run out and the patient faces a massive bill at the pharmacy, or her third-party payer denies payment? Now our patient has a problem. She is in AF and has risk factors for stroke. A gap in anticoagulation is not desirable.

At the heart of this issue is the value and superiority of NOAC drugs compared with warfarin. At the 2013 American Heart Association Sessions, the ENGAGE-AF trial showed that the newest NOAC drug, edoxaban, compared favorably to warfarin.[5] All 4 clinical trials of NOAC drugs vs warfarin looked strikingly similar — namely, that in absolute benefits (stroke reduction) and harm (bleeding), NOAC drugs and warfarin performed similarly, within 1% of each other. In the cost-conscious, evidence-based climate of 2013, NOAC drugs are increasingly recognized as overvalued. Warfarin, with all its imperfections, remains steady.

Transparency, End-of-Life Care, and TACT

7. The Sunshine Act

Cardiology is a drug- and device-intensive field. Collaboration with industry is necessary. Skillful use of stents, ICDs, ablation, and pharmaceutical agents has enhanced and saved the lives of millions of patients. Yet, there is clear evidence of overuse and misuse of expensive technology. Look no further than studies that show huge geographic practice variation,[6] which I wrote about here.

The 2013 Sunshine Act has changed the landscape of cardiology education and influence. The upside of transparency is that knowing the financial relationships of investigators is an important part of judging science. Perhaps more important, though, is the possibility that the Sunshine Act will help remove those with financial relationships from guideline writing. Given the influence of guidelines, it’s important that writers be free of conflicts.

The potential downsides of too much Sunshine are noteworthy. After being interviewed in the Wall Street Journal this August,[7] I wrote the following on my blog:

Doctors are a conservative lot. Concern over perception will surely decrease physicians’ interactions with industry, both the useful and not so useful ones. The effect on physician education might suffer. Though the Ben Goldacres of the world rightly emphasize bias when industry entwines itself with medical education, I can attest to have learned a lot from industry-sponsored programs. And this too: one thing that happens when industry sponsors a learning session is that doctors come to it. They talk; they share cases; they come together face-to-face. Such interactions are critical. Will the disappearance of sponsored sessions decrease the amount of face-to-face learning?

We shall soon learn whether all this sunshine enhances health or causes burns.

8. Compassionate Care of the Elderly

Cardiologists are programmed to see death as the enemy. This is a very good thing when treating diseases like STEMI. But a side effect of improving life-prolonging interventions is that patients live long enough to develop other problems. Cardiologists are increasingly asked to treat the elderly and the frail. And this is a challenge because in these patients, treating death as if it’s avoidable is perilous. Delaying death is not the same as prolonging life. Treating a disease is not the same as treating a person.

It’s possible that 2013 will be the year in which things changed for the better in the care of the elderly. And if it is, we will have Katy Butler, an author and investigative journalist, to thank. Ms. Butler’s 2013 book, Knocking on Heaven’s Door, poignantly chronicles the difficulties that both her parents struggled with as they approached the end of life.[8] In both cases, suffering occurred because of disconnect with cardiologists who behaved as if death were optional.

Writing in the Wall Street Journal this September, Ms. Butler describes her mother’s decision to forego aggressive intervention for valvular heart disease.[9] Despite being cared for in one of the nation’s elite heart hospitals, Mrs. Butler’s mother was forced to fight hard for her right to self-determination. Perhaps she mustered the strength to fight for a good death because of the lessons she learned as a caregiver for her chronically ill husband, whose death was tragically prolonged at the hands of paternalistic cardiologists. In Ms. Butler’s father’s case, which she describes in this award-winning New York Times Magazine essay, cardiologists implanted an unnecessary pacemaker and then refused to deactivate it, against the family’s wishes.[10]

As the American College of Cardiology begins an awareness campaign for aortic stenosis, and transcutaneous approaches to valvular disease begin their long road to clinical utility, no topic could be timelier than compassionate patient-centered care for the elderly. 2013 is the year that the oath of Maimonides — “Oh, God, Thou has appointed me to watch over the life and death of Thy creatures” — becomes even more relevant to cardiologists, the guardians of technology.

9. Chelation Therapy

Nothing has become more virtuous in the practice of medicine than clinical evidence. We have set out the rules: The scientific method will determine the best treatments for our patients. One group gets treatment A and the other treatment B. Then we measure outcomes — the simpler the better. These are the rules of the game; they can’t be changed when we don’t like how the game turns.

The TACT investigators have followed the rules. They compared 322 diabetic patients with coronary heart disease who were treated with chelation vs 311 similarly matched patients treated with placebo infusions.[11] The primary endpoint, a composite of death, MI, stroke, revascularization, and hospitalization for angina, occurred in 80 of 322 (25%) treated with chelation and 117 (38%) on placebo. That’s an absolute — not relative — reduction of 13%, and an astounding NNT of 7. For comparison, statin drugs for primary prevention, or NOAC drugs vs warfarin in patients with AF, have NNTs greater than 100.

What makes chelation in diabetics a top story of the year is more than just the data. By the authors’ own account, these findings need to be replicated. What’s really big here is the voracity of opposition from the establishment. I re-read what I said in my opinion piece from November. I’m sticking to it: “It would be a huge mistake to dismiss this science because chelation does not conform to preconceived notions or because it is practiced outside the mainstream of medicine. Let’s not forget about the patients with this terrible disease. It’s not as if we have good treatments for them.”

EMRs and the Blogosphere

10. EMR and the Danger It Poses to the Patient-Doctor Connection

Among Mr. Obama’s broken promises (if you like your insurance plan…) was that the efficiency inherent in electronic medical records (EMRs) would solve the growing cost of healthcare.

In 2013, nearly every doctor is being forced to adopt an EMR. Medicine is replete with examples of good ideas gone awry. There is no better example of this than medical EMR systems. The list is long: EMRs interface poorly with users (doctors). Completing a medical record on an encounter for a common heart rhythm ailment requires me to click more than 25 times. (Fact: EMRs either decrease the number of patients one can see, or worse, they cause a doctor to spend less face time with each patient.) EMRs don’t talk to each other — and in their current form, never will. There is not a shred of evidence that they improve real outcomes. EMRs function more as a billing invoice than a useful medical record.

Doctors are the end-users but not the customers of EMR companies, so our feedback carries little weight. EMR companies effectively answer to no one. And talk about conflict of interest: Anointed EMR companies have become immensely profitable. Even the New York Times took notice.[12]

None of this is the worst part. The worst aspect of EMR systems (in their current form) is that they threaten to remove the humanity from something that at its heart should be human: the patient-doctor connection. In 2013, EMR is one of the many forces that threaten the patient-doctor relationship. If this situation improves in 2014, I’ll report it; but I’m not optimistic. (Full disclosure: I love computers.)

11. Social Media

The American College of Cardiology, the Heart Rhythm Society, the BMJ, and the New England Journal of Medicine are all actively engaged in social media and blogging. I gave a talk at an Indiana University medical student leadership conference this year. Nearly every medical student was on Twitter. So is the president of the ACC and SCAI, as are millions of patients.

The democracy of information on social media enhances patient involvement in medical decision-making. When patients have information, decisions improve. AF patient Mellanie True Hills has made her Website, StopAfib, a go-to resource for patients, a place where influential academic leaders in electrophysiology have taken the time to be interviewed. Social media empowers patient advocates.

Social media is also transforming influence. In the past, the only influencers in cardiology were academic leaders — those who have access to medical journals. That is changing. Look at me: I am a nobody in the academic world, yet Dr. Rich Fogel, the former president of the Heart Rhythm Society (HRS), put me on the same stage with Dr. Douglas Zipes, Dr. Brian Olshansky, and Dr. Anne Curtis at the 2013 HRS sessions to speak about ICDs.

Finally, this is speculative, but I believe that social media has the power to transform medical education. This year, the biggest electrophysiology story from the 2013 European Society of Cardiology Congress was the Echo-CRT trial.[13] This was a practice changer because it put a stop to implanting CRT devices in patients destined to be nonresponders. Dr. Jay Schloss (Christ Hospital, Cincinnati, Ohio), writing on his personal blog, provided clear and useful coverage for free, without the need for registration. Another example: I think IV-diltiazem is overused and misused. In the academic literature, you cannot find a contemporary piece to support this view. But you can on social media.

This is my top 11 for 2013. I invite you to use the comments section to share your top cardiology picks.






Read Full Post »

The Young Surgeon and The Retired Pathologist: On Science, Medicine and HealthCare Policy – The Best Writers Among the WRITERS

Curator: Aviva Lev-Ari, PhD, RN

Updated on 2/18/2016

Since January 2005, I am a Reader, Curator and Author of scientific articles in Life Sciences and Medicine.

On 2/18/2016, the Open Access Online Scientific Journal launched in 4/2012,

Open Access Online Scientific Journal


Site Statistics


Views to Date

# of articles

NIH Clicks

Nature Clicks






 7/29/2013  217,356  1,138  1,389  705
       12/11/2013  293,694  1,464  1,693  828
10/17/2015  765,762  3,444  2,726 1,683 
02/18/2016  886,454 4,162  2,911 1,813 

By 2/18/2016, I have curated 2,333 articles, the list of titles is on 109 pages on http://pharmaceuticalintelligence.com

Links to each article to be found at


Frontiers in Cardiology – 653 articles


These articles have been viewed, since the first article was published on 4/30/2012, by +886,454 viewers.

Author and Curator e-Readers since


Journal Articles



Aviva Lev-Ari, PhD, RN 248,163 2,333
Larry H Bernstein, MD, FCAP  




Of all the readings and reviews I completed to date, my appreciation got bonded to two Science and Medicine writers:


I am inviting the e-Readers to join me on a language immersion during a LITERARY TOUR in Science, Medicine and HealthCare Policy. 

Part One: The Young Surgeon

Eric J. Topol, MD: Editor’s Note on The Young Surgeon

Atul Gawande, MD, MPH, wears many hats, including that of a surgeon, researcher, journalist, and author. In this segment of Medscape One-on-One, Dr. Gawande talks with Eric J. Topol, MD, about what inspires him, his plans for the future, and why he’s secretly a frustrated rock singer.

WATCH the INTERVIEW of December 06, 2013 on VIDEO

Eric Topol on Medscape > Medscape One-on-One

Atul Gawande on the Secrets of a Puzzle-Filled Career

, Atul Gawande, MD, MPH


Atul Gawande is a surgeon, writer, and public health researcher. He practices general and endocrine surgery at Brigham and Women’s Hospital in Boston, and is Director of Ariadne Labs, a joint center for health systems innovation. He is Professor in the Department of Health Policy and Management at the Harvard School of Public Health and Professor of Surgery at Harvard Medical School. And he is also co-founder and chairman of Lifebox, an international not-for-profit implementing systems and technologies to reduce surgical deaths globally.

Soon after he began his residency, his friend Jacob Weisberg, editor of Slate, asked him to contribute to the online magazine. His pieces on the life of a surgical resident caught the eye of The New Yorker which published several pieces by him before making him a staff writer in 1998.

A June 2009 New Yorker essay by Gawande[12] compared the health care of two towns in Texas to show why health care was more expensive in one town compared to the other. Using the town of McAllen, Texas, as an example, it argued that a revenue-maximizing businessman-like culture (which can provide substantial amounts of unnecessary care) was an important factor in driving up costs, unlike a culture of low-cost high-quality care as provided by the Mayo Clinic and other efficient health systems.

Ezra Klein of The Washington Post called it “the best article you’ll see this year on American health care—why it’s so expensive, why it’s so poor, [and] what can be done.”[13] The article was cited by Pres. Barack Obama during Obama’s attempt to get health care reform legislation passed by the United States Congress. The article “made waves”[14] and according to Senator Ron Wyden, the article “affected [Obama’s] thinking dramatically”, and was shown to a group of senators by Obama, who said, “This is what we’ve got to fix.”[15] After reading the New Yorker article, Warren Buffett‘s long-time business partner Charlie Mungermailed a check to Gawande in the amount of $20,000 as a thank you to Dr. Gawande for providing something so socially useful.[16] Gawande donated the $20,000 to the Brigham and Women’s Hospital Center for Surgery and Public Health.[17]

In addition to his popular writing, Gawande has published studies on topics including military surgery techniques and error in medicine, included in the New England Journal of Medicine. He is also the director of theWorld Health Organization‘s Global Patient Safety Challenge. His essays have appeared in The Best American Essays 2003, The Best American Science Writing 2002, The Best American Science Writing 2009 andBest American Science and Nature Writing 2011.

He has been a staff writer for the New Yorker magazine since 1998. He has written three bestselling books: Complications, which was a finalist for the National Book Award in 2002; Better, which was selected as one of the ten best books of 2007 by Amazon.com; and The Checklist Manifesto. He has won two National Magazine Awards, AcademyHealth’s Impact Award for highest research impact on health care, a MacArthur Fellowship, and he has been named one of the world’s hundred most influential thinkers by Foreign Policy and TIME.



RESEARCH by Dr. Atul Gawande

Tsai TC, Joynt KE, Orav EJ, Gawande AA, Jha AK. Variation in Surgical-Readmission Rates and Quality of Hospital CareNew England Journal of Medicine Published online September, 2013.

Funk LM, Conley DM, Berry WR, Gawande AA. Hospital Management Practices and Availability of Surgery in Sub-Saharan Africa: A Pilot Study of Three HospitalsWorld Journal of Surgery Published online August, 2013.

Nehs MA, Ruan DT, Gawande AA, Moore FD Jr, Cho NL.Bilateral neck exploration decreases operative time compared to minimally invasive parathyroidectomy in patients with discordant imagingWorld Journal of SurgeryPublished online July, 2013.

Joynt KE, Gawande AA, Orav EJ, Jha AK.Contribution of Preventable Acute Care Spending to Total Spending for High-Cost Medicare PatientsJAMA Published online June 24, 2013.

McCrum ML, Joynt KE, Orav EJ, Gawande AA, Jha AK.Mortality for Publicly Reported Conditions and Overall Hospital Mortality RatesJAMA Published online June 24, 2013.

Spector JM, Lashoher A, Agrawal P, Lemer C, Dziekan G, Bahl R, Mathai M, Merialdi M, Berry W, and Gawande AA.Designing the WHO Safe Childbirth Checklist Program to Improve Quality of Care at ChildbirthInternational Journal of Gynecology & Obstetrics Published online June 5, 2013.

Barnet CS, Arriaga AF, Hepner DL, Correll DJ, Gawande AA, Bader AM. Surgery at the End of LifeThe Journal of the American Society of Anathesiologists Published online June, 2013.

Bowman KG, Jovic G, Rangel S, Berry WR, Gawande AA.Pediatric emergency and essential surgical care in Zambian hospitals: A nationwide studyJournal of Pediatric Surgery Published online June, 2013.

Rice-Townsend S, Gawande A, Lipsitz S, Rangel SJ.Relationship between unplanned readmission and total treatment-related hospital days following management of complicated appendicitis at 31 children’s hospitalsJournal of Pediatric Surgery Published online June, 2013.

Eappen S, Lane BH, Rosenberg B, Lipsitz SA, Sadoff D, Matheson D, Berry WR, Lester M, Gawande AA. Relationship Between Occurrence of Surgical Complications and Hospital FinancesJAMA April 17, 2013;309:1599-1606.

Kwok AC, Funk LM, Baltaga R, Lipsitz SR, Merry AF, Dziekan G, Ciobanu G, Berry WR, Gawande AA. Implementation of the World Health Organization Surgical Safety Checklist, Including Introduction of Pulse Oximetry, in a Resource-Limited SettingAnnals of Surgery April 4, 2013.

Molina G, Funk LM, Rodriguez V, Lipsitz SR, Gawande A.Evaluation of Surgical Care in El Salvador Using the WHO Surgical Vital StatisticsWorld Journal of Surgery Published online, March 2013.

Arriaga AF, Bader AM, Wong JM, Lipsitz SR, Berry WR, Ziewacz JE, Hepner DL, Boorman DJ, Pozner CN, Smink DS, Gawande AA. Simulation-Based Trial of Surgical-Crisis ChecklistsNew England Journal Of Medicine 2013;368:246-53.

Spector JM, Reisman J, Lipsitz S, Desai P, and Gawande AA.Access to Essential Technologies for Safe Childbirth: A Survey of Health Workers in Africa and AsiaBMC Pregnancy and Childbirth February 20, 2013;13:43-49.

Wong JM, Panchmatia JR, Ziewacz JE, Bader AM, Dunn IF, Laws ER, Gawande AA. Patterns in neurosurgical adverse events: intracranial neoplasm surgeryJournal of Neurosurgery 2012;33(5):E16.

Wong JM, Ziewacz JE, Ho AL, Panchmatia JR, Kim AH, Bader AM, Thompson BG, Du R, Gawande AA. Patterns in neurosurgical adverse events: open cerebrovascular neurosurgeryJournal of Neurosurgery 2012;33(5):E15.

GO TO the First article



Nanevicz TM, Prince MR, Gawande AA, Puliafito CA. Excimer laser ablation of the lens.Archives of Ophthalmology. 1986;104(12):1825-9.

Selected References

  1. Dr Atul Gawande – 2014 Reith Lectures. BBC Radio 4. Retrieved October 18, 2014.
  2. Atul Gawande on Twitter
  3.  Atul Gawande: ‘If I haven’t succeeded in making you itchy, disgusted or cry I haven’t done my job’, The Guardian
  4.  Former Policymaker Opts for Hands-On Health Care – International Herald Tribune
  5. MacArthur Fellows 2006. Atul Gawand
  6. “Atul Gawande Named MacArthur Fellow”. Press release by Brigham and Women’s Hospital. September 19, 2006. Retrieved February 25, 2010
  7. “Q&A with Atul Gawande, Part 2” H&HN. June 30, 2011. Retrieved July 7, 2011.
  8. Why Do Doctors Fail?The Reith Lectures, Dr Atul Gawande: The Future of Medicine Episode 1 of 4, BBC
  9. “Atul Gawande: surgeon, health policy scholar, and writer”.Harvard Magazine. Sep–Oct 2009
  10. Bates, D. W.; Gawande, A. A. (2003). “Improving Safety with Information Technology”. New England Journal of Medicine 348 (25): 2526. doi:10.1056/NEJMsa020847.
  11. Weiser, T. G.; Regenbogen, S. E.; Thompson, K. D.; Haynes, A. B.; Lipsitz, S. R.; Berry, W. R.; Gawande, A. A. (2008). “An estimation of the global volume of surgery: A modelling strategy based on available data”. The Lancet 372 (9633): 139. doi:10.1016/S0140-6736(08)60878-8.
  12. Gawande, A. A.; Studdert, D. M.; Orav, E. J.; Brennan, T. A.; Zinner, M. J. (2003). “Risk factors for retained instruments and sponges after surgery”. New England Journal of Medicine 348 (3): 229–35. doi:10.1056/NEJMsa021721. PMID 12529464.
  13. Gawande, A. A.; Thomas, E. J.; Zinner, M. J.; Brennan, T. A. (1999). “The incidence and nature of surgical adverse events in Colorado and Utah in 1992”. Surgery 126 (1): 66–75.doi:10.1067/msy.1999.98664. PMID 10418594.

Dr. Atul Gawande’s Articles in the New Yorker

States of Health
New Yorker
October 7, 2013

Slow Ideas
New Yorker
July 29, 2013

Why Boston’s Hospitals Were Ready
New Yorker
April 17, 2013

Big Med
New Yorker
August 6, 2012

Something Wicked This Way Comes
New Yorker
June 28, 2012

Failure and Rescue
New Yorker
June 4, 2012

200 Years of Surgery
New England Journal of Medicine
May 2, 2012

Personal Best
The New Yorker
September 26, 2011

A Townie Speaks
Ohio University Commencement Address
June 11, 2011

Cowboys and Pit Crews
2011 Harvard Medical School Commencement Address
May 26, 2011

The Hot Spotters
The New Yorker
January 17, 2011

Seeing Spots
The New Yorker News Desk
January 27, 2011

Letting Go
The New Yorker
July 26, 2010

Now What?
The New Yorker
Apr 5, 2010

Testing, Testing 
The New Yorker
Dec 14, 2009

The Cost Conundrum Redux
The New Yorker
News Desk Blog
Jun 23, 2009

The Cost Conundrum 
The New Yorker
Jun 1, 2009

The New Yorker
Mar 30, 2009

Getting There from Here 
The New Yorker
Jan 26, 2009

The Itch 
The New Yorker
Jun 30, 2008

A Lifesaving Checklist 
The New York Times
Dec 30, 2007

The Checklist 
The New Yorker
Dec 10, 2007

Sick and Twisted
The New Yorker
Jul 23, 2007

The Obama Health Plan
The New York Times
May 31, 2007

A Katrina Health Care System 
The New York Times
May 26, 2007

Rethinking Old Age
The New York Times
May 24, 2007

Let’s Talk About Sex 
The New York Times
May 19, 2007

Doctors, Drugs, and the Poor 
The New York Times
May 17, 2007

Bad Medicine, Sneaking In 
The New York Times
May 12, 2007

Curing the System
The New York Times
May 10, 2007

Can This Patient Be Saved? 
The New York Times
May 5, 2007

The Power of Negative Thinking
The New York Times
May 1, 2007

The Way We Age Now 
The New Yorker
Apr 30, 2007

The Score
The New Yorker
Oct 9, 2006

The Malpractice Mess
The New Yorker
Nov 14, 2005

The New Yorker
Apr 4, 2005

The Bell Curve
The New Yorker
Dec 6, 2004

The Mop-Up
The New Yorker
Jan 12, 2004

Desperate Measures
The New Yorker
May 5, 2003

Cold comfort
The New Yorker
Mar 11, 2002

The learning curve
The New Yorker
Jan 28, 2002

The man who couldn’t stop eating
The New Yorker
Jul 9, 2001

Final cut
The New Yorker
Mar 19, 2001

Crimson tide

The New Yorker

Feb 12, 2001

Under suspicion
The New Yorker
Jan 8, 2001

When good doctors go bad
The New Yorker
Aug 7, 2000

GO TO the First article

The Gist: Persian Gulf War Syndrome
The Gist
Oct 25, 1996


A New York Times Bestseller and an Amazon Best Book of the Month: December 2009


One of Amazon.com’s 10 Best Books of 2007
“Essential Reading For Anyone Involved In Medicine”–Amazon.com –  2002

An avalanche of unnecessary medical care is harming patients physically and financially. What can we do about it?
Annals of Health Care MAY 11, 2015

It was lunchtime before my afternoon surgery clinic, which meant that I was at my desk, eating a ham-and-cheese sandwich and clicking through medical articles. Among those which caught my eye: a British case report on the first 3-D-printed hip implanted in a human being, a Canadian analysis of the rising volume of emergency-room visits by children who have ingested magnets, and a Colorado study finding that the percentage of fatal motor-vehicle accidents involving marijuana had doubled since its commercial distribution became legal. The one that got me thinking, however, was a study of more than a million Medicare patients. It suggested that a huge proportion had received care that was simply a waste.
tests, drugs, and operations

tests, drugs, and operations

Millions of Americans get tests, drugs, and operations that won’t make them better, may cause harm, and cost billions.
Our medical systems are broken. Doctors are capable of extraordinary (and expensive) treatments, but they are losing their core focus: actually treating people. Doctor and writer Atul Gawande suggests we take a step back and look at new ways to do medicine — with fewer cowboys and more pit crews.

Being Mortal: Medicine and What Matters in the End – Deckle Edge, Oct 7, 2014

Part Two: The Retired Pathologist 

On Science, Medicine and HealthCare Policy – The Best Writers Among the WRITERS

Roles at http://pharmaceuticalintelligence.com

Chief Scientific Officer, Member of the Board

Research Categories OWNER:

  • Biomarkers & medical diagnosis in Pathology (Co-Owner)
  • Clinical Trials and IRB related Issues
  • Acute and Chronic Disease Classifications
  • Biomarker Discovery and Validation
  • Cardiovascular Research
  • Clinical Laboratory-Related Issues
  • Healthcare and Hospital Costs
  • Health Information Technology  and Workflow Redesign
  • Metabolomics
  • Metabolic Derangements
  • Nutraceuticals
  • Nutrigenomics
  • Nutrition
  • Nutrition and Phytochemistry
  • Proteomics
  • Statistical Methods for Research Evaluation
  • Systemic Inflammatory Response Related Disorders


Larry H. Bernstein, M.D., FCAP – My Life in Medicine 


I retired from a five year position as Chief of the Division of Clinical Pathology (Laboratory Medicine) at  New York Methodist Hospital-Weill Cornell Affiliate, Park Slope, Brooklyn in 2008 folowed by an interim consultancy at Norwalk Hospital in 2010.  I then became engaged with a medical informatics project called “Second Opinion” with Gil David and Ronald CoifmanEmeritus Professor and Chairman of the Department of Mathematics in the Program in Applied Mathematics at Yale.  I went to Prof. Coifman with a large database of 30,000 hemograms that are the most commonly ordered test in medicine because of the elucidation of red cell, white cell and platelet populations in the blood.  The problem boiled down to a level of noise that exists in such data, and developing a primary evidence-based classification that technology did not support until the first decade of the 21st century.

Realtime Clinical Expert Support and Validation System

Gil David and Larry Bernstein have developed, in consultation with Prof. Ronald Coifman, in the Yale University Applied Mathematics Program, a software system that is the equivalent of an intelligent Electronic Health Records Dashboard that provides empirical medical reference and suggests quantitative diagnostics options.

Our dashboard is a visual display of essential metrics. The primary purpose is to gather medical information, generate metrics, analyze them in realtime and provide a differential diagnosis, meeting the highest standard of accuracy. The diagnosis provides a risk assessment to the patient’s medical condition, while locating and presenting similar cases of other patients with the same anomalous profile and their corresponding treatment and followup. Given medical information of a patient, the system builds its unique characterization and provides a list of other patients that share this unique profile, therefore utilizing the vast aggregated knowledge (diagnosis, analysis, treatment, etc.) of the medical community.

The main mathematical breakthroughs are provided by accurate patient profiling and inference methodologies in which anomalous subprofiles are extracted and compared to potentially relevant cases. Our methodologies organize numerical medical data profiles into demographics and characteristics relevant for inference and case tracking. As the model grows and its knowledge database is extended, the diagnostic and the prognostic become more accurate and precise.

We anticipate that the effect of implementing this diagnostic amplifier would result in higher physician productivity at a time of great human resource limitations, safer prescribing practices, rapid identification of unusual patients, better assignment of patients to observation, inpatient beds, intensive care, or referral to clinic, shortened length of patients ICU and bed days.

[Second Opinion 2009-2011 Proprietary]

As an example, inputs from test data such as Hematology results are processed for anomaly characterization and compared with similar anomalies in a data base of 30,000 patients, provide diagnostic statistics, warning flags , and risk assessment . These are based on past prior experience , including ,diagnostics and treatment outcomes (collective experience). The system was trained on this database of patients, built the learning knowledge base and used to analysis and diagnosis 5,000 new patients. Our system identified successfully the main risks with very high accuracy (more than 96%) and very low false rate (less than 0.5%).

The main benefit is a real time assessment as well as diagnostic options based on

comparable cases, flags for risk and potential problems as illustrated in the following case acquired on 04/21/10. The patient was diagnosed by our system with severe SIRS at a grade of .61 .

The patient was treated for SIRS and the blood tests were repeated during the following week. Following treatment, the SIRS risk as a major concern was eliminated and the system provides a positive feedback for the treatment of the physician.

To experiment with our demo system using our existing database or your own data it  resides online at:

http://netlab2.math.yale.edu:30049/cgi-bin/second opinion.py

[Second Opinion 2009-2011 Proprietary]

I have been reviewing manuscripts somewhat frequently for Nutrition, Clin Chem Lab Med, Clin Biochem, and J Ped Hem Oncol., and serve on the Editorial Advisory Board of Nutrition.

I was the Chief, Clinical Pathology at NY Methodist Hospital, a 600+ bed hospital in Park Slope, Brooklyn, 2 hours from Bridgeport, CT, where I worked for 5 years,  and was previously Chief of Clinical Chemistry and Chief of Blood Bank at Bridgeport Hospital for 20 years, and Acting Chairman of Yale University Department of Pathology at Bridgeport Hospital for one year prior to my experience at NY Methodist Hospital Weill-Cornell.

My work with nutrition is extensive as a consulting pathologist on the Nutritional Support Team and I worked closely with the Burn Unit at Bridgeport Hospital, led by Dr. Walter Pleban, the first physician expert in burn and wound care to use TPN in Connecticut.  I rejected the dependence on serum albumin and implemented the first use of prealbumin (transthyretin)(half-life of 2 days) to follow the return to anabolic status of severely stressed patients, starting with Immunodiffusion plates from Behring Diagnostics, then converting to running batch turbidimetric assays on the Roche centrifugal analyzer, and finally running on a Beckman. My lab was the only one to get down to reliable measurements of 20 mg/L.  I co-chaired the First International Transthyretin Congress in Strasbourg, chaired the 14th and was an invited participant in the 17th Ross Roundtable on Nutrition, Organized and Chaired the Beckman Roundtable on Prealbumin in Los Angeles, was responsible for the AACC first document of Standards of Clinical Laboratory Practice with Lawrence Kaplan, and was recipient of the Labbe/Garry award of the Nutrition Division of AACC).  I did some of the earliest work on point of care diagnostics in neonatal care. My work with Creatine kinase isoenzyme MB and the isonzyme 1 of LD goes back to my residency and my long term contact with Burton Sobel. The improved use of troponins and NT-proBNP and have  been ongoing projects for the last 10 years, some of which was supported by Roche Diagnostics on the recommendation of Pauline Lau and Bernard Statland. The projects in normalizing the NT-proBNP for age and estimated glomerular filtration rate (eGFR), was successful, but widespread implementation is even more gradual than was TTR.

I have served on the Board of Directors of NAACLS and the American Library Association Commission on Accreditation, am listed in America’s Top Physicians, Marquis Who’s Who in Science and Engineering and Marquis’ Who’s Who in Medicine, Who’s Who in Pathology, Continental Who’s Who, Strathmore’s Who’s Who, and have 3 patents.

Selected Peer Reviewed publications

1. Rosser A. Rudolph, Larry H. Bernstein,and Joseph Babb. Information Induction for
Predicting Acute Myocardial Infarction. CLIN CHEM 1988; 34(10): 2031-2038.

2. Zarich SW, Bradley K, Mayall ID, Bernstein LH. Minor elevations in troponin T values enhance risk assessment in emergency department patients with suspected myocardial ischemia: analysis of novel troponin T cut-off values. Clin Chim Acta 2004; 343:223-29.

3. Bernstein, L.H.; Devakonda, A.; Engelman, E.; Pancer, G.; Ferrara, J.; Rucinski, J.; Raoof, S.; George, L.; Melniker, L. The Role of Procalcitonin in the Diagnosis of Sepsis and Patient Assignment to Medical Intensive Care.  J Clinical Ligand Assay, 2007; 30 (3-4):98-104

Older patients, make up a large part of the ICU population and tend to have an acute stressful condition superimposed on chronic illness.  The effects of anorexia, hypermetabolism, and malabsorption on these patients lead to substantial nitrogen losses. The most widely used methods for assessing malnutrition are the Subjective Global Assessment (SGA); TTR, and a combination of laboratory and biochemical features. The simplest of these, transthyretin (TTR) has become a commonly assayed protein in assessing PEM status. Clinical studies indicate that determination of the TTR level may allow for earlier recognition of malnutrition risk and timely intervention. Since TTR has a relatively short circulating half-life, it is expected to respond rapidly in response to metabolic support. TTR production decreases after 14 days of consuming a diet that provides only 60% of required proteins. Rapid turnover proteins, such as transthyretin (half-life < 2 days) respond early to nutrition support, and reflect a delayed return to anabolic status.It is particularly helpful in removing interpretation bias, and it is an excellent measure of the systemic inflammatory response concurrent with a preexisting state of chronic inanition. In the ICU patients we studied, TTR removed interpretation bias because the sickest patients experienced an uncommon delayed return of TTR to normal levels with adequate nutritional support.
DevakondaA, et al,Transthyretin as a marker to predict outcome in critically ill patients,ClinBiochem(2008),doi:10.1016/j.clinbiochem.2008.06.016
Protein energy malnutrition; Critically ill patients; Stress hypermetabolism; Transthyretin;  Multivariate classification.

4. Bernstein LH, Zions MY, Haq SA, Zarich S, Rucinski J, Seamonds B, Berger S, Lesley DY, Fleischman W, Heitner JF: Effect of renal function loss on NT-proBNP level variations. Clin Biochem; 2009;42(10-11):1091-8 [PMID: 19298805]

OBJECTIVE: NT-proBNP level is used for the detection of acute CHF and as a predictor of survival. However, a number of factors, including renal function, may affect the NT-proBNP levels. This study aims to provide a more precise way of interpreting NT-proBNP levels based on GFR, independent of age. METHODS: This study includes 247 pts in whom CHF and known confounders of elevated NT-proBNP were excluded, to show the relationship of GFR in association with age. The effect of eGFR on NT-proBNP level was adjusted by dividing 1000 x log(NT-proBNP) by eGFR then further adjusting for age in order to determine a normalized NT-proBNP value. RESULTS: The normalized NT-proBNP levels were affected by eGFR independent of the age of the patient. CONCLUSION: A normalizing function based on eGFR eliminates the need for an age-based reference ranges for NT-proBNP.
Kidney Function Tests. Natriuretic Peptide, Brain / blood. Peptide

5. David G, Bernstein LH, Coifman RR.  Generating Evidence Based Interpretation of Hematology Screens via Anomaly Characterization. The Open Clinical Chemistry Journal, 2011; 4:10-16. ISSN 1874-2416/11. Bentham Journal.
Introduction: We propose an automated nutritional assessment (ANA) algorithm that provides a method for malnutrition risk prediction with high accuracy and reliability. Materials  and Methods: The database used for this study is a file of 432 patients, where each patient is described by 4 laboratory parameters and 11 clinical parameters. A malnutrition risk assessment of low (1), moderate (2) or high (3) was assigned by a dietitian for each patient. An algorithm for data organization and classification via characteristic metrics is proposed.  For each patient, the algorithm characterizes its unique profile and builds a characteristic metric to identify similar patients who are mapped into a classification. Results: The algorithm assigned a malnutrition risk level for each patient based on different training sizes that were taken out of the data. Our method resulted in an average error (distance between the automated score and the real score) of 0.386, 0.3507, 0.3454, 0.34 and 0.2907 for 10%, 30%, 50%, 70% and 90% training sizes, respectively. Our method outperformed the compared method even when our method used a smaller training set then the compared method. In addition, we show that the laboratory parameters themselves are sufficient for the automated risk prediction and adding the clinical parameters does not improve the accuracy. We present an organization of the patients into several clusters and sub-clusters. These  clusters  correspond to low risk areas, low-moderate risk areas, moderate risk areas, moderate-high risk areas and high risk areas. The organization and visualization methods provide a tool for exploration and navigation of the data points. Discussion: The problem of rapidly identifying risk and severity of malnutrition is crucial for minimizing medical and
surgical complications associated with previsit under-nutrition, chronic illness affecting swallowing, eating, and weight loss.

6. Brugler L, Stankovic AK, Schlefer M, Bernstein L. A simplified nutrition screen for hospitalized patients using readily available laboratory and patient information. Nutrition 2005; 21(6): 650-658

Results:  The analysis demonstrated the characteristics that correlated best with MRC risk level assignment were: the occurrence of a wound (p=2.5e14), poor oral intake (p=3.2e-14), malnutrition related admission diagnosis (p=3.9e-9), serum albumin value (p=1.4e-31), hemoglobin value (p=3.3e-10), and total lymphocyte count   (p=1.4e-29). The 6 variable model had an R2 of 0.773 and p = 4.6e-116. A second model had 4 variables (malnutrition related admission diagnosis, serum albumin value, hemoglobin value and total lymphocyte count) and 3 (high, moderate and low) versus 4 (high, moderate, low and no) MRC risk levels with an R2 of 0.721 and p = 1.6e-104. Discussion: The ability of admission information to accurately reflect MRC risk is crucial to early initiation of restorative medical nutrition therapy (MNT), the efficient utilization of nutrition care resources and compliance with regulatory requirements. There is currently no uniform or proved standard for identifying MRC risk within 24 hours of acute care admission. The ideal nutrition screen correlates well with the occurrence of MRCs and also contains parameters that can be quickly and routinely obtained at admission. The six and even four parameter models described above meet both criteria and they can be uniformly used by hospitals to screen patients for MRC risk.7. Larry H. Bernstein, and James Rucinski. The relationship between granulocyte maturation and theseptic state measurement of granulocyte maturation may improve the early  diagnosis of the septic state,   Clin Chem Lab Med 2011;49   DOI 10.1515/CCLM.2011.688

Methods: This study calibrates and validates the measurement of granulocyte maturation with Immature granulocytes (IG) to the identification of sepsis, a study carried out on a
Sysmex Analyzer, model XE 2100 (Kobe, Japan). The Sysmex IG parameter is a crucial measure of immature granulocyte counts and includes metamyelocytes and myelocytes,
but not band neutrophils. Results and conclusions: We found agreement with previous work that designated an IG measurement cut-off of 3.2  as optimal. The analysis was then carried a step further with a multivariable discriminator.

8. Larry H Bernstein and Johannes Everse. Studies on the Mechanism of the Malate Dehydrogenase Reaction. J Biol Chemistry.  Dec 25, 1978; 253(24): 8702-8707.

These studies determine the levels of malate dehydrogenase isoenzymes in cardiac muscle by a steady state kinetic method which depends on the differential inhibition of these isoenzyme forms by high concentrations of oxaloacetate. This inhibition is similar to that exhibited by lactate dehydrogenase in the presence of high concentrations of pyruvate. The results obtained by this method are comparable in resolution to those obtained by CM-Sephadex fractionation and by differential centrifugation for the analyses of mitochondrial malate dehydrogenase and cytoplasmic malate dehydrogenase in tissues. The use of standard curves of percent inhibition of malate dehydrogenase activity plotted against the ratio of mitochondrial MDH activity to the total of mMDH and cMDH activities [ malate dehydrogenase ratio] (percent m-type) is introduced for studies of comparative mitochondrial function in heart muscle of different species or in different tissues of the same species.

9. MB Grisham, LH Bernstein, J Everse. The cytoplasmic malate dehydrogenase in neoplastic tissues” presence of a novel isoenzyme? Br J Cancer 1983; 47: 727-731

Malate dehydrogenase (MDH,EC1.1.1.37) catalyzes the reversible reduction of oxaloacetate tomalate in the presence of NADH. In eukaryotic cells the enzyme is generally found to be present as two distinct isoenzymes; one form is present in the cellular cytosol and the other is present exclusively in the mitochondria. These 2 isoenzymes form part of a shuttle system (the malate-aspartate shuttle) that functions as the major mechanism for the transportation of reducing equivalents between the cytosol and the mitochondria. As part of our ongoing studies on the mechansim of action and metabolic function of the malate dehydrogenases (Bernstein,etal. 1978; Bernstein & Everse, 1978; Bernstein & Grisham 1978), we recently
investigated the kinetic properties of the 2 isoenzymes present in rat Novikoff hepatoma tissues.These studies were initiated to evaluate whether or not the enzymes in the malate-asparate shuttle of tumour tissues are structurally and functionally identical to those of normal tissues. Fresh tumour or liver was homogenized with a glass tissue homogenizer in 0.1M potassium phosphate buffer, pH 7.5, containing 0.25M sucrose, centrifuged to remove tissue debris, and the supernatant was then centrifuged to obtain a supernatant that contained the cytoplasmic enzymes. The supernatantant did not contain any isocitrate dehydrogenase activity or transhydrogenase activity and was therefore judged to be free of mitochondrial enzymes.This high-speed supernatant was used without further fractionation for the determination of the cytoplasmic MDH activity. Mitochondria were prepared by suspending the pellet in 0.1M phosphate buffer, pH7.5, containing 0.25M sucrose and centrifuging the suspension at 600 g,  and re-centrifuged at 20,000 g for 30 min, and the precipitate was collected and washed, then suspended in phosphate buffer and sonicated for 1 min. The resulting solution was used for the assays for the mitochondrial enzyme. The assays were performed in 0.1M phosphate buffer, pH 7.0, at room temperature with a Beckman Model 24 recording spectrophotometer. We found that the Km values of the mitochondrial enzyme from the hepatoma tissue were identical with the values obtained with the enzyme from normal liver mitochondria. The cytoplasmic enzymes also have identical Km values for the coenzyme; however,the Lineweaver-Burk plots for oxaloacetate were non-identical. Whereas the Km value for oxaloacetate obtained with the liver enzyme was- 55 M, the Lineweaver-Burk plot obtained with the hepatoma enzyme displayed 2 slopes. One of the slopes corresponded with a Km value that is approximately identical to that of the liver enzyme, whereas the other slope yielded a Km value for oxaloacetate of-1mM. We interpret these data to indicate that Novikoff hepatoma tissue contains 2 cytoplasmic enzymes that possess MDH activity, one of which closely resembles that present in the rat liver cytoplasm. The other enzyme, having a Km of-1mM, is not found in normal liver tissue.

Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View?

Author: Larry H. Bernstein, MD, FCAP  

Article Published 10/17/2012 — 4,111 VIEWS on 12/10/2013

Top Author Views in 12 mo
larryhbern 40,730

Electronic Books EDITORIAL 

Series A: e-Books on Cardiovascular Diseases

Content Consultant: Justin D Pearlman, MD, PhD, FACC

Volume One: Perspectives on Nitric Oxide

Sr. Editor: Larry Bernstein, MD, FCAP, Editor: Aviral Vatsa, PhD and Content Consultant: Stephen J Williams, PhD

available on Kindle Store @ Amazon.com


Volume Two: Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation

Curators: Justin D Pearlman, MD, PhD, FACC, Larry H Bernstein, MD, FCAP, Aviva Lev-Ari, PhD, RN

  • Causes
  • Risks and Biomarkers
  • Therapeutic Implications

Volume Three: Etiologies of CVD: Epigenetics, Genetics & Genomics

Curators: Larry H Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

  • Causes
  • Risks and Biomarkers
  • Therapeutic Implications

Genomics and Medicine by Prof. Marcus Feldman, Stanford University

Volume Four: Therapeutic Promise: CVD, Regenerative & Translational Medicine

Curators: Larry H Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

  • Causes
  • Risks and Biomarkers
  • Therapeutic Implications

Volume Five: Pharmaco-Therapies for CVD

Curators: Justin D Pearlman, MD, PhD, FACC and Aviva Lev-Ari, PhD, RN

  • Causes
  • Risks and Biomarkers
  • Therapeutic Implications

Volume Six: Interventional Cardiology and Cardiac Surgery

Curators: Justin D Pearlman, MD, PhD, FACC, Larry H Bernstein, MD, FCAP, Aviva Lev-Ari, PhD, RN

  • Causes
  • Risks and Biomarkers
  • Therapeutic Implications

Series B: e-Books on Genomics & Medicine

Content Consultant: Larry H Bernstein, MD, FCAP

Volume 1: Genomics and Individualized Medicine

Sr. Editor: Stephen J Williams, PhD

Editors: Larry H Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN

Volume 2: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS & BioInformatics, Simulations and the Genome Ontology

Editor: Stephen J Williams and Aviva Lev-Ari, PhD, RN

Volume 3: Institutional Leadership in Genomics

Editors: Marcus Feldman, PhD and Aviva Lev-Ari, PhD, RN 

Series C: e-Books on Cancer & Oncology

Content Consultant: Larry H Bernstein, MD, FCAP

Volume 1: Cancer and Genomics

Sr. Editor: Stephen J Williams, PhD

Editors: Ritu Saxena, PhD, Tilda Barliya, PhD

Volume 2: Radiation Oncology & Immunotherapy in Cancer

Editor: Larry H Bernstein, MD, FCAP

Volume 3: Nanotechnology and Drug Delivery

Editor and Author: Tilda Barliya, PhD

Series D: e-Books on BioMedicine

Volume 1: Metabolomics

Sr. Editors: Larry H Bernstein, MD, FCAP

Series E: Patient-Centered Medicine

Expert, Author, Writer: Larry H Bernstein, MD, FCAP

Editor: Larry H Bernstein, MD, FCAP

Editor: Larry H Bernstein, MD, FCAP

Expert, Author, Writer: Larry H Bernstein, MD, FCAP

ARTICLES on http://pharmaceuticalintelligence.com

12/10/2013: 276 Scientific Articles
FIRST Article on This Open Access Scientific Journal, 7/28/2013, 569 Views:

The role of biomarkers in the diagnosis of sepsis and patient management

L. H. Bernstein, MD, FCAP

LIST of 276 ARTICLES on http://pharmaceuticalintelligence.com

Recommended Reading by the Curator of this article:

The Essential Role of Nitric Oxide and Therapeutic NO Donor Targets in Renal Pharmacotherapy


Author’s Selection of his Top Articles to date on the Journal

  • Developments in the Genomics and Proteomics of Type 2 Diabetes Mellitus and Treatment Targets


  • Vegan Diet is Sulfur Deficient and Heart Unhealthy
  • Erythropoietin (EPO) and Intravenous Iron (Fe) as Therapeutics for Anemia in Severe and Resistant CHF: The Elevated N-terminal proBNP Biomarker

Selected citations to peer reviewed publications

 Clinical value of NT-proBNP assay in the emergency department for the diagnosis of heart f…
Archives of gerontology and geriatrics 05/2015; 61(2). DOI:10.1016/j.archger.2015.05.001
 Diagnostic Yield of Targeted Next-Generation Sequencing in Various Cancer Types: An Inform…
Cancer Genetics 05/2015; 208(9). DOI:10.1016/j.cancergen.2015.05.030
Information induction for predicting acute myocardial infarction…in –
The economic cost of hospital malnutrition in Europe; a narrative review
e-SPEN the European e-Journal of Clinical Nutrition and Metabolism 04/2015; DOI:10.1016/j.clnesp.2015.04.003

Plasma Transthyretin as a Biomarker of Lean Body Mass and Catabolic States

Sep 2015 · Advances in Nutrition  Yves Ingenbleek  Larry H Bernstein

Transthyretin as a marker to predict outcome in critically ill patients
Devakonda, A., George, L., Raoof, S., Esan, A., Saleh, A., Bernstein, L.H.
Clinical Biochemistry  2008; 41(14-15):1126 – 1130

Biomarkers in critically ill patients with systemic inflammatory response syndrome or sepsis supplemented with high-dose selenium
Brodska, H., Valenta, J., Malickova, K., Kohout, P., Kazda, A., Drabek, T.
Journal of Trace Elements in Medicine and Biology 2015; 31:25-32View all citations to your article in Scopus

Read Full Post »

Stanford Dropout is Already Drawing Comparisons with Steve Jobs

Larry H Bernstein, MD, Reporter

An interview by Eric Topol on Medscape of a 29 year-old Stanford University dropout is fascinating.

Editor’s Note:

If 29-year-old Elizabeth Holmes has her way, patients will no longer have to go to physicians’ offices, hospitals or laboratories to get high-complexity diagnostic blood tests. Nor will vial after vial of blood draws be necessary to do these tests.

Barely out of the gate after a decade of secrecy, the Stanford dropout is already drawing comparisons with Steve Jobs (she often wears the same black turtleneck). And her company, Theranos, Inc., which emerged from the shadows in September, just might be healthcare’s answer to Apple.[1] The so-called disruptive technology that Ms. Holmes, a former engineering major, and Theranos have created is said to have the potential to shake up and forever change the way laboratory medicine is conducted. Since forgoing college at 19, Ms. Holmes has secured millions of dollars in funding for her new venture, including $45 million in private equity funding in 2010.[2] The board of directors of her company is a Who’s Who of distinguished former and current technology, academic, and government officials.[2,3]

In an exclusive interview, Ms. Holmes talks to Medscape Editor-in-Chief Eric J. Topol, MD, about the decade she spent building her company; plans for the present and the future, including a recent deal with Walgreens drugstores; and whether she’s on the path to the creative destruction of laboratory medicine.

Leaving Stanford at Age 19

Dr. Topol: Hello. I’m Dr. Eric Topol, Editor-in-Chief of Medscape. Joining me today for Medscape One-on-One is Elizabeth Holmes, Founder, President, and CEO of Theranos.  We are here in Palo Alto, California, at the company’s headquarters. Elizabeth, welcome. This is going to be a fascinating discussion.

Ms. Holmes: Thank you. It’s wonderful to be here and have you here.

Dr. Topol: This is a story that has been brewing for a long time. You were at Stanford University, and at age 19 you decided to change your path. Is that right?

Ms. Holmes: Yes.

Dr. Topol: What made you think, “I’m on to something, and I don’t want to do college; I’ve got something else that’s  probably bigger than that”?

Ms. Holmes: I knew that I wanted to do something that could make a difference in the world.

To me, there was nothing greater that I could build than something that would change the reality in our healthcare system today, which is that when someone you love gets really, really sick, usually by the time you find that out, it’s too late to be able to do something about it. And in those moments it’s heartbreaking, because there is nothing you wouldn’t do.


Read Full Post »

Larry H Bernstein, MD, FCAP, Reporter and curator

αllbβ3 Antagonists As An Example of Translational Medicine Therapeutics

http://phrmaceuticalintelligence.com/2013-10-12/larryhbern_BS-Coller/αllbβ3 Antagonists As An Example of Translational Medicine Therapeutics

by Barry S. Coller, MD
Rockefeller University


This article is a segment in several articles about platelets, platelet function, and advances in applying the surge of knowledge to therapy.  In acute coronary syndromes, plaque rupture leads to thrombotic occlusion.  We have also seen that the development of a plaque occurs in 3 stages, only the last of which involves plaque rupture.  Platelets interact with the vascular endothelium, and platelet-endothelial as well as platelet-platelet interactions are known to be important in atherogenesis.  We learned that platelets are derived from megakaryocytes that break up and these elements are released into the blood stream.  It has recently been discovered that platelets can replicate in the circulation.  The turnover of platelets is rapid, and platelets sre stored at room temperature with shaking, and are viable for perhaps only 3-4 days once they are received in the blood bank for use.  In cardiology, the identification, isolation, and characterization of GPIIb/IIIa from the platelet was a huge advance in the potential for coronary intervention, and that potential became of paramount importance with the introduction of GPIIb/IIIa inhibitors as a standard in coronary vascular therapeutic procedures.   The following manuscript by Barry Coller, at Rockefeller University,  is a presentation of the GPIIb/IIIa story as an excellent example of Translational Medicine.

Search for GPIIb/IIIa inhibitor of the (anti-αIIb133 (GPIIb/IIIa) receptor)

The deliberate search for drugs to inhibit the αIIb133 (GPIIb/IIIa) receptor ushered in the era of rationally designed antiplatelet therapy and thus represents an important milestone in the evolution of antiplatelet drug development. The selection of the αIIb133 receptor as a therapeutic target rested on a broad base of basic and clinical research conducted by many investigators in the 1960s and 1970s working in the fields of platelet physiology, the rare bleeding disorder Glanzmann thrombasthenia, platelet membrane glycoproteins, integrin receptors, coronary artery pathology, and experimental thrombosis. Thus, αIIb133 was found to mediate platelet aggregation by virtually all of the physiology agonists (e.g., ADP, epinephrine, and thrombin) through a mechanism in which platelet activation by these agents results in a change in the conformation of the receptor. This is followed by increased affinity of the receptor for the multivalent ligands fibrinogen and von Willebrand factor, both of which are capable of binding to receptors on two platelets simultaneously, producing platelet crosslinking and aggregation. At about the same time, experimental studies demonstrated platelet thrombus formation at sites of vascular injury, and biochemical studies in humans demonstrated evidence of platelet activation during acute ischemic cardiovascular events.

Our own studies initially focused on platelet-fibrinogen interactions using an assay in which normal platelets agglutinated fibrinogen-coated beads. The agglutination was enhanced with platelet activators. Platelets from patients with Glanzmann thrombasthenia, who lack the αIIb133 receptor, did not agglutinate the beads. We adapted this assay to a microtiter plate system to identify monoclonal antibodies that inhibited platelet-fibrinogen interactions and then demonstrated that these antibodies bound to αIIb133. They were also more potent inhibitors of platelet aggregation than any known antiplatelet agent and produced a pattern of aggregation that was virtually identical to that found using platelets from patients with Glanzmann thrombasthenia.

I recognized the theoretical potential of using an antibody to inhibit platelets in vivo but also recognized the challenges and limitations. Since experimental models of thrombosis had been developed in the dog, and since the antibody we initially worked with did not react with dog platelets, we had to go back to our original samples to identify an antibody (7E3) that reacted with dog platelets in addition to human platelets. Since coating platelets with immunoglobulins results in their rapid elimination of the platelets from the circulation, and since the clearance is mediated by the immunoglobulin Fc region, we prepared F(ab’)2 fragments of 7E3 for our in vivo studies. Additional challenges included preparing large quantities of antibody on a very limited budget and purifying the antibodies so they contained only minimal amounts of endotoxin. With the small amount of 7E3-F(ab’)2 we initially prepared, we were able to show dose response inhibition of platelet aggregation in three dogs, achieving greater inhibition than with aspirin or ticlopidine, the only antiplatelet agents approved for human use at that time. We also devised an assay using radiolabeled 7E3 to quantify the percentage of platelet αIIbβ3 receptors that were blocked when a specific dose of 7E3-F(ab’)2 was administered in vivo. This allowed us to directly measure the effect of the agent on its target receptor on its target cell.

I considered two criteria most important in selecting the initial animal models in which to test the efficacy and safety of administering 7E3-F(ab’)2:

  • 1) the model had to convincingly simulate a human vascular disease, and
  • 2) aspirin had to have failed to produce complete protection from thrombosis.

The latter criterion was particularly important because I planned to stop this line of research if the 7E3-F(ab’)2 was not more efficacious than aspirin.

Ultimately, we collaborated with Dr. John Folts of the University of Wisconsin, who had developed a dog model of unstable angina by attaching a short cylindrical ring to partially occlude a coronary artery and using a hemostat to induce vascular injury. Pretreatment of the animal with 7E3-F(ab’)2 was more effective than aspirin or any other compound Dr. Folts had previously tested in preventing platelet thrombus formation, as judged by its effects on the characteristic repetitive cycles of platelet deposition and embolization. Electron microscopy of the vessels confirmed the reduction in platelet thrombi by 7E3-F(ab’)2, with only a monolayer of platelets typically deposited.

Dr. Chip Gold and his colleagues at Massachusetts General Hospital had developed a dog model to assess the effects of tissue plasminogen activator (t-PA) on experimental thrombi induced in the dog coronary artery. Although t-PA was effective in lysing the thrombi, the blood vessels rapidly reoccluded with new thrombi that were rich in platelets. Aspirin could not prevent reocclusion, whereas 7E3-F(ab’)2 not only prevented reocclusion, but also increased the speed of reperfusion by t-PA.

The next steps in drug development could not be performed in my laboratory because they required resources far in excess of those in my grant from the National Heart, Lung, and Blood Institute to study basic platelet physiology. As a result, in 1986 the Research Foundation of the State University of New York licensed the 7E3 antibody to Centocor, Inc., a new biotechnology company specializing in the diagnostic and therapeutic application of monoclonal antibodies.

Subsequent Development of 7E3

The subsequent development of 7E3 as a therapeutic agent required extensive collaboration among myself, a large number of outstanding scientists at Centocor, and many leading academic cardiologists. Many decisions and hurdles remained for us, including the decision to develop a mouse/human chimeric 7E3 Fab (c7E3 Fab); the design and execution of the toxicology studies; the assessment of the potential toxicity of 7E3 crossreactivity with αVβ3; the development of sensitive and specific assays to assess immune responses to c7E3 Fab; the design, execution, and analysis of the Phase I, II, and III studies; and the preparation, submission, and presentation of the Product Licensing  Application to the Food and Drug Administration, and comparable documents to European and Scandinavian agencies.

Based on the results of the 2,099 patient EPIC trial, in which conjunctive treatment with a bolus plus infusion of c7E3 Fab significantly reduced the risk of developing an ischemic complication (death, myocardial infarction, or need for urgent intervention) after coronary artery angioplasty or atherectomy in patients at high risk of such complications, the Food and Drug Administration approved the conjunctive use of c7E3 Fab (generic name, abciximab) in high-risk angioplasty and atherectomy on December 22, 1994. Since then it has been administered to more than 2.5 million patients in the U.S., Europe, Scandinavia, and Asia. Its optimal role in treating cardiovascular disease continues to evolve in response to the introduction of new anticoagulants, antiplatelet agents, stents, and procedures.

Extended Investigations

We have also been able to apply the monoclonal antibodies we prepared to αIIb33 to the prenatal detection of Glanzmann thrombasthenia, and have used the antibodies as probes for characterizing both the biogenesis of the receptor and the conformational changes that the receptor undergoes with activation. We have been able to precisely map the 7E3 epitope on 33, providing additional insights into the mechanism by which it prevents ligand binding. We have also exploited the ability of another antibody to αIIb33 to stabilize the receptor complex in order to facilitate production of crystals of the αIIb33 headpiece; the x-ray diffraction properties of these crystals were studied in collaboration with Dr. Timothy Springer’s group at Harvard and provide the first structural information on the receptor.

In landmark studies in the 1980s, Pierschbacher and Ruoslahti demonstrated the importance of the arginine-aspartic acid (RGD) sequence in the interaction of the integrin α531 with fibronectin, and they went on to show that peptides with the RGD sequence could inhibit this interaction. Subsequent studies by many groups demonstrated that these peptides could also inhibit the interaction of platelet αIIb33 with fibrinogen and von Willebrand factor. Dr. David Phillip and Dr. Robert Scarbrough led the team at Cor Therapeutics that made a cyclic pentapeptide with high selectivity for αIIb33 over αV33 by patterning their compound on the KGD sequence in the snake venom barbourin. The resulting antiplatelet agent, eptifibatide, received FDA approval in May 1998. At Merck, Dr. Robert Gould led the team that developed the nonpeptide RGD-mimetic tirofiban, which also is selective for αIIb33 compared to αV33. It also received FDA approval in May 1998. Our recent x-ray crystallographic studies in collaboration with Dr. Springer’s group provided structural information on the mechanisms and sites of binding of these drugs with αIIb33.

Translation of Basic Science into Therapy

Many important elements and an enormous amount of good fortune were needed for the translation of the basic science information about platelet aggregation into the drug abciximab, including, but not limited to:

  • 1) the support of basic studies of platelet physiology by the National Institutes of Health in my laboratory and many other laboratories,
  • 2) the creation and ongoing funding of a core facility available to all faculty members to prepare monoclonal antibodies at the State University of New York at Stony Brook under the direction of Dr. Arnold Levine,
  • 3) the 1988 Bayh-Dole Act and its subsequent amendments, and the expertise of the Technology Transfer Office at Stony Brook in licensing 7E3 to Centocor, which then provided the capital and additional expertise required for its development, and
  • 4) the expert and enthusiastic collaboration by two large and disciplined cooperative groups of interventional cardiologists (TAMI, EPIC) under the dynamic leadership of Drs. Eric Topol and Rob Califf,

tirofiban, that were eager to test the safety and efficacy of the 7E3 derivatives. Although the translation of each new scientific discovery into improved health via novel preventive, diagnostic, or therapeutic strategies requires the blazing of a unique path, optimizing these elements and similar ones may allow the path to be shorter and/or to be traversed more easily, at a lower cost, or in a shorter period of time.


Related articles in Pharmaceutical Intelligence:

Platelets in Translational Research – 1   Larry H. Bernstein, MD, FCAP
Platelets in Translational Research – 2  Larry H. Bernstein, MD, FCAP

Do Novel Anticoagulants Affect the PT/INR? The Cases of XARELTO (rivaroxaban) and PRADAXA (dabigatran)
Vivek Lal, MBBS, MD, FCIR, Justin D Pearlman, MD, PhD, FACC and Aviva Lev-Ari, PhD, RN


Read Full Post »

Genome Sequencing of the Healthy

Curators: Larry H. Bernstein, MD, FCAP and Aviva Lev-Ari, PhD, RN


Key Issues in Genome Sequencing of Healthy Individuals
Eric Topol, MD, Genomic Medicine

I briefly review 3 important articles that recently appeared, each touching on important controversies in the use of whole genome sequencing


I briefly review 3 important articles that recently appeared, each touching on important controversies in the use of whole genome sequencing:
1. Should Healthy People Have Their Genomes Sequenced At This Time? Wall Street Journal, February 15, 2013.
2. A Genetic Code for Genius? Wall Street Journal, February 15, 2013.
3. Francke U, Djamco C, Kiefer AK, et al. Dealing with the unexpected: consumer responses to direct-access BRCA mutation testing. PeerJ. 2012;1:e8. DOI 10.7717/peerj.8
Welcome to another segment on genomic medicine. Today, I want to get into 3 different articles: 2 from the Wall Street (“Medical”) Journal and the other from a new open access journal, PeerJ. All of them are related to the issues of genome sequencing.
First, there was a debate about whether all healthy people should have their genomes sequenced. It was a debate between Atul Butte from Stanford and Robert Green from Harvard. In this debate, they made a number of really good points, and I have linked you to that article if you’re interested.
Basically, is it too early to get sequencing because we need millions of people to have whole genome sequencing who are healthy in order for that information to be truly informative. The price continues to drop. So even though the sequencing that is done today would still be valid if it’s done accurately, the problem we have, of course, is a lack of enough people who are phenotyped with a particular condition to extract all the best information that is truly informative from whole genome sequencing.
 it’s unlikely that even 2000 individuals with high IQ will be particularly informative but also, of course, what this could do from a bioethical standpoint. I’ll leave that to your imagination and thoughts as to where this could go – that is, trying to understand, even with limited power, the genomics of intelligence.
The third article, which is also very interesting, comes from this new journal called PeerJ. I’m on the editorial board of that journal, and I think it’s terrific to see open access, high-quality biomedical articles.
This one comes from the company 23andMe. From a very large number of individuals – now over 200,000 and rapidly approaching 1 million – who have had genome scans, a large number of women had information about the BRCA gene and whether they had a significant mutation. From these women who volunteered to participate in this study, we learned that they had no serious psychological repercussions from knowledge of this highly actionable BRCA pathogenic mutation.
This goes along with the previous study that we had done at Scripps led by my colleague Cinnamon Bloss in the New England Journal of Medicine, where, in thousands of individuals who had genome scans and had such data as ApoE4 status known to them for the first time, there were no significant negative psychological repercussions.

Should Healthy People Have Their Genomes Sequenced At This Time?

‘Patients in Waiting’

Injecting so much uncertain genetic information into the doctor-patient relationship could create legions of “patients in waiting” leading to unnecessary tests, harmful outcomes and lifelong anxiety. As private software companies compete to provide more genomic “findings” to a medical culture that is trained to search for diagnostic fire when they smell the smoke of disease risk, there are potential benefits. But there is also a real possibility that medical resources will be squandered and patients could be harmed.

Perhaps we all underestimated how complicated it would be to move genomic knowledge into the practice of medicine and public health. Now is the time to make sure we get this right through rigorous basic and clinical studies that define which mutations are dangerous, and distinguish useful from unnecessary interventions. Soon, genomic insights will give us early warnings about life-threatening illnesses that we may be able to prevent. Soon, standards will be available to guide doctors about which findings are meaningful and which are not.

Soon, there may be evidence to support the benefits of screening healthy individuals. But not today.

Table 1. Performance values for genome sequenc...

Table 1. Performance values for genome sequencing technologies including Sanger methods and Massively Parallel Seqeuncing methods. Sinville, R. and Soper, S. A. High resolution DNA separations using microchip electrophoresis. J. Sep. Sci. 2007, 30, 1714 – 1728 Morozova,O. and Marra, M. A. Applications of next-generation sequencing technologies in functional genomics. Genomics. 92 (2008) 255–264 (Photo credit: Wikipedia)


Read Full Post »

Larry H Bernstein, MD, FACP, Reporter

Clinical Trials
Dr. Eric Topol, Director of the Scripps Translational Science Institute and Editor-in-Chief of Medscape Genomic Medicine and the heart.org.
In our series The Creative Destruction of Medicine, I’m trying to get into critical aspects of how we can Schumpeter or reboot the future of healthcare by leveraging the big innovations that are occurring in the digital world, including digital medicine.

But one of the things that has been missed along the way is that how we do clinical research will be radically affected as well. We have this big thing about evidence-based medicine and, of course, the sanctimonious randomized, placebo-controlled clinical trial. Well, that’s great if one can do that, but often we’re talking about needing thousands, if not tens of thousands, of patients for these types of clinical trials. And things are changing so fast with respect to medicine and, for example, genomically guided interventions that it’s going to become increasingly difficult to justify these very large clinical trials.

For example, there was a drug trial for melanoma and the mutation of BRAF, which is the gene that is found in about 60% of people with malignant melanoma. When that trial was done, there was a placebo control, and there was a big ethical charge asking whether it is justifiable to have a body count. This was a matched drug for the biology underpinning metastatic melanoma, which is essentially a fatal condition within 1 year, and researchers were giving some individuals a placebo.

Would we even do that kind of trial in the future when we now have such elegant matching of the biological defect and the specific drug intervention? A remarkable example of a trial of the future was announced in May.[1] For this trial, the National Institutes of Health is working with [Banner Alzheimer’s Institute] in Arizona, the University of Antioquia in Colombia, and Genentech to have a specific mutation studied in a large extended family living in the country of Colombia in South America. There is a family of 8000 individuals who have the so-called Paisa mutation, a presenilin gene mutation, which results in every member of this family developing dementia in their 40s.

Clinical Trials (journal)

Clinical Trials (journal) (Photo credit: Wikipedia)

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

From MedscapeTopol on The Creative Destruction of Medicine

Topol: Consumer-Driven Healthcare Is an Uncomfortable Concept

Eric J. Topol, MD

Posted: 09/17/2012

Hi. I’m Dr. Eric Topol, Director of the Scripps Translational Science Institute and Editor-in-Chief of Medscape Genomic Medicine and theheart.org.

In this series, The Creative Destruction of Medicine, emanating from the book I wrote, I am trying to zoom in on critical aspects of how the digital world will create better healthcare. The segment that we are getting into today is on consumer-driven healthcare.

This is a concept that a lot of physicians are very uncomfortable with. If you go back to the Gutenberg printing press, it was only then in the Middle Ages when the Bible and all the printed information could be read by others besides the high priest. In fact, that’s an analogy of what is going to happen in medicine, because until now there has been this tremendous information asymmetry.

Essentially, all the data, information, and knowledge were in the domain of doctors and healthcare professionals, and the consumer, patient, and individual was out there without that information, not even their own data. But that’s changing very quickly.

Patients will have the capability of accessing notes from an office visit and hospital records, as well as laboratory data and DNA sequencing — and on one’s smartphone, for example, blood pressure and glucose and all the key physiologic metrics.

When each individual has access to all this critical data, there will be a real shakeup to the old way that medicine was practiced. In the past, the Internet was supposed to be empowering for consumers, but that really didn’t matter because what the consumer could get through the Internet was data about a population. Now, one can get data about oneself, and, of course, a center hub for that data-sharing will be the smartphone.

Even critical information based on one’s genomic sequencing, such as drug interactions, will have a whole different look. We’ve already learned so much about the direct-to-consumer movement from the pharmaceutical industry in which patients were directed to go to their doctors and ask them for a prescription drug. That had a very powerful impact.

But in the future, with each person potentially armed with so much data and information, the role of the doctor is a very different one: It is to provide guidance, wisdom, knowledge, and judgment and, of course, the critical aspects of compassion, empathy, and communication. That is a whole different look for the consumer-driven healthcare world of the future.

Thanks so much for your attention to this segment. We will be back with more on The Creative Destruction of Medicine.



Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

Genomics and the State of Science Clarity

Projects supported by the US National Institutes of Health will have produced 68,000 total human genomes — around 18,000 of those whole human genomes — through the end of this year, National Human Genome Research Institute estimates indicate. And in his book, The Creative Destruction of Medicine, the Scripps Research Institute‘s Eric Topol projects that 1 million human genomes will have been sequenced by 2013 and 5 million by 2014.

“There’s a lot of inventory out there, and these things are being generated at a fiendish rate,” says Daniel MacArthur, a group leader in Massachusetts General Hospital‘s Analytic and Translational Genetics Unit. “From a capacity perspective … millions of genomes are not that far off. If you look at the rate that we’re scaling, we can certainly achieve that.”

The prospect of so many genomes has brought clinical interpretation into focus — and for good reason. Save for regulatory hurdles, it seems to be the single greatest barrier to the broad implementation of genomic medicine.

But there is an important distinction to be made between the interpretation of an apparently healthy person’s genome and that of an individual who is already affected by a disease, whether known or unknown.

In an April Science Translational Medicine paper, Johns Hopkins University School of Medicine‘s Nicholas Roberts and his colleagues reported that personal genome sequences for healthy monozygotic twin pairs are not predictive of significant risk for 24 different diseases in those individuals. The researchers then concluded that whole-genome sequencing was not likely to be clinically useful for that purpose. (See sidebar, story end.)

“The Roberts paper was really about the value of omniscient interpretation of whole-genome sequences in asymptomatic individuals and what were the likely theoretical limits,” says Isaac Kohane, chair of the informatics program at Children’s Hospital Boston. “That was certainly an important study, and it was important to establish what those limits of knowledge are in asymptomatic populations. But, in fact, the major and most important use cases [for whole-genome sequencing] may be in cases of disease.”

Still, targeted clinical interpretations are not cut and dried. “Even in cases of disease, it’s not clear that we know now how to look across multiple genes and figure out which are relevant, which are not,” Kohane adds.

While substantial progress has been made — in particular, for genetic diseases, including certain cancers — ambiguities have clouded even the most targeted interpretation efforts to date. Technological challenges, meager sample sizes, and a need for increased, fail-safe automation all have hampered researchers’ attempts to reliably interpret the clinical significance of genomic variation. But perhaps the greatest problem, experts say, is a lack of community-wide standards for the task.

Genes to genomes

When scientists analyzed James Watson’s genome — his was the first personal sequence, completed in 2007 and published in Nature in 2008 — they were surprised to find that he harbored two putative homozygous SNPs matching Human Gene Mutation Database entries that, were they truly homozygous, would have produced severe clinical pheno-types.

But Watson was not sick.

As researchers search more and more genomes, such inconsistencies are increasingly common.

“My take on what has happened is that the people who were doing the interpretation of the raw sequence largely were coming from a SNPs world, where they were thinking about sequence variants that have been observed before, or that have an appreciable frequency, and weren’t thinking very much about the single-ton sequence variants,” says Sean Tavtigian, associate professor of oncology at the University of Utah.

“There is a qualitative difference between looking at whole-genome sequences and looking at single genes or, even more typically, small numbers of variants that have been previously implicated in a disease,” Boston’s Kohane adds.
“Previously, because of the cost and time limitations around sequencing and genotyping, we only looked at variants in genes for which we had a clinical indication. Now, since we can essentially see that in the near future we will be able to do a full genome sequence for essentially the same cost as just a focused set-of-variants test, all of the sudden we have to ask ourselves: What is the meaning of variants that fall outside where we would have ordinarily looked for a given disease or, in fact, if there is no disease at all?”

Mass General’s MacArthur says it has been difficult to pinpoint causal variants because they are enriched for both sequencing and annotation errors. “In the genome era, we can generate those false positives at an amazing rate, and we need to work hard to filter them back out,” he says.

“Clinical geneticists have been working on rare diseases for a long time, and have identified many genes, and are used to working in a world where there is sequence data available only from, say, one gene with a strong biological hypothesis. Suddenly, they’re in this world where they have data from patients on all 20,000 genes,” MacArthur adds. “There’s a fundamental mind-shift there, in shifting from one gene through to every gene. My impression is that the community as a whole hasn’t really internalized that shift; people still have a sense in their head that if you see a strongly damaging variant that segregates with the disease, and maybe there’s some sort of biological plausibility around it as well, that that’s probably the causal variant.”

Studies have shown that that’s not necessarily so. Because of this, “I do worry that in the next year or so we’ll see increasing numbers of mutations published that later prove to just be benign polymorphisms,” MacArthur adds.

“The meaning of whole-genome -sequence I think is very much front-and-center of where genomics is going to go. What is the true, clinical meaning? What is the interpretation? And, there’s really a double-edged sword,” Kohane says. On one hand, “if you only focus on the genes that you believe are relevant to the condition you’re studying, then you might miss some important findings,” he says. Conversely, “if you look at every-thing, the likelihood of a false positive becomes very, very high. Because, if you look at enough things, invariably you will find something abnormal,” he adds.

False positives are but one of the several challenges scientists working to analyze genomes in a clinical context face.

Technical difficulties

That advances in sequencing technologies are far outstripping researchers’ abilities to analyze the data they produce has become a truism of the field. But current sequencing platforms are still far from perfect, making most analyses complicated and nuanced. Among other things, improvements in both read length and quality are needed to enable accurate and reproducible interpretations.

“The most promising thing is the rate at which the cost-per-base-pair of massively parallel sequencing has dropped,” Utah’s Tavtigian says. Still, the cost of clinical sequencing is not inconsequential. “The $1,000, $2,000, $3,000 whole-genome sequences that you can do right now do not come anywhere close to 99 percent probability to identify a singleton sequence variant, especially a biologically severe singleton sequence variant,” he says. “Right now, the real price of just the laboratory sequencing to reach that quality is at least $5,000, if not $10,000.”

However, Tavtigian adds, “techniques for multiplexing many samples into a channel for sequencing have come along. They’re not perfect yet, but they’re going to improve over the next year or so.”

Using next-generation sequencing platforms, researchers have uncovered a variety of SNPs, copy-number variants, and small indels. But to MacArthur’s mind, current read lengths are not up to par when it comes to clinical-grade sequencing, and they have made supernumerary quality-control measures necessary.

“There’s no question that we’re already seeing huge improvements. … And as we add in to that changes in technology — for instance much, much longer sequencing reads, more accurate reads, possibly combining different platforms — I think these sorts of [quality-control] issues will begin to go away over the next couple of years,” MacArthur says. “But at this stage, there is still a substantial quality-control component in any sort of interpretation process. We don’t have perfect genomes.”

In a 2011 Nature Biotechnology paper, Stanford University’s Michael Snyder and his colleagues sought to examine the accuracy and completeness of single-nucleotide variant and indel calls from both the Illumina and Complete Genomics platforms by sequencing the genome of one individual using both technologies. Though the researchers found that more than 88 percent of the unique single-nucleotide variants they detected were concordant between the two platforms, only around one-quarter of the indel calls they generated matched up. Overall, the authors reported having found tens of thousands of platform-specific variant calls, around 60 percent of which they later validated by genotyping array.

For clinical sequencing to ever become widespread, “we’re going to have to be able to show the same reproducibility and test characteristic modification as we have for, let’s say, an LDL cholesterol level,” Boston’s Kohane says. “And if you measure it in one place, it should not be too different from another place. … Even before we can get to the clinical meaning of the genomes, we’re going to have to get some industry-wide standards around quality of sequencing.”
Scripps’ Topol adds that when it comes to detecting rare variants, “there still needs to be a big upgrade in accuracy.”

Analytical issues

Beyond sequencing, technological advances must also be made on the analysis end. “The next thing, of course, is once you have better -accuracy … being able to do all of the analytical work,” Topol says. “We’re getting better at the exome, but every-thing outside of protein-coding -elements, there’s still a tremendous challenge.”

Indeed, that challenge has inspired another — a friendly competition among bioinformaticians working to analyze pediatric genomes in a pedigree study.

With enrollment closed and all sequencing completed, participants in the Children’s Hospital Boston-sponsored CLARITY Challenge have rolled up their shirtsleeves and begun to dig into the data — de-identified clinical summaries and exome or whole-genome sequences generated by Complete Genomics and Life Technologies for three children affected by rare diseases of unknown genetic basis, and their parents. According to its organizers, the competition aims to help set standards for genomic analysis and interpretation in a clinical setting, and for returning actionable results to clinicians and patients.

“A bunch of teams have signed up to provide clinical-grade reports that will be checked by a blue-ribbon panel of judges later this year to compare and contrast the different forms of clinical reporting at the genome-wide level,” Kohane says. The winning team will be announced this fall and will receive a $25,000 prize, he adds.

While the competition covers all aspects of clinical sequencing — from readout to reporting — it is important to recognize that, more generally, there may not be one right answer and that the challenges are far-reaching, affecting even the most basic aspects of analysis.

“There is a lot of algorithm investment still to be made in order to get very good at identifying the very rare or singleton sequence variants from the massively parallel sequencing reads efficiently, accurately, [and with] sensitivity,” Utah’s Tavtigian says.

Picking up a variant that has been seen before is one thing, but detecting a potentially causal, though as-yet-unclassified variant is a beast of another nature.

“Novel mutations usually need extensive knowledge but also validation. That’s one of the challenges,” says Zhongming Zhao, associate professor of biomedical informatics at Vanderbilt University. “Validation in terms of a disease study is most challenging right now, because it is very time-consuming, and usually you need to find a good number of samples with similar disease to show this is not by chance.”

Search for significance

Much like sequencing a human genome in the early- to mid-2000s was more laborious than it is now, genome interpretation has also become increasingly automated.

Beyond standard quality-control checks, the process of moving from raw data to calling variants is now semiautomatic. “There’s essentially no manual intervention required there, apart from running our eyes over [the calls], making sure nothing has gone horribly wrong,” says Mass General’s MacArthur. “The step that requires manual intervention now is all about taking that list of variants that comes out of that and looking at all the available biological data that exists on the Web, [coming] up with a short-list of genes, and then all of us basically have a look at all sorts of online resources to see if any of them have some kind of intuitive biological profile that fits with the disease we’re thinking about.”

Of course, intuitive leads are not foolproof, nor are current mutation data-bases. (See sidebar, story end.) And so, MacArthur says, “we need to start replacing the sort of intuitive biological approach with a much more data-informed approach.”

Developing such an approach hinges in part on having more genomes. “If we get thousands — tens of thousands — of people sequenced with various different phenotypes that have been crisply identified, that’s going to be so important because it’s the coupling of the processing of the data with having rare variants, structural variants, all the other genomic variations to understand the relationship of whole-genome sequence of any particular phenotype and a sequence variant,” Scripps’ Topol says.

Vanderbilt’s Zhao says that sample size is still an issue. “Right now, the number of samples in each whole-genome sequencing-based publication is still very limited,” he says. At the same time, he adds, “when I read peers’ grant applications, they are proposing more and more whole-genome sequencing.”

When it comes to disease studies, sequencing a whole swath of apparently healthy people is not likely to ever be worthwhile. According to Utah’s Tavtigian, “the place where it is cost-effective is when you test cases and then, if something is found in the case, go on and test all of the first-degree relatives of the case — reflex testing for the first-degree relatives,” he says. “If there is something that’s pathogenic for heart disease or colon cancer or whatever is found in an index case, then there is a roughly 50 percent chance that the first-degree relatives are going to carry the same thing, whereas if you go and apply that same test to someone in the general population, the probability that they carry something of interest is a lot lower.”

But more genomes, even familial ones, are not the only missing elements. To fill in the functional blanks, researchers require multiple data types.

“We’ve been pretty much sequence-centric in our thinking for many years now because that was where are the attention [was],” Topol says. “But that leaves the other ‘omes out there.”

From the transcriptome to the proteome, the metabolome, the microbiome, and beyond — Topol says that because all the ‘omes contribute to human health, they all merit review.

“The ability to integrate information about the other ‘omics will probably be a critical direction to understand the underpinnings of disease,” he says. “I call it the ‘panoromic’ view — that is really going to become a critical future direction once we can do those other ‘omics readily. We’re quite a ways off from that right now.”

Mass General’s MacArthur envisages “rolling in data from protein-protein interaction networks and tissue expression data — pulling all of these together into a model that predicts, given the phenotype, given the systems that appear to be disrupted by this variant, what are the most likely set of genes to be involved,” he says. From there, whittling that set down to putative causal variants would be simpler.

“And at the end of that, I think we’ll end up with a relatively small number of variants, each of which has a probability score associated with it, along with a whole host of additional information that a clinician can just drill down into in an intuitive way in making a diagnosis in that individual,” he adds.

According to MacArthur, “we’re already moving in this direction — in five years I think we will have made substantial progress toward that.” He adds, “I certainly think within five years we will be diagnosing the majority of severe genetic disease patients; the vast majority of those we’ll be able to assign a likely causal variant using this type of approach.”

Tavtigian, however, highlights a potential pitfall. While he says that “integration of those [multivariate] data helps a lot with assessing unclassified variants,” it is not enough to help clinicians ascertain causality. Functional assays, which can be both inconclusive and costly, will be needed for some unclassified variant hits, particularly those that are thought to be clinically meaningful.

“I don’t see how you’re going to do a functional assay for less than like $1,000,” he says. “That means that unless the cost of the sequencing test also includes a whole bunch of money for assessing the unclassified variants, a sequencing test is going to create more of a mess than it cleans up.”

Rare, common

Despite the challenges, there have been plenty of clinical sequencing success stories. Already, Scripps’ Topol says there have been “two big fronts in 2012: One is the unknown diseases [and] the other one, of course, is cancer.” But scientists say that despite the challenges, whole–genome sequencing might also become clinically useful for asymptomatic individuals in the future.

Down the line, scientists have their sights set on sequencing asymptomatic individuals to predict disease risk. “The long-term goal is to have any person walk off the street, be able to take a look at their genome and, without even looking at them clinically, say: ‘This is a person who will almost certainly have phenotype X,'” MacArthur says. “That is a long way away. And, of course, there are many phenotypes that can’t be predicted from genetic data alone.”

Nearer term, Boston’s Kohane imagines that newborns might have their genomes screened for a number of neonatal or pediatric conditions.

Overall, he says, it’s tough to say exactly where all of the chips might fall. “It’s going to be an interesting few years where the sequencing companies will be aligning themselves with laboratory testing companies and with genome interpretation companies,” Kohane says.

Even if clinical sequencing does not show utility for cases other than genetic diseases, it could still become common practice.

“Worldwide, there are certainly millions of people with severe diseases that would benefit from whole–genome sequencing, so the demand is certainly there,” MacArthur says. “It’s just a question of whether we can develop the infrastructure that is required to turn the research-grade genomes that we’re generating at the moment into clinical-grade genomes. Given the demand and the practical benefit of having this information … I don’t think there is any question that we will continue to drive, pretty aggressively, towards large-scale -genome sequencing.”

Kohane adds that “although rare diseases are rare, in aggregate they’re actually not — 5 percent of the population, or 1 in 20, is beginning to look common.”

Despite conflicting reports as to its clinical value, given the rapid declines in cost, Kohane says it’s possible that a whole-genome sequence could be less expensive than a CT scan in the next five years. Confident that many of the interpretation issues will be worked out by then, he adds, “this soon-to-be-very-inexpensive test will actually have a lot of clinical value in a variety of situations. I think it will become part the decision procedure of most doctors.”

[Sidebar] ‘Predictive Capacity’ Challenged

In Science Translational Medicine in April, Johns Hopkins University School of Medicine’s Nicholas Roberts and his colleagues showed that personal genome sequences for healthy monozygotic twin pairs are not predictive of significant risk for 24 different diseases in those individuals and concluded that whole-genome sequencing was unlikely to be useful for that purpose.

As the Scripps Research Institute’s Eric Topol says, that Roberts and his colleagues examined the predictive capacity of personal genome sequencing “without any genome sequences” was but one flaw of their interpretation.

In a comment appearing in the same journal in May, Topol elaborated on this criticism, and noted that the Roberts et al. study essentially showed nothing new. “We cannot know the predictive capacity of whole-genome sequencing until we have sequenced a large number of individuals with like conditions,” Topol wrote.

Elsewhere in the journal, Tel Aviv University’s David Golan and Saharon Rosset noted that slightly tweaking the gene-environment parameters of the mathematical model used by Roberts et al. showed that the “predictive capacity of genomes may be higher than their maximal estimates.”

Colin Begg and Malcolm Pike from Memorial Sloan-Kettering Cancer Center also commented on the study in Science Translational Medicine, reporting their -alternative calculation of the predictive capacity of personal sequencing and their analysis of cancer occurrence in the second breast of breast cancer patients, both of which, they wrote, “offer a more optimistic view of the predictive value of genetic data.”

In response to those comments, Bert Vogelstein — who co-authored the Roberts et al. study — and his colleagues wrote in Science Translational Medicine that their “group was the first to show that unbiased genome-wide sequencing could illuminate the basis for a hereditary disease,” adding that they are “acutely aware of its immense power to elucidate disease pathogenesis.” However, Vogelstein and his colleagues also said that recognizing the potential limitations of personal genome sequencing is important to “minimize false expectations and foster the most fruitful investigations.”

[Sidebar] ‘The Single Biggest Problem’

That there is currently no comprehensive, accurate, and openly accessible database of human disease-causing mutations “is the single greatest failure of modern human genetics,” Massachusetts General Hospital’s Daniel MacArthur says.

“We’ve invested so much effort and so much money in researching these Mendelian diseases, and yet we have never managed as a community to centralize all of those mutations in a single resource that’s actually useful,” MacArthur says. While he notes that several groups have produced enormously helpful resources and that others are developing more, currently “none covers anywhere close to the whole of the literature with the degree of detail that is required to make an accurate interpretation.”

Because of this, he adds, researchers are pouring time and resources into rehashing one another’s efforts and chasing down false leads.

“As anyone at the moment who is sequencing genomes can tell you, when you look at a person’s genome and you compare it to any of these databases, you find things that just shouldn’t be there — homozygous mutations that are predicted to be severe, recessive, disease-causing variants and dominant mutations all over the place, maybe a dozen or more, that they’ve seen in every genome,” MacArthur says. “Those things are clearly not what they claim to be, in the sense that a person isn’t sick.” Most often, he adds, the researchers who reported that variant as disease-causing were mistaken. Less commonly, the database moderators are at fault.

“The single biggest problem is that the literature contains a lot of noise. There are things that have been reported to be mutations that just aren’t. And, of course, a lot of the databases are missing a lot of mutations as well,” MacArthur adds. “Until we have a complete database of severe disease mutations that we can trust, genome interpretation will always be far more complicated than it should be.”

Tracy Vence is a senior editor of Genome Technology.



NIST Consortium Embarks on Developing ‘Meter Stick of the Genome’ for Clinical Sequencing

September 05, 2012

The National Institute of Standards and Technology has founded a consortium, called “Genome in a Bottle,” to develop reference materials and performance metrics for clinical human genome sequencing.

Following an initial workshop in April, consortium members – which include stakeholders from industry, academia, and the government – met at NIST last month to discuss details and timelines for the project.

The current aim is to have the first reference genome — consisting of genomic DNA for a specific human sample and whole-genome sequencing data with variant calls for that sample — available by the end of next year, and another, more complete version by mid-2014.

“At present, there are no widely accepted genomics standards or quantitative performance metrics for confidence in variant calling,” the consortium wrote in its work plan, which was discussed at the meeting. Its main motivation is “to develop widely accepted reference materials and accompanying performance metrics to provide a strong scientific foundation for the development of regulations and professional standards for clinical sequencing.”

“This is like the meter stick of the genome,” said Marc Salit, leader of the Multiplexed Biomolecular Science group in NIST’s Materials Measurement Laboratory and one of the consortium’s organizers. He and his colleagues were approached by several vendors of next-generation sequencing instrumentation about the possibility of generating standards for assessing the performance of next-gen sequencing in clinical laboratories. The project, he said, will focus on whole-genome sequencing but will also include targeted sequencing applications.

The consortium, which receives funding from NIST and the Food and Drug Administration, is open for anyone to participate. About 100 people, representing 40 to 50 organizations, attended last month’s meeting, among them representatives from Illumina, Life Technologies, Pacific Biosciences, Complete Genomics, the FDA, the Centers for Disease Control and Prevention, commercial and academic clinical laboratories, and a number of large-scale sequencing centers.

Four working groups will be responsible for different aspects of the project: a group led by Andrew Grupe at Celera will select and design the reference materials; a group headed by Elliott Margulies at Illumina will characterize the reference materials experimentally, using multiple sequencing platforms; Steve Sherry at the National Center for Biotechnology Information is heading a bioinformatics, data integration, and data representation group to analyze and represent the experimental data; and Justin Johnson from EdgeBio is in charge of a performance metrics and “figures of merit” group to help laboratories use the reference materials to characterize their own performance.

The reference materials will include both human genomic DNA and synthetic DNA that can be used as spike-in controls. Eventually, NIST plans to release the references as Standard Reference Materials that will be “internationally recognized as certified reference materials of higher order.”

According to Salit, there was some discussion at the meeting about what sample to select for a national reference genome. The initial plan was to use a HapMap sample – NA12878, a female from the CEPH pedigree from Utah – but it turned out that HapMap samples are consented for research use only and not for commercial use, for example in an in vitro diagnostic or for potential re-identification from sequence data.

The genome of NA12878 has already been extensively characterized, and the CDC is developing it as a reference for clinical laboratories doing targeted sequencing. “We were going to build on that momentum and make our first reference material the same genome,” Salit said. But because of the consent issues, NIST’s institutional review board and legal experts are currently evaluating whether the sample can be used.

In the meantime, consortium members have been “quite enthusiastic” about using samples from the Harvard University’s Personal Genome Project, which are broadly consented, Salit said.

The reference material working group issued a recommendation to develop a set of genomes from eight ethnically diverse parent-child trios as references, he said. For cancer applications, the references may also potentially include a tumor-normal pair.

The consortium will characterize all reference materials by several sequencing platforms. Several instrument vendors, as well as a couple of academic labs, have offered to contribute to data production. According to Justin Zook, a biomedical engineer at NIST and another organizer of the consortium, the current plan is to use sequencing technology from Illumina, Life Technologies, Complete Genomics, and – at least for the first genome – PacBio. Some of the sequencing will be done internally at NIST, which has Life Tech’s 5500 and Ion Torrent PGM available. In addition, the consortium might consider fosmid sequencing, which would provide phasing information and lower the error rate, as well as optical mapping to gain structural information, Zook said.

He and his colleagues have developed new methods for calling consensus variants from different data sets already available for the NA12878 sample, which they are planning to submit for publication in the near future. A fraction of the genotype calls will be validated using other methods, such as microarrays and Sanger sequencing. Consensus genotypes with associated confidence levels will eventually be released publicly as NIST Reference Data.

An important part of NIST’s work on the data analysis will be to develop probabilistic confidence estimates for the variant calls. It will also be important to distinguish between homozygous reference genotypes and areas in the genome “where you’re not sure what the genotype is,” Zook said, adding that this will require new data formats.

Coming up with confidence estimates for the different types of variants will be challenging, Zook said, particularly for indels and structural variants. Also, representing complex variants has not been standardized yet.

Several meeting participants called for “reproducible research and transparency in the analysis,” Salit said, and there were discussions about how to implement that at the technical level, including data archives so anyone can re-analyze the reference data.

One of the challenges will be to establish the infrastructure for hosting the reference data, which will require help from the NCBI, Salit said. Also, analyzing the data collaboratively is “not a solved problem,” and the consortium is looking into cloud computing services for that.

The consortium will also develop methods that describe how to use the reference materials to assess the performance of a particular sequencing method, including both experimental protocols and open source software for comparing genotypes. “We could throw this over the fence and tell someone, ‘Here is the genome and here is the variant table,'” Salit said, but, he noted, the consortium would like to help clinical labs use those tools to understand their own performance.

Edge Bio’s Johnson, who is chairing the working group in charge of this effort, is also involved in developing bioinformatic tools to judge the quality of genomes for the Archon Genomics X Prize (CSN 11/2/2011). Salit said that NIST is “leveraging some excellent work coming out of the X Prize” and is collaborating with a member of the X Prize team on the consensus genotype calling project.

By the end of 2013, the consortium wants to have its first “genome in a bottle” and reference data with SNV and maybe indel calls available, which will not yet include all confidence estimates. Another version, to be released in mid-2014, will include further analysis of error rates and uncertainties, as well as additional types of variants, such as structural variation.

Julia Karow tracks trends in next-generation sequencing for research and clinical applications for GenomeWeb’s In Sequenceand Clinical Sequencing News. E-mail her here or follow her GenomeWeb Twitter accounts at @InSequence and@ClinSeqNews.

At AACC, NHGRI’s Green Lays out Vision for Genomic Medicine

July 16, 2012

LOS ANGELES – The age of genomic medicine is within “striking distance,” Eric Green, director of the National Human Genome Research Institute, told attendees of the American Association of Clinical Chemistry’s annual meeting here on Sunday.

Speaking at the conference’s opening plenary session, Green discussed NHGRI’sroadmap for moving genomic findings into clinical practice. While this so-called “helix to healthcare” vision may take many years to fully materialize, “I predict absolutely that it’s coming,” he said.

Green noted that rapid advances in DNA sequencing have put genomics on a similar development path as clinical chemistry, which is also a technology-driven field. “If you look over the history of clinical chemistry, whenever there were technology advances, it became incredibly powerful and new opportunities sprouted up left and right,” he said.

Green likened next-gen sequencing to the autoanalyzers that “changed the face of clinical chemistry” by providing a generic platform that enabled a range of applications. In a similar fashion, low-cost sequencing is becoming a “general purpose technology” that can not only read out DNA sequence but can also provide information about RNA, epigenetic modifications, and other associated biology, he said.

The “low-hanging fruit” for genomic medicine is cancer, where molecular profiling is already being used alongside traditional histopathology to provide information on prognosis and to help guide treatment, he said.

Another area where Green said that genomic medicine is already bearing fruit is pharmacogenomics, where genomic data is proving useful in determining which patients will respond to specific drugs.

Nevertheless, while it’s clear that “sequencing is already altering the clinical landscape,” Green urged caution. “We have to manage expectations and realize it’s going to be many years from going from the most basic information about our genome sequence to actually changing medical care in any serious way,” he said.

In particular, he noted that the clinical interpretation of genomic data is still a challenge. Not only are the data volumes formidable, but the functional role of most variants is still unknown, he noted.

This knowledge gap should be addressed over the next several years as NHGRI and other organizations worldwide sequence “hundreds of thousands” of human genomes as part of large-scale research studies.

“We’re increasingly thinking about how to use that data to actually do clinical care, but I want to emphasize that the great majority of this data being generated will and should be part of research studies and not part of primary clinical care quite yet,” Green said.



Startup Aims to Translate Hopkins Team’s Cancer Genomics Expertise into Patient Care

May 16, 2012

Researchers at Johns Hopkins University who helped pioneer cancer genome sequencing have launched a commercial effort intended to translate their experience into clinical care.

Personal Genome Diagnostics, founded in 2010 by Victor Velculescu and Luis Diaz, aims to commercialize a number of cancer genome analysis methods that have been developed at Hopkins over the past several decades. Velculescu, chief scientific officer of PGDx, is director of cancer genetics at the Ludwig Center for Cancer Genetics and Therapeutics at Hopkins; while Diaz, chief medical officer of the company, is director of translational medicine at the Ludwig Center.

Other founders include Ludwig Center Director Bert Vogelstein as well as Hopkins researchers Ken Kinzler, Nick Papadopoulos, and Shibin Zhou. The team has led a number of seminal cancer sequencing projects, including the first effort to apply large-scale sequencing to cancer genomes, one of the first cancer exome sequencingstudies, and the discovery of a number of cancer-related genes, including TP53, PIK3CA, APC, IDH1 and IDH2.

Velculescu told Clinical Sequencing News that the 10-person company, headquartered in the Science and Technology Park at Johns Hopkins in Baltimore, is a natural extension of the Hopkins group’s research activities.

Several years ago, “we began receiving requests from other researchers, other physicians, collaborators, and then actually patients, family members, and friends, wanting us to do these whole-exome analyses on cancer samples,” he said. “We realized that doing this in the laboratory wasn’t really the best place to do it, so for that reason we founded Personal Genome Diagnostics.”

The goal of the company, he said, “is to translate this history of our group’s experience of cancer genetics and our understanding of cancer biology, together with the technology that has now become available, and to ultimately perform these analyses for individual patients.”

The fledgling company has reached two commercial milestones in the last several weeks. First, it gained CLIA certification for cancer exome sequencing using the HiSeq 2000. In addition, it secured exclusive licensing rights from Hopkins for a technology called digital karyotyping, developed by Velculescu and colleagues to analyze copy number changes in cancer genomes.

PGDx offers a comprehensive cancer genome analysis service that combines exome sequencing with digital karyotyping, which isolates short sequence tags from specific genomic loci in order to identify chromosomal changes as well as amplifications and deletions.

The company sequences tumor-normal pairs and promises a turnaround time of six to 10 weeks, though Velculescu said that ongoing improvements in sequencing technology and the team’s analysis methods promise to reduce that time “significantly.” It is currently seeing turnaround times of under a month.

To date, the company has focused solely on the research market. Customers have included pharmaceutical and biotech companies, individual clinicians and researchers, and contract research organizations, while the scale of these projects has ranged from individual patients to thousands of exomes for clinical trials.

While the company performs its own sequencing for smaller projects, it relies on third-party service providers for larger studies.

PGDx specializes in all aspects of cancer genome analyses, but has a particular focus on the front and back end of the workflow, Velculescu said, including “library construction, pathologic review of the samples, dissection of tumor samples to enrich tumor purity, next generation sequencing, identification of tumor-specific alterations, and linking of these data to clinical and biologic information about human cancer.”

The sequencing step in the middle, however, “is really almost becoming a commodity,” he noted. “Although we’ve done it in house, we typically do outsource it and that allows us to scale with the size of these projects.”

He said that PGDx typically works with “a number of very high-quality sequence partners to do that part of it,” but he declined to disclose these partners.

On the front end, PGDx has developed “a variety of techniques that we’ve licensed and optimized from Hopkins that have allowed us to improve extraction of DNA from both frozen tissue and [formalin-fixed, paraffin-embedded] tissue, even at very small quantities,” Diaz said. The team has also developed methods “to maximize our ability to construct libraries, capture, and then perform exomic sequencing with digital karyotyping.”

Once the sequence data is in hand, “we have a pipeline that takes that information and deciphers the changes that are most likely to be related to the cancer and its genetic make-up,” he said. “That’s not trivial. It requires inspection by an experienced cancer geneticist.”

While the firm is working on automating the analysis, “it’s not something that is entirely automatable at this time and therefore cannot be commoditized,” Diaz said.

The firm issues a report for its customers that “provides information not only on the actual sequence changes which are of high quality, but what these changes are likely to do,” Velculescu said, including “information about diagnosis, prognosis, therapeutic targeting [information] or predictive information about the therapy, and clinical trials.”

So far, the company has relied primarily on word of mouth to raise awareness of its offerings. “We’ve literally been swamped with requests from people who just know us,” Velculescu said. “I think one of the major reasons people have been coming to us for either these small or very large contracts is that people are getting this type of NGS data and they don’t know what to do with it — whether it’s a researcher who doesn’t have a lot of experience in cancer or a clinician who hasn’t seen this type of data before.”

While there’s currently “a wealth in the ability to get data, there’s an inadequacy in being able to understand and interpret the data,” he said.

Pricing for the company’s services is on a case-by-case basis, but Diaz estimated that retail costs are currently between $5,000 and $10,000 per tumor-normal pair for research purposes. Clinical cases are more costly because the depth of coverage is deeper and additional analyses are required, as well as a physician interpretation.

A Cautious Approach

While the company’s ultimate goal is to help oncologists use genomic information to inform treatment for their patients, PGDx is “proceeding cautiously” in that direction, Diaz said.

The firm has so far sequenced around 50 tumor-normal pairs for individual patients, but these have been for “informational purposes,” he said, stressing that the company believes the field of cancer genomics is still in the “discovery” phase.

“I think we’re really at the beginning of the genomic revolution in cancer,” Diaz said. “We are partnering with pharma, with researchers, and with certain clinicians to start bringing this forward — not only as a discovery tool but eventually as a clinical application.”

“We do think that rushing into this right now is too soon, but we are building the infrastructure — for example our recent CLIA approval for cancer genome analyses — to do that,” he added.

This cautious approach sets the firm apart from some competitors, including Foundation Medicine, which is about to launch a targeted sequencing test that it is marketing as a diagnostic aid to help physicians tailor therapy for their patients. Diagnostic firm Asuragen is also offering cancer sequencing services based on a targeted approach (CSN 1/12/12), as are a number of academic labs.

Diaz said that PGDx’s comprehensive approach also sets it apart from these groups. “We think there’s a lot of clinically actionable information in the genome … and we don’t want to limit ourselves by just looking at a set of genes and saying that these may or may not have importance.”

While the genes in targeted panels “may have some data surrounding them with regard to prognosis, or in relation to a therapy, that’s really only a small part of the story when it comes to the patient’s cancer,” Diaz said.

“That’s why we would like to remain the company that looks at the entire cancer genome in a comprehensive fashion, because we don’t know enough yet to break it down to a few genes,” he said.

The company’s proprietary use of digital karyotyping to find copy number alterations is another differentiator, Velculescu said, because many cancer-associated genes — such as p16, EGFR, MYC, and HER2/neu — are only affected by copy number changes, not point mutations.

Ultimately, “we want to develop something that has value for the clinician,” Diaz said. “A clinician currently sees 20 to 30 patients a day and may have only a few minutes to look at a report. If [information from sequencing] doesn’t have immediate high-impact value, it’s going to be very hard to justify its use down the road.”

He added that the company is “thinking very hard about what we can squeeze out of the cancer genome to provide that high-impact clinical value — something that isn’t just going to improve the outcome of patients by a few months or weeks, but actually change the outlook of that patient substantially.”



Bernadette Toner is editorial director for GenomeWeb’s premium content. E-mail her here or follow her GenomeWeb Twitter account at @GenomeWeb.

In Educational Symposium, Illumina to Sequence, Interpret Genomes of 50 Participants for $5K Each

June 27, 2012

This story was originally published June 25.

As part of a company-sponsored symposium this fall to “explore best practices for deploying next-generation sequencing in a clinical setting,” Illumina plans to sequence and analyze the genomes of around 50 participants for $5,000 each, Clinical Sequencing News has learned.

According to Matt Posard, senior vice president and general manager of Illumina’s translational and consumer genomics business, the event is part of a “multi-step process to engage experts in the field around whole-genome sequencing, and to support the conversation.”

The “Understand your Genome” symposium will take place Oct. 22-23 at Illumina’s headquarters in San Diego.

The company sent out invitations to the event over the last few months, targeting individuals with a professional interest in whole-genome sequencing, including medical geneticists, pathologists, academics, and industry or business leaders, Posard told CSN this week. To provide potential participants with more information about the symposium, Illumina also hosted a webinar this month that included a Q&A session.

Registration closed June 14 and has exceeded capacity — initially 50 spots, a number that may increase slightly, Posard said. Everyone else is currently waitlisted, and Illumina plans to host additional symposia next year.

“There has been quite a bit of unanticipated enthusiasm around this from people who are speaking at the event or planning to attend the event,” including postings on blogs and listservs, Posard said.

As part of their $5,000 registration fee, which does not include travel and lodging, participants will have their whole genome sequenced in Illumina’s CLIA-certified and CAP-accredited lab prior to the event. It is also possible to participate without having one’s genome sequenced, but only as a companion to a full registrant, according to Illumina’s website. The company prefers that participants submit their own sample, but as an alternative, they may submit a patient sample instead.

The general procedure is very similar to Illumina’s Individual Genome Sequencing, or IGS, service in that it requires a prescription from a physician, who also receives the results to review them with the participant. However, participants pay less than they would through IGS, where a single human genome currently costs $9,500.

Participants will also have a one-on-one session with an Illumina geneticist prior to being sequenced, and they can choose to not receive certain medical information as part of the genome interpretation.

Doctors will receive the results and review them with the participants sometime before the event. “There will be no surprises for these participants when they come to the symposium,” Posard said.

Results will include not only a list of variants but also a clinical interpretation of the data by Illumina geneticists. This is currently not part of IGS, which requires an interpretation of the data by a third party, but Illumina plans to start offering interpretation services for IGS before the symposium, Posard said.

“Our stated intent has always been that we want to fill in all of the pieces that the physicians require, so we are building a human resource, as well as an informatics team, to provide that clinical interpretation, and we are using that apparatus for the ‘Understand your Genome’ event,” Posard said.

The interpretation will include “a specified subset of genes relating to Mendelian conditions, drug response, and complex disease risks,” according to the website, which notes that “as with any clinical test, the patient and physician must discuss any medically significant results.”

The first day of the symposium will feature presentations on clinical, laboratory, ethical, legal, and social issues around whole-genome sequencing by experts in the field. Speakers include Eric Topol from the Scripps Translational Science Institute, Matthew Ferber from the Mayo Clinic, Robert Green from Brigham and Women’s Hospital and Harvard Medical School, Heidi Rehm from the Harvard Partners Center for Genetics and Genomics, Gregory Tsongalis from the Dartmouth Hitchcock Medical Center, Robert Best from the University of South Carolina School of Medicine, Kenneth Chahine from Ancestry.com, as well as Illumina’s CEO Jay Flatley and chief scientist David Bentley.

On the second day, participants will receive their genome data on an iPad and learn how to analyze their results using the iPad MyGenome application that Illumina launched in April.

The planned symposium stirred some controversy at the European Society of Human Genetics annual meeting in Nuremberg, Germany, this week. During a presentation in a session on the diagnostic use of next-generation sequencing, Gert Matthijs, head of the Laboratory for Molecular Diagnostics at the Center for Human Genetics in Leuven, Belgium, said he was upset because the invitation to Illumina’s event apparently not only reached selected individuals but also patient organizations.

“To me, personally, [the event] tells that some people are really exploring the limits of business, and business models, to get us to genome sequencing,” he said.

“We have to be very careful when we put next-generation sequencing direct to the consumer, or to patient testing, but it’s a free world,” he added later.

Posard said that Illumina welcomes questions about and criticism of the symposium. “This is another example of us being extremely responsible and transparent in how we’re handling this novel application that everybody acknowledges is the wave of the future,” he said. “We want to responsibly introduce that wave, and I believe we’re doing so, through such things as the ‘Understand your Genome’ event, but not limited to this event.”

Julia Karow tracks trends in next-generation sequencing for research and clinical applications for GenomeWeb’s In Sequenceand Clinical Sequencing News. E-mail her here or follow her GenomeWeb Twitter accounts at @InSequence and@ClinSeqNews.

Federal Court Rules Helicos Patent Invalid; Company Reaches Payment Agreement with Lenders

August 30, 2012

NEW YORK (GenomeWeb News) – A federal court has ruled in Illumina’s favor in a lawsuit filed by Helicos BioSciences that had alleged patent infringement.

In a decision dated Aug. 28, District Judge Sue Robinson of the US District Court for the District of Delaware granted Illumina’s motion for summary judgment declaring US Patent No 7,593,109 held by Helicos invalid for “lack of written description.”

Titled “Apparatus and methods for analyzing samples,” the patent relates to an apparatus, systems, and methods for biological sample analysis.

The ‘109 patent was the last of three patents that Helicos accused Illumina of infringing, following voluntary dismissal by Helicos earlier this year with prejudice of the other two patents. In October 2010 Helicos included Illumina and Life Technologies in a lawsuit that originally accused Pacific Biosciences of patent infringement.

Helicos dropped its lawsuit against Life Tech and settled with PacBio earlier this year, leaving Illumina as the sole defendant.

In seeking a motion for summary judgment, Illumina argued that the ‘109 patent does not disclose “a focusing light source operating with any one of the analytical light sources to focus said optical instrument on the sample.” Illumina’s expert witness further said that the patent “does not describe how focusing light source works” nor does it provide an illustration of such a system, according to court documents.

In handing down her decision, Robinson said, “In sum, and based on the record created by the parties, the court concludes that Illumina has demonstrated, by clear and convincing evidence, that the written description requirement has not been met.”

In a statement, Illumina President and CEO Jay Flatley said he was pleased with the court’s decision.

“The court’s ruling on the ‘109 patent, and Helicos’ voluntary dismissal of the other patents in the suit, vindicates our position that we do not infringe any valid Helicos patent,” he said. “While we respect valid and enforceable intellectual property rights of others, Illumina will continue to vigorously defend against unfounded claims of infringement.”

After the close of the market Wednesday, Helicos also disclosed that it had reached an agreement with lenders to waive defaults arising from Helicos’ failure to pay certain risk premium payments in connection with prior liquidity transactions. The transactions are part of risk premium payment agreement Helicos entered into with funds affiliated with Atlas Venture and Flagship Ventures in November 2010.

The lenders have agreed to defer the risk premium payments “until [10] business days after receipt of a written notice from the lenders demanding the payment of such risk premium payments,” Helicos said in a document filed with the US Securities and Exchange Commission.

The Cambridge, Mass.-based firm also disclosed that Noubar Afeyan and Peter Barrett have resigned from its board.

Helicos said two weeks ago that its second-quarter revenues dipped 29 percent year over year to $577,000. In an SEC document, it also warned that existing funds were not sufficient to support its operations and related litigation expenses through the planned September trial date for its dispute with Illumina.

In Thursday trade on the OTC market, shares of Helicos closed down 20 percent at $.04.



State of the Science: Genomics and Cancer Research

April 2012
Basic research allows for a better understanding of cancer and, eventually, improved patient outcomes. Zhu Chen, China’s minister of health, and Shanghai Jiao Tong University’s Zhen-Yi Wang received the seventh annual Szent-Györgyi prize from the National Foundation for Cancer Research for their work on a treatment for acute promyelocytic leukemia. Genome Technology‘s Ciara Curtin spoke to Chen, Wang, and past prize winners about the state of cancer research.

Genome Technology: Doctors Wang and Chen, can you tell me a bit about the work you did that led to you receiving the Szent-Györgyi prize?

Zhen-Yi Wang: I am a physician. I am working in the clinic, so I have to serve the patients. … I know the genes very superficially, not very deeply, but the question raised to me is: There are so many genes, but how are [we] to judge what is the most important?

Zhu Chen: The work that is recognized by this year’s Szent-Györgyi Prize concerns … acute promyelocytic leukemia. Over the past few decades, we have been involved in developing new treatment strategies against this disease.

You have two [therapies — all-trans retinoic acid and arsenic trioxide] — that target the same protein but with slightly different mechanisms, so we call this synergistic targeting. When the two drugs combine together for the induction therapy, then we see very nice response in terms of the complete remission rate. But more importantly, we see that this synergistic targeting, together with the effect of the chemotherapy, can achieve a very high five-year disease-free survival — as high as 90 percent.

But we were more interested in the functional aspects of the genome, to understand what each gene does and also to particularly understand the network behavior of the genes.

GT: There are a number of consortiums looking at the genome sequences of many cancer types. What do you hope to see from such studies?

Webster Cavenee: This is a way that tumors are being sequenced in a rational kind of way. It would have been done anyway by labs individually, which would have taken a lot more money and taken a lot longer, too. The human genome sequence, everybody said, ‘Why are you going to do that?’ … But that now turns out to be a tremendous resource. … From the point of view of The Cancer Genome Atlas, having the catalog of all of the kinds of mutations which are present in tumors can be very useful because you can see patterns. For example, in the glioblastoma cancer genome project, they found an unexpected association of some mutations and combinations of mutations with drug sensitivity. Nobody would have thought that.

The problem, of course, is that when you are sequencing all these tumors, it’s a very static thing. You get one point in time and you sequence whatever comes out of this big lump of tissue. That big lump is made up of a lot of different kinds of pieces, so when you see a mutation, you can’t know where it came from and you don’t know whether it actually does anything. That then leads into what’s going to be the functionalizing of the genome. Because in the absence of knowing that it has a function, it’s not going to be of very much use to develop drugs or anything like that. And that’s a much bigger exercise because that involves a lot of experiments, not just stuffing stuff into a sequencer.Peter Vogt: [The genome] has to be used primarily to determine function. Without function, there’s not much you can do with these mutations, because the distinction between a driver mutation and a passenger mutation can’t be made just on the basis of sequence.

Carlo Croce: After that, you have to be able to validate all of the genetic operations in model systems where you can reproduce the same changes and see whether there are the same consequences. Otherwise, without validation, to develop therapy doesn’t make much sense because maybe those so-called driver mutations will turn out to be something else.

GT: Will sequencing of patient’s tumors come to the clinic?

CC: It is inevitable. Naturally, there are a lot of bottlenecks. To do the sequencing is the, quote, trivial part and it is going to cost less and less. But then interpreting the data might be a little bit more cumbersome.

Sujuan Ba: Dr. Chen, there is an e-health card in China right now. Do you think some day gene sequencing will be stored in that card?

ZC: We are developing a digital healthcare in China. We started with electronic health records and now by providing the e-health card to the people, that will facilitate the individualized health management and also the supervision of our healthcare system. In terms of the use of genetic information for clinical purposes, as Professor Croce said, it’s going to happen.

GT: What do you think are the major questions in cancer research that still need to be addressed?

PV: There are increasingly two schools of thought on cancer. One is that it is all an engineering problem: We have all the information we need, we just need to engineer the right drugs. The other school says it’s still a basic knowledge problem. I think more and more people think it’s just an engineering problem — give us the money and we’ll do it all. A lot of things can be done, but we still don’t have complete knowledge.

Roundtable Participants
Sujuan Ba, National Foundation for Cancer Research
Webster Cavenee, University of California, San Diego
Zhu Chen, Ministry of Health, China
Carlo Croce, Ohio State University
Peter Vogt, Scripps Research Institute
Zhen-Yi Wang, Shanghai Jiao Tong University


Read Full Post »


Reporter: Aviva Lev-Ari, PhD, RN

Eric Topol: Get Rid of the Randomized Trial; Here’s a Better Way

Eric J. Topol, MD


Hi. I’m Dr. Eric Topol, Director of the Scripps Translational Science Institute and Editor-in-Chief of Medscape Genomic Medicine and theheart.org. In our series The Creative Destruction of Medicine, I’m trying to get into critical aspects of how we can Schumpeter or reboot the future of healthcare by leveraging the big innovations that are occurring in the digital world, including digital medicine.

But one of the things that has been missed along the way is that how we do clinical research will be radically affected as well. We have this big thing about evidence-based medicine and, of course, the sanctimonious randomized, placebo-controlled clinical trial. Well, that’s great if one can do that, but often we’re talking about needing thousands, if not tens of thousands, of patients for these types of clinical trials. And things are changing so fast with respect to medicine and, for example, genomically guided interventions that it’s going to become increasingly difficult to justify these very large clinical trials.

For example, there was a drug trial for melanoma and the mutation of BRAF, which is the gene that is found in about 60% of people with malignant melanoma. When that trial was done, there was a placebo control, and there was a big ethical charge asking whether it is justifiable to have a body count. This was a matched drug for the biology underpinning metastatic melanoma, which is essentially a fatal condition within 1 year, and researchers were giving some individuals a placebo.

Would we even do that kind of trial in the future when we now have such elegant matching of the biological defect and the specific drug intervention? A remarkable example of a trial of the future was announced in May.[1] For this trial, the National Institutes of Health is working with [Banner Alzheimer’s Institute] in Arizona, the University of Antioquia in Colombia, and Genentech to have a specific mutation studied in a large extended family living in the country of Colombia in South America. There is a family of 8000 individuals who have the so-called Paisa mutation, a presenilin gene mutation, which results in every member of this family developing dementia in their 40s.

Researchers will be testing a drug that binds amyloid, a monoclonal antibody, in just [300][1] family members. They’re not following these patients out to the point of where they get dementia. Instead, they are using surrogate markers to see whether or not the process of developing Alzheimer’s can be blocked using this drug. This is an exciting way in which we can study treatments that can potentially prevent Alzheimer’s in a very well-demarcated, very restricted population with a genetic defect, and then branch out to a much broader population of people who are at risk for Alzheimer’s. These are the types of trials of the future and, in fact, it would be great if we could get rid of the randomization and the placebo-controlled era going forward.

One of things that I’ve been trying to push is that we need a different position at the FDA. Now, we can find great efficacy, but the problem is that establishing safety often also requires thousands, or tens of thousands, of patients. That is not going to happen in the contrived clinical trial world. We need to get to the real world and into this digital world where we would have electronic surveillance of every single patient who is admitted and enrolled in a trial. Why can’t we do that? Why can’t we have conditional approval for a new drug or device or even a diagnostic test, and then monitor that very carefully. Then we can grant, if the data are supported, final approval.

I hope that we can finally get an innovative spirit, a whole new way of a conditional and then final approval in phases in the real world, rather than continuing in this contrived clinical trial environment. These are some things that can change in the rebooting or in the creative destruction, or reconstruction, of medicine going forward.

Thanks so much for your attention and your continued support of The Creative Destruction of Medicine series on Medscape.


  1. Banner Alzheimer’s Institute. Groundbreaking Alzheimer’s disease prevention trial announced. Press release.http://banneralz.org/media/28067/api_prevention_trial_release_5_15_12_final.pdf Accessed July 31, 2012.

On other topics in Medicine:

Topol on The Creative Destruction of Medicine


Read Full Post »

%d bloggers like this: