Feeds:
Posts
Comments

See on Scoop.itCardiotoxicity

Hypertension Drug Appears to Relieve Psychosis
PsychCentral.com
Patients experiencing psychosis quickly improved after a single infusion of sodium nitroprusside, an antihypertensive agent, according to a new study published in JAMA Psychiatry.

See on psychcentral.com

See on Scoop.itCardiotoxicity

ICU Checklist Avoids Unplanned Readmissions Pharmacy Practice News The checklist featured seven items addressing the clinical plans for pain, delirium, arrhythmias, respiratory support, antibiotics, diuretics, blood products, anticoagulation,…

See on www.pharmacypracticenews.com

See on Scoop.itCardiotoxicity

AlberniPortal.ca
Obesity Surgery and High Blood Pressure
AlberniPortal.ca
If the patient has normal blood pressure, or is controlled by anti-hypertensive medication, the changes will be small.

See on www.alberniportal.ca

See on Scoop.itCardiovascular and vascular imaging

Echocardiography

See on www.mdlinx.com

Vinod Khosla: “20% doctor included”: speculations & musings of a technology optimist or “Technology will replace 80% of what doctors do”

Reporter: Aviva Lev-Ari, PhD, RN

This image has an empty alt attribute; its file name is ArticleID-48.png

WordCloud Image Produced by Adam Tubman

Vinod Khosla of A Khosla Ventures investment, in a Fortune Magazine article, “Technology will replace 80% of what doctors do”, on December 4, 2012

http://tech.fortune.cnn.com/2012/12/04/technology-doctors-khosla/?source=linkedin&goback=%2Egde_4346921_member_240383059

 Vinod Khosla is the founder of Khosla Ventures, a venture capital firm in Menlo Park, CA. A longer version of this story can be found here on its website.

The Longer Version:

“20% doctor included”: speculations & musings of a technology optimist

by Vinod Khosla

In recent talks at the University of California, San Francisco and other venues, I laid out my thesis on how mobile devices, big data, and artificial intelligence will disrupt healthcare. I believe it is inevitable that the majority of physicians’ diagnostic and prescription work (physicians do other work too) will be replaced with smart hardware and software, and healthcare will become better, more consistent across physicians, and more scientific. The remaining 20% of physicians’ work will be AMPLIFIED, giving them better capability. Given the importance of having clarity on what I hypothesized (my forecasts are directional guesses rather than precise predictions) about how healthcare might change, I’d like to make clarify my views and respond to some of the recent commentary.

Let me start with a summary:

  1. Healthcare today is often really the “practice of medicine” rather than the “science of medicine”. In the worst cases of the practice of medicine, doctors just take moderately educated shots in the dark when it comes to patient care. Physicians should be much more scientific and data-driven. That’s hard for the average physician to pull off without technology, because of the increasing amount of data and research released every year. Next-generation medicine will be the scientific arrival at diagnostic and treatment conclusions and real testing of what’s actually going on in your body. And, it will be much more personalized than your physician can provide. Data science will be key to this.
  2. Technology makes up for human deficiencies and amplifies strengths – MDs and even other less trained medical professionals can do much more than they do now. By 2025 more data-driven, automated healthcare will displace up to 80% of physicians’ diagnostic and prescription work. It will AMPLIFY physicians by arming them with more complete, synthesized, and up-to-date research data, all leading to better patient outcomes. Computers are much better than people at organizing and recalling information. They have larger and less corruptible memories, remember more complex information much more quickly and completely, and make far fewer mistakes than a hot shot MD from Harvard. Contrary to popular opinion, they’re also better at integrating and balancing considerations of patient symptoms, history, demeanor, environmental factors, and population management guidelines than the average physician.
  3. The healthcare transition will start incrementally and develop slowly in sophistication, much like a great MD who starts with seven years of med school and then spends a decade training with the best practitioners by watching, learning, and experiencing. Expect many laughing-stock attempts by “toddler MD computer systems” early in their evolution and learning. These systems will grow with the help of the top MDs and AMPLIFY them so everyone can have the best, most researched care, not the average, overburdened, and rushed doctor the average person gets today!  In fact, the very best MDs will be an integral part of designing and building these advanced systems.
  4. The systems of 10-15 years from now (which is the time frame I am talking about) will overcome many of the short-term deficiencies of today’s technologies. By analogy, using today’s technology then would be like our carrying the multi-pound phones from 1987 (they were floor mounted cell phones with big handsets and heavy cords) in our pockets rather than iPhones. There will be point innovations that seem immaterial, but, when there are enough of them, they will integrate with each other and start to feel like a revolution. In the meantime, expect these early systems and tools to be the butt of jokes from many a writer or MD. Early printers, typically “dot matrix”, did not exactly cut it for business correspondence, let alone replace traditional means. Early medical systems will be used in non-critical roles or under physician supervision. Eventually, this shift in how healthcare is delivered will allow for less money to be spent on capital equipment, cutting health care costs. It will allow us to provide care to those who don’t have it now. And, it will prevent simple things from getting worse before being addressed.
  5. The human element of care, provided by humane humans, not rushed, overloaded MDs, will still be around. These people may have MDs anyway, but they won’t need ten or twenty years of medical school training. Or, when they have the training, they will become much better diagnosticians and caregivers. Beyond diagnosis and treatment, there are many things doctors do that won’t be replaced.
  6. The problem in healthcare is not with doctors, many of whom are accomplished, caring, honest, and compassionate providers. The problem is the incredible increase in complexity of the newly enabled data (some extrinsic), vast amounts of research, longitudinal health records, and histories without the self-reporting inaccuracies of the patient that allows for the much more integrative analysis that is now possible.  The problem is also the misalignment of incentives in medicine, where organizations try to maximize revenue (extra surgeries anyone?) at the expense of optimizing care. This is why innovation will most likely happen from the outside. It is also important to realize that I refer to the “AVERAGE GLOBAL doctor” or healthcare concern, not every doctor or company. The standards of performance in some parts of the world and in some parts of this country are very different than those in the best metropolitan hospitals in the United States. I also note that 50% of MD’s are below average though every doctor most people  know is above average!

Practice vs. science

One of my principal concerns about healthcare today is that it’s often really the “practice of medicine” rather than the “science of medicine”. Take modern medicine’s view towards fever as an example. For 150 years, doctors have routinely prescribed antipyretics (aspirin, acetaminophen, etc.) to help reduce fever, since fever was viewed as an inability of the body to regulate itself and therefore needed to be reduced aggressively in all cases. But in 2005, researchers at the University of Miami, Florida, ran a study of 82 intensive care patients, for whom protection from high temperatures was traditionally thought to be important. The patients were randomly assigned to receive antipyretics either if their temperature rose beyond 101.3°F (the “standard treatment”) or only if their temperature reached 104°F. As the trial progressed, seven people getting the standard treatment died, while there was only one death in the group of patients allowed to have a higher fever. At this point, the trial was stopped because the team felt it would be unethical to allow any more patients to get the standard treatment. So when something as basic as fever reduction is a hallmark of the “practice of medicine” and hasn’t been challenged for 100+ years, we have to ask “what else might be practiced due to tradition rather than science?”

In the worst cases of the practice of medicine, the “average” doctors just take moderately educated shots in the dark when it comes to patient care. A diagnosis is partially informed by the patient’s medical history (but often not really), partially informed by symptoms (but patients aren’t very good at communicating what’s really going on), and mostly informed by pharma advertising and the doctor’s half-remembered lessons from medical school (which are laden with cognitive biases, recency biases, and other very human errors, besides potentially having been obsoleted by more recent research). Many times, if you ask three doctors to look at the same problem, you’ll get three different diagnoses and three different treatment plans. As a patient, how do you feel when a doctor keeps changing his mind about your disease over time? How do you feel if different doctors say different things about your disease? Today, this happens often. In some areas, psychiatry for example, doctors frequently disagree on diagnoses. Research has found that psychiatrists using the Diagnostic and Statistical Manual of Mental Disorders (DSM), the standard desk reference for psychiatric diagnoses, have dangerously low diagnostic agreement. The DSM V uses a statistic called the “kappa” to measure the level of agreement between psychiatrists (ranging from 0 for no agreement and 1 for complete agreement). In research trials, the DSM V, which is set to be published in May 2013, generates a kappa of 0.2 for generalized anxiety disorder and 0.3 for major depressive disorder. Scientific American described these results for the standard of psychiatric care as “two pitiful kappas”. And often, there are errors of omission where a diagnosis is just missed entirely.

The net effect of all this is patient outcomes that are far inferior to and more expensive than what they should be. The current benchmarks of performance aren’t good enough, and it’s trivially easy to find study after study that demonstrates the shortcomings of the practice of medicine. A Johns Hopkins study found that as many as 40,500 patients die in an ICU in the US each year due to misdiagnosis, rivaling the number of deaths from breast cancer. Yet another study found that ‘system-related factors’, e.g. poor processes, teamwork, and communication, were involved in 65% of studied diagnostic error cases.  ‘Cognitive factors’ were involved in 75%, with ‘premature closure’ (sticking with the initial diagnosis and ignoring reasonable alternatives, or, more fancifully termed, the “confirmation bias”) as the most common cause. These types of diagnostic errors also add to rising healthcare expenditures, costing $300,000 per malpractice claim.

Physicians should be much more scientific and data-driven in providing patient care. That’s hard to pull off without technology, because of the increasing amount of data and research released every year. For example, standard operating procedure involves giving the same drug to millions of people even though we know that each patient metabolizes medication at different rates and with different effectiveness. Many of us are even resistant to aspirin. Each person should be treated differently, but the average doctor can’t handle the information required to do that. Nor does he have enough time or knowledge to do it. Healthcare needs to become much more about data-driven deduction and less about trial-and-error. Next-generation medicine will be the scientific arrival at diagnostic and treatment conclusions based on probabilities and real testing of what’s actually going on in your body. And, it will be much more personalized than your physician can provide. Systems will utilize more complex models of interactions within the human body and much more sensor data than a human MD could comprehend to suggest diagnosis. Thousands of baseline and disease mulit-ohmic (genomic, metabolomics, microbiomic, and other) data points, more integrative history, and demeanor will go into each diagnosis. Ever-improving dialog manager systems will help make data capture and exploration from patients more accurate and comprehensive.  Data science will be key to this. In the end, this would reduce costs, reduce physician workloads, and improve patient care. Doctors will also be able to tailor their explanations to the health literacy level of patients using common-language terms and adapting the sophistication level using computerized dialog managers. These computerized managers will be patient, unlike your typical “doctor in a hurry” with the usual unfortunate case overload. This matters because, according to the Institute of Medicine, nearly 100M US adults have “limited health literacy skills” that are most likely to affect their health outcomes!

 

Replacing 80% of what doctors do?

At the heart of my view is that much of what physicians do (checkups, testing, diagnosis, prescription, behavior modification, etc.) can be done better by well-designed sensors, passive and active data collection, and analytics, without necessarily taking away from the human element of care (especially in the 2020’s decade when this technology will be in its fifth or tenth evolution)! Let’s take a brief look at a standard physical as an example. During a physical, a patient will get measured in multiple ways – from weight and blood pressure to pulse and respiration. A nurse, doctor, or other caregiver will spend 15–30 minutes to run through all these different routines. Every single one of these measurements could be done in real-time by the patient in more representative environments at home if he or she just had the right sensors hooked up wirelessly to a mobile device. All that data could be measured and transmitted to the doctor’s office in less time and for less money than it takes to gas up the car and get to the hospital in the first place (leave aside sitting in the waiting room). ZocDoc* allows patients to check in for an appointment and provide basic information ahead of time. This type of form could easily be expanded to include vitals. And our “dialog manager” could ask many follow-on questions and probe other symptom possibilities, providing a more complete patient record WHILE reducing the amount of time the doctor has to spend with the patient, if any!). No doubt, the best doctors will likely do this better than our dialog manager, even for the next one or two decades. But, the system will be an immediate dramatic improvement compared to the average hurried and overloaded doctor (or worse, the developing world “no medical-school doctor” living within 50 miles of a rural patient)!

Of course, doctors aren’t supposed to just measure. They’re supposed to consume all that data, carefully consider it in context of the latest medical findings and the patient’s history, and figure out if something’s wrong. Computers can take on much of diagnosis and treatment work, and even do these functions better than an average doctor could (while considering more options and making fewer errors). What if you’re a heart patient? It’s a simple fact that most doctors couldn’t possibly read and digest all of the latest 5,000 research articles on heart disease. In fact, most of the average doctor’s medical knowledge is from when they were in medical school, and cognitive limitations prevent them from remembering the 10,000+ diseases humans can get.

Computers are much better than people at organizing and recalling information. They have larger and less corruptible memories, remember more complex information much more quickly and completely, and make far fewer mistakes than a hotshot MD from Harvard. Contrary to popular opinion, they’re also better at integrating and balancing considerations of patient symptoms, history, demeanor, environmental factors, and population management guidelines than the average physician. Besides, who wants to be treated by the average or below-average physician? Remember, 50% of MDs are below-average!  Not only that, computers have much lower error rates. Shouldn’t we take advantage of that when it comes to our health?!

Technology makes up for human deficiencies and amplifies strengths – MDs and even other less trained medical professionals can do much more. Eventually, computers will replace 80% of what doctors do but amplify the doctor’s capabilities, reducing misdiagnosis, in what they do by arming them with more complete, synthesized, and up-to-date data, all leading to better patient outcomes. Physicians spend too much time doing things computers can do, and we should give them more time for things that uniquely require human involvement, like providing patients “warm & fuzzies”, comforting kids in pediatric care, making inherently subjective decisions that require empathy or a consideration of ethics, and providing a friendly ear for lonely patients. Some of these functions may not need medical school training at all, but rather draw on more empathic skills and could actually be done by non-MDs. Lifecom, an AI diagnostics engine company, showed in clinical trials that medical assistants using a knowledge engine were 91% accurate in diagnosis without using labs, imaging, or exams. Another clinical study by the same company demonstrated that greater than 75% of cases can be safely triaged to be treated by RNs, with the remainder handled by doctors. Another study at MassGen found that 25% of the time, a medical record for patients who wound up with ‘high risk diagnoses’ had ‘high information clinical findings’ before a physician eventually made the diagnosis — in other words, there was a significant delay that might have been avoided had a clinical decision support system been used to parse the notes!

Initially, many doctors will be against this transition and won’t support it, but new technologies will make the receptive doctors much better at their job – quicker, more accurate, and more fact-based. There is a tremendous opportunity here in the influx of data that has never before been available. At first, computers will just help in decision support, starting by leveraging guidance from the very best doctors. Eventually (sometime in the next 10-15 years), computers will become better diagnosticians than your average doctor. Once we have a large enough dataset on different people in different situations, along with an addressable database of research studies, we will be amazed at how much better computers can do over today’s patient outcomes. We’ll be able to identify patterns and interactions among various areas of physiology in ways that weren’t possible before, making the link between changes in one area of the body causing symptoms in another area. Doctors will struggle to keep up, but then will increasingly rely on these tools to make decisions. Over time, they will increase their reliance on technology for triage, diagnosis, and decision-making, so that we’ll need fewer doctors and every patient will receive the best care. Diagnosis and treatment planning will be done by a computer, used in concert with the empathetic support from medical personnel selected and trained more for their caring personalities than for their diagnostic abilities. No brilliant diagnostician with bad manners, a la “Dr. House”, will be needed in direct patient contact. He can best serve as the trainer for the new “Dr. Algorithm”, which we’ll use to provide the diagnosis, while the most humane humans , (nurse practitioners or other medical professionals) will provide the care.

Eventually, computers will model and track your entire health state. They’ll read your mood by analyzing your facial expression; gauge your social activity through the number of emails you sent, calls you made, or things you tweeted; track your mobility and activity from your GPS as Ginger.io* does in assisting mental health patients or Jawbone* UP band; and monitor your vitals through your food intake, galvanic skin response, heart rate, and skin temperature, among other things. No doctor could be this integrative. There are already numerous startups (and others soon to be launched) that plan to collect health data in a frictionless, easy way in order to create better baseline systems models of the body for patients. Others will do predictive analytics on that information and head off problems before they arise. Still more will suggest lifestyle approaches to improve the way people live. Some of this change is already happening, but this quiet rumbling of data-driven diagnostics will become an avalanche in the future, playing out similarly to the explosion of cellphones. Imagine today’s systems, improved 10X by a decade or two of evolution and competition. Nobody expected cell phones to take over India in 2000, but now in 2012, few people remember that cell phones weren’t expected to be universal (AT&T even killed off their mobile business in the 80s because McKinsey told them the total US market would be less than a million devices by 2020).

Device- and data-driven healthcare will also extend the reach of medicine. It will change public health, especially in the developing world. A country like India needs ten times as many doctors to serve everyone well, and that’s not affordable. Most doctors there don’t have access to the latest, expensive research journals and couldn’t assimilate all the information contained in them even if they had the time, patience, and inclination to read them. A mobile phone could provide the needed testing and diagnosis to the remotest villages and at very affordable costs. This type of care is just not possible using today’s medical school graduates, who tend to be clustered around cities.

Systems will start as clumsy toddlers and develop to maturity and efficiency!

Don’t expect ace diagnosis systems overnight. They may start as seemingly minor point innovations or as clumsy-sounding systems not ready for prime time.

Imagine using a device like the AliveCor* iPhone case to take an ECG after every workout. What about every time you feel lightheaded or numb? What about every single morning, just like diabetics who measure their blood sugar multiple times a day? Now what if you could get an ECG case for free and do measurements for less than $1/test? If you’re a heart patient, this device and others like it would capture a lot more information than your annual or semiannual ECG check at the doctor’s office. Not only that, the office check will cost hundreds to even thousands of dollars, and what’s more, you probably wouldn’t be exhibiting any symptoms during the in-person visit anyway if your condition is intermittent. What if you instead sent 500 ECGs to your doctor over the course of a year for less than it costs to get one ECG done in the hospital? What do you think the average physician would do with all that valuable data? He or she would have no clue what to do with it, which is why we’d need software to “auto-diagnose” the ECG. Today, most heart disease is identified only after patients have heart attacks. But imagine having preventative cardiac care, with every at-risk patient having an ECG every morning for less than a buck. After being trained to identify abnormalities, machine-learning software could predict episodes indicated by the ten or so ECGs out of that set of 500 that the cardiologist should pay attention to (before getting to a point computers can do an EKG read themselves for pennies), simultaneously making his or her job easier AND more effective. We could discover most heart disease well before a heart attack or stroke and address it at a fraction of the cost of care that would be needed following such a trauma. But we need a decade of data to be really good at it.

Many dermatology appointments could be handled by CellScope* which produces low-cost iPhone attachments for imaging skin moles, rashes, ear infections, and (in the future) your retina or throat. The resulting images, taken at home, could be processed by sophisticated algorithms running in the cloud to detect patterns that warrant closer inspection (e.g. SkinVision uses the fractal nature of patterns in a skin image to determine more accurately than most general physicians whether you have skin cancer). You might get a diagnosis a lot faster than it takes to get a doctor’s appointment and then take your child to the clinic. And follow-up ‘visits’ could happen every six or twelve hours! A device like the Eyenetra* could give you an eye test and fit you for eyeglasses at little cost or hassle. Technology could handle diagnoses, lab orders, and writing prescriptions, asking for human assistance or input only when necessary.

Every metabolic process that has a volatile byproduct causes changes in your breath. Adamant* is a very risky startup that’s attempting to produce a chip that can detect hundreds of gases in your breath. If you’re asthmatic, it can measure the level of nitrous oxide and predict whether you’re at risk of having an attack. If you’re diabetic and have ketones in your breath, it can detect that too and tell you that your body is undergoing ketosis. It can detect if you have lung cancer and even tell you what type of lung cancer. It will even detect whether you’re burning fat or sugars during exercise, because each results in different component concentrations in your breath. This little chip will do all this inexpensively, for far less than a big, expensive CT scanner that’ll just tell you that you have a nodule in your lungs but can’t tell you what kind of lung cancer you have. Eventually, you won’t have a doctor stare at you and tell you that you look well; instead, your doctor will be able to look at the levels of hundreds of compounds in your breath and know whether you’re well.

Speaking of looking well, a startup named Ginger.io* determines patients’ mental health based on a variety of metrics. It can monitor your rate of emailing, tweeting, texting, and calling to gauge your social activity. Using motion sensors and your phone’s GPS, it can even know if you’re hiding in your bedroom, eating in your kitchen, or just staying in bed. By watching for changes in your behavior, it can tell how you’re doing far better than a psychiatrist could possibly determine and actually calls your psychiatrist if you’re in the danger zone of an episode. For example, detecting a behavioral pattern change that’s indicative of bipolar disorder could help us prevent shooting sprees of the type we’ve seen recently.

There are many other startups doing innovative things in healthcare. Proteus is helping address drug non-compliance, one of the biggest problems in medicine. They’ve designed a clever system that combines a pill sensor, a body patch, and a mobile app. They attach a tiny, ingestible sensor to pills that gets activated by stomach acids. When a patient takes the pill, the sensor sends a signal to the body patch, which then relays the signal to the app. This system will allow caregivers to remotely monitor patient adherence by individual, time-stamped pill consumption events. This is a far better solution than having your doctor base a diagnosis on the one-time blood test done in the clinic. Empatica uses sensors on patients’ wrists to measure bio-signals that correlate with emotion. Imagine having a continuous and, more importantly, accurate and objective measurement of your emotions for a month instead of only the latest, biased description that you might give (or forget to give) to your hurried physician. Several companies are also improving remote monitoring and diagnosis in the clinical setting. AirStrip Technologies provides real-time vital signs to physicians’ mobile devices. Sotera Wireless provides a battery-powered mobile device for monitoring vital signs. Agile Diagnosis and Lifecom are improving clinical decision-making by providing decision trees with probability-based outcomes for physicians at the point of care. These and other startups are forcing us to rethink healthcare from diagnostics to treatment.

This is only the beginning of the many generations of improvement that are likely to happen in the next two decades, but we have already begun to see meaningful impacts on healthcare. Studies have demonstrated the ability of computerized clinical decision support systems to lower diagnostic errors of omission significantly, directly countering the ‘premature closure’ cognitive bias. Isabel is a differential diagnosis tool and, according to a Stony Book study, matched the diagnoses of experienced clinicians in 74% of complex cases. The system improved to a 95% match after a more rigorous entry of patient data. Even IBM’s fancy Watson computer, after beating all humans at the very human intelligence-based task of playing Jeopardy, is now turning its attention to medical diagnosis. It can process natural language questions and is fast at parsing high volumes of medical information, reading and understanding 200 million pages of text in 3 seconds.

Right now, Isabel and Watson require physicians to ask follow-up questions of the patient, a point of inefficiency and potential cognitive bias introduction. But, Lifecom has the medical text-parsing, knowledge-base generation, and runtime diagnostic capabilities of Watson, while also being able to automatically propose a follow-on question or test to eliminate candidates from the differential diagnosis list. This accelerates the diagnostic process and lessens biases introduced by the human element of question-framing. Watson also relies on machine-learned, purely statistical relationships among symptoms, findings, and causes, whereas a system like Lifecom adds to that ability the use of a medical ontology, the web of physiological, anatomical, and other concepts, as well as their interrelationships. It’s too early to tell which approach will work better in the long term, but the point is that these systems will continue to evolve and, eventually, won’t need physicians as intermediaries.

In the beginning, these point innovations will seem immaterial, but, when there are enough of them, they will integrate with each other and start to feel like a revolution. The medical devices and software systems of 2020 will be as different from today’s computers as the car floor-mounted, multi-pound cell phones with bulky handset cords of 1986 are from today’s iPhones!

Digital first-aid kits

The confluence of all these different sensing, monitoring, and communication technologies will naturally lead to the creation of ‘digital first aid kits’ that cost < $100 and can used at home between doctor visits. These kits will include mobile apps to help patients determine how serious a new medical problem might be, as well as monitoring devices that can track and analyze blood pressure, A1c levels, ECG waveforms, blood oxygen levels, skin/ear/ENT conditions, social interaction, and other indicators of how well patients are managing their diabetes, asthma, depression, and other chronic conditions. Clinical staff will help train patients on these devices and apps, and, using software, they’ll help them interpret health trend data and provide recommendations during return visits.

Healthcare service stations of the future

The emergence of healthcare service stations, found in your local pharmacy or supermarket, will be another exciting phenomenon resulting from the proliferation of devices and data. These walk-in stations will be convenient — 75% of the US population lives within a 5-minute drive of a Walgreens. Right now, you can go to most of these stores and get some basic advice, receive a flu shot, or have your eyes checked, but not much else. In the near future, new technologies will enable many more services, providing high-quality care at low cost. In these centers, pharmacists and nurse practitioners will take patient histories, do simple exams, and prescribe medications or follow-on tests, aided by computerized diagnostics and telemedicine well before 2025.

Patients will be able to walk into these advanced healthcare stations without appointments and give their medical updates to personally-chosen avatars on private screens. These avatars will serve as healthcare concierges, elucidating symptoms, wirelessly uploading data collected by sensors on the patient’s phone, suggesting tests (to be done by the patient using easy-to-use tools on-site or at home), and giving prescriptions through a clinical support system, whose ability to diagnose and recommend effective treatments will have been validated against the best general practitioners and specialists. Software will help RNs diagnose illnesses, recommend effective treatment options, and refer patients to specialists when necessary. In ten years, it’s likely that genetic testing will be routine, extending diagnostic capability and allowing treatment selection based on genotype. Nurse practitioners will perform exams and take image scans using inexpensive, disposable tools (some of which might be part of the digital first-aid kit). Images will be analyzed in real-time by diagnostic software and transmitted to an on-call specialist, who pulls up patient information, stored in the cloud, and connects with the NP and patient using telepresence systems. Assessment and treatment apps will be prescribed, downloaded, and installed on patients’ phones, just as easily as prescriptions for cold medicine. And avatars will be accessible while patients are at home, providing real continuity of care. Ultimately, with ubiquitous walk-in clinics in the retail setting driven by powerful decision-support software, the distribution of healthcare professionals will change in parallel with the distribution channel. The pyramid will flatten, with many more RN-level professionals trained to effectively leverage software and interact with patients, and many fewer expensive MDs specialized to handle what will become the long tail of care.

The human element

Some of the critics of more automated healthcare argue that medicine isn’t just about inputting symptoms and receiving a diagnosis; it’s about building personal relationships of trust between providers and patients. The move towards sensor-based, data-driven healthcare doesn’t mean that we’ll get rid of human interaction. Serotonin administered by comforting humans and the placebo effect are only a few of many ways that complex interactions help recovery, and we shouldn’t lose these tools (instead, let’s understand them better, quantify them, and amplify them). Providing good bedside manner, giving comfort, and answering certain types of questions can often be handled better by a person than a machine, but you don’t need a medical degree to do that (except in some specialty areas like surgery or research). Nurses, nurse practitioners, social workers, and other types of less expensive, non-MD caregivers could do this just as well as doctors (if not better) and spend more time providing personal, compassionate care. For a while, surgery and other procedures may require human doctors with greater knowledge than a nurse practitioner. The point is that not every function doctors do will be replaced, but the majority will be done in less time and with greater efficacy over the next two decades. At the other end, some surgical procedures requiring extreme precision (e.g. tumor irradiation) are best handled by surgical robots or robots and humans working in concert.

Consider hospital discharge for a moment. You might think this is a function that could (or should) only be performed by a human being. Interestingly, a Boston Medical Center study showed that patients preferred receiving discharge instructions from a computer instead of a human and appreciated the amount of time and information provided by the computer. Additionally, patients with lower health literacy reported a significantly greater bond with the computer compared to patients with higher health literacy, supporting the notion that computers can provide adequate emotional support in certain circumstances.

I’m not advocating the removal of the human front-end to patient care. I’m arguing that we should focus on building robust back-end sensor technology and diagnostics through sophisticated machine learning and artificial intelligence operating on clinical and non-clinical data in greater volumes than humans can handle, in order to create a much more comprehensive understanding of patients. What’s more, these combined hardware/software systems will do better at follow-ups than over-burdened, time-constrained doctors. They’ll recognize and flag exceptions in ECGs, respiration, movement, or anything else for caregivers to respond to in more targeted ways, actually improving provider-patient interactions and health outcomes.

A transition to automation has already happened in several other areas where we once thought human judgment was required. When you’re on board that cross-country flight from New York to Los Angeles, most of the flying is being done by auto-pilot, not by a human (though a human is there for certain emergency situations). Investing was long considered a unique bastion of human judgment. Algorithmic trading now drives the vast majority of volume in the stock markets, with computers long ago replacing gesturing traders wearing funny-looking coats in the stock pits. Cars have been parking themselves and avoiding accidents using lane-assist for some time now, and Google already has a self-driving car that’s had zero accidents driving more than 300,000 miles on normal streets (a much harder problem than automating many physician functions). Most humans couldn’t drive that much without at least hitting a curb or two. Within a few decades, vehicles with steering wheels will become as quaint a concept as hand-cranked engines. David Cope at University of Santa Cruz has even developed software that has matched  original Bach and a music professor composing in the Bach tradition in the music’s “Bachness” to the chagrin of many.The same mental shift about human involvement and its gradual replacement by computers will also happen in healthcare. This would create a much more comprehensive understanding of patients and actually improve provider-patient interactions and health outcomes with more personalized treatment. In the end, physicians will be able to spend more, higher quality time with each patient, as if they had only 300 to manage, rather than 3,000 (though he will be able to manage 10,000+ patients with computer assist)! Caregivers will actually have MORE time to spend talking to their patients, making sure they understand, socializing care, and finding out the harder-to-measure pieces of information from patients because they will be spending a lot less time gathering data and referring to old notes. And they will be able to handle many more patients, reducing costs in the system.

The source of healthcare innovation

Where will all this innovation in healthcare come from? Some believe we have to work within the constraints of the medical establishment in order to advance it. I disagree. Some will follow reluctantly, and some will try to lead, but most organizations in traditional healthcare will fight this trend towards reduced costs because it reduces profits. That’s why the system will most likely be disrupted by outsiders. Land-line phone call rates didn’t decline until mobile operators changed the rules of what a phone call should cost. Remember how expensive long distance calling was not very long ago?

Innovation seldom happens from the inside because existing incentives are usually set up to discourage disruption, and doctors and hospitals are all invested in doing things the same way. If a hospital could cure you in half the time, would they be willing to cut their business in half? Pharma companies push marginally different drugs instead of generic solutions that may actually be better for patients because they don’t necessarily want to cure you; they want you to be a drug subscriber and generate recurring revenue for as long as possible. They’d rather sell you a cholesterol-lowering drug than encourage you to eat healthier and reduce their own profits. If they permanently reduced your cholesterol, they would lose a customer! Psychiatrists in general have the same problem with incentives. They don’t get paid to cure you; they get paid to treat you over multiple expensive therapy sessions that never end. What would you rather do as a therapist: have a steady stream of repeat patients that fill your hours or have to attract new patients? The former has minimal patient acquisition costs. Medical device manufacturers, like those that build and sell huge scanning systems, don’t want to cannibalize sales of their expensive equipment by providing cheaper, more accessible monitoring devices like a $29 * ECG machine. The traditional players will lobby/goad/pay/intimidate doctors and regulators to reject these new devices, emptily claiming that they aren’t “as good” and provide only 90% of the functionality (but at 5% of the cost). Expecting the medical establishment to do anything different is like expecting them to reduce their own profits.  To be fair, these are generalizations and there are many great doctors and many ethical organizations and people. The point is that the incentives in healthcare make innovation from within extremely unlikely. Fortunately, it doesn’t matter if the establishment tries to do this or not, because it will happen regardless. And it may start at the periphery, e.g. with the 40 million uninsured people in this country or the hundreds of millions of people in India with no access to any doctor. There aren’t enough rural doctors in India and few of them have access to the New England Journal of Medicine or a CT scanner or even reliable electricity. But, most potential patients have cell phones. This shift in how healthcare is delivered will allow for less money to be spent on capital equipment, cutting health care costs. It will allow us to provide care to those who don’t have it now. It will help avoid errors and provide basic services to those who cannot afford full healthcare services. And, it will prevent simple things from getting worse before being addressed.

There has been much ado in the blogs about how Silicon Valley and outsiders don’t understand healthcare and hence should not or cannot try to understand and innovate in it. As I explained above, and granting the rare exceptions, it is hard for insiders to innovate within a system, at least when it comes to radical innovation.  That’s not to say these Silicon Valley and other outsider “innovators” won’t leverage the system or have partners and doctors from inside helping them. Lifecom’s CEO is a trauma surgeon and teaches surgery and critical care on faculty. One of Proteus’ founders is an MD, as is one of the founders of AirStrip Technologies. IBM Watson is working with . Many others work with  in testing and studying their ideas and technologies. Most startups we are funding have MDs on their team and collaborate with other healthcare partners.

The reality is that healthcare has to move in this direction in order to make it affordable to everyone. There are many arguments and challenging questions in the blogs about this point of view. Some have answers, many reflecting naivety in understanding how technology increments and evolves, and many questions and criticisms don’t have answers. But, just because an answer doesn’t exist today does not mean it that it won’t be found or that we won’t find workarounds. Some things will come as tradeoffs to make healthcare more affordable. My guesstimates will be wrong on many counts as new technologies and approaches, and sometimes unforeseen problems, emerge. Many comments have come from good and passionate doctors (there are plenty of them around) and bloggers who have health insurance and can afford good care. I personally worry about the bottom half of doctors globally who are too rushed, too overburdened, too mercenary, or too out of date with their education, especially in the developing world.

Entrepreneurs can come at these challenges from the outside or inside the system and inject new insight.  They can ask naïve questions that get at the heart of assumptions that may be both pervasive and unperceived. They can leverage the many insiders at the right time to provide real understanding of medicine. They can build smart computers to be objective cost minimizers WHILE being care optimizers. Domain expertise can have a place, and the smartest doctors aren’t outraged at this idea (just the ones with knee-jerk reactions). People always react against technological progress, and many don’t have the imagination to see how the world is changing. But, there will be many good doctors willing to assist in this transition. Eric Topol (author of “The Creative Destruction of Medicine”) and Dr. Daniel Kraft, have called for a data-driven approach to healthcare and are examples of insiders who think like outsiders. There’s no question that many naïve innovators from outside the system, maybe even 90% of them, will attempt this change and fail. But, a few of these outsiders will succeed and change the system. They will get the appropriate help from insiders and leverage their expertise. And there will be many good doctors willing to assist in this transition.

This evolution from an entirely human-based to an increasingly automated healthcare system will take time, and there are many ways in which it can happen, but it won’t take as long as people think. The move will happen in fits and starts along different pathways, with many course corrections, steps backward, and mistakes as we figure out the best approach forward. It’s impossible to predict how this will ultimately happen. It may be the case that all significant efforts will have to be catalyzed by outsiders. The healthcare system might actually start responding to these threats from the inside and change as a result. Maybe we’ll start seeing disruption at the fringes along slippery but shallow slopes. The transition could start as a hundred small changes in different areas of medicine and in different ways, ending with an overhaul of healthcare that takes place over a couple decades. During all this, many or most in this effort will fail, but a few will succeed and change the world. For those of us who support entrepreneurs and companies that help create this change, most investments will be lost but more money will be made than lost through the few successes. None of us knows for sure how this space will turn out, but there’s a huge opportunity for technologists, entrepreneurs, and other forward-thinkers to reduce healthcare expenditures and improve patient care at the very same time.

* A Khosla Ventures investment

Graphene Becomes Magnetic for First Time

Reporter: Aviva Lev-Ari, PhD, RN

 

See on Scoop.itAmazing Science

Researchers from both the University of Madrid Complutense and the Universidad Autonoma working together at the IMDEA-Nanociencia Institute in Spain have for the first time given graphene magnetic properties, opening up the potential that the material can find new applications in future spintronic devices. Unlike electronics in which an electron’s charge-carrying capabilities are exploited to create circuits, spintronics involves the quantum mechanical property of electrons to spin, which creates a magnetic moment that makes the electrons behave briefly like magnets. When in the presence of a magnetic field the spin of the electrons moves either into a parallel or antiparallel position in relation to the field. This positioning can be translated into a binary signal.

 

The trials and tribulations trying to make graphene applicable to electronics despite its lack of an inherent band gap have been well documented. However, what many have overlooked in the quest to bring graphene to electronics is that it doesn’t really lend itself very well to spintronics either. Since 2007, researchers have looked at graphene as the material for channels in spintronic devices. At this function, it appears to excel. In fact, just this year record distances were achieved for carry information using the spin of electrons.

 

Unfortunately, when two-dimensional graphene is laid out flat, the motion of electrons moving through the material doesn’t influence the spin of other electrons that they pass. Instead the direction and the spin of electrons remain random rather than patterned. More than two years ago, researchers at the University of Copenhagen discovered that that all changed if you curved the graphene into a cylinder. In that shape, the movement of electrons did influence the spin of other electrons, opening the door to their potential in spintronics.

 

In order for a material to be have magnetic properties a majority of the electrons in the material must be spinning in the same direction. Despite the work of the Copenhagen researchers and many others, it has remained a challenge to get graphenes’ electrons to spin in the same direction instead of just randomly. But the Spanish researchers believe they have accomplished it.

 

“In spite of the huge efforts to date of scientists all over the world, it has not been possible to add the magnetic properties required to develop graphene-based spintronics. However these results pave the way to this possibility,” says Prof. Rodolfo Miranda, Director of IMDEA-Nanociencia, in a press release.

 

The research, which was published in the journal Nature Physics (“Long-range magnetic order in a purely organic 2D layer adsorbed on epitaxial graphene”), first grew ultra pure graphene film over a crystal inside of a vacuum. While still in the vacumm, the researchers evaporated molecules of a semiconductor on the graphene’s surface. When they observed the material with a scanning tunneling microscope, they were surprised to discover that the semiconductor molecules were organized and regularly distributed across the surface of the graphene and its crystal substrate.

 

Since spintronics hasn’t really progressed beyond its application into devices beyond hard-disk drives, the ability to give graphene magnetic properties likely won’t bring spintronic devices into other applications any sooner. But these kinds of breakthroughs do have a way of opening up unexpected possibilities.

See on spectrum.ieee.org

133 Lectures about the Foundations of Modern Physics (Stanford Courses – Prof. Leonard Susskind)

Reporter: Aviva Lev-Ari, PhD, RN

 

See on Scoop.itAmazing Science

Free video course on Foundations of Modern Physics by Leonard Susskind of Stanford. This Stanford Continuing Studies course is a six-quarter sequence of classes exploring the essential theoretical foundations of modern physics.

 

This Stanford Continuing Studies course is a six-quarter sequence of classes exploring the essential theoretical foundations of modern physics. The topics covered in this course focus on classical mechanics, quantum mechanics, the general and special theories of relativity, electromagnatism, cosmology, black holes and statistical mechanics. While these courses build upon one another, each section of the course also stands on its own, and both individually and collectively they will allow the students to attain the “theoretical minnimum” for thinking intelligently about physics. Quantum theory governs the universe at its most basic level. In the first half of the 20th century physics was turned on its head by the radical discoveriies of Max Planck, Albert Einstein, Niels Bohr, Werner Heisenberg, and Erwin Schroedinger. An entire new logical and mathematical foundation – quantum mechanics – eventually replaced classical physics. This course explores the quantum world, including the particle theory of light, the Heisenberg Uncertainty Principle, and the Schroedinger Equation. The course is taught by Leonard Susskind, the Felix Bloch Professor of Physics at Stanford University.

 

Here is a comprehensive listing of all lectures from Dr. Susskind:

 

http://www.youtube.com/view_play_list?p=189C0DCE90CB6D81
http://www.youtube.com/playlist?list=PLA27CEA1B8B27EB67
http://www.youtube.com/playlist?list=PL5F9D6DB4231291BE
http://www.youtube.com/view_play_list?p=84C10A9CB1D13841
http://www.youtube.com/view_play_list?p=CCD6C043FEC59772
http://www.youtube.com/view_play_list?p=6C8BDEEBA6BDC78D
http://www.youtube.com/view_play_list?p=F363FFF951EC0673
http://www.youtube.com/view_play_list?p=B72416C707D85AB0
http://www.youtube.com/view_play_list?p=888811AA667C942F
http://www.youtube.com/playlist?list=PL8BCB4981DD1A0108
http://www.youtube.com/playlist?list=PLA2FDCCBC7956448F
http://www.youtube.com/playlist?list=PL3E633552E58EB230
http://www.youtube.com/playlist?list=PL47F408D36D4CF129
http://www.youtube.com/playlist?list=PL701CD168D02FF56F

 

http://glenmartin.wordpress.com/home/leonard-susskinds-online-lectures/

See on www.academicearth.org

What Most Schools Don’t Teach – How To Code

Reporter: Aviva Lev-Ari, PhD, RN

 

 

See on Scoop.itCardiovascular and vascular imaging

Learn about a new “superpower” that isn’t being taught in 90% of US schools.

Starring Bill Gates, Mark Zuckerberg, will.i.am, Chris Bosh, Jack Dorsey, Tony Hsieh, Drew Houston, Gabe Newell, Ruchi Sanghvi, Elena Silenok, Vanessa Hurst, and Hadi Partovi.

Directed by Leslie Chilcott. Executive producers Hadi and Ali Partovi.

 

See on www.youtube.com

Metabolomics: its Applications in Food and Nutrition Research

Reporter and Curator: Sudipta Saha, Ph.D.

 

Metabolomics is a relatively new field of “omics” research concerned with the high-throughput identification and quantification of small molecule (<1500 Da) metabolites in the metabolome. The metabolome is formally defined as the collection of all small molecule metabolites or chemicals that can be found in a cell, organ or organism. These small molecules can include a range of endogenous and exogenous chemical entities such as peptides, amino acids, nucleic acids, carbohydrates, organic acids, vitamins, polyphenols, alkaloids, minerals and just about any other chemical that can be used, ingested or synthesized by a given cell or organism.

Metabolomics is ideally positioned to be used in many areas of food science and nutrition research including food component analysis, food quality/authenticity assessment, food consumption monitoring and physiological monitoring in food intervention studies. However, the potential impact of metabolomics is still limited by two factors: (1) technology and (2) databases. In terms of instrumentation, it is clear that significant improvements need to be made to make metabolite detection and quantification technology more robust, automated and comprehensive. While promising advances have been made, current techniques are only capable of detecting perhaps 1/10th of the relevant metabolome. This expanded breadth and depth of coverage is particularly important in food and nutrition studies.

Many more reference spectral or chromatographic databases on metabolites, food components and phytochemicals need to be developed and made public. It is only through these databases that nutritionally relevant compounds can be routinely identified or quantified. Indeed a comprehensive effort, similar to that undertaken to annotate the human metabolome, needs to be made to complete and annotate the “food metabolome”. Similar efforts also need to be directed towards creating publicly accessible, comprehensive nutritional phenotype databases that include quantitative metabolomic (and other omic) data collected from diet-challenge or food intervention experiments. While these kinds of endeavours may take years to complete and cost millions of dollars, hopefully the food science community (and its funding agencies) will find a way of coordinating its activities to complete these efforts. Indeed, having public resource like a food metabolome database or a nutritional phenotype database could be as valuable to food scientists as GenBank has been to molecular biologists.

Source References:

http://www.sciencedirect.com/science/article/pii/S0924224408000770

http://www.sciencedirect.com/science/article/pii/B9780123945983000010

http://www.sciencedirect.com/science/article/pii/S092422440900226X

http://www.sciencedirect.com/science/article/pii/S1359644605036093

http://www.sciencedirect.com/science/article/pii/B9780080885049000520

http://www.sciencedirect.com/science/article/pii/B9780123744135000051

Other articles related to this topic were published on this Open Access Online Scientific Journal, including the following:

Ca2+ signaling: transcriptional control

Larry H. Bernstein, MD, FCAP, Reporter, RN 03/06/2013

http://pharmaceuticalintelligence.com/2013/03/06/ca2-signaling-transcriptional-control/

Harnessing Personalized Medicine for Cancer Management, Prospects of Prevention and Cure: Opinions of Cancer Scientific Leaders @ http://pharmaceuticalintelligence.com

Aviva Lev-Ari, PhD, RN 01/12/2013

http://pharmaceuticalintelligence.com/2013/01/12/harnessing-personalized-medicine-for-cancer-management-prospects-of-prevention-and-cure-opinions-of-cancer-scientific-leaders-httppharmaceuticalintelligence-com/

Breakthrough Digestive Disorders Research: Conditions affecting the Gastrointestinal Tract.

Aviva Lev-Ari, PhD, RN 12/12/2012

http://pharmaceuticalintelligence.com/2012/12/12/breakthrough-digestive-disorders-research-conditions-affecting-the-gastrointestinal-tract/

A Second Look at the Transthyretin Nutrition Inflammatory Conundrum

Larry H. Bernstein, MD, FCAP, Reporter, RN 12/03/2012

http://pharmaceuticalintelligence.com/2012/12/03/a-second-look-at-the-transthyretin-nutrition-inflammatory-conundrum/

Metabolic drivers in aggressive brain tumors

Prabodh Kandala, PhD, RN 11/11/2012

http://pharmaceuticalintelligence.com/2012/11/11/metabolic-drivers-in-aggressive-brain-tumors/

Metabolite Identification Combining Genetic and Metabolic Information: Genetic association links unknown metabolites to functionally related genes

Aviva Lev-Ari, PhD, RN 10/22/2012

http://pharmaceuticalintelligence.com/2012/10/22/metabolite-identification-combining-genetic-and-metabolic-information-genetic-association-links-unknown-metabolites-to-functionally-related-genes/

Advances in Separations Technology for the “OMICs” and Clarification of Therapeutic Targets

Larry H. Bernstein, MD, FCAP, Reporter, RN 10/22/2012

http://pharmaceuticalintelligence.com/2012/10/22/advances-in-separations-technology-for-the-omics-and-clarification-of-therapeutic-targets/

Expanding the Genetic Alphabet and linking the genome to the metabolome

Larry H. Bernstein, MD, FCAP, Reporter, RN 09/24/2012

http://pharmaceuticalintelligence.com/2012/09/24/expanding-the-genetic-alphabet-and-linking-the-genome-to-the-metabolome/

Therapeutic Targets for Diabetes and Related Metabolic Disorders

Aviva Lev-Ari, PhD, RN 08/20/2012

http://pharmaceuticalintelligence.com/2012/08/20/therapeutic-targets-for-diabetes-and-related-metabolic-disorders/

The Automated Second Opinion Generator

Larry H. Bernstein, MD, FCAP, Reporter, RN 08/13/2012

http://pharmaceuticalintelligence.com/2012/08/13/the-automated-second-opinion-generator/

 

Pros and Cons of Drug Stabilizers for Arterial  Elasticity as an Alternative or Adjunct to Diuretics and Vasodilators in the Management of Hypertension.

Author, and Content Consultant to e-SERIES A: Cardiovascular Diseases: Justin Pearlman, MD, PhD, FACC

and

Article Curator: Aviva Lev-Ari, PhD, RN

This article presents the 2013 Thought Frontier on Hypertension and Vascular Compliance.

Conceptual development of the subject is presented in the following nine parts:

1.        Physiology of Circulation and Role of Arterial Elasticity

2.      Isolated Systolic Hypertension caused by Arterial Stiffening may be inadequately treated by Diuretics or Vasodilatation Antihypertensive Medications

3.         Physiology of Circulation and Compensatory Mechanism of Arterial Elasticity

4.         Vascular Compliance – The Potential for Novel Therapies

  • Novel Mechanism for Disease Etiology: Modulation of Nuclear and Cytoskeletal Actin Polymerization.
  • Genetic Therapy targeting Vascular Conductivity 
  • Regenerative Medicine for Vasculature Function Protection

5.        In addition to curtailing high pressures, stabilizing BP variability is a potential target for management of hypertension

6.        Mathematical Modeling: Arterial stiffening  explains much of primary hypertension

7.         Classification of Blood Pressure and Hypertensive Treatment Best Practice of Care in the US

8.         Genetic Risk for High Blood Pressure

9.         Is it Hypertension or Physical Inactivity: Cardiovascular Risk and Mortality – New results in 3/2013.

Summary By Justin D. Pearlman MD ME PhD MA FACC

1.       Physiology of Circulation and Role of Arterial Elasticity

  • Simplistically, high blood pressure stems from too much volume (salt water) for the vascular space, or conversely, too little space for the volume. Biological signals, such as endothelin, hypoxia, acidosis, nitric oxide, can modify vascular volume by constricting muscles in blood vessel walls. Less simplistically the physics of circulation are governed by numerous factors, with essentials detailed below.
  • The vascular space has two major circuits: pulmonary (lungs) and systemic (body).
  • Compliance (C)  relates change in volume (ΔV) to change in pressure (ΔP) as a measure of the strength of elasticity, where elasticity summarizes the intrinsic forces that  return to original shape after deformation: C = ΔV/ΔP . Those values can be estimated by ultrasound imaging with Doppler blood velocity estimation, by MRI, or invasively. Related properties can also be measured, such as wave propagation time or fractional flow reserve.
  • The vascular system is dynamic, with frequency components and reactive elements. The fundamental frequency is governed by the heart rate delivering a stroke volume forward into the vasculature; a heart rate of 60/minute corresponds to the frequency of 1 Hertz (1 cycle/second). The pressure rise due to the ejection of stroke volume is called the pulse pressure.
  • Numerous factors affect blood flow, including blood composition (affected by anemia or blood dilution), leakiness of vessels, elasticity, wave propagation, streamlines, viscosity, osmotic pressure (affected by protein deficiency and other factors),
  • In a static system, the driving force relates linearly flow by way of resistance (R  in units of dyn·s·cm−5): V=IR (Ohm’s law).
    • Pulmonary:\frac {80 \cdot (mean\ pulmonary\ arterial\ pressure - mean \ pulmonary \ artery \ wedge \ pressure)} {cardiac\ output}
    • Systemic:\frac {80 \cdot (mean\ arterial\ pressure - mean \ right \ atrial \ pressure)} {cardiac\ output}
  • In a dynamic, reactive system, the relation between the driving potential (pressure gradient), and current (blood flow) is governed by a differential equation. However, use of complex numbers and exponentials recovers simplicity similar to Ohm’s law:
    • Variables take the form Ae^{st}, where t is time, s is a complex parameter, and A is a complex scalar. Complex values simply mean two dimensional, e.g., magnitude (as in resistance) plus phase shift (to account for reactive components).
    • Complex version of Ohm’s law: \boldsymbol{V} = \boldsymbol{I} \cdot \boldsymbol{Z} where V and I are the complex scalars in the voltage and current respectively and Z is the complex impedance.
    • Frequency dependent “resistance” is captured by the term impedance.
  • Breathing in increases the return of blood to the heart, adding to pulse variation.
  • Dynamic elastance  (Eadyn relates volume variation (VVS) to pressure variation (PPV): Eadyn=PPV/SVV
    • PPV(%) = 100% × (PPmax − PPmin)/[(PPmax + PPmin)/2)]
      • where PPmax and PPmin are the maximum and minimum pulse pressures determined during a single  respiratory cycle
    • SVV(%) = 100% × [(SVmax − SVmin)/SVmean]
      • where SVmax and SVmin  are the maximum and minimum standard deviation of arterial pressure about the mean arterial pressure during a single respiratory cycle
  • The nervous system provides both stimulants and inhibitors (sympathetic and vagal nerves) to regulate blood vessel wall muscle tone and also heart rate. Many medications, and anesthetic agents in particular, reduce those responses to stimuli, so the vessels dilate, vascular impedance lowers, pressures drop, and autoregulation is impaired.
  • Diuretics aim to decrease volume of circulating fluid, vasodilators aim to increase the vascular space, and elasticity treatments will aim to preserve or improve the ability to accommodate changes in volume of fluid.
    • Vessel dilation near the skin promotes heat loss.
  • Vascular elasticity is impaired by atherosclerosis, menopause, and endothelial dysfunction (impaired nitric oxide signals  response, impaired endothelin response).
  • Elastance in a cyclic pressure system of systole-diastole (contraction-dilation) presents impedance as a pulsatile load on the heart. Inotropy describes the generation of pressure by cardiac contraction, lusiotropy the compliance of the heart to accept filling with minimal back pressure to the lungs. Chronic exposure to elevated vascular impedance leads to impairment of lusiotropy (diastolic failure, stiff heart) and inotropy (systolic failure, weak heart).

2.      Isolated Systolic Hypertension caused by Arterial Stiffening may be inadequately treated by Diuretics or Vasodilatation Antihypertensive Medications

3. Physiology of Circulation and Compensatory Mechanism of Arterial Elasticity

Antihypertensive agents have focused on the following approaches:

  1. The most common prescriptions, a mild diuretic, hydrochlorothiazide (HCTZ), is known to improve blood vessel compliance by reducing cell turgor, which explains why its full onset of benefit as well as its slow offset when stopped can take more than one month.
  2. Chlorthalidone  – Some evidence suggests that chlorthalidone may be superior to hydrochlorothiazide for the treatment of hypertension. However, a recent study concluded: chlorthalidone in older adults was not associated with fewer adverse cardiovascular events or deaths than hydrochlorothiazide. However, it was associated with a greater incidence of electrolyte abnormalities, particularly hypokalemia.
  • Increased vascular space (vasodilation)

    • Alternatively, the pressure can be lowered by increasing the vascular space for a given vascular volume. Examples of mediators for arterial tone (degree of dilation) include nitric oxide, prostacyclin and endothelin.

 

Class

Description

Hyperpolarization mediated (Calcium channel blocker) Changes in the resting membrane potential of thecell affects the level of intracellular calciumthrough modulation of voltage sensitive calcium channelsin the plasma membrane.
cAMP mediated Adrenergic stimulation results in elevated levelsof cAMP and protein kinase A, which results inincreasing calcium removal from the cytoplasm.
cGMP mediated (Nitrovasodilator) Through stimulation of protein kinase G.Until 2002, the enzyme for this conversion wasdiscovered to be mitochondrial aldehyde dehydrogenase.Proc. Natl. Acad. Sci. USA 102 (34): 12159–12164. doi:10.1073/pnas.0503723102http://www.pnas.org/content/102/34/12159.long

Class

Example

Hyperpolarization mediated (Calcium channel blocker) adenosineamlodipine (Norvasc),diltiazem (Cardizem,Dilacor XR) andnifedipine (Adalat, Procardia).
cAMP mediated prostacyclin
cGMP mediated (Nitrovasodilator) nitric oxide
  • Reduced pulsatile force (beta blockers)

These work by blocking certain nerve and hormonal signals to the heart and blood vessels, thus lowering blood pressure. Frequently prescribed beta blockers include

  • metoprolol (Lopressor, Toprol XL)
  • carvedilol (Coreg)
  • nadolol (Corgard)
  • penbutolol (Levatol).
  • Metabolized nebivolol increases vascular NO production, involves endothelial ß2-adrenergic receptor ligation, with a subsequent rise in endothelial free [Ca2+]i and endothelial NO synthase–dependent NO production
  • Angiotensin-converting enzyme (ACE) inhibitors

These allow blood vessels to widen by preventing the hormone angiotensin from affecting blood vessels. Frequently prescribed ACE inhibitors include captopril (Capoten), lisinopril (Prinivil, Zestril) and ramipril (Altace).

  • Angiotensin II receptor blockers

These help blood vessels relax by blocking the action of angiotensin. Frequently prescribed angiotensin II receptor blockers include losartan (Cozaar), olmesartan (Benicar) and valsartan (Diovan).
Another very commonly prescribed drug class of medication counteracts hardening of arteries.

Atheroma lipids have enzyme systems that explicitly disassemble cholesterol esters and reconstruct them inside blood vessel walls,e.g.,  Anacetrapib, Genetic variants that improve cholesterol levels are stimulating development of additional medications.

We can propose that atheroma build up in arterial blood vessel walls constitutes a maladaptive defense against aneurysm and risk of vessel rupture from hypertension.

Arguably, HMG-CoA reductase inhibitors,  statin therapy is a second example of a medication that helps protect vascular elasticity, both by its lipid effects and its anti-inflammatory effects.

The best-selling statin is atorvastatin, marketed as Lipitor (manufactured by Pfizer) and Torvast. By 2003, atorvastatin became the best-selling pharmaceutical in history,[4] with Pfizer reporting sales of US$12.4 billion in 2008.[5] As of 2010, a number of statinsare on the market: atorvastatin (Lipitor and Torvast), fluvastatin (Lescol), lovastatin (Mevacor, Altocor, Altoprev), pitavastatin(Livalo, Pitava), pravastatin (Pravachol, Selektine, Lipostat), rosuvastatin (Crestor) and simvastatin (Zocor, Lipex).[6] Several combination preparations of a statin and another agent, such as ezetimibe/simvastatin, are also available.

References for Statins from:

http://en.wikipedia.org/wiki/Statin

Clinical Considerations of Statin Therapy’s manifold effects, in

http://pharmaceuticalintelligence.com/2012/10/08/statins-nonlipid-effects-on-vascular-endothelium-through-enos-activation/

Compensatory Effects in the Physiology of Circulation

Before declaring vessel elasticity a new and highly desirable treatment target, consider that it is not firmly established that hardening of arteries (loss of elasticity) is entirely maladaptive.

In parallel with any focus on increasing vascular elasticity or compliance, each of the issues discussed, below merits scrutiny and investigation.

Cardiac Circulation Dynamics

Endothelium morphology, rheological properties of intra vasculature fluid dynamics and blood viscosity provided explanation for shear stress of vessels under arterial pressure

http://pharmaceuticalintelligence.com/2012/11/28/special-considerations-in-blood-lipoproteins-viscosity-assessment-and-treatment/

and

http://pharmaceuticalintelligence.com/2012/11/28/what-is-the-role-of-plasma-viscosity-in-hemostasis-and-vascular-disease-risk/

Aging and Vasculature Diminished Elasticity

While among other reasons for Hypertension increasing prevalence with aging, arterial stiffening is one.

Yet, stiffer vessels are more efficient at transmitting pressure to distal targets. With aging, muscle mass diminishes markedly and the contribution to circulation from skeletal muscle tissue compressions combined with competent venous valves fades.

http://pharmaceuticalintelligence.com/2012/08/27/endothelial-dysfunction-diminished-availability-of-cepcs-increasing-cvd-risk-for-macrovascular-disease-therapeutic-potential-of-cepcs/

and

http://pharmaceuticalintelligence.com/2012/10/19/clinical-trials-results-for-endothelin-system-pathophysiological-role-in-chronic-heart-failure-acute-coronary-syndromes-and-mi-marker-of-disease-severity-or-genetic-determination/

and

http://pharmaceuticalintelligence.com/2012/11/13/peroxisome-proliferator-activated-receptor-ppar-gamma-receptors-activation-pparγ-transrepression-for-angiogenesis-in-cardiovascular-disease-and-pparγ-transactivation-for-treatment-of-dia/

Aging and Myocardial Diminished Contractility and Ejection Fraction

With aging heart contractility diminishes. These issues can cause under perfusion of tissues, inadequate nutrient blood delivery (ischemia), lactic acidosis, tissue dysfunction and multi-organ failure. Hardened arteries may compensate. Thus, pharmacotherapy to increase Arterial Elasticity may be counterindicated for patients with mild to progressive CHF.

http://pharmaceuticalintelligence.com/2013/05/05/bioengineering-of-vascular-and-tissue-models/

and

http://pharmaceuticalintelligence.com/2012/10/20/nitric-oxide-and-sepsis-hemodynamic-collapse-and-the-search-for-therapeutic-options/

and

http://pharmaceuticalintelligence.com/2012/10/17/chronic-heart-failure-personalized-medicine-two-gene-test-predicts-response-to-beta-blocker-bucindolol/
Our biosystems are highly interdependent, and we cannot leap to conclusions without careful thorough evidence. Increasing arterial elastance will lower vascular impedance and change the frequency components of our pulsatile perfusion system.

MOST comprehensive review of the Human Cardiac Conduction System presented to date:

http://pharmaceuticalintelligence.com/2013/04/28/genetics-of-conduction-disease-atrioventricular-av-conduction-disease-block-gene-mutations-transcription-excitability-and-energy-homeostasis/

Diminished contractility will increase the amount of energy needed to maintain circulation. It will change efficiency dramatically – consider the difference between periodically pushing someone sitting on a swing at the resonance frequency if the pendulum versus significantly off resonance.

http://pharmaceuticalintelligence.com/2013/04/14/mitochondrial-metabolism-and-cardiac-function/

and

http://pharmaceuticalintelligence.com/2012/10/28/mitochondrial-damage-and-repair-under-oxidative-stress/

Increased Arterial Elasticity – Potential Risk to Myocardium

The hypothesis that we should focus on cellular therapies to increase vascular compliance may decrease the circulation efficiency and result in worsening of cardiac right ventricular morphology and development of Dilated cardiomyopathy and hypertrophic cardiomyopathy (muscle thickening and diastolic failure), an undesirable outcome resulting from an attempt to treat the hypertension.

4. Vascular Compliance – The Potential of Noval Therapies

  • Novel Mechanism for Disease Etiology for the Cardiac Phenotype: Modulation of Nuclear and Cytoskeletal Actin Polymerization.

Lamin A/C and emerin regulate MKL1–SRF activity by modulating actin dynamics

Chin Yee Ho,

Diana E. Jaalouk,

Maria K. Vartiainen

Jan Lammerding

Nature (2013) doi:10.1038/nature12105

Published online 05 May 2013

Affiliations

Cornell University, Weill Institute for Cell and Molecular Biology/Department of Biomedical Engineering, Ithaca, New York 14853, USA

Chin Yee Ho &

Jan Lammerding

Brigham and Women’s Hospital/Harvard Medical School, Department of Medicine, Boston 02115, Massachusetts, USA

Chin Yee Ho,

Diana E. Jaalouk &

Jan Lammerding

Institute of Biotechnology, University of Helsinki, 00014 Helsinki, Finland

Maria K. Vartiainen

Present address: American University of Beirut, Department of Biology, Beirut 1107 2020, Lebanon.

Diana E. Jaalouk

Contributions

C.Y.H., D.E.J. and J.L. conceived and designed the overall project, with valuable help from M.K.V. C.Y.H. and D.E.J. performed the experiments. C.Y.H., D.E.J. and J.L. analysed data. C.Y.H. and J.L. wrote the paper.

Corresponding author Jan Lammerding

Laminopathies, caused by mutations in the LMNA gene encoding the nuclear envelope proteins lamins A and C, represent a diverse group of diseases that include Emery–Dreifuss muscular dystrophy (EDMD), dilated cardiomyopathy (DCM), limb-girdle muscular dystrophy, and Hutchison–Gilford progeria syndrome1. Most LMNA mutations affect skeletal and cardiac muscle by mechanisms that remain incompletely understood. Loss of structural function and altered interaction of mutant lamins with (tissue-specific) transcription factors have been proposed to explain the tissue-specific phenotypes1. Here we report in mice that lamin-A/C-deficient (Lmna/) and LmnaN195K/N195K mutant cells have impaired nuclear translocation and downstream signalling of the mechanosensitive transcription factor megakaryoblastic leukaemia 1 (MKL1), a myocardin family member that is pivotal in cardiac development and function2. Altered nucleo-cytoplasmic shuttling of MKL1 was caused by altered actin dynamics in Lmna/ and LmnaN195K/N195K mutant cells. Ectopic expression of the nuclear envelope protein emerin, which is mislocalized in Lmnamutant cells and also linked to EDMD and DCM, restored MKL1 nuclear translocation and rescued actin dynamics in mutant cells. These findings present a novel mechanism that could provide insight into the disease aetiology for the cardiac phenotype in many laminopathies, whereby lamin A/C and emerin regulate gene expression through modulation of nuclear and cytoskeletal actin polymerization.

 http://www.nature.com/nature/journal/vaop/ncurrent/full/nature12105.html

  • Genetic Therapy to Conductivity Disease

http://pharmaceuticalintelligence.com/2012/10/01/ngs-cardiovascular-diagnostics-long-qt-genes-sequenced-a-potential-replacement-for-molecular-pathology/

  • Regenerative Medicine for Vasculature Function Protection

http://pharmaceuticalintelligence.com/2012/08/29/positioning-a-therapeutic-concept-for-endogenous-augmentation-of-cepcs-therapeutic-indications-for-macrovascular-disease-coronary-cerebrovascular-and-peripheral/

and

http://pharmaceuticalintelligence.com/2012/08/28/cardiovascular-outcomes-function-of-circulating-endothelial-progenitor-cells-cepcs-exploring-pharmaco-therapy-targeted-at-endogenous-augmentation-of-cepcs/

and

http://pharmaceuticalintelligence.com/2013/02/28/the-heart-vasculature-protection-a-concept-based-pharmacological-therapy-including-thymosin/

5. Stabilizing BP Variability is the next Big Target in Hypertension Management

Hypertension caused by Arterial Stiffening is Ineffectively Treated by Diuretics and Vasodilatation Antihypertensives

Barcelona, Spain – An aging population grappling with rising rates of hypertension and other cardiometabolic risk factors should prompt an overhaul of how hypertension is diagnosed and monitored and should spur development of drugs with entirely new mechanisms of action, one expert says. Speaking here at the 2013 International Conference on Prehypertension and Cardiometabolic Syndrome, meeting cochair Dr Reuven Zimlichman (Tel Aviv University, Israel) argued that the definitions of hypertension, as well as the risk-factor tables used to guide treatment, are no longer appropriate for a growing number of patients.

Most antihypertensives today work by producing vasodilation or decreasing blood volume and so are ineffective treatments in ISH patients. In the future, he predicts, “we will have to start looking for a totally different medication that will aim to improve or at least to stabilize arterial elasticity: medication that might affect factors that determine the stiffness of the arteries, like collagen, like fibroblasts. Those are not the aim of any group of antihypertensive medications today.”

Zimlichman believes existing databases could be used to develop algorithms that take this progression of disease into account, in order to better guide hypertension management. He also points out that new ambulatory blood-pressure-monitoring devices also measure arterial elasticity. “Unquestionably, these will improve our ability to diagnose both the status of the arteries and the changes of the arteries with time as a result of our treatment. So if we treat the patient and we see no improvement in arterial elasticity, or the patient is worse, something is wrong, something is not working—either the patient is not taking the medication, or our choice of medication is not appropriate, or the dose is insufficient, etc.”

http://www.theheart.org/article/1502067.do

Oslo, Norway – New research that is only just starting to be digested by the hypertension community indicates that visit-to-visit variability in blood-pressure readings will likely become another way of looking for “at-risk” hypertensive patients and in fact is likely to be more reliable as an indicator of cardiovascular risk than the currently used mean BP.

The Goal of Stabilizing BP variability 

June 29, 2010  

Discussing the importance of this issue for guidelines and clinical practice, Dr Tony Heagerty (University of Manchester, UK) told the recent European Society of Hypertension (ESH) European Meeting on Hypertension 2010: “We are poking around in the dark, offering treatment blankly across a large community, and probably treating a lot of people who don’t need to be treated, while not necessarily treating the highest-risk patients. We should stop being reassured by ‘occasional’ normal BPs. The whole game now is, can we improve the identification of our ‘at-risk’ individuals?”

Heagerty was speaking at a special plenary session on late-breaking research discussing BP variability as a risk factor. This issue has emerged following new analyses reported at the ACC meeting and published in a number of papers in the Lancet and Lancet Neurology earlier this year, which showed that variability in blood pressure is a much stronger determinant of both stroke and coronary disease outcome than average blood pressure.

http://www.theheart.org/article/1093553.do

Three years later, 2/1/2013, Zimlichman also argued that definitions of essential and secondary hypertension have changed very little over the past few decades and have typically only been tweaked up or down related to other CV risk factors. Diastolic hypertension has been the primary goal of treatment, and treatment goals have not adequately taken patient age into account (in whom arterial stiffening plays a larger role), and they have typically relied too heavily on threshold cutoffs, rather than the “linear progression” of risk factors and their impact on organ damage.

6. Mathematical Modeling: Arterial stiffening provides sufficient explanation for primary hypertension

Klas H. PettersenScott M. BugenhagenJavaid NaumanDaniel A. BeardStig W. Omholt

(Submitted on 3 May 2013 (v1), last revised 6 May 2013 (this version, v2))

Hypertension is one of the most common age-related chronic diseases and by predisposing individuals for heart failure, stroke and kidney disease, it is a major source of morbidity and mortality. Its etiology remains enigmatic despite intense research efforts over many decades. By use of empirically well-constrained computer models describing the coupled function of the baroreceptor reflex and mechanics of the circulatory system, we demonstrate quantitatively that arterial stiffening seems sufficient to explain age-related emergence of hypertension. Specifically, the empirically observed chronic changes in pulse pressure with age, and the impaired capacity of hypertensive individuals to regulate short-term changes in blood pressure, arise as emergent properties of the integrated system. Results are consistent with available experimental data from chemical and surgical manipulation of the cardio-vascular system. In contrast to widely held opinions, the results suggest that primary hypertension can be attributed to a mechanogenic etiology without challenging current conceptions of renal and sympathetic nervous system function. The results support the view that a major target for treating chronic hypertension in the elderly is the reestablishment of a proper baroreflex response.

Klas H. Pettersen1, Scott M. Bugenhagen2, Javaid Nauman3, Daniel A. Beard2 & Stig W. Omholt3

1Department of Mathematical and Technological Sciences, Norwegian University of Life Science, Norway

2Department of Physiology, Medical College of Wisconsin, Milwaukee, Wisconsin, USA

3NTNU Norwegian University of Science and Technology, Department of Circulation and Medical Imaging, Cardiac Exercise Research Group, Trondheim, Norway

Correspondence should be addressed to: KHP (klas.pettersen@gmail.com)

Keywords: hypertension, mechanogenic, baroreceptor signaling, cardiovascular model, arterial stiffening

Author contributions: K.H.P. and S.W.O. designed the study. K.H.P. constructed the

integrated model and performed the numerical experiments with contributions from

D.A.B. and S.M.B.. J.N. extracted and compiled empirical test data from the HUNT2

Survey. S.W.O, K.H.P. and D.A.B. wrote the paper.

http://arxiv.org/abs/1305.0727v2

http://arxiv.org/pdf/1305.0727v2.pdf

 

7. Classification of Blood Pressure and Hypertensive Treatment:

Best Practice of Care in the US

8. Genetic Risk for High Blood Pressure

Hypertension.2013; 61: 931doi: 10.1161/​HYP.0b013e31829399b2

Blood Pressure Single-Nucleotide Polymorphisms and Coronary Artery Sisease (page 995)

Blood pressure (BP) is considered a major cardiovascular risk factor that is influenced by multiple genetic and environmental factors. However, the precise genetic underpinning influencing interindividual BP variation is not well characterized; and it is unclear whether BP-associated genetic variants also predispose to clinically apparent cardiovascular disease. Such an association of BP-related variants with cardiovascular disease would strengthen the concept of BP as a causal risk factor for cardiovascular disease. In this issue of Hypertension, analyses within the Coronary ARtery DIsease Genome-Wide Replication And Meta-Analysis consortium indicate that common genetic variants associated with BP in the population, indeed, contribute to the susceptibility for coronary artery disease (CAD). Lieb et al tested 30 single-nucleotide polymorphisms—that based on prior studies were known to affect BP—for their association with CAD. In total, data from 22 233 CAD cases and 64 762 controls were analyzed. The vast majority (88%) of BP-related single-nucleotide polymorphisms were also shown to increase the risk of CAD (as defined by an odds ratio for CAD >1; Figure). On average, each of the multiple BP-raising alleles was associated with a 3% (95% confidence interval, 1.8%–4.3%) risk increase for CAD.

Masked Hypertension in Diabetes Mellitus (page 964)

The first important finding in the IDACO study of masked hypertension (MH) in the population with diabetes mellitus and non–diabetes mellitus was that antihypertensive treatment converted some sustained hypertensives into sustained normotensives; this resulted in an increased cardiovascular disease risk in the treated versus untreated normotensive comparator group (Figure). Not surprisingly, normalization of blood pressure (BP) with treatment did not eliminate the lifetime cardiovascular disease burden associated with prior elevated BP nor did it correct other cardiometabolic risk factors that clustered with the hypertensive state.

The second important IDACO finding was that treatment increased the prevalence of MH by decreasing conventional BP versus daytime ambulatory BP (ABP) by a ratio of ≈3 to 2. The clinical implication of increased prevalence of MH with therapy in the population of both diabetes mellitus and non–diabetes mellitus was that these subjects did not receive sufficient antihypertensive therapy to convert MH into normalized ABP (ie, treated, normalized ABP being the gold standard for minimizing cardiovascular disease risk). Indeed, there is a transformation-continuum from sustained hypertension to MH and finally to sustained normotension with increasing antihypertensive therapy. These IDACO findings strongly suggest that many physicians mistakenly have their primary focus on normalizing in-office rather than out-of-office home BP and/or 24-hour ABP values and this results in an increased prevalence of MH. However, what constitutes optimal normalized ABP will remain empirical until established in randomized controlled trials.

Genetic Risk Score for Blood Pressure (page 987)

Elevated blood pressure (BP) is a strong, independent, and modifiable risk factor for stroke and heart disease. BP is a heritable trait, and genome-wide association studies have identified several genetic loci that are associated with systolic BP, diastolic BP, or both. Although the variants have modest effects on BP, typically 0.5 to 1.0 mm Hg, their presence may act over the entire life course and, therefore, lead to substantial increase in risk of cardiovascular disease (CVD). However, the independent impact of these variants on CVD risk has not been established in a prospective setting. Havulinna et al genotyped 32 common single-nucleotide polymorphisms in several Finnish cohorts, with up to 32 669 individuals after exclusion of prevalent CVD cases. The median follow-up was 9.8 years, during which 2295 incident CVD events occurred. Genetic risk scores were created for systolic BP and diastolic BP by multiplying the risk allele count of each single-nucleotide polymorphism by the effect size estimated in published genome-wide association studies on BP traits. The GRSs were strongly associated with baseline systolic BP, diastolic BP, and hypertension (all P<10–62). Hazard ratios for incident CVD increased roughly linearly by quintile of systolic BP or diastolic BP GRS (Figure). GRSs remained significant predictors of CVD risk after adjustment for traditional risk factors, even including BP and use of antihypertensive medication. These findings are consistent with a lifelong effect of these variants on BP and CVD risk.

Related Articles on Genetics and Blood Pressure

Genetic Predisposition to Higher Blood Pressure Increases Coronary Artery Disease Risk

  • Wolfgang Lieb,
  • Henning Jansen,
  • Christina Loley,
  • Michael J. Pencina,
  • Christopher P. Nelson,
  • Christopher Newton-Cheh,
  • Sekar Kathiresan,
  • Muredach P. Reilly,
  • Themistocles L. Assimes,
  • Eric Boerwinkle,
  • Alistair S. Hall,
  • Christian Hengstenberg,
  • Reijo Laaksonen,
  • Ruth McPherson,
  • Unnur Thorsteinsdottir,
  • Andreas Ziegler,
  • Annette Peters,
  • John R. Thompson,
  • Inke R. König,
  • Jeanette Erdmann,
  • Nilesh J. Samani,
  • Ramachandran S. Vasan,
  • andHeribert Schunkert
  • , on behalf of CARDIoGRAM

Hypertension. 2013;61:995-1001, published online before print March 11 2013,doi:10.1161/HYPERTENSIONAHA.111.00275

Masked Hypertension in Diabetes Mellitus: Treatment Implications for Clinical Practice

  • Stanley S. Franklin,
  • Lutgarde Thijs,
  • Yan Li,
  • Tine W. Hansen,
  • José Boggia,
  • Yanping Liu,
  • Kei Asayama,
  • Kristina Björklund-Bodegård,
  • Takayoshi Ohkubo,
  • Jørgen Jeppesen,
  • Christian Torp-Pedersen,
  • Eamon Dolan,
  • Tatiana Kuznetsova,
  • Katarzyna Stolarz-Skrzypek,
  • Valérie Tikhonoff,
  • Sofia Malyutina,
  • Edoardo Casiglia,
  • Yuri Nikitin,
  • Lars Lind,
  • Edgardo Sandoya,
  • Kalina Kawecka-Jaszcz,
  • Jan Filipovský,
  • Yutaka Imai,
  • Jiguang Wang,
  • Hans Ibsen,
  • Eoin O’Brien,
  • and Jan A. Staessen
  • , on behalf of the International Database on Ambulatory blood pressure in relation to Cardiovascular Outcomes (IDACO) Investigators

Hypertension. 2013;61:964-971, published online before print March 11 2013,doi:10.1161/HYPERTENSIONAHA.111.00289

A Blood Pressure Genetic Risk Score Is a Significant Predictor of Incident Cardiovascular Events in 32 669 Individuals

  • Aki S. Havulinna,
  • Johannes Kettunen,
  • Olavi Ukkola,
  • Clive Osmond,
  • Johan G. Eriksson,
  • Y. Antero Kesäniemi,
  • Antti Jula,
  • Leena Peltonen,
  • Kimmo Kontula,
  • Veikko Salomaa,
  • and Christopher Newton-Cheh

Hypertension. 2013;61:987-994, published online before print March 18 2013,doi:10.1161/HYPERTENSIONAHA.111.00649

9. Is it Hypertension or Physical Inactivity: Cardiovascular Risk and Mortality – New results in 3/2013.

Heart doi:10.1136/heartjnl-2012-303461

  • Epidemiology
  • Original article

Estimating the effect of long-term physical activity on cardiovascular disease and mortality: evidence from the Framingham Heart Study

  1. Susan M Shortreed1,2,
  2. Anna Peeters1,3,
  3. Andrew B Forbes1

+Author Affiliations


  1. 1Department of Epidemiology and Preventive Medicine, Monash University, Melbourne, Australia

  2. 2Biostatistics Unit, Group Health Research Institute, Seattle, Washington, USA

  3. 3Obesity and Population Health Unit, Baker IDI Heart and Diabetes Institute, Melbourne, Australia

Correspondence toDr Susan M Shortreed, Biostatistics Unit, Group Health Research Institute, 1730 Minor Avenue, Suite 1600, Seattle, WA 98101, USA; shortreed.s@ghc.org

  • Published Online First 8 March 2013

Abstract

Objective In the majority of studies, the effect of physical activity (PA) on cardiovascular disease (CVD) and mortality is estimated at a single time point. The impact of long-term PA is likely to differ. Our study objective was to estimate the effect of long-term adult-life PA compared with long-term inactivity on the risk of incident CVD, all-cause mortality and CVD-attributable mortality.

Design Observational cohort study.

Setting Framingham, MA, USA.

Patients 4729 Framingham Heart Study participants who were alive and CVD-free in 1956.

Exposures PA was measured at three visits over 30 years along with a variety of risk factors for CVD. Cumulative PA was defined as long-term active versus long-term inactive.

Main outcome measures Incident CVD, all-cause mortality and CVD-attributable mortality.

Results During 40 years of follow-up there were 2594 cases of incident CVD, 1313 CVD-attributable deaths and 3521 deaths. Compared with long-term physical inactivity, the rate ratio of long-term PA was 0.95 (95% CI 0.84 to 1.07) for CVD, 0.81 (0.71 to 0.93) for all-cause mortality and 0.83 (0.72 to 0.97) for CVD-attributable mortality. Assessment of effect modification by sex suggests greater protective effect of long-term PA on CVD incidence (p value for interaction=0.004) in men (0.79 (0.66 to 0.93)) than in women (1.15 (0.97 to 1.37)).

Conclusions

  • Cumulative long-term PA has a protective effect on incidence of all-cause and CVD-attributable mortality compared with long-term physical inactivity.
  • In men, but not women, long-term PA also appears to have a protective effect on incidence of CVD.

Summary – PENDING

REFERENCES
1. Kannel WB, Gordan T (1978) Evaluation of cardiovascular risk in the elderly: the Framingham study. Bull N Y Acad Med 54:573–591.
2. Franklin SS, Khan SA, Wong ND, Larson MG, Levy D (1999) Is pulse pressure useful in predicting risk for coronary heart disease?: The Framingham Heart Study. Circulation 100:354–360.
3. Mitchell GF et al. (2010) Hemodynamic Correlates of Blood Pressure Across the Adult Age Spectrum: Noninvasive Evaluation in the Framingham Heart Study. Circulation 122:1379–1386.
4. Khattar RS, Swales JD, Dore C, Senior R, Lahiri A (2001) Effect of Aging on the Prognostic Significance of Ambulatory Systolic, Diastolic, and Pulse Pressure in Essential Hypertension. Circulation 104:783–789.
5. Franklin SS et al. (1997) Hemodynamic patterns of age-related changes in blood pressure: the Framingham Heart Study. Circulation 96:308.
6. Guyenet PG (2006) The sympathetic control of blood pressure. Nat Rev Neurosci 7:335–346.
7. Monahan KD (2007) Effect of aging on baroreflex function in humans. Am J Physiol Regul Integr Comp Physiol 293:R3–R12.
8. Zieman SJ (2005) Mechanisms, Pathophysiology, and Therapy of Arterial Stiffness. Arterioscler Thromb Vasc Biol 25:932–943.
9. McVeigh GE, Bank AJ, Cohn JN (2007) Arterial compliance. Cardiovasc Med:1811–1831.
10. Guyton AC (1991) Blood pressure control–special role of the kidneys and body fluids. Science 252:1813–1816.
11. Smith BW, Chase JG, Nokes RI, Shaw GM, Wake G (2004) Minimal haemodynamic system model including ventricular interaction and valve dynamics. Med Eng Phys 26:131–139.
12. Smith BW, Geoffrey Chase J, Shaw GM, Nokes RI (2005) Experimentally verified minimal cardiovascular system model for rapid diagnostic assistance. Control Eng Pract 13:1183–1193.13. Bugenhagen SM, Cowley AW, Beard DA (2010) Identifying physiological origins of baroreflex dysfunction in salt-sensitive hypertension in the Dahl SS rat. Physiol Genomics 42:23–41.14. Beard DA et al. (2012) Multiscale Modeling and Data Integration in the Virtual Physiological Rat Project. Ann Biomed Eng.15. King AL (1946) Pressure-Volume Relation for Cylindrical Tubes with Elastomeric Walls: The Human Aorta. J Appl Phys 17:501.16. Dayan P, Abbott LF (2001) Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems (Computational Neuroscience) (The MIT Press). 1st Ed.17. Andresen MC, Krauhs JM, Brown AM (1978) Relationship of aortic wall and baroreceptor properties during development in normotensive and spontaneously hypertensive rats. Circ Res 43:728–738.18. Hallock P, Benson IC (1937) Studies on the elastic properties of human isolated aorta. J Clin Invest 16:595–602.19. Coffman TM (2011) Under pressure: the search for the essential mechanisms of hypertension. Nat Med 17:1402–1409.20. Proctor DN et al. (1998) Influence of age and gender on cardiac output-V O 2 relationships during submaximal cycle ergometry. J Appl Physiol 84:599–605.21. Fagard R, Thijs L, AMERY A (1993) Age and the Hemodynamic Response to Posture and Exercise. Am J Geriatr Cardiol 2:23–40.22. Stratton JR, Levy WC, Cerqueira MD, Schwartz RS, Abrass IB (1994) Cardiovascular responses to exercise. Effects of aging and exercise training in healthy men. Circulation 89:1648–1655.23. Holmen J et al. (2003) The Nord-Trøndelag Health Study 1995–97 (HUNT 2): objectives, contents, methods and participation. Norsk epidemiologi 13:19–32.24. Chobanian AV et al. (2003) The Seventh Report of the Joint National Committee on Prevention, Detection, Evaluation, and Treatment of High Blood Pressure: the JNC 7 report. JAMA 289:2560–2572.25. Cowley AW, LIARD JF, Guyton AC (1973) Role of the Baroreceptor Reflex in Daily Control of Arterial Blood Pressure and Other Variables in Dogs. Circ Res 32:564–576.26. Schreihofer AM, Sved AF (1992) Nucleus tractus solitarius and control of blood pressure in chronic sinoaortic denervated rats. Am J Physiol 263:R258–66.27. Ito S, Sved AF (1997) Influence of GABA in the nucleus of the solitary tract on blood pressure in baroreceptor-denervated rats. Am J Physiol Regul Integr Comp Physiol 273:R1657–R1662.28. Thrasher TN (2004) Baroreceptors, baroreceptor unloading, and the long-term control of blood pressure. Am J Physiol Regul Integr Comp Physiol 288:R819– R827.29. Monahan KD et al. (2001) Age-associated changes in cardiovagal baroreflex sensitivity are related to central arterial compliance. Am J Physiol Heart Circ Physiol 281:H284–H289.30. Malpas S (2009) Editorial comment: Montani versus Osborn exchange of views. Experimental Physiology 94:381–382.31. Mori T et al. (2008) High Perfusion Pressure Accelerates Renal Injury in Salt-Sensitive Hypertension. Journal of the American Society of Nephrology 19:1472–1482.32. Beard DA, Mescam M (2012) Mechanisms of pressure-diuresis and pressurenatriuresis in Dahl salt-resistant and Dahl salt-sensitive rats. BMC Physiol 12:6.33. Iliescu R, Irwin ED, Georgakopoulos D, Lohmeier TE (2012) Renal Responses to Chronic Suppression of Central Sympathetic Outflow. Hypertension 60:749–756.34. Krum H et al. (2009) Catheter-based renal sympathetic denervation for resistant hypertension: a multicentre safety and proof-of-principle cohort study. Lancet 373:1275–1281.35. Mahfoud F et al. (2012) Renal Hemodynamics and Renal Function After Catheter-Based Renal Sympathetic Denervation in Patients With Resistant Hypertension. Hypertension 60:419–424.36. Vink EE, Blankestijn PJ (2012) Evidence and Consequences of the Central Role of the Kidneys in the Pathophysiology of Sympathetic Hyperactivity. Front Physio 3.37. Cowley A Jr (1992) Long-term control of arterial blood pressure. Physiol Rev 72:231–300.38. Mancia G, Ludbrook J, Ferrari A, Gregorini L, Zanchetti A (1978) Baroreceptor reflexes in human hypertension. Circ Res 43:170–177.39. Kaess BM et al. (2012) Aortic stiffness, blood pressure progression, and incident hypertension. JAMA 308:875–881.

40. Kirkwood TBL (1977) Evolution of ageing. Nature 270:301–304.

41. Nakayama Y et al. (2001) Heart Rate-Independent Vagal Effect on End-Systolic Elastance of the Canine Left Ventricle Under Various Levels of Sympathetic Tone. Circulation 104:2277–2279.

42. Cohen A (1991) A Padé approximant to the inverse Langevin function. Rheologic Acta 30:270–273.

43. Brown AM, Saum WR, Tuley FH (1976) A comparison of aortic baroreceptor discharge in normotensive and spontaneously hypertensive rats. Circ Res 39:488–496.

44. Smith H (2011) in Texts in Applied Mathematics, Texts in Applied Mathematics. (Springer New York, New York, NY), pp 119–130.

Other related articles were published on this Open Access Online Scientific Journal including the following:

Pearlman, JD and A. Lev-Ari 5/24/2013 Imaging Biomarker for Arterial Stiffness: Pathways in Pharmacotherapy for Hypertension and Hypercholesterolemia Management

http://pharmaceuticalintelligence.com/2013/05/24/imaging-biomarker-for-arterial-stiffness-pathways-in-pharmacotherapy-for-hypertension-and-hypercholesterolemia-management/

Lev-Ari, A. 5/17/2013 Synthetic Biology: On Advanced Genome Interpretation for Gene Variants and Pathways: What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging

http://pharmaceuticalintelligence.com/2013/05/17/synthetic-biology-on-advanced-genome-interpretation-for-gene-variants-and-pathways-what-is-the-genetic-base-of-atherosclerosis-and-loss-of-arterial-elasticity-with-aging/

Bernstein, HL and A. Lev-Ari 5/15/2013 Diagnosis of Cardiovascular Disease, Treatment and Prevention: Current & Predicted Cost of Care and the Promise of Individualized Medicine Using Clinical Decision Support Systems

http://pharmaceuticalintelligence.com/2013/05/15/diagnosis-of-cardiovascular-disease-treatment-and-prevention-current-predicted-cost-of-care-and-the-promise-of-individualized-medicine-using-clinical-decision-support-systems-2/

Pearlman, JD and A. Lev-Ari 5/7/2013 On Devices and On Algorithms: Arrhythmia after Cardiac Surgery Prediction and ECG Prediction of Paroxysmal Atrial Fibrillation Onset

http://pharmaceuticalintelligence.com/2013/05/07/on-devices-and-on-algorithms-arrhythmia-after-cardiac-surgery-prediction-and-ecg-prediction-of-paroxysmal-atrial-fibrillation-onset/

Pearlman, JD and A. Lev-Ari 5/4/2013 Cardiovascular Diseases: Decision Support Systems for Disease Management Decision Making

http://pharmaceuticalintelligence.com/2013/05/04/cardiovascular-diseases-decision-support-systems-for-disease-management-decision-making/

Larry H Bernstein, MD, FACP, 12/10/2012

Genomics & Genetics of Cardiovascular DiseaseDiagnoses: A Literature Survey of AHA’s Circulation Cardiovascular Genetics, 3/2010 – 3/2013

Aviva Lev-Ari, PhD, RN and Larry H. Bernstein, MD, FACP, 3/7/2013

Mitochondrial Dysfunction and Cardiac Disorders

Curator: Larry H Bernstein, MD, FACP

Aviva Lev-Ari, PhD, RN, 4/7/2013