Feeds:
Posts
Comments

Posts Tagged ‘Data analysis’


Worldwide trial uses AI to quickly identify ideal Covid-19 treatments

Reporter : Irina Robu, PhD

The novel coronavirus, SARS-CoV-2 that has been spreading around the world can cause a respiratory illness that can be severe. The disease, COVID-19 appears to have a fatality rate of less than 2 percent and forcing doctors to choose between two equally revolting options: try an unproven therapy and anticipate that it works or treat patients with standard supportive care for severe respiratory disease until a vaccine is developed.

Currently, randomized controlled trials have started in dozens of hospitals around the world by fusing two approaches together, using artificial intelligence to home in or using the most effective treatments for respiratory infections. The randomized trials, also known as an adaptive trial, in which scientists adjust the treatment protocols and/or statistical procedures based on the outcomes of participants. The trials are seen as a way to detect promising treatments and brand trials more flexible than traditional randomized trials and force patients, trial sponsors to wait for an outcome that often turns out to be disappointing. The inadequacies of the randomized approach have been taken into sharp relief during the pandemic, as thousands of patients can’t wait for gold-standard science to play out as they lay dying in intensive care of units.

However, analyzing data from more than 50 hospitals, researchers hope to supply quick answers to pressing questions such as the fact that the antimalarial drug hydroxychloroquine is an effective therapy and, if so, for which types of patients. The trial will also allow the researchers to test multiple therapies at once. Since the approach seems reasonable to give answers during a pandemic, it still has a lot of challenges, plus the necessity to rapidly assemble and analyze data from several hospitals with various record-keeping systems on three continents. Then update the protocols in accord during a crisis that is draining clinical resources.

Since several treatments are being tested, carrying out these trials is predominantly complicated. But progress in computing resources required to share data and analyze it swiftly using artificial intelligence have started to make these designs more practical.

The World Health Organization and the U.S. Food and Drug Administration, along with groups like the Gates Foundation, have offered increasing support for adaptive trial designs in recent years, particularly as a way to evaluate therapies during epidemics.

Nonetheless that doesn’t mean this specific effort is going to yield results in time to save the first wave of extremely ill patients. Once a promising treatment is recognized more patients will be allocated to receive it during each successive round of therapy. So far, about 130 ICU patients with Covid-19 have been enrolled, furthermore to hundreds of other hospitalized patients.

The goal in the REMAP-CAP trial, once all the trial sites are up and running, is to analyze results and change treatments on a weekly basis.

SOURCE

International trial uses AI to rapidly identify optimal Covid-19 treatments

Other Resources

Chinese Hospitals Deploy AI to Help Diagnose Covid-19

Software that reads CT lung scans had been used primarily to detect cancer. Now it’s retooled to look for signs of pneumonia caused by coronavirus.

https://www.wired.com/story/chinese-hospitals-deploy-ai-help-diagnose-covid-19/

 

Artificial Intelligence against COVID-19: An Early Review

AI has not yet made an impact, but data scientists have taken up the challenge

https://towardsdatascience.com/artificial-intelligence-against-covid-19-an-early-review-92a8360edaba

 

Can AI Find a Cure for COVID-19?

Alex Woodie

https://www.datanami.com/2020/03/23/can-ai-find-a-cure-for-covid-19/

 

Scientists are racing to find the best drugs to treat COVID-19

The WHO is launching a multicountry trial to collect good data

By Nicole Wetsman Mar 23, 2020, 9:21am EDT

https://www.theverge.com/2020/3/23/21188167/coronavirus-treatment-clinical-trials-drugs-remdesivir-chloroquine-covid

 

Data Scientists Use Machine Learning to Discover COVID-19 Treatments

Researchers are using machine learning algorithms capable of generating millions of therapeutic antibodies to quickly find treatments for COVID-19.

https://healthitanalytics.com/news/data-scientists-use-machine-learning-to-discover-covid-19-treatments

Read Full Post »


Live Conference Coverage @Medcitynews Converge 2018 @Philadelphia: Promising Drugs and Breaking Down Silos

Reporter: Stephen J. Williams, PhD

Promising Drugs, Pricing and Access

The drug pricing debate rages on. What are the solutions to continuing to foster research and innovation, while ensuring access and affordability for patients? Can biosimilars and generics be able to expand market access in the U.S.?

Moderator: Bunny Ellerin, Director, Healthcare and Pharmaceutical Management Program, Columbia Business School
Speakers:
Patrick Davish, AVP, Global & US Pricing/Market Access, Merck
Robert Dubois M.D., Chief Science Officer and Executive Vice President, National Pharmaceutical Council
Gary Kurzman, M.D., Senior Vice President and Managing Director, Healthcare, Safeguard Scientifics
Steven Lucio, Associate Vice President, Pharmacy Services, Vizient

What is working and what needs to change in pricing models?

Robert:  He sees so many players in the onStevencology space discovering new drugs and other drugs are going generic (that is what is working).  However are we spending too much on cancer care relative to other diseases (their initiative Going Beyond the Surface)

Steven:  the advent of biosimilars is good for the industry

Patrick:  large effort in oncology, maybe too much (750 trials on Keytruda) and he says pharma is spending on R&D (however clinical trials take large chunk of this money)

Robert: cancer has gotten a free ride but cost per year relative to benefit looks different than other diseases.  Are we overinvesting in cancer or is that a societal decision

Gary:  maybe as we become more specific with precision medicines high prices may be a result of our success in specifically targeting a mutation.  We need to understand the targeted drugs and outcomes.

Patrick: “Cancer is the last big frontier” but he says prices will come down in most cases.  He gives the example of Hep C treatment… the previous only therapeutic option was a very toxic yearlong treatment but the newer drugs may be more cost effective and safer

Steven: Our blockbuster drugs could diffuse the expense but now with precision we can’t diffuse the expense over a large number of patients

President’s Cancer Panel Recommendation

Six recommendations

  1. promoting value based pricing
  2. enabling communications of cost
  3. financial toxicity
  4. stimulate competition biosimilars
  5. value based care
  6. invest in biomedical research

Patrick: the government pricing regime is hurting.  Alot of practical barriers but Merck has over 200 studies on cost basis

Robert:  many concerns/impetus started in Europe on pricing as they are a set price model (EU won’t pay more than x for a drug). US is moving more to outcomes pricing. For every one health outcome study three studies did not show a benefit.  With cancer it is tricky to establish specific health outcomes.  Also Medicare gets best price status so needs to be a safe harbor for payers and biggest constraint is regulatory issues.

Steven: They all want value based pricing but we don’t have that yet and there is a challenge to understand the nuances of new therapies.  Hard to align all the stakeholders together so until some legislation starts to change the reimbursement-clinic-patient-pharma obstacles.  Possibly the big data efforts discussed here may help align each stakeholders goals.

Gary: What is the data necessary to understand what is happening to patients and until we have that information it still will be complicated to determine where investors in health care stand at in this discussion

Robert: on an ICER methods advisory board: 1) great concern of costs how do we determine fair value of drug 2) ICER is only game in town, other orgs only give recommendations 3) ICER evaluates long term value (cost per quality year of life), budget impact (will people go bankrupt)

4) ICER getting traction in the public eye and advocates 5) the problem is ICER not ready for prime time as evidence keeps changing or are they keeping the societal factors in mind and they don’t have total transparancy in their methodology

Steven: We need more transparency into all the costs associated with the drug and therapy and value-based outcome.  Right now price is more of a black box.

Moderator: pointed to a recent study which showed that outpatient costs are going down while hospital based care cost is going rapidly up (cost of site of care) so we need to figure out how to get people into lower cost setting

Breaking Down Silos in Research

“Silo” is healthcare’s four-letter word. How are researchers, life science companies and others sharing information that can benefit patients more quickly? Hear from experts at institutions that are striving to tear down the walls that prevent data from flowing.

Moderator: Vini Jolly, Executive Director, Woodside Capital Partners
Speakers:
Ardy Arianpour, CEO & Co-Founder, Seqster @seqster
Lauren Becnel, Ph.D., Real World Data Lead for Oncology, Pfizer
Rakesh Mathew, Innovation, Research, & Development Lead, HealthShareExchange
David Nace M.D., Chief Medical Officer, Innovaccer

Seqster: Seqster is a secure platform that helps you and your family manage medical records, DNA, fitness, and nutrition data—all in one place. Founder has a genomic sequencing background but realized sequence  information needs to be linked with medical records.

HealthShareExchange.org :

HealthShare Exchange envisions a trusted community of healthcare stakeholders collaborating to deliver better care to consumers in the greater Philadelphia region. HealthShare Exchange will provide secure access to health information to enable preventive and cost-effective care; improve quality of patient care; and facilitate care transitions. They have partnered with multiple players in healthcare field and have data on over 7 million patients.

Innovacer

Data can be overwhelming, but it doesn’t have to be this way. To drive healthcare efficiency, we designed a modular suite of products for a smooth transition into a data-driven world within 4 weeks. Why does it take so much money to move data around and so slowly?

What is interoperatibility?

Ardy: We knew in genomics field how to build algorithms to analyze big data but how do we expand this from a consumer standpoint and see and share your data.

Lauren: how can we use the data between patients, doctors, researchers?  On the research side genomics represent only 2% of data.  Silos are one issue but figuring out the standards for data (collection, curation, analysis) is not set. Still need to improve semantic interoperability. For example Flatiron had good annotated data on male metastatic breast cancer.

David: Technical interopatabliltiy (platform), semantic interopatability (meaning or word usage), format (syntactic) interopatibility (data structure).  There is technical interoperatiblity between health system but some semantic but formats are all different (pharmacies use different systems and write different prescriptions using different suppliers).  In any value based contract this problem is a big issue now (we are going to pay you based on the quality of your performance then there is big need to coordinate across platforms).  We can solve it by bringing data in real time in one place and use mapping to integrate the format (need quality control) then need to make the data democratized among players.

Rakesh:  Patients data should follow the patient. Of Philadelphia’s 12 health systems we had a challenge to make data interoperatable among them so tdhey said to providers don’t use portals and made sure hospitals were sending standardized data. Health care data is complex.

David: 80% of clinical data is noise. For example most eMedical Records are text. Another problem is defining a patient identifier which US does not believe in.

 

 

 

 

Please follow on Twitter using the following #hash tags and @pharma_BI

#MCConverge

#cancertreatment

#healthIT

#innovation

#precisionmedicine

#healthcaremodels

#personalizedmedicine

#healthcaredata

And at the following handles:

@pharma_BI

@medcitynews

Read Full Post »


 

Reporter: Larry Bernstein, MD

Bioinformatics  refers to the creation and maintenance of a database to store biological information such as nucleotide sequences and amino acid sequences. Development of this type of database involved not only design issues but the development of complex interfaces whereby researchers could access existing data as well as submit new or revised data.

In order to study how normal cellular activities are altered in different disease states, the biological data must be combined to form a comprehensive picture of these activities. Therefore, the field of bioinformatics This includes nucleotide and amino acid sequences, protein domains, and protein structures. The actual process of analyzing and interpreting data is referred to as computational biology.

The primary goal of bioinformatics is to increase the understanding of biological processes. What sets it apart from other approaches, however, is its focus on developing and applying computationally intensive techniques to achieve this goal.

Bioinformatics elements for NGS data analysis

4 – 5 – 6 – 7 Dicembre 2012
c/o Polo Scientifico e Tecnologico di Careggi
Viale Morgagni 40, Firenze
Inscription Deadline: 3 November 2012

The high level training in “Bioinformatics for NGS data analysis” is oriented for students and PhD students in mathematics, physics, natural science, medicine, biotechnology, pharmacy and ingenering as well as employees of public institutions, industry and university researchers interested in problems of NGS bioinformatics.

The primary object of the course is to introduce the participants to the basic theory and the technical knowledge of NGS data analysis for the identification of single nucleotide polymorphism, insertion/deletion, genomic variants and for the study of gene expression.

The course will span four days structured in seminars and hands-on sessions at the computer given by docents and professionals.

Contacts
For more information visit: http://sites.google.com/site/corsobioinformatica/
e-mai: corsobioinformatica@gmail.com
telephone: (+39) 055 7949036

 

Read Full Post »