Funding, Deals & Partnerships: BIOLOGICS & MEDICAL DEVICES; BioMed e-Series; Medicine and Life Sciences Scientific Journal – http://PharmaceuticalIntelligence.com
In today’s fast-paced business world, effective visual communication is more important than ever. This is where Apple Vision Pro comes in, serving as the ultimate tool for enhancing visual communication in the workplace. With its innovative software solution, it revolutionizes the way businesses communicate by providing new and improved methods for creating, editing, and sharing visual content. Apple Vision Pro boasts a user-friendly interface and advanced features that have made it an indispensable asset for businesses seeking to streamline their visual communication processes and boost productivity. The software empowers businesses to generate professional-looking visuals that are easily shareable and foster collaboration, making it an invaluable tool for staying ahead in today’s competitive landscape.
Apple Vision Pro sets itself apart with cutting-edge features such as augmented reality and machine learning, enabling users to create immersive and informative content. Through this tool, businesses can effortlessly produce and distribute videos, presentations, and other visual materials, making it a must-have for business owners, marketers, and creative professionals alike.
The impact of Apple Vision Pro is transformative, revolutionizing how businesses communicate visually. With its advanced capabilities and intuitive interface, it empowers users to craft visually stunning content that not only captivates but also educates. The tool offers a diverse range of visual elements like charts, graphs, images, and videos, enabling the effective conveyance of complex information to audiences. Furthermore, Apple Vision Pro facilitates real-time collaboration, making it easier for teams to work together, generate content collectively, and share ideas seamlessly. These capabilities enable businesses to enhance their visual communication efforts and create more impactful content, ultimately driving them towards achieving their goals with greater efficiency and effectiveness.
Some of the potential applications of Apple Vision Pro:
Gaming: Vision Pro could be used for gaming by providing a more immersive experience.
Productivity: Vision Pro could be used for productivity applications by providing a more natural way to interact with computers.
Creative applications: Vision Pro could be used for creative applications by providing a more immersive way to create and edit content.
Education: Vision Pro could be used for education by providing a more immersive way to learn.
Training: Vision Pro could be used for training by providing a more immersive way to learn new skills.
Remote collaboration: Vision Pro could be used for remote collaboration by providing a more immersive way to work with others.
Reporter: Frason Francis Kalapurakal, Research Assistant II
Researchers from MIT and Technion have made a significant contribution to the field of machine learning by developing an adaptive algorithm that addresses the challenge of determining when a machine should follow a teacher’s instructions or explore on its own. The algorithm autonomously decides whether to use imitation learning, which involves mimicking the behavior of a skilled teacher, or reinforcement learning, which relies on trial and error to learn from the environment.
The researchers’ key innovation lies in the algorithm’s adaptability and ability to determine the most effective learning method throughout the training process. To achieve this, they trained two “students” with different learning approaches: one using a combination of reinforcement and imitation learning, and the other relying solely on reinforcement learning. The algorithm continuously compared the performance of these two students, adjusting the emphasis on imitation or reinforcement learning based on which student achieved better results.
The algorithm’s efficacy was tested through simulated training scenarios, such as navigating mazes or reorienting objects with touch sensors. In all cases, the algorithm demonstrated superior performance compared to non-adaptive methods, achieving nearly perfect success rates and significantly outperforming other methods in terms of both accuracy and speed. This adaptability could enhance the training of machines in real-world situations where uncertainty is prevalent, such as robots navigating unfamiliar buildings or performing complex tasks involving object manipulation and locomotion.
Furthermore, the algorithm’s potential applications extend beyond robotics to various domains where imitation or reinforcement learning is employed. For example, large language models like GPT-4 could be used as teachers to train smaller models to excel in specific tasks. The researchers also suggest that analyzing the similarities and differences between machines and humans learning from their respective teachers could provide valuable insights for improving the learning experience.The MIT and Technion researchers’ algorithm stands out due to its principled approach, efficiency, and versatility across different domains. Unlike existing methods that require brute-force trial-and-error or manual tuning of parameters, their algorithm dynamically adjusts the balance between imitation and trial-and-error learning based on performance comparisons. This robustness, adaptability, and promising results make it a noteworthy advancement in the field of machine learning.
References:
“TGRL: TEACHER GUIDED REINFORCEMENT LEARNING ALGORITHM FOR POMDPS” Reincarnating Reinforcement Learning Workshop at ICLR 2023 https://openreview.net/pdf?id=kTqjkIvjj7
Concrete Problems in AI Safety by Dario Amodei, Chris Olah, Jacob Steinhardt, Paul Christiano, John Schulman, Dan Mané https://arxiv.org/abs/1606.06565
Other related articles published in this Open Access Online Scientific Journal include the following:
92 articles in the Category:
‘Artificial Intelligence – Breakthroughs in Theories and Technologies’
Science Has A Systemic Problem, Not an Innovation Problem
Curator:Stephen J. Williams, Ph.D.
A recent email, asking me to submit a survey, got me thinking about the malaise that scientists and industry professionals frequently bemoan: that innovation has been stymied for some reason and all sorts of convuluted processes must be altered to spur this mythical void of great new discoveries….. and it got me thinking about our current state of science, and what is the perceived issue… and if this desert of innovation actually exists or is more a fundamental problem which we have created.
The email was from an NIH committee asking for opinions on recreating the grant review process …. now this on the same day someone complained to me about a shoddy and perplexing grant review they received.
The following email, which was sent out to multiple researchers, involved in either NIH grant review on both sides, as well as those who had been involved in previous questionnaires and studies on grant review and bias. The email asked for researchers to fill out a survey on the grant review process, and how to best change it to increase innovation of ideas as well as inclusivity. In recent years, there have been multiple survey requests on these matters, with multiple confusing procedural changes to grant format and content requirements, adding more administrative burden to scientists.
The email from Center for Scientific Review (one of the divisions a grant will go to before review {they set up review study sections and decide what section a grant should be assigned to} was as follows:
Update on Simplifying Review Criteria: A Request for Information
NIH has issued a request for information (RFI) seeking feedback on revising and simplifying the peer review framework for research project grant applications. The goal of this effort is to facilitate the mission of scientific peer review – identification of the strongest, highest-impact research. The proposed changes will allow peer reviewers to focus on scientific merit by evaluating 1) the scientific impact, research rigor, and feasibility of the proposed research without the distraction of administrative questions and 2) whether or not appropriate expertise and resources are available to conduct the research, thus mitigating the undue influence of the reputation of the institution or investigator.
Currently, applications for research project grants (RPGs, such as R01s, R03s, R15s, R21s, R34s) are evaluated based on five scored criteria: Significance, Investigators, Innovation, Approach, and Environment (derived from NIH peer review regulations 42 C.F.R. Part 52h.8; see Definitions of Criteria and Considerations for Research Project Grant Critiques for more detail) and a number of additional review criteria such as Human Subject Protections.
NIH gathered input from the community to identify potential revisions to the review framework. Given longstanding and often-heard concerns from diverse groups, CSR decided to form two working groups to the CSR Advisory Council—one on non-clinical trials and one on clinical trials. To inform these groups, CSR published a Review Matters blog, which was cross-posted on the Office of Extramural Research blog, Open Mike. The blog received more than 9,000 views by unique individuals and over 400 comments. Interim recommendations were presented to the CSR Advisory Council in a public forum (March 2020 video, slides; March 2021 video, slides). Final recommendations from the CSRAC (report) were considered by the major extramural committees of the NIH that included leadership from across NIH institutes and centers. Additional background information can be found here. This process produced many modifications and the final proposal presented below. Discussions are underway to incorporate consideration of a Plan for Enhancing Diverse Perspectives (PEDP) and rigorous review of clinical trials RPGs (~10% of RPGs are clinical trials) within the proposed framework.
Simplified Review Criteria
NIH proposes to reorganize the five review criteria into three factors, with Factors 1 and 2 receiving a numerical score. Reviewers will be instructed to consider all three factors (Factors 1, 2 and 3) in arriving at their Overall Impact Score (scored 1-9), reflecting the overall scientific and technical merit of the application.
Factor 1: Importance of the Research (Significance, Innovation), numerical score (1-9)
Factor 2: Rigor and Feasibility (Approach), numerical score (1-9)
Factor 3: Expertise and Resources (Investigator, Environment), assessed and considered in the Overall Impact Score, but not individually scored
Within Factor 3 (Expertise and Resources), Investigator and Environment will be assessed in the context of the research proposed. Investigator(s) will be rated as “fully capable” or “additional expertise/capability needed”. Environment will be rated as “appropriate” or “additional resources needed.” If a need for additional expertise or resources is identified, written justification must be provided. Detailed descriptions of the three factors can be found here.
Now looking at some of the Comments were very illuminating:
I strongly support streamlining the five current main review criteria into three, and the present five additional criteria into two. This will bring clarity to applicants and reduce the workload on both applicants and reviewers. Blinding reviewers to the applicants’ identities and institutions would be a helpful next step, and would do much to reduce the “rich-getting-richer” / “good ole girls and good ole boys” / “big science” elitism that plagues the present review system, wherein pedigree and connections often outweigh substance and creativity.
I support the proposed changes. The shift away from “innovation” will help reduce the tendency to create hype around a proposed research direction. The shift away from Investigator and Environment assessments will help reduce bias toward already funded investigators in large well-known institutions.
As a reviewer for 5 years, I believe that the proposed changes are a step in the right direction, refocusing the review on whether the science SHOULD be done and whether it CAN BE DONE WELL, while eliminating burdensome and unhelpful sections of review that are better handled administratively. I particularly believe that the de-emphasis of innovation (which typically focuses on technical innovation) will improve evaluation of the overall science, and de-emphasis of review of minor technical details will, if implemented correctly, reduce the “downward pull” on scores for approach. The above comments reference blinded reviews, but I did not see this in the proposed recommendations. I do not believe this is a good idea for several reasons: 1) Blinding of the applicant and institution is not likely feasible for many of the reasons others have described (e.g., self-referencing of prior work), 2) Blinding would eliminate the potential to review investigators’ biosketches and budget justifications, which are critically important in review, 3) Making review blinded would make determination of conflicts of interest harder to identify and avoid, 4) Evaluation of “Investigator and Environment” would be nearly impossible.
Most of the Comments were in favor of the proposed changes, however many admitted that it adds additional confusion on top of many administrative changes to formats and content of grant sections.
Being a Stephen Covey devotee, and just have listened to The Four Principles of Execution, it became more apparent that issues that hinder many great ideas coming into fruition, especially in science, is a result of these systemic or problems in the process, not at the level of individual researchers or small companies trying to get their innovations funded or noticed. In summary, Dr. Covey states most issues related to the success of any initiative is NOT in the strategic planning, but in the failure to adhere to a few EXECUTION principles. Primary to these failures of strategic plans is lack of accounting of what Dr. Covey calls the ‘whirlwind’, or those important but recurring tasks that take us away from achieving the wildly important goals. In addition, lack of determining lead and lag measures of success hinder such plans.
In this case a lag measure in INNOVATION. It appears we have created such a whirlwind and focus on lag measures that we are incapable of translating great discoveries into INNOVATION.
In the following post, I will focus on issues relating to Open Access, publishing and dissemination of scientific discovery may be costing us TIME to INNOVATION. And it appears that there are systemic reasons why we appear stuck in a rut, so to speak.
The first indication is from a paper published by Johan Chu and James Evans in 2021 in PNAS:
Slowed canonical progress in large fields of science
Chu JSG, Evans JA. Slowed canonical progress in large fields of science. Proc Natl Acad Sci U S A. 2021 Oct 12;118(41):e2021636118. doi: 10.1073/pnas.2021636118. PMID: 34607941; PMCID: PMC8522281
Abstract
In many academic fields, the number of papers published each year has increased significantly over time. Policy measures aim to increase the quantity of scientists, research funding, and scientific output, which is measured by the number of papers produced. These quantitative metrics determine the career trajectories of scholars and evaluations of academic departments, institutions, and nations. Whether and how these increases in the numbers of scientists and papers translate into advances in knowledge is unclear, however. Here, we first lay out a theoretical argument for why too many papers published each year in a field can lead to stagnation rather than advance. The deluge of new papers may deprive reviewers and readers the cognitive slack required to fully recognize and understand novel ideas. Competition among many new ideas may prevent the gradual accumulation of focused attention on a promising new idea. Then, we show data supporting the predictions of this theory. When the number of papers published per year in a scientific field grows large, citations flow disproportionately to already well-cited papers; the list of most-cited papers ossifies; new papers are unlikely to ever become highly cited, and when they do, it is not through a gradual, cumulative process of attention gathering; and newly published papers become unlikely to disrupt existing work. These findings suggest that the progress of large scientific fields may be slowed, trapped in existing canon. Policy measures shifting how scientific work is produced, disseminated, consumed, and rewarded may be called for to push fields into new, more fertile areas of study.
So the Summary of this paper is
The authors examined 1.8 billion citations among 90 million papers over 241 subjects
found the corpus of papers do not lead to turnover of new ideas in a field, but rather the ossification or entrenchment of canonical (or older ideas)
this is mainly due to older paper cited more frequently than new papers with new ideas, potentially because authors are trying to get their own papers cited more frequently for funding and exposure purposes
The authors suggest that “fundamental progress may be stymied if quantitative growth of scientific endeavors is not balanced by structures fostering disruptive scholarship and focusing attention of novel ideas”
The authors note that, in most cases, science policy reinforces this “more is better” philosophy”, where metrics of publication productivity are either number of publications or impact measured by citation rankings. However, using an analysis of citation changes occurring in large versus smaller fields, it becomes apparent that this process is favoring the older, more established papers and a recirculating of older canonical ideas.
“Rather than resulting in faster turnover of field paradigms, the massive amounts of new publications entrenches the ideas of top-cited papers.” New ideas are pushed down to the bottom of the citation list and potentially lost in the literature. The authors suggest that this problem will intensify as the “annual mass” of new publications in each field grows, especially in large fields. This issue is exacerbated by the deluge on new online ‘open access’ journals, in which authors would focus on citing the more highly cited literature.
We maybe at a critical junction, where if many papers are published in a short time, new ideas will not be considered as carefully as the older ideas. In addition,
with proliferation of journals and the blurring of journal hierarchies due to online articles-level access can exacerbate this problem
As a counterpoint, the authors do note that even though many molecular biology highly cited articles were done in 1976, there has been extremely much innovation since then however it may take a lot more in experiments and money to gain the level of citations that those papers produced, and hence a lower scientific productivity.
This issue is seen in the field of economics as well
Ellison, Glenn. “Is peer review in decline?” Economic Inquiry, vol. 49, no. 3, July 2011, pp. 635+. Gale Academic OneFile, link.gale.com/apps/doc/A261386330/AONE?u=temple_main&sid=bookmark-AONE&xid=f5891002. Accessed 12 Dec. 2022.
Abstract:
Over the past decade, there has been a decline in the fraction of papers in top economics journals written by economists from the highest-ranked economics departments. This paper documents this fact and uses additional data on publications and citations to assess various potential explanations. Several observations are consistent with the hypothesis that the Internet improves the ability of high-profile authors to disseminate their research without going through the traditional peer-review process. (JEL A14, 030)
The facts part of this paper documents two main facts:
1. Economists in top-ranked departments now publish very few papers in top field journals. There is a marked decline in such publications between the early 1990s and early 2000s.
2. Comparing the early 2000s with the early 1990s, there is a decline in both the absolute number of papers and the share of papers in the top general interest journals written by Harvard economics department faculty.
Although the second fact just concerns one department, I see it as potentially important to understanding what is happening because it comes at a time when Harvard is widely regarded (I believe correctly) as having ascended to the top position in the profession.
The “decline-of-peer-review” theory I allude to in the title is that the necessity of going through the peer-review process has lessened for high-status authors: in the old days peer-reviewed journals were by far the most effective means of reaching readers, whereas with the growth of the Internet high-status authors can now post papers online and exploit their reputation to attract readers.
Many alternate explanations are possible. I focus on four theories: the decline-in-peer-review theory and three alternatives.
1. The trends could be a consequence of top-school authors’ being crowded out of the top journals by other researchers. Several such stories have an optimistic message, for example, there is more talent entering the profession, old pro-elite biases are being broken down, more schools are encouraging faculty to do cutting-edge research, and the Internet is enabling more cutting-edge research by breaking down informational barriers that had hampered researchers outside the top schools. (2)
2. The trends could be a consequence of the growth of revisions at economics journals discussed in Ellison (2002a, 2002b). In this more pessimistic theory, highly productive researchers must abandon some projects and/or seek out faster outlets to conserve the time now required to publish their most important works.
3. The trends could simply reflect that field journals have declined in quality in some relative sense and become a less attractive place to publish. This theory is meant to encompass also the rise of new journals, which is not obviously desirable or undesirable.
The majority of this paper is devoted to examining various data sources that provide additional details about how economics publishing has changed over the past decade. These are intended both to sharpen understanding of the facts to be explained and to provide tests of auxiliary predictions of the theories. Two main sources of information are used: data on publications and data on citations. The publication data include department-level counts of publications in various additional journals, an individual-level dataset containing records of publications in a subset of journals for thousands of economists, and a very small dataset containing complete data on a few authors’ publication records. The citation data include citations at the paper level for 9,000 published papers and less well-matched data that is used to construct measures of citations to authors’ unpublished works, to departments as a whole, and to various journals.
Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship
Josh Angrist, Pierre Azoulay, Glenn Ellison, Ryan Hill, Susan Feng Lu. Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship.
JOURNAL OF ECONOMIC LITERATURE
Abstract
Does academic economic research produce material of general scientific value, or do academic economists write only for peers? Is economics scholarship uniquely insular? We address these questions by quantifying interactions between economics and other disciplines. Changes in the influence of economic scholarship are measured here by the frequency with which other disciplines cite papers in economics journals. We document a clear rise in the extramural influence of economic research, while also showing that economics is increasingly likely to reference other social sciences. A breakdown of extramural citations by economics fields shows broad field influence. Differentiating between theoretical and empirical papers classified using machine learning, we see that much of the rise in economics’ extramural influence reflects growth in citations to empirical work. This growth parallels an increase in the share of empirical cites within economics. At the same time, some disciplines that primarily cite economic theory have also recently increased citations of economics scholarship.
Citation
Angrist, Josh, Pierre Azoulay, Glenn Ellison, Ryan Hill, and Susan Feng Lu. 2020.“Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship.”Journal of Economic Literature, 58 (1): 3-52.DOI: 10.1257/jel.20181508
VOL. 58, NO. 1, MARCH 2020
(pp. 3-52)
So if innovation is there but it may be buried under the massive amount of heavily cited older literature, do we see evidence of this in other fields like medicine?
Why Isn’t Innovation Helping Reduce Health Care Costs?
National health care expenditures (NHEs) in the United States continue to grow at rates outpacing the broader economy: Inflation- and population-adjusted NHEs have increased 1.6 percent faster than the gross domestic product (GDP) between 1990 and 2018. US national health expenditure growth as a share of GDP far outpaces comparable nations in the Organization for Economic Cooperation and Development (17.2 versus 8.9 percent).
Multiple recent analyses have proposed that growth in the prices and intensity of US health care services—rather than in utilization rates or demographic characteristics—is responsible for the disproportionate increases in NHEs relative to global counterparts. The consequences of ever-rising costs amid ubiquitous underinsurance in the US include price-induced deferral of care leading to excess morbidity relative to comparable nations.
These patterns exist despite a robust innovation ecosystem in US health care—implying that novel technologies, in isolation, are insufficient to bend the health care cost curve. Indeed, studies have documented that novel technologies directly increase expenditure growth.
Why is our prolific innovation ecosystem not helping reduce costs? The core issue relates to its apparent failure to enhance net productivity—the relative output generated per unit resource required. In this post, we decompose the concept of innovation to highlight situations in which inventions may not increase net productivity. We begin by describing how this issue has taken on increased urgency amid resource constraints magnified by the COVID-19 pandemic. In turn, we describe incentives for the pervasiveness of productivity-diminishing innovations. Finally, we provide recommendations to promote opportunities for low-cost innovation.
Net Productivity During The COVID-19 Pandemic
The issue of productivity-enhancing innovation is timely, as health care systems have been overwhelmed by COVID-19. Hospitals in Italy, New York City, and elsewhere have lacked adequate capital resources to care for patients with the disease, sufficient liquidity to invest in sorely needed resources, and enough staff to perform all of the necessary tasks.
The critical constraint in these settings is not technology: In fact, the most advanced technology required to routinely treat COVID-19—the mechanical ventilator—was invented nearly 100 years ago in response to polio (the so-called iron lung). Rather, the bottleneck relates to the total financial and human resources required to use the technology—the denominator of net productivity. The clinical implementation of ventilators has been illustrative: Health care workers are still required to operate ventilators on a nearly one-to-one basis, just like in the mid-twentieth century.
High levels of resources required for implementation of health care technologies constrain the scalability of patient care—such as during respiratory disease outbreaks such as COVID-19. Thus, research to reduce health care costs is the same kind of research we urgently require to promote health care access for patients with COVID-19.
Types Of Innovation And Their Relationship To Expenditure Growth
The widespread use of novel medical technologies has been highlighted as a central driver of NHE growth in the US. We believe that the continued expansion of health care costs is largely the result of innovation that tends to have low productivity (exhibit 1). We argue that these archetypes—novel widgets tacked on to existing workflows to reinforce traditional care models—are exactly the wrong properties to reduce NHEs at the systemic level.
Exhibit 1: Relative productivity of innovation subtypes
These may be contrasted with process innovations, which address the organized sequences of activities that implement content. Classically, these include clinical pathways and protocols. They can address the delivery of care for acute conditions, such as central line infections, sepsis, or natural disasters. Alternatively, they can target chronic conditions through initiatives such as team-based management of hypertension and hospital-at-home models for geriatric care. Other processes include hiring staff, delegating labor, and supply chain management.
Performance-Enhancing Versus Cost-Reducing Innovation
Performance-enhancing innovations frequently create incremental outcome gains in diagnostic characteristics, such as sensitivity or specificity, or in therapeutic characteristics, such as biomarkers for disease status. Their performance gains often lead to higher prices compared to existing alternatives.
Performance-enhancing innovations can be compared to “non-inferior” innovations capable of achieving outcomes approximating those of existing alternatives, but at reduced cost. Industries outside of medicine, such as the computing industry, have relied heavily on the ability to reduce costs while retaining performance.
In health care though, this pattern of innovation is rare. Since passage of the 2010 “Biosimilars” Act aimed at stimulating non-inferior innovation and competition in therapeutics markets, only 17 agents have been approved, and only seven have made it to market. More than three-quarters of all drugs receiving new patents between 2005 and 2015 were “reissues,” meaning they had already been approved, and the new patent reflected changes to the previously approved formula. Meanwhile, the costs of approved drugs have increased over time, at rates between 4 percent and 7 percent annually.
Moreover, the preponderance of performance-enhancing diagnostic and therapeutic innovations tend to address narrow patient cohorts (such as rare diseases or cancer subtypes), with limited clear clinical utility in broader populations. For example, the recently approved eculizimab is a monoclonal antibody approved for paroxysmal nocturnal hemoglobinuria—which effects 1 in 10 million individuals. At the time of its launch, eculizimab was priced at more than $400,000 per year, making it the most expensive drug in modern history. For clinical populations with no available alternatives, drugs such as eculizimab may be cost-effective, pending society’s willingness to pay, and morally desirable, given a society’s values. But such drugs are certainly not cost-reducing.
Additive Versus Substitutive Innovation
Additive innovations are those that append to preexisting workflows, while substitutive innovations reconfigure preexisting workflows. In this way, additive innovations increase the use of precedent services, whereas substitutive innovations decrease precedent service use.
For example, previous analyses have found that novel imaging modalities are additive innovations, as they tend not to diminish use of preexisting modalities. Similarly, novel procedures tend to incompletely replace traditional procedures. In the case of therapeutics and devices, off-label uses in disease groups outside of the approved indication(s) can prompt innovation that is additive. This is especially true, given that off-label prescriptions classically occur after approved methods are exhausted.
Eculizimab once again provides an illustrative example. As of February 2019, the drug had been used for 39 indications (it had been approved for three of those, by that time), 69 percent of which lacked any form of evidence of real-world effectiveness. Meanwhile, the drug generated nearly $4 billion in sales in 2019. Again, these expenditures may be something for which society chooses to pay—but they are nonetheless additive, rather than substitutive.
Sustaining Versus Disruptive Innovation
Competitive market theory suggests that incumbents and disruptors innovate differently. Incumbents seek sustaining innovations capable of perpetuating their dominance, whereas disruptors pursue innovations capable of redefining traditional business models.
In health care, while disruptive innovations hold the potential to reduce overall health expenditures, often they run counter to the capabilities of market incumbents. For example, telemedicine can deliver care asynchronously, remotely, and virtually, but large-scale brick-and-mortar medical facilities invest enormous capital in the delivery of synchronous, in-house, in-person care (incentivized by facility fees).
The connection between incumbent business models and the innovation pipeline is particularly relevant given that 58 percent of total funding for biomedical research in the US is now derived from private entities, compared with 46 percent a decade prior. It follows that the growing influence of eminent private organizations may favor innovations supporting their market dominance—rather than innovations that are societally optimal.
Incentives And Repercussions Of High-Cost Innovation
Taken together, these observations suggest that innovation in health care is preferentially designed for revenue expansion rather than for cost reduction. While offering incremental improvements in patient outcomes, therefore creating theoretical value for society, these innovations rarely deliver incremental reductions in short- or long-term costs at the health system level.
For example, content-based, performance-enhancing, additive, sustaining innovations tend to add layers of complexity to the health care system—which in turn require additional administration to manage. The net result is employment growth in excess of outcome improvement, leading to productivity losses. This gap leads to continuously increasing overall expenditures in turn passed along to payers and consumers.
Nonetheless, high-cost innovations are incentivized across health care stakeholders (exhibit 2). From the supply side of innovation, for academic researchers, “breakthrough” and “groundbreaking” innovations constitute the basis for career advancement via funding and tenure. This is despite stakeholders’ frequent inability to generalize early successes to become cost-effective in the clinical setting. As previously discussed, the increasing influence of private entities in setting the medical research agenda is also likely to stimulate innovation benefitting single stakeholders rather than the system.
Source: Authors’ analysis adapted from Hofmann BM. Too much technology. BMJ. 2015 Feb 16.
From the demand side of innovation (providers and health systems), a combined allure (to provide “cutting-edge” patient care), imperative (to leave “no stone unturned” in patient care), and profit-motive (to amplify fee-for-service reimbursements) spur participation in a “technological arms-race.” The status quo thus remains as Clay Christensen has written: “Our major health care institutions…together overshoot the level of care actually needed or used by the vast majority of patients.”
Christensen’s observations have been validated during the COVID-19 epidemic, as treatment of the disease requires predominantly century-old technology. By continually adopting innovation that routinely overshoots the needs of most patients, layer by layer, health care institutions are accruing costs that quickly become the burden of society writ large.
Recommendations To Reduce The Costs Of Health Care Innovation
Henry Aaron wrote in 2002 that “…the forces that have driven up costs are, if anything, intensifying. The staggering fecundity of biomedical research is increasing…[and] always raises expenditures.” With NHEs spiraling ever-higher, urgency to “bend the cost curve” is mounting. Yet, since much biomedical innovation targets the “flat of the [productivity] curve,” alternative forms of innovation are necessary.
The shortcomings in net productivity revealed by the COVID-19 pandemic highlight the urgent need for redesign of health care delivery in this country, and reevaluation of the innovation needed to support it. Specifically, efforts supporting process redesign are critical to promote cost-reducing, substitutive innovations that can inaugurate new and disruptive business models.
Process redesign rarely involves novel gizmos, so much as rejiggering the wiring of, and connections between, existing gadgets. It targets operational changes capable of streamlining workflows, rather than technical advancements that complicate them. As described above, precisely these sorts of “frugal innovations” have led to productivity improvements yielding lower costs in other high-technology industries, such as the computing industry.
Shrank and colleagues recently estimated that nearly one-third of NHEs—almost $1 trillion—were due to preventable waste. Four of the six categories of waste enumerated by the authors—failure in care delivery, failure in care coordination, low-value care, and administrative complexity—represent ripe targets for process innovation, accounting for $610 billion in waste annually, according to Shrank.
Health systems adopting process redesign methods such as continuous improvement and value-based management have exhibited outcome enhancement and expense reduction simultaneously. Internal processes addressed have included supply chain reconfiguration, operational redesign, outlier reconciliation, and resource standardization.
Despite the potential of process innovation, focus on this area (often bundled into “health services” or “quality improvement” research) occupies only a minute fraction of wallet- or mind-share in the biomedical research landscape, accounting for 0.3 percent of research dollars in medicine. This may be due to a variety of barriers beyond minimal funding. One set of barriers is academic, relating to negative perceptions around rigor and a lack of outlets in which to publish quality improvement research. To achieve health care cost containment over the long term, this dimension of innovation must be destigmatized relative to more traditional manners of innovation by the funders and institutions determining the conditions of the research ecosystem.
Another set of barriers is financial: Innovations yielding cost reduction are less “reimbursable” than are innovations fashioned for revenue expansion. This is especially the case in a fee-for-service system where reimbursement is tethered to cost, which creates perverse incentives for health care institutions to overlook cost increases. However, institutions investing in low-cost innovation will be well-positioned in a rapidly approaching future of value-based care—in which the solvency of health care institutions will rely upon their ability to provide economically efficient care.
Innovating For Cost Control Necessitates Frugality Over Novelty
Restraining US NHEs represents a critical step toward health promotion. Innovation for innovation’s sake—that is content-based, incrementally effective, additive, and sustaining—is unlikely to constrain continually expanding NHEs.
In contrast, process innovation offers opportunities to reduce costs while maintaining high standards of patient care. As COVID-19 stress-tests health care systems across the world, the importance of cost control and productivity amplification for patient care has become apparent.
As such, frugality, rather than novelty, may hold the key to health care cost containment. Redesigning the innovation agenda to stem the tide of ever-rising NHEs is an essential strategy to promote widespread access to care—as well as high-value preventive care—in this country. In the words of investors across Silicon Valley: Cost-reducing innovation is no longer a “nice-to-have,” but a “need-to-have” for the future of health and overall well-being this country.
So Do We Need A New Way of Disseminating Scientific Information? Can Curation Help?
We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!
Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.
The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?
Biden will appoint Dr. Elizabeth Jaffee, Dr. Mitchel Berger and Dr. Carol Brown to the panel, which will advise him and the White House on how to use resources of the federal government to advance cancer research and reduce the burden of cancer in the United States.
Jaffee, who will serve as chair of the panel, is an expert in cancer immunology and pancreatic cancer, according to the White House. She is currently the deputy director of the Sidney Kimmel Comprehensive Cancer Center at Johns Hopkins University and previously led the American Association for Cancer Research.
In this Sept. 8, 2016, file photo, Dr. Elizabeth M. Jaffee of the Pancreatic Dream Team attends Stand Up To Cancer (SU2C), a program of the Entertainment Industry Foundation (EIF), in Hollywood, Calif.ABC Handout via Getty Images, FILE
Berger, a neurological surgeon, directs the University of California, San Francisco Brain Tumor Center and previously spent 23 years at the school as a professor of neurological surgery.
Brown, a gynecologic oncologist, is the senior vice president and chief health equity officer at Memorial Sloan Kettering Cancer Center in New York City. According to the White House, much of her career has been focused on eliminating cancer care disparities due to racial, ethnic, cultural or socioeconomic factors.
Additionally, First Lady Jill Biden, members of the Cabinet and other administration officials are holding a meeting Wednesday of the Cancer Cabinet, made up of officials across several governmental departments and agencies, the White House said.
The Cabinet will introduce new members and discuss priorities in the battle against cancer including closing the screening gap, addressing potential environmental exposures, reducing the number of preventable cancer and expanding access to cancer research.MORE: Long Island school district found to have higher rates of cancer cases: Study
It is the second meeting of the cabinet since Biden relaunched the initiative in February, which he originally began in 2016 when he was vice president.
Both Jaffee and Berger were members of the Blue Ribbon Panel for the Cancer Moonshot Initiative led by Biden.
The initiative has personal meaning for Biden, whose son, Beau, died of glioblastoma — one of the most aggressive forms of brain cancer — in 2015.
“I committed to this fight when I was vice president,” Biden said at the time, during an event at the White House announcing the relaunch. “It’s one of the reasons why, quite frankly, I ran for president. Let there be no doubt, now that I am president, this is a presidential, White House priority. Period.”
The initiative has several priority actions including diagnosing cancer sooner; preventing cancer; addressing inequities; and supporting patients, caregivers and survivors.
In this June 14, 2016, file photo, Dr. Carol Brown, physician at Memorial Sloan Kettering Cancer Center, gives a presentation, at The White House Summit on The United State of Women, in Washington, D.C.NurPhoto via Getty Images, FILE
The White House has also issued a call to action to get cancer screenings back to pre-pandemic levels.
“We have to get cancer screenings back on track and make sure they’re accessible to all Americans,” Biden said at the time.
Since the first meeting of the Cancer Cabinet, the Centers for Disease Control and Prevention has issued more than $200 million in grants to cancer prevention programs, the Centers for Medicaid & Medicare Services implemented a new model to reduce the cost of cancer care, and the U.S. Patent and Trademark Office said it will fast-track applications for cancer immunotherapies.
ABC News’ Sasha Pezenik contributed to this report.
President Joe Biden is expected to pick cancer surgeon Monica Bertagnolli as the next director of the National Cancer Institute (NCI). Bertagnolli, a physician-scientist at Brigham and Women’s Hospital, the Dana-Farber Cancer Center, and Harvard Medical School, specializes in gastrointestinal cancers and is well known for her expertise in clinical trials. She will replace Ned Sharpless, who stepped down as NCI director in April after nearly 5 years.
The White House has not yet announced the selection, first reported by STAT, but several cancer research organizations closely watching for the nomination have issued statements supporting Bertagnolli’s expected selection. She is “a national leader” in clinical cancer research and “a great person to take the job,” Sharpless told ScienceInsider.
With a budget of $7 billion, NCI is the largest component of the National Institutes of Health (NIH) and the world’s largest funder of cancer research. Its director is the only NIH institute director selected by the president. Bertagnolli’s expected appointment, which does not require Senate confirmation, drew applause from the cancer research community
Margaret Foti, CEO of the American Association for Cancer Research, praised Bertagnolli’s “appreciation for … basic research” and “commitment to ensuring that such treatment innovations reach patients … across the United States.” Ellen Sigal, chair and founder of Friends of Cancer Research, says Bertagnolli “brings expertise the agency needs at a true inflection point for cancer research.”
Bertagnolli, 63, will be the first woman to lead NCI. Her lab research on tumor immunology and the role of a gene called APC in colorectal cancer led to a landmark trial she headed showing that an anti-inflammatory drug can help prevent this cancer. In 2007, she became the chief of surgery at the Dana-Farber Brigham Cancer Center.
She served as president of the American Society of Clinical Oncology in 2018 and currently chairs the Alliance for Clinical Trials in Oncology, which is funded by NCI’s National Clinical Trials Network. The network is a “complicated” program, and “Monica will have a lot of good ideas on how to make it work better,” Sharpless says.
ADVERTISEMENT
One of Bertagnolli’s first tasks will be to shape NCI’s role in Biden’s reignited Cancer Moonshot, which aims to slash the U.S. cancer death rate in half within 25 years. NCI’s new leader also needs to sort out how the agency will mesh with a new NIH component that will fund high-risk, goal-driven research, the Advanced Research Projects Agency for Health (ARPA-H).
Bertagnolli will also head NCI efforts already underway to boost grant funding rates, diversify the cancer research workplace, and reduce higher death rates for Black people with cancer.
The White House recently nominated applied physicist Arati Prabhakar to fill another high-level science position, director of the White House Office of Science and Technology Policy (OSTP). But still vacant is the NIH director slot, which Francis Collins, acting science adviser to the president, left in December 2021. And the administration hasn’t yet selected the inaugural director of ARPA-H.
Correction, 22 July, 9 a.m.: This story has been updated to reflect that Francis Collins is acting science adviser to the president, not acting director of the White House Office of Science and Technology Policy.
Developing Machine Learning Models for Prediction of Onset of Type-2 Diabetes
Reporter: Amandeep Kaur, B.Sc., M.Sc.
A recent study reports the development of an advanced AI algorithm which predicts up to five years in advance the starting of type 2 diabetes by utilizing regularly collected medical data. Researchers described their AI model as notable and distinctive based on the specific design which perform assessments at the population level.
The first author Mathieu Ravaut, M.Sc. of the University of Toronto and other team members stated that “The main purpose of our model was to inform population health planning and management for the prevention of diabetes that incorporates health equity. It was not our goal for this model to be applied in the context of individual patient care.”
Research group collected data from 2006 to 2016 of approximately 2.1 million patients treated at the same healthcare system in Ontario, Canada. Even though the patients were belonged to the same area, the authors highlighted that Ontario encompasses a diverse and large population.
The newly developed algorithm was instructed with data of approximately 1.6 million patients, validated with data of about 243,000 patients and evaluated with more than 236,000 patient’s data. The data used to improve the algorithm included the medical history of each patient from previous two years- prescriptions, medications, lab tests and demographic information.
When predicting the onset of type 2 diabetes within five years, the algorithm model reached a test area under the ROC curve of 80.26.
The authors reported that “Our model showed consistent calibration across sex, immigration status, racial/ethnic and material deprivation, and a low to moderate number of events in the health care history of the patient. The cohort was representative of the whole population of Ontario, which is itself among the most diverse in the world. The model was well calibrated, and its discrimination, although with a slightly different end goal, was competitive with results reported in the literature for other machine learning–based studies that used more granular clinical data from electronic medical records without any modifications to the original test set distribution.”
This model could potentially improve the healthcare system of countries equipped with thorough administrative databases and aim towards specific cohorts that may encounter the faulty outcomes.
Research group stated that “Because our machine learning model included social determinants of health that are known to contribute to diabetes risk, our population-wide approach to risk assessment may represent a tool for addressing health disparities.”
Ravaut M, Harish V, Sadeghi H, et al. Development and Validation of a Machine Learning Model Using Administrative Health Data to Predict Onset of Type 2 Diabetes. JAMA Netw Open. 2021;4(5):e2111315. doi:10.1001/jamanetworkopen.2021.11315 https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2780137
Other related articles were published in this Open Access Online Scientific Journal, including the following:
AI in Drug Discovery: Data Science and Core Biology @Merck &Co, Inc., @GNS Healthcare, @QuartzBio, @Benevolent AI and Nuritas
Reporters: Aviva Lev-Ari, PhD, RN and Irina Robu, PhD
Joe Biden Announced Science Team Nominations for the New Administration
Reporter: Stephen J. Williams, PhD
Article ID #287: Joe Biden Announced Science Team Nominations for the New Administration. Published on 1/17/2021
WordCloud Image Produced by Adam Tubman
In an announcement televised on C-Span, President Elect Joseph Biden announced his new Science Team to advise on science policy matters, as part of the White House Advisory Committee on Science and Technology. Below is a video clip and the transcript, also available at
Novartis vet Danny Bar-Zohar leaps back into R&D, taking over the development team at Merck KGaA as Luciano Rossetti steps out
John Carroll
Editor & Founder
After a brief stint as a biotech investor at Syncona, Novartis vet Danny Bar-Zohar is back in R&D, and he’s taking the lead position at Merck KGaA’s drug division.
Bar-Zohar had led late-stage clinical development across a variety of areas — neuroscience, immunology, oncology and ophthalmology, among others — before joining the migration of talent out of the Basel-based multinational. He had been at Novartis for 7 years, which followed an earlier chapter in research at Teva.
Luciano Rossetti
The scientist is taking the lead on development at Merck KGaA, in place of Luciano Rossetti, who had a mixed record in R&D that nevertheless marked a big improvement over the dismal run the company had endured earlier. Joern-Peter Halle will continue on as global head of research. Rossetti is retiring after 6 years of running the research group, which has extensive operations in Germany as well as Massachusetts.
Their PD-L1 Bavencio — allied with Pfizer — has had a few successes, and a whole slate of failures. Sprifermin was touted as a big potential advance in osteoarthritis, but Merck KGaA is now auctioning off that part of the portfolio. One of the few late-stage bright spots has been their MET inhibitor tepotinib, which won breakthrough status and now is under priority review. That drug faces a rival at Novartis — capmatinib — that won an accelerated OK at the FDA in May.
advertisement
advertisement
There’s also a BTK inhibitor, evobrutinib, that’s being developed for MS. But that’s a very crowded field, and Sanofi has been bullish about its prospects in the same research niche after buying out Principia.
Moving back into mid-stage development, there’s a major program underway for bintrafusp alfa, a bifunctional fusion protein targeting TGF-β and PD-L1, which Merck KGaA has high hopes for.
That all marks some bright, though limited, prospects for Merck KGaA, highlighting the need to find something new to beef up the pipeline. Bar-Zohar will get a say in that.
Improving diagnostic yield in pediatric cancer precision medicine
Elaine R Mardis
Advent of genomics have revolutionized how we diagnose and treat lung cancer
We are currently needing to understand the driver mutations and variants where we can personalize therapy
PD-L1 and other checkpoint therapy have not really been used in pediatric cancers even though CAR-T have been successful
The incidence rates and mortality rates of pediatric cancers are rising
Large scale study of over 700 pediatric cancers show cancers driven by epigenetic drivers or fusion proteins. Need for transcriptomics. Also study demonstrated that we have underestimated germ line mutations and hereditary factors.
They put together a database to nominate patients on their IGM Cancer protocol. Involves genetic counseling and obtaining germ line samples to determine hereditary factors. RNA and protein are evaluated as well as exome sequencing. RNASeq and Archer Dx test to identify driver fusions
PECAN curated database from St. Jude used to determine driver mutations. They use multiple databases and overlap within these databases and knowledge base to determine or weed out false positives
They have used these studies to understand the immune infiltrate into recurrent cancers (CytoCure)
They found 40 germline cancer predisposition genes, 47 driver somatic fusion proteins, 81 potential actionable targets, 106 CNV, 196 meaningful somatic driver mutations
They are functioning well at NCI with respect to grant reviews, research, and general functions in spite of the COVID pandemic and the massive demonstrations on also focusing on the disparities which occur in cancer research field and cancer care
There are ongoing efforts at NCI to make a positive difference in racial injustice, diversity in the cancer workforce, and for patients as well
Need a diverse workforce across the cancer research and care spectrum
Data show that areas where the clinicians are successful in putting African Americans on clinical trials are areas (geographic and site specific) where health disparities are narrowing
Grants through NCI new SeroNet for COVID-19 serologic testing funded by two RFAs through NIAD (RFA-CA-30-038 and RFA-CA-20-039) and will close on July 22, 2020
Tuesday, June 23
12:45 PM – 1:46 PM EDT
Virtual Educational Session
Immunology, Tumor Biology, Experimental and Molecular Therapeutics, Molecular and Cellular Biology/Genetics
This educational session will update cancer researchers and clinicians about the latest developments in the detailed understanding of the types and roles of immune cells in tumors. It will summarize current knowledge about the types of T cells, natural killer cells, B cells, and myeloid cells in tumors and discuss current knowledge about the roles these cells play in the antitumor immune response. The session will feature some of the most promising up-and-coming cancer immunologists who will inform about their latest strategies to harness the immune system to promote more effective therapies.
Judith A Varner, Yuliya Pylayeva-Gupta
Introduction
Judith A Varner
New techniques reveal critical roles of myeloid cells in tumor development and progression
Different type of cells are becoming targets for immune checkpoint like myeloid cells
In T cell excluded or desert tumors T cells are held at periphery so myeloid cells can infiltrate though so macrophages might be effective in these immune t cell naïve tumors, macrophages are most abundant types of immune cells in tumors
CXCLs are potential targets
PI3K delta inhibitors,
Reduce the infiltrate of myeloid tumor suppressor cells like macrophages
When should we give myeloid or T cell therapy is the issue
Judith A Varner
Novel strategies to harness T-cell biology for cancer therapy
Positive and negative roles of B cells in cancer
Yuliya Pylayeva-Gupta
New approaches in cancer immunotherapy: Programming bacteria to induce systemic antitumor immunity
There are numerous examples of highly successful covalent drugs such as aspirin and penicillin that have been in use for a long period of time. Despite historical success, there was a period of reluctance among many to purse covalent drugs based on concerns about toxicity. With advances in understanding features of a well-designed covalent drug, new techniques to discover and characterize covalent inhibitors, and clinical success of new covalent cancer drugs in recent years, there is renewed interest in covalent compounds. This session will provide a broad look at covalent probe compounds and drug development, including a historical perspective, examination of warheads and electrophilic amino acids, the role of chemoproteomics, and case studies.
Benjamin F Cravatt, Richard A. Ward, Sara J Buhrlage
Discovering and optimizing covalent small-molecule ligands by chemical proteomics
Benjamin F Cravatt
Multiple approaches are being investigated to find new covalent inhibitors such as: 1) cysteine reactivity mapping, 2) mapping cysteine ligandability, 3) and functional screening in phenotypic assays for electrophilic compounds
Using fluorescent activity probes in proteomic screens; have broad useability in the proteome but can be specific
They screened quiescent versus stimulated T cells to determine reactive cysteines in a phenotypic screen and analyzed by MS proteomics (cysteine reactivity profiling); can quantitate 15000 to 20,000 reactive cysteines
Isocitrate dehydrogenase 1 and adapter protein LCP-1 are two examples of changes in reactive cysteines they have seen using this method
They use scout molecules to target ligands or proteins with reactive cysteines
For phenotypic screens they first use a cytotoxic assay to screen out toxic compounds which just kill cells without causing T cell activation (like IL10 secretion)
INTERESTINGLY coupling these MS reactive cysteine screens with phenotypic screens you can find NONCANONICAL mechanisms of many of these target proteins (many of the compounds found targets which were not predicted or known)
Electrophilic warheads and nucleophilic amino acids: A chemical and computational perspective on covalent modifier
The covalent targeting of cysteine residues in drug discovery and its application to the discovery of Osimertinib
Richard A. Ward
Cysteine activation: thiolate form of cysteine is a strong nucleophile
Thiolate form preferred in polar environment
Activation can be assisted by neighboring residues; pKA will have an effect on deprotonation
pKas of cysteine vary in EGFR
cysteine that are too reactive give toxicity while not reactive enough are ineffective
Accelerating drug discovery with lysine-targeted covalent probes
This Educational Session aims to guide discussion on the heterogeneous cells and metabolism in the tumor microenvironment. It is now clear that the diversity of cells in tumors each require distinct metabolic programs to survive and proliferate. Tumors, however, are genetically programmed for high rates of metabolism and can present a metabolically hostile environment in which nutrient competition and hypoxia can limit antitumor immunity.
Jeffrey C Rathmell, Lydia Lynch, Mara H Sherman, Greg M Delgoffe
T-cell metabolism and metabolic reprogramming antitumor immunity
Jeffrey C Rathmell
Introduction
Jeffrey C Rathmell
Metabolic functions of cancer-associated fibroblasts
Mara H Sherman
Tumor microenvironment metabolism and its effects on antitumor immunity and immunotherapeutic response
Greg M Delgoffe
Multiple metabolites, reactive oxygen species within the tumor microenvironment; is there heterogeneity within the TME metabolome which can predict their ability to be immunosensitive
Took melanoma cells and looked at metabolism using Seahorse (glycolysis): and there was vast heterogeneity in melanoma tumor cells; some just do oxphos and no glycolytic metabolism (inverse Warburg)
As they profiled whole tumors they could separate out the metabolism of each cell type within the tumor and could look at T cells versus stromal CAFs or tumor cells and characterized cells as indolent or metabolic
T cells from hyerglycolytic tumors were fine but from high glycolysis the T cells were more indolent
When knock down glucose transporter the cells become more glycolytic
If patient had high oxidative metabolism had low PDL1 sensitivity
Showed this result in head and neck cancer as well
Metformin a complex 1 inhibitor which is not as toxic as most mito oxphos inhibitors the T cells have less hypoxia and can remodel the TME and stimulate the immune response
Metformin now in clinical trials
T cells though seem metabolically restricted; T cells that infiltrate tumors are low mitochondrial phosph cells
T cells from tumors have defective mitochondria or little respiratory capacity
They have some preliminary findings that metabolic inhibitors may help with CAR-T therapy
Obesity, lipids and suppression of anti-tumor immunity
Lydia Lynch
Hypothesis: obesity causes issues with anti tumor immunity
Less NK cells in obese people; also produce less IFN gamma
RNASeq on NOD mice; granzymes and perforins at top of list of obese downregulated
Upregulated genes that were upregulated involved in lipid metabolism
All were PPAR target genes
NK cells from obese patients takes up palmitate and this reduces their glycolysis but OXPHOS also reduced; they think increased FFA basically overloads mitochondria
Long recognized for their role in cancer diagnosis and prognostication, pathologists are beginning to leverage a variety of digital imaging technologies and computational tools to improve both clinical practice and cancer research. Remarkably, the emergence of artificial intelligence (AI) and machine learning algorithms for analyzing pathology specimens is poised to not only augment the resolution and accuracy of clinical diagnosis, but also fundamentally transform the role of the pathologist in cancer science and precision oncology. This session will discuss what pathologists are currently able to achieve with these new technologies, present their challenges and barriers, and overview their future possibilities in cancer diagnosis and research. The session will also include discussions of what is practical and doable in the clinic for diagnostic and clinical oncology in comparison to technologies and approaches primarily utilized to accelerate cancer research.
Jorge S Reis-Filho, Thomas J Fuchs, David L Rimm, Jayanta Debnath
Using old methods and new methods; so cell counting you use to find the cells then phenotype; with quantification like with Aqua use densitometry of positive signal to determine a threshold to determine presence of a cell for counting
Hiplex versus multiplex imaging where you have ten channels to measure by cycling of flour on antibody (can get up to 20plex)
Hiplex can be coupled with Mass spectrometry (Imaging Mass spectrometry, based on heavy metal tags on mAbs)
However it will still take a trained pathologist to define regions of interest or field of desired view
Introduction
Jayanta Debnath
Challenges and barriers of implementing AI tools for cancer diagnostics
Jorge S Reis-Filho
Implementing robust digital pathology workflows into clinical practice and cancer research
Jayanta Debnath
Invited Speaker
Thomas J Fuchs
Founder of spinout of Memorial Sloan Kettering
Separates AI from computational algothimic
Dealing with not just machines but integrating human intelligence
Making decision for the patients must involve human decision making as well
How do we get experts to do these decisions faster
AI in pathology: what is difficult? =è sandbox scenarios where machines are great,; curated datasets; human decision support systems or maps; or try to predict nature
1) learn rules made by humans; human to human scenario 2)constrained nature 3)unconstrained nature like images and or behavior 4) predict nature response to nature response to itself
In sandbox scenario the rules are set in stone and machines are great like chess playing
In second scenario can train computer to predict what a human would predict
So third scenario is like driving cars
System on constrained nature or constrained dataset will take a long time for commuter to get to decision
Fourth category is long term data collection project
He is finding it is still finding it is still is difficult to predict nature so going from clinical finding to prognosis still does not have good predictability with AI alone; need for human involvement
End to end partnering (EPL) is a new way where humans can get more involved with the algorithm and assist with the problem of constrained data
An example of a workflow for pathology would be as follows from Campanella et al 2019 Nature Medicine: obtain digital images (they digitized a million slides), train a massive data set with highthroughput computing (needed a lot of time and big software developing effort), and then train it using input be the best expert pathologists (nature to human and unconstrained because no data curation done)
Led to first clinically grade machine learning system (Camelyon16 was the challenge for detecting metastatic cells in lymph tissue; tested on 12,000 patients from 45 countries)
The first big hurdle was moving from manually annotated slides (which was a big bottleneck) to automatically extracted data from path reports).
Now problem is in prediction: How can we bridge the gap from predicting humans to predicting nature?
With an AI system pathologist drastically improved the ability to detect very small lesions
Incidence rates of several cancers (e.g., colorectal, pancreatic, and breast cancers) are rising in younger populations, which contrasts with either declining or more slowly rising incidence in older populations. Early-onset cancers are also more aggressive and have different tumor characteristics than those in older populations. Evidence on risk factors and contributors to early-onset cancers is emerging. In this Educational Session, the trends and burden, potential causes, risk factors, and tumor characteristics of early-onset cancers will be covered. Presenters will focus on colorectal and breast cancer, which are among the most common causes of cancer deaths in younger people. Potential mechanisms of early-onset cancers and racial/ethnic differences will also be discussed.
Stacey A. Fedewa, Xavier Llor, Pepper Jo Schedin, Yin Cao
Cancers that are and are not increasing in younger populations
Stacey A. Fedewa
Early onset cancers, pediatric cancers and colon cancers are increasing in younger adults
Younger people are more likely to be uninsured and these are there most productive years so it is a horrible life event for a young adult to be diagnosed with cancer. They will have more financial hardship and most (70%) of the young adults with cancer have had financial difficulties. It is very hard for women as they are on their childbearing years so additional stress
Types of early onset cancer varies by age as well as geographic locations. For example in 20s thyroid cancer is more common but in 30s it is breast cancer. Colorectal and testicular most common in US.
SCC is decreasing by adenocarcinoma of the cervix is increasing in women’s 40s, potentially due to changing sexual behaviors
Breast cancer is increasing in younger women: maybe etiologic distinct like triple negative and larger racial disparities in younger African American women
Increased obesity among younger people is becoming a factor in this increasing incidence of early onset cancers
Other Articles on this Open Access Online Journal on Cancer Conferences and Conference Coverage in Real Time Include
Old Industrial Revolution Paradigm of Education Needs to End: How Scientific Curation Can Transform Education
Curator: Stephen J. Williams, PhD.
Dr. Cathy N. Davidson from Duke University gives a talk entitled: Now You See It. Why the Future of Learning Demands a Paradigm Shift
In this talk, shown below, Dr. Davidson shows how our current education system has been designed for educating students for the industrial age type careers and skills needed for success in the Industrial Age and how this educational paradigm is failing to prepare students for the challenges they will face in their future careers.
Or as Dr. Davidson summarizes
Designing education not for your past but for their future
As the video is almost an hour I will summarize some of the main points below
PLEASE WATCH VIDEO
Summary of talk
Dr. Davidson starts the talk with a thesis: that Institutions tend to preserve the problems they were created to solve.
All the current work, teaching paradigms that we use today were created for the last information age (19th century)
Our job to to remake the institutions of education work for the future not the one we inherited
Four information ages or technologies that radically changed communication
advent of writing: B.C. in ancient Mesopotamia allowed us to record and transfer knowledge and ideas
movable type – first seen in 10th century China
steam powered press – allowed books to be mass produced and available to the middle class. First time middle class was able to have unlimited access to literature
internet- ability to publish and share ideas worldwide
Interestingly, in the early phases of each of these information ages, the same four complaints about the new technology/methodology of disseminating information was heard
ruins memory
creates a distraction
ruins interpersonal dialogue and authority
reduces complexity of thought
She gives an example of Socrates who hated writing and frequently stated that writing ruins memory, creates a distraction, and worst commits ideas to what one writes down which could not be changed or altered and so destroys ‘free thinking’.
She discusses how our educational institutions are designed for the industrial age.
The need for collaborative (group) learning AND teaching
Designing education not for your past but for the future
In other words preparing students for THEIR future not your past and the future careers that do not exist today.
In the West we were all taught to answer silently and alone. However in Japan, education is arranged in the han or group think utilizing the best talents of each member in the group. In Japan you are arranged in such groups at an early age. The concept is that each member of the group contributes their unique talent and skill for the betterment of the whole group. The goal is to demonstrate that the group worked well together.
In the 19th century in institutions had to solve a problem: how to get people out of the farm and into the factory and/or out of the shop and into the firm
Takes a lot of regulation and institutionalization to convince people that independent thought is not the best way in the corporation
keywords for an industrial age
timeliness
attention to task
standards, standardization
hierarchy
specialization, expertise
metrics (measures, management)
two cultures: separating curriculum into STEM versus artistic tracts or dividing the world of science and world of art
This effort led to a concept used in scientific labor management derived from this old paradigm in education, an educational system controlled and success measured using
grades (A,B,C,D)
multiple choice tests
keywords for our age
workflow
multitasking attention
interactive process (Prototype, Feedback)
data mining
collaboration by difference
Can using a methodology such as scientific curation affect higher education to achieve this goal of teaching students to collaborate in an interactive process using data mining to create a new workflow for any given problem? Can a methodology of scientific curation be able to affect such changes needed in academic departments to achieve the above goal?
This will be the subject of future curations tested using real-world in class examples.
However, it is important to first discern that scientific content curation takes material from Peer reviewed sources and other expert-vetted sources. This is unique from other types of content curation in which take from varied sources, some of which are not expert-reviewed, vetted, or possibly ‘fake news’ or highly edited materials such as altered video and audio. In this respect, the expert acts not only as curator but as referee. In addition, collaboration is necessary and even compulsory for the methodology of scientific content curation, portending the curator not as the sole expert but revealing the CONTENT from experts as the main focus for learning and edification.
Other article of note on this subject in this Open Access Online Scientific Journal include:
The above articles will give a good background on this NEW Conceived Methodology of Scientific Curation and its Applicability in various areas such as Medical Publishing, and as discussed below Medical Education.
To understand the new paradigm in medical communication and the impact curative networks have or will play in this arena please read the following:
This article discusses a history of medical communication and how science and medical communication initially moved from discussions from select individuals to the current open accessible and cooperative structure using Web 2.0 as a platform.
In Data Science, A Pioneer Practitioner’s Portfolio of Algorithm-based Decision Support Systems for Operations Management in Several Industrial Verticals: Analytics Designer, Aviva Lev-Ari, PhD, RN
An overview of Data Science as a discipline is presented in
On this landscape about IT, The Internet, Analytics, Statistics, Big Data, Data Science and Artificial Intelligence, I am to tell stories on my own pioneering work in data science, Algorithm-based decision support systems design for different organizations in several sectors of the US economy:
Images on 12/7/2019
Startups:
TimeØ Group – The leader in Digital Marketplaces Design
Concept Five Technologies, Inc. – Commercialization of DoD funded technologies
MDSS, Inc. – SAAS in Analytical Services
LPBI Group – Pharmaceutical & Media
Top Tier Management Consulting: SRI International, Monitor Group;
OEM: Amdahl Corporation;
Top 6th System Integrator: Perot System Corporation;
FFRDC: MITRE Corporation.
Publishing industry: was Director of Research at McGraw-Hill/CTB.
Northeastern University, Researcher on Cardiovascular Pharmacotherapy at Bouve College of Health Sciences (Independent research guided by Professor of Pharmacology)
Pioneering implementations of analytics to business decision making: contributions to domain knowledge conceptualization, research design, methodology development, data modeling and statistical data analysis: Aviva Lev-Ari, UCB, PhD’83; HUJI MA’76
Was prepared for publication in American Friends of the Hebrew University (AFHU), May 2018 Newsletter, Hebrew University’s HUJI AlumniSpotlight Section.
Aviva Lev-Ari’s profile was up on 5/3/2018 on AFHU website under the Alumni Spotlight at https://www.afhu.org/
On 5/11/2018, Excerpts were Published in AFHU e-news.