Feeds:
Posts
Comments

Archive for the ‘Discovery process’ Category

Science Has A Systemic Problem, Not an Innovation Problem

Curator: Stephen J. Williams, Ph.D.

    A recent email, asking me to submit a survey, got me thinking about the malaise that scientists and industry professionals frequently bemoan: that innovation has been stymied for some reason and all sorts of convuluted processes must be altered to spur this mythical void of great new discoveries…..  and it got me thinking about our current state of science, and what is the perceived issue… and if this desert of innovation actually exists or is more a fundamental problem which we have created.

The email was from an NIH committee asking for opinions on recreating the grant review process …. now this on the same day someone complained to me about a shoddy and perplexing grant review they received.

The following email, which was sent out to multiple researchers, involved in either NIH grant review on both sides, as well as those who had been involved in previous questionnaires and studies on grant review and bias.  The email asked for researchers to fill out a survey on the grant review process, and how to best change it to increase innovation of ideas as well as inclusivity.  In recent years, there have been multiple survey requests on these matters, with multiple confusing procedural changes to grant format and content requirements, adding more administrative burden to scientists.

The email from Center for Scientific Review (one of the divisions a grant will go to before review {they set up review study sections and decide what section a grant should be  assigned to} was as follows:

Update on Simplifying Review Criteria: A Request for Information

https://www.csr.nih.gov/reviewmatters/2022/12/08/update-on-simplifying-review-criteria-a-request-for-information/

NIH has issued a request for information (RFI) seeking feedback on revising and simplifying the peer review framework for research project grant applications. The goal of this effort is to facilitate the mission of scientific peer review – identification of the strongest, highest-impact research. The proposed changes will allow peer reviewers to focus on scientific merit by evaluating 1) the scientific impact, research rigor, and feasibility of the proposed research without the distraction of administrative questions and 2) whether or not appropriate expertise and resources are available to conduct the research, thus mitigating the undue influence of the reputation of the institution or investigator.

Currently, applications for research project grants (RPGs, such as R01s, R03s, R15s, R21s, R34s) are evaluated based on five scored criteria: Significance, Investigators, Innovation, Approach, and Environment (derived from NIH peer review regulations 42 C.F.R. Part 52h.8; see Definitions of Criteria and Considerations for Research Project Grant Critiques for more detail) and a number of additional review criteria such as Human Subject Protections.

NIH gathered input from the community to identify potential revisions to the review framework. Given longstanding and often-heard concerns from diverse groups, CSR decided to form two working groups to the CSR Advisory Council—one on non-clinical trials and one on clinical trials. To inform these groups, CSR published a Review Matters blog, which was cross-posted on the Office of Extramural Research blog, Open Mike. The blog received more than 9,000 views by unique individuals and over 400 comments. Interim recommendations were presented to the CSR Advisory Council in a public forum (March 2020 videoslides; March 2021 videoslides). Final recommendations from the CSRAC (report) were considered by the major extramural committees of the NIH that included leadership from across NIH institutes and centers. Additional background information can be found here. This process produced many modifications and the final proposal presented below. Discussions are underway to incorporate consideration of a Plan for Enhancing Diverse Perspectives (PEDP) and rigorous review of clinical trials RPGs (~10% of RPGs are clinical trials) within the proposed framework.

Simplified Review Criteria

NIH proposes to reorganize the five review criteria into three factors, with Factors 1 and 2 receiving a numerical score. Reviewers will be instructed to consider all three factors (Factors 1, 2 and 3) in arriving at their Overall Impact Score (scored 1-9), reflecting the overall scientific and technical merit of the application.

  • Factor 1: Importance of the Research (Significance, Innovation), numerical score (1-9)
  • Factor 2: Rigor and Feasibility (Approach), numerical score (1-9)
  • Factor 3: Expertise and Resources (Investigator, Environment), assessed and considered in the Overall Impact Score, but not individually scored

Within Factor 3 (Expertise and Resources), Investigator and Environment will be assessed in the context of the research proposed. Investigator(s) will be rated as “fully capable” or “additional expertise/capability needed”. Environment will be rated as “appropriate” or “additional resources needed.” If a need for additional expertise or resources is identified, written justification must be provided. Detailed descriptions of the three factors can be found here.

Now looking at some of the Comments were very illuminating:

I strongly support streamlining the five current main review criteria into three, and the present five additional criteria into two. This will bring clarity to applicants and reduce the workload on both applicants and reviewers. Blinding reviewers to the applicants’ identities and institutions would be a helpful next step, and would do much to reduce the “rich-getting-richer” / “good ole girls and good ole boys” / “big science” elitism that plagues the present review system, wherein pedigree and connections often outweigh substance and creativity.

I support the proposed changes. The shift away from “innovation” will help reduce the tendency to create hype around a proposed research direction. The shift away from Investigator and Environment assessments will help reduce bias toward already funded investigators in large well-known institutions.

As a reviewer for 5 years, I believe that the proposed changes are a step in the right direction, refocusing the review on whether the science SHOULD be done and whether it CAN BE DONE WELL, while eliminating burdensome and unhelpful sections of review that are better handled administratively. I particularly believe that the de-emphasis of innovation (which typically focuses on technical innovation) will improve evaluation of the overall science, and de-emphasis of review of minor technical details will, if implemented correctly, reduce the “downward pull” on scores for approach. The above comments reference blinded reviews, but I did not see this in the proposed recommendations. I do not believe this is a good idea for several reasons: 1) Blinding of the applicant and institution is not likely feasible for many of the reasons others have described (e.g., self-referencing of prior work), 2) Blinding would eliminate the potential to review investigators’ biosketches and budget justifications, which are critically important in review, 3) Making review blinded would make determination of conflicts of interest harder to identify and avoid, 4) Evaluation of “Investigator and Environment” would be nearly impossible.

Most of the Comments were in favor of the proposed changes, however many admitted that it adds additional confusion on top of many administrative changes to formats and content of grant sections.

Being a Stephen Covey devotee, and just have listened to  The Four Principles of Execution, it became more apparent that issues that hinder many great ideas coming into fruition, especially in science, is a result of these systemic or problems in the process, not at the level of individual researchers or small companies trying to get their innovations funded or noticed.  In summary, Dr. Covey states most issues related to the success of any initiative is NOT in the strategic planning, but in the failure to adhere to a few EXECUTION principles.  Primary to these failures of strategic plans is lack of accounting of what Dr. Covey calls the ‘whirlwind’, or those important but recurring tasks that take us away from achieving the wildly important goals.  In addition, lack of  determining lead and lag measures of success hinder such plans.

In this case a lag measure in INNOVATION.  It appears we have created such a whirlwind and focus on lag measures that we are incapable of translating great discoveries into INNOVATION.

In the following post, I will focus on issues relating to Open Access, publishing and dissemination of scientific discovery may be costing us TIME to INNOVATION.  And it appears that there are systemic reasons why we appear stuck in a rut, so to speak.

The first indication is from a paper published by Johan Chu and James Evans in 2021 in PNAS:

 

Slowed canonical progress in large fields of science

Chu JSG, Evans JA. Slowed canonical progress in large fields of science. Proc Natl Acad Sci U S A. 2021 Oct 12;118(41):e2021636118. doi: 10.1073/pnas.2021636118. PMID: 34607941; PMCID: PMC8522281

 

Abstract

In many academic fields, the number of papers published each year has increased significantly over time. Policy measures aim to increase the quantity of scientists, research funding, and scientific output, which is measured by the number of papers produced. These quantitative metrics determine the career trajectories of scholars and evaluations of academic departments, institutions, and nations. Whether and how these increases in the numbers of scientists and papers translate into advances in knowledge is unclear, however. Here, we first lay out a theoretical argument for why too many papers published each year in a field can lead to stagnation rather than advance. The deluge of new papers may deprive reviewers and readers the cognitive slack required to fully recognize and understand novel ideas. Competition among many new ideas may prevent the gradual accumulation of focused attention on a promising new idea. Then, we show data supporting the predictions of this theory. When the number of papers published per year in a scientific field grows large, citations flow disproportionately to already well-cited papers; the list of most-cited papers ossifies; new papers are unlikely to ever become highly cited, and when they do, it is not through a gradual, cumulative process of attention gathering; and newly published papers become unlikely to disrupt existing work. These findings suggest that the progress of large scientific fields may be slowed, trapped in existing canon. Policy measures shifting how scientific work is produced, disseminated, consumed, and rewarded may be called for to push fields into new, more fertile areas of study.

So the Summary of this paper is

  • The authors examined 1.8 billion citations among 90 million papers over 241 subjects
  • found the corpus of papers do not lead to turnover of new ideas in a field, but rather the ossification or entrenchment of canonical (or older ideas)
  • this is mainly due to older paper cited more frequently than new papers with new ideas, potentially because authors are trying to get their own papers cited more frequently for funding and exposure purposes
  • The authors suggest that “fundamental progress may be stymied if quantitative growth of scientific endeavors is not balanced by structures fostering disruptive scholarship and focusing attention of novel ideas”

The authors note that, in most cases, science policy reinforces this “more is better” philosophy”,  where metrics of publication productivity are either number of publications or impact measured by citation rankings.  However, using an analysis of citation changes occurring in large versus smaller fields, it becomes apparent that this process is favoring the older, more established papers and a recirculating of older canonical ideas.

“Rather than resulting in faster turnover of field paradigms, the massive amounts of new publications entrenches the ideas of top-cited papers.”  New ideas are pushed down to the bottom of the citation list and potentially lost in the literature.  The authors suggest that this problem will intensify as the “annual mass” of new publications in each field grows, especially in large fields.  This issue is exacerbated by the deluge on new online ‘open access’ journals, in which authors would focus on citing the more highly cited literature. 

We maybe at a critical junction, where if many papers are published in a short time, new ideas will not be considered as carefully as the older ideas.  In addition,

with proliferation of journals and the blurring of journal hierarchies due to online articles-level access can exacerbate this problem

As a counterpoint, the authors do note that even though many molecular biology highly cited articles were done in 1976, there has been extremely much innovation since then however it may take a lot more in experiments and money to gain the level of citations that those papers produced, and hence a lower scientific productivity.

This issue is seen in the field of economics as well

Ellison, Glenn. “Is peer review in decline?” Economic Inquiry, vol. 49, no. 3, July 2011, pp. 635+. Gale Academic OneFile, link.gale.com/apps/doc/A261386330/AONE?u=temple_main&sid=bookmark-AONE&xid=f5891002. Accessed 12 Dec. 2022.

Abstract

Over the past decade, there has been a decline in the fraction of papers in top economics journals written by economists from the highest-ranked economics departments. This paper documents this fact and uses additional data on publications and citations to assess various potential explanations. Several observations are consistent with the hypothesis that the Internet improves the ability of high-profile authors to disseminate their research without going through the traditional peer-review process. (JEL A14, 030)

The facts part of this paper documents two main facts:

1. Economists in top-ranked departments now publish very few papers in top field journals. There is a marked decline in such publications between the early 1990s and early 2000s.

2. Comparing the early 2000s with the early 1990s, there is a decline in both the absolute number of papers and the share of papers in the top general interest journals written by Harvard economics department faculty.

Although the second fact just concerns one department, I see it as potentially important to understanding what is happening because it comes at a time when Harvard is widely regarded (I believe correctly) as having ascended to the top position in the profession.

The “decline-of-peer-review” theory I allude to in the title is that the necessity of going through the peer-review process has lessened for high-status authors: in the old days peer-reviewed journals were by far the most effective means of reaching readers, whereas with the growth of the Internet high-status authors can now post papers online and exploit their reputation to attract readers.

Many alternate explanations are possible. I focus on four theories: the decline-in-peer-review theory and three alternatives.

1. The trends could be a consequence of top-school authors’ being crowded out of the top journals by other researchers. Several such stories have an optimistic message, for example, there is more talent entering the profession, old pro-elite biases are being broken down, more schools are encouraging faculty to do cutting-edge research, and the Internet is enabling more cutting-edge research by breaking down informational barriers that had hampered researchers outside the top schools. (2)

2. The trends could be a consequence of the growth of revisions at economics journals discussed in Ellison (2002a, 2002b). In this more pessimistic theory, highly productive researchers must abandon some projects and/or seek out faster outlets to conserve the time now required to publish their most important works.

3. The trends could simply reflect that field journals have declined in quality in some relative sense and become a less attractive place to publish. This theory is meant to encompass also the rise of new journals, which is not obviously desirable or undesirable.

The majority of this paper is devoted to examining various data sources that provide additional details about how economics publishing has changed over the past decade. These are intended both to sharpen understanding of the facts to be explained and to provide tests of auxiliary predictions of the theories. Two main sources of information are used: data on publications and data on citations. The publication data include department-level counts of publications in various additional journals, an individual-level dataset containing records of publications in a subset of journals for thousands of economists, and a very small dataset containing complete data on a few authors’ publication records. The citation data include citations at the paper level for 9,000 published papers and less well-matched data that is used to construct measures of citations to authors’ unpublished works, to departments as a whole, and to various journals.

Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship

Josh Angrist, Pierre Azoulay, Glenn Ellison, Ryan Hill, Susan Feng Lu. Inside Job or Deep Impact? Extramural Citations and the Influence of Economic Scholarship.

JOURNAL OF ECONOMIC LITERATURE

VOL. 58, NO. 1, MARCH 2020

(pp. 3-52)

So if innovation is there but it may be buried under the massive amount of heavily cited older literature, do we see evidence of this in other fields like medicine?

Why Isn’t Innovation Helping Reduce Health Care Costs?

 
 

National health care expenditures (NHEs) in the United States continue to grow at rates outpacing the broader economy: Inflation- and population-adjusted NHEs have increased 1.6 percent faster than the gross domestic product (GDP) between 1990 and 2018. US national health expenditure growth as a share of GDP far outpaces comparable nations in the Organization for Economic Cooperation and Development (17.2 versus 8.9 percent).

Multiple recent analyses have proposed that growth in the prices and intensity of US health care services—rather than in utilization rates or demographic characteristics—is responsible for the disproportionate increases in NHEs relative to global counterparts. The consequences of ever-rising costs amid ubiquitous underinsurance in the US include price-induced deferral of care leading to excess morbidity relative to comparable nations.

These patterns exist despite a robust innovation ecosystem in US health care—implying that novel technologies, in isolation, are insufficient to bend the health care cost curve. Indeed, studies have documented that novel technologies directly increase expenditure growth.

Why is our prolific innovation ecosystem not helping reduce costs? The core issue relates to its apparent failure to enhance net productivity—the relative output generated per unit resource required. In this post, we decompose the concept of innovation to highlight situations in which inventions may not increase net productivity. We begin by describing how this issue has taken on increased urgency amid resource constraints magnified by the COVID-19 pandemic. In turn, we describe incentives for the pervasiveness of productivity-diminishing innovations. Finally, we provide recommendations to promote opportunities for low-cost innovation.

 

 

Net Productivity During The COVID-19 Pandemic

The issue of productivity-enhancing innovation is timely, as health care systems have been overwhelmed by COVID-19. Hospitals in Italy, New York City, and elsewhere have lacked adequate capital resources to care for patients with the disease, sufficient liquidity to invest in sorely needed resources, and enough staff to perform all of the necessary tasks.

The critical constraint in these settings is not technology: In fact, the most advanced technology required to routinely treat COVID-19—the mechanical ventilator—was invented nearly 100 years ago in response to polio (the so-called iron lung). Rather, the bottleneck relates to the total financial and human resources required to use the technology—the denominator of net productivity. The clinical implementation of ventilators has been illustrative: Health care workers are still required to operate ventilators on a nearly one-to-one basis, just like in the mid-twentieth century. 

High levels of resources required for implementation of health care technologies constrain the scalability of patient care—such as during respiratory disease outbreaks such as COVID-19. Thus, research to reduce health care costs is the same kind of research we urgently require to promote health care access for patients with COVID-19.

Types Of Innovation And Their Relationship To Expenditure Growth

The widespread use of novel medical technologies has been highlighted as a central driver of NHE growth in the US. We believe that the continued expansion of health care costs is largely the result of innovation that tends to have low productivity (exhibit 1). We argue that these archetypes—novel widgets tacked on to existing workflows to reinforce traditional care models—are exactly the wrong properties to reduce NHEs at the systemic level.

Exhibit 1: Relative productivity of innovation subtypes

Source: Authors’ analysis.

Content Versus Process Innovation

Content (also called technical) innovation refers to the creation of new widgets, such as biochemical agents, diagnostic tools, or therapeutic interventions. Contemporary examples of content innovation include specialty pharmaceuticalsmolecular diagnostics, and advanced interventions and imaging.

These may be contrasted with process innovations, which address the organized sequences of activities that implement content. Classically, these include clinical pathways and protocols. They can address the delivery of care for acute conditions, such as central line infections, sepsis, or natural disasters. Alternatively, they can target chronic conditions through initiatives such as team-based management of hypertension and hospital-at-home models for geriatric care. Other processes include hiring staffdelegating labor, and supply chain management.

Performance-Enhancing Versus Cost-Reducing Innovation

Performance-enhancing innovations frequently create incremental outcome gains in diagnostic characteristics, such as sensitivity or specificity, or in therapeutic characteristics, such as biomarkers for disease status. Their performance gains often lead to higher prices compared to existing alternatives.  

Performance-enhancing innovations can be compared to “non-inferior” innovations capable of achieving outcomes approximating those of existing alternatives, but at reduced cost. Industries outside of medicine, such as the computing industry, have relied heavily on the ability to reduce costs while retaining performance.

In health care though, this pattern of innovation is rare. Since passage of the 2010 “Biosimilars” Act aimed at stimulating non-inferior innovation and competition in therapeutics markets, only 17 agents have been approved, and only seven have made it to market. More than three-quarters of all drugs receiving new patents between 2005 and 2015 were “reissues,” meaning they had already been approved, and the new patent reflected changes to the previously approved formula. Meanwhile, the costs of approved drugs have increased over time, at rates between 4 percent and 7 percent annually.

Moreover, the preponderance of performance-enhancing diagnostic and therapeutic innovations tend to address narrow patient cohorts (such as rare diseases or cancer subtypes), with limited clear clinical utility in broader populations. For example, the recently approved eculizimab is a monoclonal antibody approved for paroxysmal nocturnal hemoglobinuria—which effects 1 in 10 million individuals. At the time of its launch, eculizimab was priced at more than $400,000 per year, making it the most expensive drug in modern history. For clinical populations with no available alternatives, drugs such as eculizimab may be cost-effective, pending society’s willingness to pay, and morally desirable, given a society’s values. But such drugs are certainly not cost-reducing.

Additive Versus Substitutive Innovation

Additive innovations are those that append to preexisting workflows, while substitutive innovations reconfigure preexisting workflows. In this way, additive innovations increase the use of precedent services, whereas substitutive innovations decrease precedent service use.

For example, previous analyses have found that novel imaging modalities are additive innovations, as they tend not to diminish use of preexisting modalities. Similarly, novel procedures tend to incompletely replace traditional procedures. In the case of therapeutics and devices, off-label uses in disease groups outside of the approved indication(s) can prompt innovation that is additive. This is especially true, given that off-label prescriptions classically occur after approved methods are exhausted.

Eculizimab once again provides an illustrative example. As of February 2019, the drug had been used for 39 indications (it had been approved for three of those, by that time), 69 percent of which lacked any form of evidence of real-world effectiveness. Meanwhile, the drug generated nearly $4 billion in sales in 2019. Again, these expenditures may be something for which society chooses to pay—but they are nonetheless additive, rather than substitutive.

Sustaining Versus Disruptive Innovation

Competitive market theory suggests that incumbents and disruptors innovate differently. Incumbents seek sustaining innovations capable of perpetuating their dominance, whereas disruptors pursue innovations capable of redefining traditional business models.

In health care, while disruptive innovations hold the potential to reduce overall health expenditures, often they run counter to the capabilities of market incumbents. For example, telemedicine can deliver care asynchronously, remotely, and virtually, but large-scale brick-and-mortar medical facilities invest enormous capital in the delivery of synchronous, in-house, in-person care (incentivized by facility fees).

The connection between incumbent business models and the innovation pipeline is particularly relevant given that 58 percent of total funding for biomedical research in the US is now derived from private entities, compared with 46 percent a decade prior. It follows that the growing influence of eminent private organizations may favor innovations supporting their market dominance—rather than innovations that are societally optimal.

Incentives And Repercussions Of High-Cost Innovation

Taken together, these observations suggest that innovation in health care is preferentially designed for revenue expansion rather than for cost reduction. While offering incremental improvements in patient outcomes, therefore creating theoretical value for society, these innovations rarely deliver incremental reductions in short- or long-term costs at the health system level.

For example, content-based, performance-enhancing, additive, sustaining innovations tend to add layers of complexity to the health care system—which in turn require additional administration to manage. The net result is employment growth in excess of outcome improvement, leading to productivity losses. This gap leads to continuously increasing overall expenditures in turn passed along to payers and consumers.

Nonetheless, high-cost innovations are incentivized across health care stakeholders (exhibit 2). From the supply side of innovation, for academic researchers, “breakthrough” and “groundbreaking” innovations constitute the basis for career advancement via funding and tenure. This is despite stakeholders’ frequent inability to generalize early successes to become cost-effective in the clinical setting. As previously discussed, the increasing influence of private entities in setting the medical research agenda is also likely to stimulate innovation benefitting single stakeholders rather than the system.

Exhibit 2: Incentives promoting low-value innovation

Source: Authors’ analysis adapted from Hofmann BM. Too much technology. BMJ. 2015 Feb 16.

From the demand side of innovation (providers and health systems), a combined allure (to provide “cutting-edge” patient care), imperative (to leave “no stone unturned” in patient care), and profit-motive (to amplify fee-for-service reimbursements) spur participation in a “technological arms-race.” The status quo thus remains as Clay Christensen has written: “Our major health care institutions…together overshoot the level of care actually needed or used by the vast majority of patients.”

Christensen’s observations have been validated during the COVID-19 epidemic, as treatment of the disease requires predominantly century-old technology. By continually adopting innovation that routinely overshoots the needs of most patients, layer by layer, health care institutions are accruing costs that quickly become the burden of society writ large.

Recommendations To Reduce The Costs Of Health Care Innovation

Henry Aaron wrote in 2002 that “…the forces that have driven up costs are, if anything, intensifying. The staggering fecundity of biomedical research is increasing…[and] always raises expenditures.” With NHEs spiraling ever-higher, urgency to “bend the cost curve” is mounting. Yet, since much biomedical innovation targets the “flat of the [productivity] curve,” alternative forms of innovation are necessary.

The shortcomings in net productivity revealed by the COVID-19 pandemic highlight the urgent need for redesign of health care delivery in this country, and reevaluation of the innovation needed to support it. Specifically, efforts supporting process redesign are critical to promote cost-reducing, substitutive innovations that can inaugurate new and disruptive business models.

Process redesign rarely involves novel gizmos, so much as rejiggering the wiring of, and connections between, existing gadgets. It targets operational changes capable of streamlining workflows, rather than technical advancements that complicate them. As described above, precisely these sorts of “frugal innovations” have led to productivity improvements yielding lower costs in other high-technology industries, such as the computing industry.

Shrank and colleagues recently estimated that nearly one-third of NHEs—almost $1 trillion—were due to preventable waste. Four of the six categories of waste enumerated by the authors—failure in care delivery, failure in care coordination, low-value care, and administrative complexity—represent ripe targets for process innovation, accounting for $610 billion in waste annually, according to Shrank.

Health systems adopting process redesign methods such as continuous improvement and value-based management have exhibited outcome enhancement and expense reduction simultaneously. Internal processes addressed have included supply chain reconfiguration, operational redesign, outlier reconciliation, and resource standardization.

Despite the potential of process innovation, focus on this area (often bundled into “health services” or “quality improvement” research) occupies only a minute fraction of wallet- or mind-share in the biomedical research landscape, accounting for 0.3 percent of research dollars in medicine. This may be due to a variety of barriers beyond minimal funding. One set of barriers is academic, relating to negative perceptions around rigor and a lack of outlets in which to publish quality improvement research. To achieve health care cost containment over the long term, this dimension of innovation must be destigmatized relative to more traditional manners of innovation by the funders and institutions determining the conditions of the research ecosystem.

Another set of barriers is financial: Innovations yielding cost reduction are less “reimbursable” than are innovations fashioned for revenue expansion. This is especially the case in a fee-for-service system where reimbursement is tethered to cost, which creates perverse incentives for health care institutions to overlook cost increases. However, institutions investing in low-cost innovation will be well-positioned in a rapidly approaching future of value-based care—in which the solvency of health care institutions will rely upon their ability to provide economically efficient care.

Innovating For Cost Control Necessitates Frugality Over Novelty

Restraining US NHEs represents a critical step toward health promotion. Innovation for innovation’s sake—that is content-based, incrementally effective, additive, and sustaining—is unlikely to constrain continually expanding NHEs.

In contrast, process innovation offers opportunities to reduce costs while maintaining high standards of patient care. As COVID-19 stress-tests health care systems across the world, the importance of cost control and productivity amplification for patient care has become apparent.

As such, frugality, rather than novelty, may hold the key to health care cost containment. Redesigning the innovation agenda to stem the tide of ever-rising NHEs is an essential strategy to promote widespread access to care—as well as high-value preventive care—in this country. In the words of investors across Silicon Valley: Cost-reducing innovation is no longer a “nice-to-have,” but a “need-to-have” for the future of health and overall well-being this country.

So Do We Need A New Way of Disseminating Scientific Information?  Can Curation Help?

We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!



Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.



The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?

We have discussed this in other posts such as

Will Web 3.0 Do Away With Science 2.0? Is Science Falling Behind?

and

Curation Methodology – Digital Communication Technology to mitigate Published Information Explosion and Obsolescence in Medicine and Life Sciences

For years the pharmaceutical industry has toyed with the idea of making innovation networks and innovation hubs

It has been the main focus of whole conferences

Tales from the Translational Frontier – Four Unique Approaches to Turning Novel Biology into Investable Innovations @BIOConvention #BIO2018

However it still seems these strategies have not worked

Is it because we did not have an Execution plan? Or we did not understand the lead measures for success?

Other Related Articles on this Open Access Scientific Journal Include:

Old Industrial Revolution Paradigm of Education Needs to End: How Scientific Curation Can Transform Education

Analysis of Utilizing LPBI Group’s Scientific Curation Platform as an Educational Tool: New Paradigm for Student Engagement

Global Alliance for Genomics and Health Issues Guidelines for Data Siloing and Sharing

Multiple Major Scientific Journals Will Fully Adopt Open Access Under Plan S

eScientific Publishing a Case in Point: Evolution of Platform Architecture Methodologies and of Intellectual Property Development (Content Creation by Curation) Business Model 

Read Full Post »

Will Web 3.0 Do Away With Science 2.0? Is Science Falling Behind?

Curator: Stephen J. Williams, Ph.D.

UPDATED 4/06/2022

A while back (actually many moons ago) I had put on two posts on this site:

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

Twitter is Becoming a Powerful Tool in Science and Medicine

Each of these posts were on the importance of scientific curation of findings within the realm of social media and the Web 2.0; a sub-environment known throughout the scientific communities as Science 2.0, in which expert networks collaborated together to produce massive new corpus of knowledge by sharing their views, insights on peer reviewed scientific findings. And through this new media, this process of curation would, in itself generate new ideas and new directions for research and discovery.

The platform sort of looked like the image below:

 

This system lied above a platform of the original Science 1.0, made up of all the scientific journals, books, and traditional literature:

In the old Science 1.0 format, scientific dissemination was in the format of hard print journals, and library subscriptions were mandatory (and eventually expensive). Open Access has tried to ameliorate the expense problem.

Previous image source: PeerJ.com

To index the massive and voluminous research and papers beyond the old Dewey Decimal system, a process of curation was mandatory. The dissemination of this was a natural for the new social media however the cost had to be spread out among numerous players. Journals, faced with the high costs of subscriptions and their only way to access this new media as an outlet was to become Open Access, a movement first sparked by journals like PLOS and PeerJ but then begrudingly adopted throughout the landscape. But with any movement or new adoption one gets the Good the Bad and the Ugly (as described in my cited, above, Clive Thompson article). The bad side of Open Access Journals were

  1. costs are still assumed by the individual researcher not by the journals
  2. the arise of the numerous Predatory Journals

 

Even PeerJ, in their column celebrating an anniversary of a year’s worth of Open Access success stories, lamented the key issues still facing Open Access in practice

  • which included the cost and the rise of predatory journals.

In essence, Open Access and Science 2.0 sprung full force BEFORE anyone thought of a way to defray the costs

 

Can Web 3.0 Finally Offer a Way to Right the Issues Facing High Costs of Scientific Publishing?

What is Web 3.0?

From Wikipedia: https://en.wikipedia.org/wiki/Web3

Web 1.0 and Web 2.0 refer to eras in the history of the Internet as it evolved through various technologies and formats. Web 1.0 refers roughly to the period from 1991 to 2004, where most websites were static webpages, and the vast majority of users were consumers, not producers, of content.[6][7] Web 2.0 is based around the idea of “the web as platform”,[8] and centers on user-created content uploaded to social-networking services, blogs, and wikis, among other services.[9] Web 2.0 is generally considered to have begun around 2004, and continues to the current day.[8][10][4]

Terminology[edit]

The term “Web3”, specifically “Web 3.0”, was coined by Ethereum co-founder Gavin Wood in 2014.[1] In 2020 and 2021, the idea of Web3 gained popularity[citation needed]. Particular interest spiked towards the end of 2021, largely due to interest from cryptocurrency enthusiasts and investments from high-profile technologists and companies.[4][5] Executives from venture capital firm Andreessen Horowitz travelled to Washington, D.C. in October 2021 to lobby for the idea as a potential solution to questions about Internet regulation with which policymakers have been grappling.[11]

Web3 is distinct from Tim Berners-Lee‘s 1999 concept for a semantic web, which has also been called “Web 3.0”.[12] Some writers referring to the decentralized concept usually known as “Web3” have used the terminology “Web 3.0”, leading to some confusion between the two concepts.[2][3] Furthermore, some visions of Web3 also incorporate ideas relating to the semantic web.[13][14]

Concept[edit]

Web3 revolves around the idea of decentralization, which proponents often contrast with Web 2.0, wherein large amounts of the web’s data and content are centralized in the fairly small group of companies often referred to as Big Tech.[4]

Specific visions for Web3 differ, but all are heavily based in blockchain technologies, such as various cryptocurrencies and non-fungible tokens (NFTs).[4] Bloomberg described Web3 as an idea that “would build financial assets, in the form of tokens, into the inner workings of almost anything you do online”.[15] Some visions are based around the concepts of decentralized autonomous organizations (DAOs).[16] Decentralized finance (DeFi) is another key concept; in it, users exchange currency without bank or government involvement.[4] Self-sovereign identity allows users to identify themselves without relying on an authentication system such as OAuth, in which a trusted party has to be reached in order to assess identity.[17]

Reception[edit]

Technologists and journalists have described Web3 as a possible solution to concerns about the over-centralization of the web in a few “Big Tech” companies.[4][11] Some have expressed the notion that Web3 could improve data securityscalability, and privacy beyond what is currently possible with Web 2.0 platforms.[14] Bloomberg states that sceptics say the idea “is a long way from proving its use beyond niche applications, many of them tools aimed at crypto traders”.[15] The New York Times reported that several investors are betting $27 billion that Web3 “is the future of the internet”.[18][19]

Some companies, including Reddit and Discord, have explored incorporating Web3 technologies into their platforms in late 2021.[4][20] After heavy user backlash, Discord later announced they had no plans to integrate such technologies.[21] The company’s CEO, Jason Citron, tweeted a screenshot suggesting it might be exploring integrating Web3 into their platform. This led some to cancel their paid subscriptions over their distaste for NFTs, and others expressed concerns that such a change might increase the amount of scams and spam they had already experienced on crypto-related Discord servers.[20] Two days later, Citron tweeted that the company had no plans to integrate Web3 technologies into their platform, and said that it was an internal-only concept that had been developed in a company-wide hackathon.[21]

Some legal scholars quoted by The Conversation have expressed concerns over the difficulty of regulating a decentralized web, which they reported might make it more difficult to prevent cybercrimeonline harassmenthate speech, and the dissemination of child abuse images.[13] But, the news website also states that, “[decentralized web] represents the cyber-libertarian views and hopes of the past that the internet can empower ordinary people by breaking down existing power structures.” Some other critics of Web3 see the concept as a part of a cryptocurrency bubble, or as an extension of blockchain-based trends that they see as overhyped or harmful, particularly NFTs.[20] Some critics have raised concerns about the environmental impact of cryptocurrencies and NFTs. Others have expressed beliefs that Web3 and the associated technologies are a pyramid scheme.[5]

Kevin Werbach, author of The Blockchain and the New Architecture of Trust,[22] said that “many so-called ‘web3’ solutions are not as decentralized as they seem, while others have yet to show they are scalable, secure and accessible enough for the mass market”, adding that this “may change, but it’s not a given that all these limitations will be overcome”.[23]

David Gerard, author of Attack of the 50 Foot Blockchain,[24] told The Register that “web3 is a marketing buzzword with no technical meaning. It’s a melange of cryptocurrencies, smart contracts with nigh-magical abilities, and NFTs just because they think they can sell some monkeys to morons”.[25]

Below is an article from MarketWatch.com Distributed Ledger series about the different forms and cryptocurrencies involved

From Marketwatch: https://www.marketwatch.com/story/bitcoin-is-so-2021-heres-why-some-institutions-are-set-to-bypass-the-no-1-crypto-and-invest-in-ethereum-other-blockchains-next-year-11639690654?mod=home-page

by Frances Yue, Editor of Distributed Ledger, Marketwatch.com

Clayton Gardner, co-CEO of crypto investment management firm Titan, told Distributed Ledger that as crypto embraces broader adoption, he expects more institutions to bypass bitcoin and invest in other blockchains, such as Ethereum, Avalanche, and Terra in 2022. which all boast smart-contract features.

Bitcoin traditionally did not support complex smart contracts, which are computer programs stored on blockchains, though a major upgrade in November might have unlocked more potential.

“Bitcoin was originally seen as a macro speculative asset by many funds and for many it still is,” Gardner said. “If anything solidifies its use case, it’s a store of value. It’s not really used as originally intended, perhaps from a medium of exchange perspective.”

For institutions that are looking for blockchains that can “produce utility and some intrinsic value over time,” they might consider some other smart contract blockchains that have been driving the growth of decentralized finance and web 3.0, the third generation of the Internet, according to Gardner. 

Bitcoin is still one of the most secure blockchains, but I think layer-one, layer-two blockchains beyond Bitcoin, will handle the majority of transactions and activities from NFT (nonfungible tokens) to DeFi,“ Gardner said. “So I think institutions see that and insofar as they want to put capital to work in the coming months, I think that could be where they just pump the capital.”

Decentralized social media? 

The price of Decentralized Social, or DeSo, a cryptocurrency powering a blockchain that supports decentralized social media applications, surged roughly 74% to about $164 from $94, after Deso was listed at Coinbase Pro on Monday, before it fell to about $95, according to CoinGecko.

In the eyes of Nader Al-Naji, head of the DeSo foundation, decentralized social media has the potential to be “a lot bigger” than decentralized finance.

“Today there are only a few companies that control most of what we see online,” Al-Naji told Distributed Ledger in an interview. But DeSo is “creating a lot of new ways for creators to make money,” Al-Naji said.

“If you find a creator when they’re small, or an influencer, you can invest in that, and then if they become bigger and more popular, you make money and they make and they get capital early on to produce their creative work,” according to AI-Naji.

BitClout, the first application that was created by AI-Naji and his team on the DeSo blockchain, had initially drawn controversy, as some found that they had profiles on the platform without their consent, while the application’s users were buying and selling tokens representing their identities. Such tokens are called “creator coins.”

AI-Naji responded to the controversy saying that DeSo now supports more than 200 social-media applications including Bitclout. “I think that if you don’t like those features, you now have the freedom to use any app you want. Some apps don’t have that functionality at all.”

 

But Before I get to the “selling monkeys to morons” quote,

I want to talk about

THE GOOD, THE BAD, AND THE UGLY

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THE GOOD

My foray into Science 2.0 and then pondering what the movement into a Science 3.0 led me to an article by Dr. Vladimir Teif, who studies gene regulation and the nucleosome, as well as creating a worldwide group of scientists who discuss matters on chromatin and gene regulation in a journal club type format.

For more information on this Fragile Nucleosome journal club see https://generegulation.org/fragile-nucleosome/.

Fragile Nucleosome is an international community of scientists interested in chromatin and gene regulation. Fragile Nucleosome is active in several spaces: one is the Discord server where several hundred scientists chat informally on scientific matters. You can join the Fragile Nucleosome Discord server. Another activity of the group is the organization of weekly virtual seminars on Zoom. Our webinars are usually conducted on Wednesdays 9am Pacific time (5pm UK, 6pm Central Europe). Most previous seminars have been recorded and can be viewed at our YouTube channel. The schedule of upcoming webinars is shown below. Our third activity is the organization of weekly journal clubs detailed at a separate page (Fragile Nucleosome Journal Club).

 

His lab site is at https://generegulation.org/ but had published a paper describing what he felt what the #science2_0 to #science3_0 transition would look like (see his blog page on this at https://generegulation.org/open-science/).

This concept of science 3.0 he had coined back in 2009.  As Dr Teif had mentioned

So essentially I first introduced this word Science 3.0 in 2009, and since then we did a lot to implement this in practice. The Twitter account @generegulation is also one of examples

 

This is curious as we still have an ill defined concept of what #science3_0 would look like but it is a good read nonetheless.

His paper,  entitled “Science 3.0: Corrections to the Science 2.0 paradigm” is on the Cornell preprint server at https://arxiv.org/abs/1301.2522 

 

Abstract

Science 3.0: Corrections to the Science 2.0 paradigm

The concept of Science 2.0 was introduced almost a decade ago to describe the new generation of online-based tools for researchers allowing easier data sharing, collaboration and publishing. Although technically sound, the concept still does not work as expected. Here we provide a systematic line of arguments to modify the concept of Science 2.0, making it more consistent with the spirit and traditions of science and Internet. Our first correction to the Science 2.0 paradigm concerns the open-access publication models charging fees to the authors. As discussed elsewhere, we show that the monopoly of such publishing models increases biases and inequalities in the representation of scientific ideas based on the author’s income. Our second correction concerns post-publication comments online, which are all essentially non-anonymous in the current Science 2.0 paradigm. We conclude that scientific post-publication discussions require special anonymization systems. We further analyze the reasons of the failure of the current post-publication peer-review models and suggest what needs to be changed in Science 3.0 to convert Internet into a large journal club. [bold face added]
In this paper it is important to note the transition of a science 1.0, which involved hard copy journal publications usually only accessible in libraries to a more digital 2.0 format where data, papers, and ideas could be easily shared among networks of scientists.
As Dr. Teif states, the term “Science 2.0” had been coined back in 2009, and several influential journals including Science, Nature and Scientific American endorsed this term and suggested scientists to move online and their discussions online.  However, even at present there are thousands on this science 2.0 platform, Dr Teif notes the number of scientists subscribed to many Science 2.0 networking groups such as on LinkedIn and ResearchGate have seemingly saturated over the years, with little new members in recent times. 
The consensus is that science 2.0 networking is:
  1. good because it multiplies the efforts of many scientists, including experts and adds to the scientific discourse unavailable on a 1.0 format
  2. that online data sharing is good because it assists in the process of discovery (can see this evident with preprint servers, bio-curated databases, Github projects)
  3. open-access publishing is beneficial because free access to professional articles and open-access will be the only publishing format in the future (although this is highly debatable as many journals are holding on to a type of “hybrid open access format” which is not truly open access
  4. only sharing of unfinished works and critiques or opinions is good because it creates visibility for scientists where they can receive credit for their expert commentary

There are a few concerns on Science 3.0 Dr. Teif articulates:

A.  Science 3.0 Still Needs Peer Review

Peer review of scientific findings will always be an imperative in the dissemination of well-done, properly controlled scientific discovery.  As Science 2.0 relies on an army of scientific volunteers, the peer review process also involves an army of scientific experts who give their time to safeguard the credibility of science, by ensuring that findings are reliable and data is presented fairly and properly.  It has been very evident, in this time of pandemic and the rapid increase of volumes of preprint server papers on Sars-COV2, that peer review is critical.  Many of these papers on such preprint servers were later either retracted or failed a stringent peer review process.

Now many journals of the 1.0 format do not generally reward their peer reviewers other than the self credit that researchers use on their curriculum vitaes.  Some journals, like the MDPI journal family, do issues peer reviewer credits which can be used to defray the high publication costs of open access (one area that many scientists lament about the open access movement; where the burden of publication cost lies on the individual researcher).

An issue which is highlighted is the potential for INFORMATION NOISE regarding the ability to self publish on Science 2.0 platforms.

 

The NEW BREED was born in 4/2012

An ongoing effort on this platform, https://pharmaceuticalintelligence.com/, is to establish a scientific methodology for curating scientific findings where one the goals is to assist to quell the information noise that can result from the massive amounts of new informatics and data occurring in the biomedical literature. 

B.  Open Access Publishing Model leads to biases and inequalities in the idea selection

The open access publishing model has been compared to the model applied by the advertising industry years ago and publishers then considered the journal articles as “advertisements”.  However NOTHING could be further from the truth.  In advertising the publishers claim the companies not the consumer pays for the ads.  However in scientific open access publishing, although the consumer (libraries) do not pay for access the burden of BOTH the cost of doing the research and publishing the findings is now put on the individual researcher.  Some of these publishing costs can be as high as $4000 USD per article, which is very high for most researchers.  However many universities try to refund the publishers if they do open access publishing so it still costs the consumer and the individual researcher, limiting the cost savings to either.  

However, this sets up a situation in which young researchers, who in general are not well funded, are struggling with the publication costs, and this sets up a bias or inequitable system which rewards the well funded older researchers and bigger academic labs.

C. Post publication comments and discussion require online hubs and anonymization systems

Many recent publications stress the importance of a post-publication review process or system yet, although many big journals like Nature and Science have their own blogs and commentary systems, these are rarely used.  In fact they show that there are just 1 comment per 100 views of a journal article on these systems.  In the traditional journals editors are the referees of comments and have the ability to censure comments or discourse.  The article laments that comments should be easy to do on journals, like how easy it is to make comments on other social sites, however scientists are not offering their comments or opinions on the matter. 

In a personal experience, 

a well written commentary goes through editors which usually reject a comment like they were rejecting an original research article.  Thus many scientists, I believe, after fashioning a well researched and referenced reply, do not get the light of day if not in the editor’s interests.  

Therefore the need for anonymity is greatly needed and the lack of this may be the hindrance why scientific discourse is so limited on these types of Science 2.0 platforms.  Platforms that have success in this arena include anonymous platforms like Wikipedia or certain closed LinkedIn professional platforms but more open platforms like Google Knowledge has been a failure.

A great example on this platform was a very spirited conversation on LinkedIn on genomics, tumor heterogeneity and personalized medicine which we curated from the LinkedIn discussion (unfortunately LinkedIn has closed many groups) seen here:

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

 

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

In this discussion, it was surprising that over a weekend so many scientists from all over the world contributed to a great discussion on the topic of tumor heterogeneity.

But many feel such discussions would be safer if they were anonymized.  However then researchers do not get any credit for their opinions or commentaries.

A Major problem is how to take the intangible and make them into tangible assets which would both promote the discourse as well as reward those who take their time to improve scientific discussion.

This is where something like NFTs or a decentralized network may become important!

See

https://pharmaceuticalintelligence.com/portfolio-of-ip-assets/

 

UPDATED 5/09/2022

Below is an online @TwitterSpace Discussion we had with some young scientists who are just starting out and gave their thoughts on what SCIENCE 3.0 and the future of dissemination of science might look like, in light of this new Meta Verse.  However we have to define each of these terms in light of Science and not just the Internet as merely a decentralized marketplace for commonly held goods.

This online discussion was tweeted out and got a fair amount of impressions (60) as well as interactors (50).

 For the recording on both Twitter as well as on an audio format please see below

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Set a reminder for my upcoming Space! <a href=”https://t.co/7mOpScZfGN”>https://t.co/7mOpScZfGN</a&gt; <a href=”https://twitter.com/Pharma_BI?ref_src=twsrc%5Etfw”>@Pharma_BI</a&gt; <a href=”https://twitter.com/PSMTempleU?ref_src=twsrc%5Etfw”>@PSMTempleU</a&gt; <a href=”https://twitter.com/hashtag/science3_0?src=hash&amp;ref_src=twsrc%5Etfw”>#science3_0</a&gt; <a href=”https://twitter.com/science2_0?ref_src=twsrc%5Etfw”>@science2_0</a></p>&mdash; Stephen J Williams (@StephenJWillia2) <a href=”https://twitter.com/StephenJWillia2/status/1519776668176502792?ref_src=twsrc%5Etfw”>April 28, 2022</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js&#8221; charset=”utf-8″></script>

 

 

To introduce this discussion first a few startoff material which will fram this discourse

 






The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?

Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.

Each version of Science has their caveats: their benefits as well as deficiencies. This curation and the ongoing discussion is meant to solidy the basis for the new format, along with definitions and determination of structure.

We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!

 

We really need people and organizations to get on top of this new Web 3.0 or metaverse so the similar issues do not get in the way: namely we need to create an organizing structure (maybe as knowledgebases), we need INCENTIVIZED co-curators, and we need ANALYSIS… lots of it!!

Are these new technologies the cure or is it just another headache?

 

There were a few overarching themes whether one was talking about AI, NLP, Virtual Reality, or other new technologies with respect to this new meta verse and a concensus of Decentralized, Incentivized, and Integrated was commonly expressed among the attendees

The Following are some slides from representative Presentations

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Other article of note on this topic on this Open Access Scientific Journal Include:

Electronic Scientific AGORA: Comment Exchanges by Global Scientists on Articles published in the Open Access Journal @pharmaceuticalintelligence.com – Four Case Studies

eScientific Publishing a Case in Point: Evolution of Platform Architecture Methodologies and of Intellectual Property Development (Content Creation by Curation) Business Model 

e-Scientific Publishing: The Competitive Advantage of a Powerhouse for Curation of Scientific Findings and Methodology Development for e-Scientific Publishing – LPBI Group, A Case in Point

@PharmaceuticalIntelligence.com –  A Case Study on the LEADER in Curation of Scientific Findings

Real Time Coverage @BIOConvention #BIO2019: Falling in Love with Science: Championing Science for Everyone, Everywhere

Old Industrial Revolution Paradigm of Education Needs to End: How Scientific Curation Can Transform Education

 

Read Full Post »

Article Title, Author/Curator’s Name and Article Views >1,000, 4/2012 – 1/2019 @pharmaceuticalintelligence.com

 

Reporter: Aviva Lev-Ari, PhD, RN

 

Expert, Author, Writer’s Initials

Name & Bio

Roles

@LPBI Group

LHB Larry Bernstein, MD, FACP,

 

Member of the Board

Expert, Author, Writer – All Specialties of Medicine & Pathology

Content Consultant to Series B,C,D,E

Editor, Series D, Vol. 1, Series E, Vols 2,3,

Co-Editor – BioMed E-Series 13 of the 16 Vols

JDP Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC,

 

Expert, Author, Writer, All Specialties of Medicine, Cardiology and Cardiac Imaging

Content Consultant for SERIES A, Cardiovascular Diseases Co-Editor: Vols 2,3,4,5,6

ALA Aviva Lev-Ari, PhD, RN,

-Ex – SRI, Int’l

-Ex – MITRE

-Ex – McGraw-Hill

Director and Founder

Editor-in-Chief, @pharmaceuticalintelligence.com

Methodologies Developer:

  • Journal Platform Architect,
  • CURATION of Scientific Findings Modules,
  • REALTIME eProceedings Digital 1-Click Publishing

Expert, Author, Writer:

  • Analytics
  • Molecular Cardiology
  • Vascular Biology
TB Tilda Barliya, PhD,

@BIU

Expert, Author, Writer: Nanotechnology for Drug Delivery

Co-Editor, Series C, Vols. 1,2

DN Dror Nir, PhD,

 

Expert, Author, Writer: Cancer & Medical Imaging Algorithms
ZR       
Ziv Raviv, PhD,
@Technion
Expert, Author, Writer: Biological Sciences, Cancer
ZS Zohi Sternberg, PhD, Expert, GUEST Author, Writer

 

Expert, GUEST Author, Writer

Neurological Sciences

SJW Stephen J. Williams, PhD Pharmacology, BSc Toxicology

Ex-Fox Chase

EAW – Cancer Biology

Co-Editor, Series A, Vol.1

Co-Editor, Series B, Genomics: Vols. 1,2

Co-Editor, Series C, Cancer, Vols. 1,2

DS Demet Sag, PhD, CRA, GCP,

 

Expert, Author, Writer: Genome Biology, Immunology, Biological Sciences: Cancer
SS Sudipta Saha, PhD,

 

Expert, Author, Writer: Reproductive Biology, Endocrinology, Bio-Instrumentation

Co-Editor, Series D, Volume 2, Infectious Diseases

AV Aviral Vatsa, PhD, MBBS

 

Expert, Author, Writer: Medical Sciences, Bone Disease, Human Sensation and Cellular Transduction: Physiology and Therapeutics

 

RS Ritu Saxena, PhD,

 

Expert, Author, Writer: Biological Sciences, Bone Disease, Cancer (Lung, Liver)
GST Gail S. Thornton, PhD(c),

Ex-MERCK

Contributing Editor, Author and Medical Writer

Co-Editor, Series E, Vol.1 Voices of Patients

RN Raphael Nir, PhD, MSM, MSc

Ex-ScheringPlough

– Expert, Author, Writer – Member of the Cancer Research Team: Brain Cancer, Liver Cancer, Cytokines

– CSO, SBH Sciences, Inc.

MB Michael R. Briggs, Ph.D.

Ex-Pfizer

– Expert, Author, Writer – Member of the Cancer Research Team: NASH

– CSO, Woodland Biosciences

AK Alan F. Kaul, R.Ph., Pharm.D, M.Sc., M.B.A., FCCP, Expert, Author, Writer

Ex-Director BWH Pharmacy

Expert, Author, Writer: Pharmacology – all aspects of Drug development and dispensation, Policy analyst
AS Anamika Sarkar, PhD,

 

Expert, Author, Writer: Computation Biology & Bioinformatics
MWF Marcus Feldman, PhD,

Stanford University, Biological Sciences, Center for Genomics

751
Research items
51,402
Reads
39,126
Citations
Member of the Board,

Scientific Counsel: Life Sciences,

Content Consultant Series B, Genomics, Vols. 1,2

Co-Editor, Vol. 2, NGS

 

Article Title and Views >1,000,

4/2012 – -1/2018

 

 

 

 

Home page / Archives

Authors

Curators

Reporters

by Name

 

Views

by 

eReaders

 

 

 

 

 

600,145

Is the Warburg Effect the Cause or the Effect of Cancer: A 21st Century View? LHB 16,720
Do Novel Anticoagulants Affect the PT/INR? The Cases of XARELTO (rivaroxaban) and PRADAXA (dabigatran)

JDP

ALA

13,225
Paclitaxel vs Abraxane (albumin-bound paclitaxel) TB 11,872
Recent comprehensive review on the role of ultrasound in breast cancer management DN 11,715
Clinical Indications for Use of Inhaled Nitric Oxide (iNO) in the Adult Patient Market: Clinical Outcomes after Use, Therapy Demand and Cost of Care ALA 7,045
Apixaban (Eliquis): Mechanism of Action, Drug Comparison and Additional Indications ALA 6,435
Mesothelin: An early detection biomarker for cancer (By Jack Andraka) TB 6,309
Our TEAM ALA 6,213
Akt inhibition for cancer treatment, where do we stand today? ZR 4,744
Biochemistry of the Coagulation Cascade and Platelet Aggregation: Nitric Oxide: Platelets, Circulatory Disorders, and Coagulation Effects LHB 4,508
Newer Treatments for Depression: Monoamine, Neurotrophic Factor & Pharmacokinetic Hypotheses ZS 4,188
AstraZeneca’s WEE1 protein inhibitor AZD1775 Shows Success Against Tumors with a SETD2 mutation SJW 4,128
Confined Indolamine 2, 3 dioxygenase (IDO) Controls the Hemeostasis of Immune Responses for Good and Bad DS 3,678
The Centrality of Ca(2+) Signaling and Cytoskeleton Involving Calmodulin Kinases and Ryanodine Receptors in Cardiac Failure, Arterial Smooth Muscle, Post-ischemic Arrhythmia, Similarities and Differences, and Pharmaceutical Targets LHB 3,652
FDA Guidelines For Developmental and Reproductive Toxicology (DART) Studies for Small Molecules SJW 3,625
Cardiovascular Diseases, Volume One: Perspectives on Nitric Oxide in Disease Mechanisms Multiple

Authors

3,575
Interaction of enzymes and hormones SS 3,546
AMPK Is a Negative Regulator of the Warburg Effect and Suppresses Tumor Growth In Vivo SJW 3,403
Causes and imaging features of false positives and false negatives on 18F-PET/CT in oncologic imaging DN 3,399
Introduction to Transdermal Drug Delivery (TDD) system and nanotechnology TB 3,371
Founder ALA 3,363
BioMed e-Series ALA 3,246
Signaling and Signaling Pathways LHB 3,178
Sexed Semen and Embryo Selection in Human Reproduction and Fertility Treatment SS 3,044
Alternative Designs for the Human Artificial Heart: Patients in Heart Failure – Outcomes of Transplant (donor)/Implantation (artificial) and Monitoring Technologies for the Transplant/Implant Patient in the Community

JDP

LHB

ALA

3,034
The mechanism of action of the drug ‘Acthar’ for Systemic Lupus Erythematosus (SLE) Dr. Karra 3,016
VISION ALA 2,988
Targeting the Wnt Pathway [7.11] LHB 2,961
Bone regeneration and nanotechnology AV 2,922
Pacemakers, Implantable Cardioverter Defibrillators (ICD) and Cardiac Resynchronization Therapy (CRT) ALA 2,892
The History and Creators of Total Parenteral Nutrition LHB 2,846
Funding, Deals & Partnerships ALA 2,708
Paclitaxel: Pharmacokinetic (PK), Pharmacodynamic (PD) and Pharmacogenpmics (PG) TB 2,700
LIK 066, Novartis, for the treatment of type 2 diabetes LHB 2,693
FDA Adds Cardiac Drugs to Watch List – TOPROL-XL® ALA 2,606
Mitochondria: Origin from oxygen free environment, role in aerobic glycolysis, metabolic adaptation LHB 2,579
Nitric Oxide and Platelet Aggregation Dr. Karra 2,550
Treatment Options for Left Ventricular Failure – Temporary Circulatory Support: Intra-aortic balloon pump (IABP) – Impella Recover LD/LP 5.0 and 2.5, Pump Catheters (Non-surgical) vs Bridge Therapy: Percutaneous Left Ventricular Assist Devices (pLVADs) and LVADs (Surgical) LHB 2,549
Isoenzymes in cell metabolic pathways LHB 2,535
“The Molecular pathology of Breast Cancer Progression” TB 2,491
In focus: Circulating Tumor Cells RS 2,465
Nitric Oxide Function in Coagulation – Part II LHB 2,444
Monoclonal Antibody Therapy and Market DS 2,443
Update on FDA Policy Regarding 3D Bioprinted Material SJW 2,410
Journal PharmaceuticalIntelligence.com ALA 2,340
A Primer on DNA and DNA Replication LHB 2,323
Pyrroloquinoline quinone (PQQ) – an unproved supplement LHB 2,294
Integrins, Cadherins, Signaling and the Cytoskeleton LHB 2,265
Evolution of Myoglobin and Hemoglobin LHB 2,251
DNA Structure and Oligonucleotides LHB 2,187
Lipid Metabolism LHB 2,176
Non-small Cell Lung Cancer drugs – where does the Future lie? RS 2,143
Biosimilars: CMC Issues and Regulatory Requirements ALA 2,101
The SCID Pig: How Pigs are becoming a Great Alternate Model for Cancer Research SJW 2,092
About ALA 2,076
Sex Hormones LHB 2,066
CD47: Target Therapy for Cancer TB 2,041
Peroxisome proliferator-activated receptor (PPAR-gamma) Receptors Activation: PPARγ transrepression for Angiogenesis in Cardiovascular Disease and PPARγ transactivation for Treatment of Diabetes ALA 2,017
Swiss Paraplegic Centre, Nottwil, Switzerland – A World-Class Clinic for Spinal Cord Injuries GST 1,989
Introduction to Tissue Engineering; Nanotechnology applications TB 1,964
Problems of vegetarianism SS 1,940
The History of Infectious Diseases and Epidemiology in the late 19th and 20th Century LHB 1,817
The top 15 best-selling cancer drugs in 2022 & Projected Sales in 2020 of World’s Top Ten Oncology Drugs ALA 1,816
Nanotechnology: Detecting and Treating metastatic cancer in the lymph node TB 1,812
Unique Selling Proposition (USP) — Building Pharmaceuticals Brands ALA 1,809
Wnt/β-catenin Signaling [7.10] LHB 1,777
The role of biomarkers in the diagnosis of sepsis and patient management LHB 1,766
Neonatal Pathophysiology LHB 1,718
Nanotechnology and MRI imaging TB 1,672
Cardiovascular Complications: Death from Reoperative Sternotomy after prior CABG, MVR, AVR, or Radiation; Complications of PCI; Sepsis from Cardiovascular Interventions JDP

ALA

1,659
Ultrasound-based Screening for Ovarian Cancer DN 1,655
Justin D. Pearlman, AB, MD, ME, PhD, MA, FACC, Expert, Author, Writer, Editor & Content Consultant for e-SERIES A: Cardiovascular Diseases JDP 1,653
Scientific and Medical Affairs Chronological CV ALA 1,619
Competition in the Ecosystem of Medical Devices in Cardiac and Vascular Repair: Heart Valves, Stents, Catheterization Tools and Kits for Open Heart and Minimally Invasive Surgery (MIS) ALA 1,609
Stenting for Proximal LAD Lesions ALA 1,603
Mitral Valve Repair: Who is a Patient Candidate for a Non-Ablative Fully Non-Invasive Procedure? JDP

ALA

1,602
Nitric Oxide, Platelets, Endothelium and Hemostasis (Coagulation Part II) LHB 1,597
Outcomes in High Cardiovascular Risk Patients: Prasugrel (Effient) vs. Clopidogrel (Plavix); Aliskiren (Tekturna) added to ACE or added to ARB LHB 1,588
Diet and Diabetes LHB 1,572
Clinical Trials Results for Endothelin System: Pathophysiological role in Chronic Heart Failure, Acute Coronary Syndromes and MI – Marker of Disease Severity or Genetic Determination? ALA 1,546
Dealing with the Use of the High Sensitivity Troponin (hs cTn) Assays LHB 1,540
Biosimilars: Intellectual Property Creation and Protection by Pioneer and by Biosimilar Manufacturers ALA 1,534
Altitude Adaptation LHB 1,527
Baby’s microbiome changing due to caesarean birth and formula feeding SS 1,498
Interview with the co-discoverer of the structure of DNA: Watson on The Double Helix and his changing view of Rosalind Franklin ALA 1,488
Triple Antihypertensive Combination Therapy Significantly Lowers Blood Pressure in Hard-to-Treat Patients with Hypertension and Diabetes ALA 1,476
IDO for Commitment of a Life Time: The Origins and Mechanisms of IDO, indolamine 2, 3-dioxygenase DS 1,469
CRISPR/Cas9: Contributions on Endoribonuclease Structure and Function, Role in Immunity and Applications in Genome Engineering LHB 1,468
Cancer Signaling Pathways and Tumor Progression: Images of Biological Processes in the Voice of a Pathologist Cancer Expert LHB 1,452
Signaling transduction tutorial LHB 1,443
Diagnostic Evaluation of SIRS by Immature Granulocytes LHB 1,440
UPDATED: PLATO Trial on ACS: BRILINTA (ticagrelor) better than Plavix® (clopidogrel bisulfate): Lowering chances of having another heart attack ALA 1,426
Cardio-oncology and Onco-Cardiology Programs: Treatments for Cancer Patients with a History of Cardiovascular Disease ALA 1,424
Nanotechnology and Heart Disease TB 1,419
Aviva Lev-Ari, PhD, RN, Director and Founder ALA 1,416
Cardiotoxicity and Cardiomyopathy Related to Drugs Adverse Effects LHB 1,415
Nitric Oxide and it’s impact on Cardiothoracic Surgery TB 1,405
A New Standard in Health Care – Farrer Park Hospital, Singapore’s First Fully Integrated Healthcare/Hospitality Complex GST 1,402
Mitochondrial Damage and Repair under Oxidative Stress LHB 1,398
Ovarian Cancer and fluorescence-guided surgery: A report TB 1,395
Sex determination vs. Sex differentiation SS 1,393
LPBI Group ALA 1,372
Closing the Mammography gap DN 1,368
Cytoskeleton and Cell Membrane Physiology LHB 1,367
Crucial role of Nitric Oxide in Cancer RS 1,364
Medical 3D Printing ALA 1,332
Survivals Comparison of Coronary Artery Bypass Graft (CABG) and Percutaneous Coronary Intervention (PCI) / Coronary Angioplasty LHB 1,325
The Final Considerations of the Role of Platelets and Platelet Endothelial Reactions in Atherosclerosis and Novel Treatments LHB 1,310
Disruption of Calcium Homeostasis: Cardiomyocytes and Vascular Smooth Muscle Cells: The Cardiac and Cardiovascular Calcium Signaling Mechanism

LHB

JDP

ALA

1,301
Mitochondrial Dynamics and Cardiovascular Diseases RS 1,284
Nitric Oxide and Immune Responses: Part 2 AV 1,282
Liver Toxicity halts Clinical Trial of IAP Antagonist for Advanced Solid Tumors SJW 1,269
Inactivation of the human papillomavirus E6 or E7 gene in cervical carcinoma cells using a bacterial CRISPR/Cas ALA 1,261
Autophagy LHB 1,255
Mitochondrial fission and fusion: potential therapeutic targets? RS 1,246
Summary of Lipid Metabolism LHB 1,239
Nitric Oxide has a Ubiquitous Role in the Regulation of Glycolysis – with a Concomitant Influence on Mitochondrial Function LHB 1,233
Future of Calcitonin…? Dr. Karra 1,211
Transcatheter Aortic Valve Implantation (TAVI): FDA approves expanded indication for two transcatheter heart valves for patients at intermediate risk for death or complications associated with open-heart surgery ALA 1,197
Gamma Linolenic Acid (GLA) as a Therapeutic tool in the Management of Glioblastoma

RN

MB

1,193
Nanotechnology and HIV/AIDS Treatment TB 1,181
Patiromer – New drug for Hyperkalemia ALA 1,179
‘Gamifying’ Drug R&D: Boehringer Ingelheim, Sanofi, Eli Lilly ALA 1,177
A Patient’s Perspective: On Open Heart Surgery from Diagnosis and Intervention to Recovery Guest Author: Ferez S. Nallaseth, Ph.D. 1,173
Assessing Cardiovascular Disease with Biomarkers LHB 1,167
Development Of Super-Resolved Fluorescence Microscopy LHB 1,166
Ubiquitin-Proteosome pathway, Autophagy, the Mitochondrion, Proteolysis and Cell Apoptosis: Part III LHB 1,162
Atrial Fibrillation contributing factor to Death, Autopsy suggests CEO Dave Goldberg had heart arrhythmia before death ALA 1,159
Linus Pauling: On Lipoprotein(a) Patents and On Vitamin C ALA 1,156
Bystolic’s generic Nebivolol – Positive Effect on circulating Endothelial Progenitor Cells Endogenous Augmentation ALA 1,154
The History of Hematology and Related Sciences LHB 1,151
Heroes in Medical Research: Barnett Rosenberg and the Discovery of Cisplatin SJW 1,146
Overview of New Strategy for Treatment of T2DM: SGLT2 Inhibiting Oral Antidiabetic Agents AV 1,143
Imatinib (Gleevec) May Help Treat Aggressive Lymphoma: Chronic Lymphocytic Leukemia (CLL) ALA 1,140
Issues in Personalized Medicine in Cancer: Intratumor Heterogeneity and Branched Evolution Revealed by Multiregion Sequencing SJW 1,137
New England Compounding Center: A Family Business AK 1,120
EpCAM [7.4] LHB 1,113
Amyloidosis with Cardiomyopathy LHB 1,110
Can Mobile Health Apps Improve Oral-Chemotherapy Adherence? The Benefit of Gamification. SJW 1,095
Acoustic Neuroma, Neurinoma or Vestibular Schwannoma: Treatment Options ALA 1,089
Treatment of Refractory Hypertension via Percutaneous Renal Denervation ALA 1,088
Proteomics – The Pathway to Understanding and Decision-making in Medicine LHB 1,085
Low Bioavailability of Nitric Oxide due to Misbalance in Cell Free Hemoglobin in Sickle Cell Disease – A Computational Model AS 1,085
Pancreatic Cancer: Genetics, Genomics and Immunotherapy TB 1,083
A NEW ERA OF GENETIC MANIPULATION   DS 1,075
Targeting Mitochondrial-bound Hexokinase for Cancer Therapy ZR 1,074
Normal and Anomalous Coronary Arteries: Dual Source CT in Cardiothoracic Imaging JDP

ALA

1,062
Transdermal drug delivery (TDD) system and nanotechnology: Part II TB 1,057
Lung Cancer (NSCLC), drug administration and nanotechnology TB 1,046
Pharma World: The Pharmaceutical Industry in Southeast Asia – Pharma CPhI 20-22 March, 2013, Jakarta International Expo, Jakarta, Indonesia ALA 1,045
Nitric Oxide and Sepsis, Hemodynamic Collapse, and the Search for Therapeutic Options LHB 1,044
Targeted delivery of therapeutics to bone and connective tissues: current status and challenges- Part I AV 1,044
Press Coverage ALA 1,036
Carbohydrate Metabolism LHB 1,036
Open Abdominal Aortic Aneurysm (AAA) repair (OAR) vs. Endovascular AAA Repair (EVAR) in Chronic Kidney Disease Patients – Comparison of Surgery Outcomes LHB

ALA

1,032
In focus: Melanoma Genetics RS 1,018
Cholesteryl Ester Transfer Protein (CETP) Inhibitor: Potential of Anacetrapib to treat Atherosclerosis and CAD ALA 1,015
Medical Devices Start Ups in Israel: Venture Capital Sourced Locally – Rainbow Medical (GlenRock) & AccelMed (Arkin Holdings) ALA 1,007
The Development of siRNA-Based Therapies for Cancer ZR 1,003

Other related articles published in this Open Access Online Scientific Journal include the following:

FIVE years of e-Scientific Publishing @pharmaceuticalintellicence.com, Top Articles by Author and by e-Views >1,000, 4/27/2012 to 1/29/2018

https://pharmaceuticalintelligence.com/2017/04/28/five-years-of-e-scientific-publishing-pharmaceuticalintellicence-com-top-articles-by-author-and-by-e-views-1000-4272012-to-4272017/

Read Full Post »

Three Genres in e-Scientific Publishing AND Three Scientists’ Dilemmas

Curator: Aviva Lev-Ari, PhD, RN

 

That’s what I tell students. The way to succeed is to get born at the right time and in the right place. If you can do that then you are bound to succeed. You have to be receptive and have some talent as well.

Professor Sydney Brenner, a professor of Genetic medicine at the University of Cambridge and Nobel Laureate in Physiology or Medicine in 2002

 

 

Cell/Nature/Science

[CNS]

 Subscription-based Access

Open Access

  1. Online journals, to which scientists pay an upfront free to cover editing costs, which then ensure the work is available free to access for anyone in perpetuity

 

Curation of Scientific Findings

i.e., Kindle Direct Publishing [KDP] – Royalty-based system

  1. Free content to e-Readers
  2. Expert, Authors, Writers -Volunteers
  3. Editor -Voluneers
Confirming or disproving past studies Confirming or disproving past studies
Decades-long pursuit of a risky “moonshot” Decades-long pursuit of a risky “moonshot”
Trendy topics with Editors Trendy topics with Editors

 

Genres in e-Scientific Publishing

(A) Cell/Nature/Science

 – June 27, 2017

Elizabeth Dzeng — Feb 24th, 2014

  • http://www.cell.com/
  • http://www.sciencemag.org/
  • https://www.nature.com/
  • In 1998, Elsevier rolled out its plan for the internet age, which would come to be called “The Big Deal”. It offered electronic access to bundles of hundreds of journals at a time: a university would pay a set fee each year – according to a report based on freedom of information requests, Cornell University’s 2009 tab was just short of $2m – and any student or professor could download any journal they wanted through Elsevier’s website. Universities signed up en masse. …. Elsevier owned 24% of the scientific journal market, while Maxwell’s old partners Springer, and his crosstown rivals Wiley-Blackwell, controlled about another 12% each. These three companies accounted for half the market. (An Elsevier representative familiar with the report told me that by their own estimate they publish only 16% of the scientific literature.)  – June 27, 2017.  Elsevier published 420,000 papers last year, after receiving 1.5m submissions  – June 28, 2017 [numbers correction to 6/27/2017.]

(B) Open Access Journals and the Phenomenon

  1. Biochemistry
  2. Biophysics and Structural Biology
  3. Cancer Biology
  4. Cell Biology
  5. Computational and Systems Biology
  6. Developmental Biology and Stem Cells
  7. Epidemiology and Global Health
  8. Genomics and Evolutionary Biology
  9. Microbiology and Infectious Disease
  10. Neuroscience

(C) Curation of Scientific Findings

Scientists’ Dilemmas

(1) Confirming or disproving past studies

(2) Decades-long pursuit of a risky “moonshot”

(3) Trendy Topics with Editors 

 

@ PharmaceuticalIntelligence.com –  A Case Study on the LEADER in Curation of Scientific Findings

https://pharmaceuticalintelligence.com/2017/06/29/pharmaceuticalintelligence-com-a-case-study-on-the-leader-in-curation-of-scientific-findings/

Product Details

Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation: The Art of Scientific & Medical Curation

Nov 29, 2015 | Kindle eBook

by Larry H. Bernstein MD FCAP and Aviva Lev-Ari PhD RN
Subscribers read for free.
Auto-delivered wirelessly
Sold by: Amazon Digital Services LLC

 

Read Full Post »

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

 

Babies born at or before 25 weeks have quite low survival outcomes, and in the US it is the leading cause of infant mortality and morbidity. Just a few weeks of extra ‘growing time’ can be the difference between severe health problems and a relatively healthy baby.

 

Researchers from The Children’s Hospital of Philadelphia (USA) Research Institute have shown it’s possible to nurture and protect a mammal in late stages of gestation inside an artificial womb; technology which could become a lifesaver for many premature human babies in just a few years.

 

The researchers took eight lambs between 105 to 120 days gestation (the physiological equivalent of 23 to 24 weeks in humans) and placed them inside the artificial womb. The artificial womb is a sealed and sterile bag filled with an electrolyte solution which acts like amniotic fluid in the uterus. The lamb’s own heart pumps the blood through the umbilical cord into a gas exchange machine outside the bag.

 

The artificial womb worked in this study and after just four weeks the lambs’ brains and lungs had matured like normal. They had also grown wool and could wiggle, open their eyes, and swallow. Although this study is looking incredibly promising but getting the research up to scratch for human babies still requires a big leap.

 

Nevertheless, if all goes well, the researchers hope to test the device on premature humans within three to five years. Potential therapeutic applications of this invention may include treatment of fetal growth retardation related to placental insufficiency or the salvage of preterm infants threatening to deliver after fetal intervention or fetal surgery.

 

The technology may also provide the opportunity to deliver infants affected by congenital malformations of the heart, lung and diaphragm for early correction or therapy before the institution of gas ventilation. Numerous applications related to fetal pharmacologic, stem cell or gene therapy could be facilitated by removing the possibility for maternal exposure and enabling direct delivery of therapeutic agents to the isolated fetus.

 

References:

 

https://www.nature.com/articles/ncomms15112

 

 

https://www.sciencealert.com/researchers-have-successfully-grown-premature-lambs-in-an-artificial-womb

 

http://www.npr.org/sections/health-shots/2017/04/25/525044286/scientists-create-artificial-womb-that-could-help-prematurely-born-babies

 

http://www.telegraph.co.uk/science/2017/04/25/artificial-womb-promises-boost-survival-premature-babies/

 

https://www.theguardian.com/science/2017/apr/25/artificial-womb-for-premature-babies-successful-in-animal-trials-biobag

 

http://www.theblaze.com/news/2017/04/25/new-artificial-womb-technology-could-keep-babies-born-prematurely-alive-and-healthy/

 

http://www.theverge.com/2017/4/25/15421734/artificial-womb-fetus-biobag-uterus-lamb-sheep-birth-premie-preterm-infant

 

http://www.abc.net.au/news/2017-04-26/artificial-womb-could-one-day-keep-premature-babies-alive/8472960

 

https://www.theatlantic.com/health/archive/2017/04/preemies-floating-in-fluid-filled-bags/524181/

 

http://www.independent.co.uk/news/health/artificial-womb-save-premature-babies-lives-scientists-create-childrens-hospital-philadelphia-nature-a7701546.html

 

https://www.cnet.com/news/artificial-womb-births-premature-lambs-human-infants/

 

https://science.slashdot.org/story/17/04/25/2035243/an-artificial-womb-successfully-grew-baby-sheep—-and-humans-could-be-next

 

http://newatlas.com/artificial-womb-premature-babies/49207/

 

https://www.geneticliteracyproject.org/2015/06/12/artificial-wombs-the-coming-era-of-motherless-births/

 

http://news.nationalgeographic.com/2017/04/artificial-womb-lambs-premature-babies-health-science/

 

https://motherboard.vice.com/en_us/article/artificial-womb-free-births-just-got-a-lot-more-real-cambridge-embryo-reproduction

 

http://www.disclose.tv/news/The_Artificial_Womb_Is_Born_Welcome_To_The_WORLD_Of_The_MATRIX/114199

 

 

Read Full Post »

Protein profiling in cancer and metabolic diseases

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Deep Protein Profiling Key

Company has encouraged by two recent reports that emphasise the importance of protein profiling to improve outcomes in cancer treatment.

http://www.technologynetworks.com/Proteomics/news.aspx?ID=190145

Proteome Sciences plc has strongly encouraged by two recent reports that emphasise the importance of protein profiling to improve outcomes in cancer treatment. These highlight the growing need for more detailed, personal assessment of protein profiles to improve the management of cancer treatment.

In the first study two groups from University College London and Cancer Research UK demonstrated that genetic mutations in cancer can lead to changes in the proteins on the cell surface1. These are new sequences which are seen as foreign by the body’s immune system and, with appropriate immunotherapy, the level of response in lung cancer was greatly enhanced.

However many of the patients with these types of mutations unfortunately still did not respond which highlighted the need for deeper analysis of the protein expression in tumours in order to better appreciate the mechanisms that contribute to treatment failure.

The second study, led by Professor Nigel Bundred of Manchester University, reported that use of two drugs that act on the same breast cancer target, an over-expressing protein called Her-2, were able to eradicate detectable tumours in around 10% of those treated in just 11 days, with 87% of those treated having a proteomic change indicating cells had stopped growing and/or cell death had increased2.

Whilst these results appear very promising it is worth noting that the over-expressing Her-2 target is only present in about 20% of breast tumours meaning this combination therapy was successful in clearing tumours in just 2% of the total breast cancer population.

Dr. Ian Pike, Chief Operating Officer of Proteome Sciences commented, “Both these recent studies should rightly be recognised as important steps forward towards better cancer treatment. However, in order to overcome the limitations of current drug therapy programs, a much deeper and more comprehensive analysis of the complex protein networks that regulate tumour growth and survival is required and will be essential to achieve a major advance in the battle to treat cancer.

“Our SysQuant® workflows provide that solution. As an example, in pancreatic cancer3 we have successfully mapped the complex network of regulatory processes and demonstrate the ability to devise personalised treatment combinations on an individual basis for each patient. A retrospective study with SysQuant® to predict response to the targeted drug Sorafenib in liver cancer is in process and we are planning further prospective trials to guide personalised treatment selection in liver cancer.

“We are already delivering systems-wide biology solutions through SysQuant® and TMTcalibrator™ programs to our clients that are generating novel biological data and results using more sensitive profiling that are helping them to better understand their drug development programs and to provide new biomarkers for tracking patient response in clinical trials.

“We are strongly positioned to deliver more comprehensive analysis of proteins and cellular pathways across other areas of disease and in particular to extend the use of SysQuant® with other leading cancer research groups in liver and other cancers.”

Proteome Sciences has also expanded its offering in personalised medicine through the use of its TMTcalibrator™ technology to uniquely identify protein biomarkers that reveal active cancer and other disease processes in body fluid samples. The importance of these ‘mechanistic’ biomarkers is that they are essential to monitor that drugs are being effective and that they can be used as early biomarkers of disease recurrence.

Using SysQuant® and TMTcalibrator™, Proteome Sciences can deliver more comprehensive analysis and provide unparalleled levels of sensitivity and breadth of coverage of the proteome, enabling faster, more efficient drug development and more accurate disease diagnosis.

 

Discovering ‘Outlier’ Enzymes

Researchers at TSRI and Salk Institute have discovered ‘Outlier’ enzymes that could offer new targets to treat type 2 diabetes and inflammatory disorders.

A team led by scientists at The Scripps Research Institute (TSRI) and the Salk Institute for Biological Studies have discovered two enzymes that appear to play a role in metabolism and inflammation—and might someday be targeted with drugs to treat type 2 diabetes and inflammatory disorders. The discovery is unusual because the enzymes do not bear a resemblance—in their structures or amino-acid sequences—to any known class of enzymes.

The team of scientists nevertheless identified them as “outlier” members of the serine/threonine hydrolase class, using newer techniques that detect biochemical activity. “A huge fraction of the human ‘proteome’ remains uncharacterized, and this paper shows how chemical approaches can be used to uncover proteins of a given functionality that have eluded classification based on sequence or predicted structure,” said co-senior author Benjamin F. Cravatt, chair of TSRI’s Department of Chemical Physiology.

“In this study, we found two genes that control levels of lipids with anti-diabetic and anti-inflammatory activity, suggesting exciting targets for diabetes and inflammatory diseases,” said co-senior author Alan Saghatelian, who holds the Dr. Frederik Paulsen Chair at the Salk Institute. The study, which appeared as a Nature Chemical Biology Advance Online Publication on March 28, 2016, began as an effort in the Cravatt laboratory to discover and characterize new serine/threonine hydrolases using fluorophosphonate (FP) probes—molecules that selectively bind and, in effect, label the active sites of these enzymes.

Pulling FP-binding proteins out of the entire proteome of test cells and identifying them using mass spectrometry techniques, the team matched nearly all to known hydrolases. The major outlier was a protein called androgen-induced gene 1 protein (AIG1). The only other one was a distant cousin in terms of sequence, a protein called ADTRP. “Neither of these proteins had been characterized as an enzyme; in fact, there had been little functional characterization of them at all,” said William H. Parsons, a research associate in the Cravatt laboratory who was co-first author of the study.

Experiments on AIG1 and ADTRP revealed that they do their enzymatic work in a unique way. “It looks like they have an active site that is novel—it had never been described in the literature,” said Parsons. Initial tests with panels of different enzyme inhibitors showed that AIG1 and ADTRP are moderately inhibited by inhibitors of lipases—enzymes that break down fats and other lipids. But on what specific lipids do these newly discovered outlier enzymes normally work?

At the Salk Institute, the Saghatelian laboratory was investigating a class of lipids it had discovered in 2014. Known as fatty acid esters of hydroxy fatty acids (FAHFAs), these molecules showed strong therapeutic potential. Saghatelian and his colleagues had found that boosting the levels of one key FAHFA lipid normalizes glucose levels in diabetic mice and also reduces inflammation.

“[Ben Cravatt’s] lab was screening panels of lipids to find the ones that their new enzymes work on,” said Saghatelian, who is a former research associate in the Cravatt laboratory. “We suggested they throw FAHFAs in there—and these turned out to be very good substrates.” The Cravatt laboratory soon developed powerful inhibitors of the newly discovered enzymes, and the two labs began working together, using the inhibitors and genetic techniques to explore the enzymes’ functions in vitro and in cultured cells.

Co-first author Matthew J. Kolar, an MD-PhD student, performed most of the experiments in the Saghatelian lab. The team concluded that AIG1 and ADTRP, at least in the cell types tested, appear to work mainly to break down FAHFAs and not any other major class of lipid. In principle, inhibitors of AIG1 and ADTRP could be developed into FAHFA-boosting therapies.

“Our prediction,” said Saghatelian, “is that if FAHFAs do what we think they’re doing, then using an enzyme inhibitor to block their degradation would make FAHFA levels go up and should thus reduce inflammation as well as improve glucose levels and insulin sensitivity.” The two labs are now collaborating on further studies of the new enzymes—and the potential benefits of inhibiting them—in mouse models of diabetes, inflammation and autoimmune disease.

“One of the neat things this study shows,” said Cravatt, “is that even for enzyme classes as well studied as the hydrolases, there may still be hidden members that, presumably by convergent evolution, arrived at that basic enzyme mechanism despite sharing no sequence or structural homology.”

Other co-authors of the study, “AIG1 and ADTRP are atypical integral membrane hydrolases that degrade bioactive FAHFAs,” were Siddhesh S. Kamat, Armand B. Cognetta III, Jonathan J. Hulce and Enrique Saez, of TSRI; and co-senior author Barbara B. Kahn of Beth Israel Deaconess Medical Center and Harvard Medical School

 

New Weapon Against Breast Cancer

Molecular marker in healthy tissue can predict a woman’s risk of getting the disease, research says.

Harvard Stem Cell Institute (HSCI) researchers at Dana-Farber Cancer Institute (DFCI) and collaborators at Brigham and Women’s Hospital (BWH) have identified a molecular marker in normal breast tissue that can predict a woman’s risk for developing breast cancer, the leading cause of death in women with cancer worldwide.

The work, led by HSCI principal faculty member Kornelia Polyak and Rulla Tamimi of BWH, was published in an early online release and in the April 1 issue of Cancer Research.

The study builds on Polyak’s earlier research finding that women already identified as having a high risk of developing cancer — namely those with a mutation called BRCA1 or BRCA2 — or women who did not give birth before their 30s had a higher number of mammary gland progenitor cells.

In the latest study, Polyak, Tamimi, and their colleagues examined biopsies, some taken as many as four decades ago, from 302 participants in the Nurses’ Health Study and the Nurses’ Health Study II who had been diagnosed with benign breast disease. The researchers compared tissue from the 69 women who later developed cancer to the tissue from the 233 women who did not. They found that women were five times as likely to develop cancer if they had a higher percentage of Ki67, a molecular marker that identifies proliferating cells, in the cells that line the mammary ducts and milk-producing lobules. These cells, called the mammary epithelium, undergo drastic changes throughout a woman’s life, and the majority of breast cancers originate in these tissues.

Doctors already test breast tumors for Ki67 levels, which can inform decisions about treatment, but this is the first time scientists have been able to link Ki67 to precancerous tissue and use it as a predictive tool.

“Instead of only telling women that they don’t have cancer, we could test the biopsies and tell women if they were at high risk or low risk for developing breast cancer in the future,” said Polyak, a breast cancer researcher at Dana-Farber and co-senior author of the paper.

“Currently, we are not able to do a very good job at distinguishing women at high and low risk of breast cancer,” added co-senior author Tamimi, an associate professor at the Harvard T.H. Chan School of Public Health and Harvard Medical School. “By identifying women at high risk of breast cancer, we can better develop individualized screening and also target risk reducing strategies.”

To date, mammograms are the best tool for the early detection, but there are risks associated with screening. False positive and negative results and over-diagnosis could cause psychological distress, delay treatment, or lead to overtreatment, according to the National Cancer Institute (NCI).

Mammography machines also use low doses of radiation. While a single mammogram is unlikely to cause harm, repeated screening can potentially cause cancer, though the NCI writes that the benefits “nearly always outweigh the risks.”

“If we can minimize unnecessary radiation for women at low risk, that would be good,” said Tamimi.

Screening for Ki67 levels would “be easy to apply in the current setting,” said Polyak, though the researchers first want to reproduce the results in an independent cohort of women.

 

AIG1 and ADTRP are atypical integral membrane hydrolases that degrade bioactive FAHFAs

William H ParsonsMatthew J Kolar, …., Barbara B KahnAlan Saghatelian & Benjamin F Cravatt

Nature Chemical Biology 28 March 2016                    http://dx.doi.org:/10.1038/nchembio.2051

Enzyme classes may contain outlier members that share mechanistic, but not sequence or structural, relatedness with more common representatives. The functional annotation of such exceptional proteins can be challenging. Here, we use activity-based profiling to discover that the poorly characterized multipass transmembrane proteins AIG1 and ADTRP are atypical hydrolytic enzymes that depend on conserved threonine and histidine residues for catalysis. Both AIG1 and ADTRP hydrolyze bioactive fatty acid esters of hydroxy fatty acids (FAHFAs) but not other major classes of lipids. We identify multiple cell-active, covalent inhibitors of AIG1 and show that these agents block FAHFA hydrolysis in mammalian cells. These results indicate that AIG1 and ADTRP are founding members of an evolutionarily conserved class of transmembrane threonine hydrolases involved in bioactive lipid metabolism. More generally, our findings demonstrate how chemical proteomics can excavate potential cases of convergent or parallel protein evolution that defy conventional sequence- and structure-based predictions.

Figure 1: Discovery and characterization of AIG1 and ADTRP as FP-reactive proteins in the human proteome.

 

http://www.nature.com/nchembio/journal/vaop/ncurrent/carousel/nchembio.2051-F1.jpg

(a) Competitive ABPP-SILAC analysis to identify FP-alkyne-inhibited proteins, in which protein enrichment and inhibition were measured in proteomic lysates from SKOV3 cells treated with FP-alkyne (20 μM, 1 h) or DMSO using the FP-biotin…

 

  1. Willems, L.I., Overkleeft, H.S. & van Kasteren, S.I. Current developments in activity-based protein profiling. Bioconjug. Chem. 25, 11811191 (2014).
  2. Niphakis, M.J. & Cravatt, B.F. Enzyme inhibitor discovery by activity-based protein profiling.Annu. Rev. Biochem. 83, 341377 (2014).
  3. Berger, A.B., Vitorino, P.M. & Bogyo, M. Activity-based protein profiling: applications to biomarker discovery, in vivo imaging and drug discovery. Am. J. Pharmacogenomics 4,371381 (2004).
  4. Liu, Y., Patricelli, M.P. & Cravatt, B.F. Activity-based protein profiling: the serine hydrolases.Proc. Natl. Acad. Sci. USA 96, 1469414699 (1999).
  5. Simon, G.M. & Cravatt, B.F. Activity-based proteomics of enzyme superfamilies: serine hydrolases as a case study. J. Biol. Chem. 285, 1105111055 (2010).
  6. Bachovchin, D.A. et al. Superfamily-wide portrait of serine hydrolase inhibition achieved by library-versus-library screening. Proc. Natl. Acad. Sci. USA 107, 2094120946 (2010).
  7. Jessani, N. et al. A streamlined platform for high-content functional proteomics of primary human specimens. Nat. Methods 2, 691697 (2005).
  8. Higa, H.H., Diaz, S. & Varki, A. Biochemical and genetic evidence for distinct membrane-bound and cytosolic sialic acid O-acetyl-esterases: serine-active-site enzymes. Biochem. Biophys. Res. Commun. 144, 10991108 (1987).

Academic cross-fertilization by public screening yields a remarkable class of protein phosphatase methylesteras-1 inhibitors

Proc Natl Acad Sci U S A. 2011 Apr 26; 108(17): 6811–6816.    doi:  10.1073/pnas.1015248108
National Institutes of Health (NIH)-sponsored screening centers provide academic researchers with a special opportunity to pursue small-molecule probes for protein targets that are outside the current interest of, or beyond the standard technologies employed by, the pharmaceutical industry. Here, we describe the outcome of an inhibitor screen for one such target, the enzyme protein phosphatase methylesterase-1 (PME-1), which regulates the methylesterification state of protein phosphatase 2A (PP2A) and is implicated in cancer and neurodegeneration. Inhibitors of PME-1 have not yet been described, which we attribute, at least in part, to a dearth of substrate assays compatible with high-throughput screening. We show that PME-1 is assayable by fluorescence polarization-activity-based protein profiling (fluopol-ABPP) and use this platform to screen the 300,000+ member NIH small-molecule library. This screen identified an unusual class of compounds, the aza-β-lactams (ABLs), as potent (IC50 values of approximately 10 nM), covalent PME-1 inhibitors. Interestingly, ABLs did not derive from a commercial vendor but rather an academic contribution to the public library. We show using competitive-ABPP that ABLs are exquisitely selective for PME-1 in living cells and mice, where enzyme inactivation leads to substantial reductions in demethylated PP2A. In summary, we have combined advanced synthetic and chemoproteomic methods to discover a class of ABL inhibitors that can be used to selectively perturb PME-1 activity in diverse biological systems. More generally, these results illustrate how public screening centers can serve as hubs to create spontaneous collaborative opportunities between synthetic chemistry and chemical biology labs interested in creating first-in-class pharmacological probes for challenging protein targets.

Protein phosphorylation is a pervasive and dynamic posttranslational protein modification in eukaryotic cells. In mammals, more than 500 protein kinases catalyze the phosphorylation of serine, threonine, and tyrosine residues on proteins (1). A much more limited number of phosphatases are responsible for reversing these phosphorylation events (2). For instance, protein phosphatase 2A (PP2A) and PP1 are thought to be responsible together for > 90% of the total serine/threonine phosphatase activity in mammalian cells (3). Specificity is imparted on PP2A activity by multiple mechanisms, including dynamic interactions between the catalytic subunit (C) and different protein-binding partners (B subunits), as well as a variety of posttranslational chemical modifications (2, 4). Within the latter category is an unusual methylesterification event found at the C terminus of the catalytic subunit of PP2A that is introduced and removed by a specific methyltransferase (leucine carbxoylmethyltransferase-1 or LCMT1) (5, 6) and methylesterase (protein phosphatase methylesterase-1 or PME-1) (7), respectively (Fig. 1A). PP2A carboxymethylation (hereafter referred to as “methylation”) has been proposed to regulate PP2A activity, at least in part, by modulating the binding interaction of the C subunit with various regulatory B subunits (810). A predicted outcome of these shifts in subunit association is the targeting of PP2A to different protein substrates in cells. PME-1 has also been hypothesized to stabilize inactive forms of nuclear PP2A (11), and recent structural studies have shed light on the physical interactions between PME-1 and the PP2A holoenzyme (12).

There were several keys to the success of our probe development effort. First, screening for inhibitors of PME-1 benefited from the fluopol-ABPP technology, which circumvented the limited throughput of previously described substrate assays for this enzyme. Second, we were fortunate that the NIH compound library contained several members of the ABL class of small molecules. These chiral compounds, which represent an academic contribution to the NIH library, occupy an unusual portion of structural space that is poorly accessed by commercial compound collections. Although at the time of their original synthesis (23) it may not have been possible to predict whether these ABLs would show specific biological activity, their incorporation into the NIH library provided a forum for screening against many proteins and cellular targets, culminating in their identification as PME-1 inhibitors. We then used advanced chemoproteomic assays to confirm the remarkable selectivity displayed by ABLs for PME-1 across (and beyond) the serine hydrolase superfamily. That the mechanism for PME-1 inhibition involves acylation of the enzyme’s conserved serine nucleophile (Fig. 3) suggests that exploration of a more structurally diverse set of ABLs might uncover inhibitors for other serine hydrolases. In this way, the chemical information gained from a single high-throughput screen may be leveraged to initiate probe development programs for additional enzyme targets.

Projecting forward, this research provides an example of how public small-molecule screening centers can serve as a portal for spawning academic collaborations between chemical biology and synthetic chemistry labs. By continuing to develop versatile high-throughput screens and combining them with a small-molecule library of expanding structural diversity conferred by advanced synthetic methodologies, academic biologists and chemists are well-positioned to collaboratively deliver pharmacological probes for a wide range of proteins and pathways in cell biology.

 

New weapon against breast cancer

Molecular marker in healthy tissue can predict a woman’s risk of getting the disease, research says

April 6, 2016 | Popular
BRC_Cancer605

 

New Group of Aging-Related Proteins Discovered

http://www.genengnews.com/gen-news-highlights/new-group-of-aging-related-proteins-discovered/81252599/

Scientists have discovered a group of six proteins that may help to divulge secrets of how we age, potentially unlocking new insights into diabetes, Alzheimer’s, cancer, and other aging-related diseases.

The proteins appear to play several roles in our bodies’ cells, from decreasing the amount of damaging free radicals and controlling the rate at which cells die to boosting metabolism and helping tissues throughout the body respond better to insulin. The naturally occurring amounts of each protein decrease with age, leading investigators to believe that they play an important role in the aging process and the onset of diseases linked to older age.

The research team led by Pinchas Cohen, M.D., dean and professor of the University of Southern California Leonard Davis School of Gerontology, identified the proteins and observed their origin from mitochondria and their game-changing roles in metabolism and cell survival. This latest finding builds upon prior research by Dr. Cohen and his team that uncovered two significant proteins, humanin and MOTS-c, hormones that appear to have significant roles in metabolism and diseases of aging.

Unlike most other proteins, humanin and MOTS-c are encoded in mitochondria. Dr. Cohen’s team used computer analysis to see if the part of the mitochondrial genome that provides the code for humanin was coding for other proteins as well. The analysis uncovered the genes for six new proteins, which were dubbed small humanin-like peptides, or SHLPs, 1 through 6 (pronounced “schlep”).

After identifying the six SHLPs and successfully developing antibodies to test for several of them, the team examined both mouse tissues and human cells to determine their abundance in different organs as well as their functions. The proteins were distributed quite differently among organs, which suggests that the proteins have varying functions based on where they are in the body. Of particular interest is SHLP 2, according to Dr. Cohen.  The protein appears to have insulin-sensitizing, antidiabetic effects as well as neuroprotective activity that may emerge as a strategy to combat Alzheimer’s disease. He added that SHLP 6 is also intriguing, with a unique ability to promote cancer cell death and thus potentially target malignant diseases.

Proteins That May Protect Against Age Related Illnesses Discovered

 

The cell proliferation antigen Ki-67 organises heterochromatin

 Michal Sobecki, 

Antigen Ki-67 is a nuclear protein expressed in proliferating mammalian cells. It is widely used in cancer histopathology but its functions remain unclear. Here, we show that Ki-67 controls heterochromatin organisation. Altering Ki-67 expression levels did not significantly affect cell proliferation in vivo. Ki-67 mutant mice developed normally and cells lacking Ki-67 proliferated efficiently. Conversely, upregulation of Ki-67 expression in differentiated tissues did not prevent cell cycle arrest. Ki-67 interactors included proteins involved in nucleolar processes and chromatin regulators. Ki-67 depletion disrupted nucleologenesis but did not inhibit pre-rRNA processing. In contrast, it altered gene expression. Ki-67 silencing also had wide-ranging effects on chromatin organisation, disrupting heterochromatin compaction and long-range genomic interactions. Trimethylation of histone H3K9 and H4K20 was relocalised within the nucleus. Finally, overexpression of human or Xenopus Ki-67 induced ectopic heterochromatin formation. Altogether, our results suggest that Ki-67 expression in proliferating cells spatially organises heterochromatin, thereby controlling gene expression.

 

A protein called Ki-67 is only produced in actively dividing cells, where it is located in the nucleus – the structure that contains most of the cell’s DNA. Researchers often use Ki-67 as a marker to identify which cells are actively dividing in tissue samples from cancer patients, and previous studies indicated that Ki-67 is needed for cells to divide. However, the exact role of this protein was not clear. Before cells can divide they need to make large amounts of new proteins using molecular machines called ribosomes and it has been suggested that Ki-67 helps to produce ribosomes.

Now, Sobecki et al. used genetic techniques to study the role of Ki-67 in mice. The experiments show that Ki-67 is not required for cells to divide in the laboratory or to make ribosomes. Instead, Ki-67 alters the way that DNA is packaged in the nucleus. Loss of Ki-67 from mice cells resulted in DNA becoming less compact, which in turn altered the activity of genes in those cells.

Read Full Post »

Glycobiology advances

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

The Evolution of the Glycobiology Space

The Nascent Stage of another Omics Field with Biomarker and Therapeutic Potential

Enal Razvi, Ph.D. , Gary Oosta, Ph.D

http://www.genengnews.com/insight-and-intelligence/the-evolution-of-the-glycobiology-space/77900638/

 

The Evolution of the Glycobiology Space

 

Glycobiology is an important field of study with medical applications because it is known that tumor cells alter their glycosylation pattern, which may contribute to their metastatic potential as well as potential immune evasion. [iStock/© vjanez]    http://www.genengnews.com/media/images/AnalysisAndInsight/Apr12_2016_iStock_41612310_PlasmaMembraneOfACell1211657142.jpg

There is growing interest in the field of glycobiology given the fact that epitopes with physiological and pathological relevance have glyco moieties.  We believe that another “omics” revolution is on the horizon—the study of the glyco modifications on the surface of cells and their potential as biomarkers and therapeutic targets in many disease classes. Not much industry tracking of this field has taken place. Thus, we sought to map this landscape by examining the entire ensemble of academic publications in this space and teasing apart the trends operative in this field from a qualitative and quantitative perspective. We believe that this methodology of en masse capture and publication and annotation provides an effective approach to evaluate this early-stage field.

Identifiation and Growth of Glycobiology Publications

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure11935315421.jpg

For this article, we identified 7000 publications in the broader glycobiology space and analyzed them in detail.  It is important to frame glycobiology in the context of genomics and proteomics as a means to assess the scale of the field. Figure 1 presents the relative sizes of these fields as assessed by publications in from 1975 to 2015.

Note that the relative scale of genomics versus proteomics and glycobiology/glycomics in this graph strongly suggests that glycobiology is a nascent space, and thus a driver for us to map its landscape today and as it evolves over the coming years.

Figure 2. (A) Segmentation of the glycobiology landscape. (B) Glycobiology versus glycomics publication growth.

 

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure2ab1917013624.jpg

To examine closely the various components of the glycobiology space, we segmented the publications database, presented in Figure 2A. Note the relative sizes and growth rates (slopes) of the various segments.

Clearly, glycoconjugates currently are the majority of this space and account for the bulk of the publications.  Glycobiology and glycomics are small but expanding and therefore can be characterized as “nascent market segments.”  These two spaces are characterized in more detail in Figure 2B, which presents their publication growth rates.

Note the very recent increased attention directed at these spaces and hence our drive to initiate industry coverage of these spaces. Figure 2B presents the overall growth and timeline of expansion of these fields—especially glycobiology—but it provides no information about the qualitative nature of these fields.

Focus of Glycobiology Publications

http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure2c1827601892.jpg

Figure 2C. Word cloud based on titles of publications in the glycobiology and glycomics spaces.

To understand the focus of publications in this field, and indeed the nature of this field, we constructed a word cloud based on titles of the publications that comprise this space presented in Figure 2C.

There is a marked emphasis on terms such as oligosaccharides and an emphasis on cells (this is after all glycosylation on the surface of cells). Overall, a pictorial representation of the types and classes of modifications that comprise this field emerge in this word cloud, demonstrating the expansion of the glycobiology and to a lesser extent the glycomics spaces as well as the character of these nascent but expanding spaces.

Characterization of the Glycobiology Space in Journals

Figure 3A. Breakout of publications in the glycobiology/glycomics fields.   http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure3a_5002432117316.jpg
Having framed the overall growth of the glycobiology field, we wanted to understand its structure and the classes of researchers as well as publications that comprise this field. To do this, we segmented the publications that constitute this field into the various journals in which glycobiology research is published. Figure 3A presents the breakout of publications by journal to illustrate the “scope” of this field.

The distribution of glycobiology publications across the various journals suggests a very concentrated marketplace that is very technically focused. The majority of the publications segregate into specialized journals on this topic, a pattern very indicative of a field in the very early stages of development—a truly nascent marketplace.

http://www.genengnews.com/Media/images/AnalysisAndInsight/thumb_April12_2016_SelectBiosciences_Figure3b1012091061.jpg

Figure 3B. Origin of publications in the glycobiology/glycomics fields.
We also sought to understand the “origin” of these publications—the breakout between academic- versus industry-derived journals. Figure 3B presents this breakout and shows that these publications are overwhelmingly (92.3%) derived from the academic sector. This is again a testimonial to the early nascent nature of this marketplace without significant engagement by the commercial sector and therefore is an important field to characterize and track from the ground up.

Select Biosciences, Inc. further analyzed the growth trajectory of the glycobiology papers in Figure 3C as a means to examine closely the publications trajectory. Although there appears to be some wobble along the way, overall the trajectory is upward, and of late it is expanding significantly.

In Summary

Figure 3C. Trajectory of the glycobiology space.   http://www.genengnews.com/Media/images/AnalysisAndInsight/April12_2016_SelectBiosciences_Figure3c1236921793.jpg
Glycobiology is the study of what coats living cells—glycans, or carbohydrates, and glycoconjugates. This is an important field of study with medical applications because it is known that tumor cells alter their glycosylation pattern, which may contribute to their metastatic potential as well as potential immune evasion.

At this point, glycobiology is largely basic research and thus it pales in comparison with the field of genomics. But in 10 years, we predict the study of glycobiology and glycomics will be ubiquitous and in the mainstream.

We started our analysis of this space because we’ve been focusing on many other classes of analytes, such as microRNAs, long-coding RNAs, oncogenes, tumor suppressor genes, etc., whose potential as biomarkers is becoming established. Glycobiology, on the other hand, represents an entire new space—a whole new category of modifications that could be analyzed for diagnostic potential and perhaps also for therapeutic targeting.

Today, glycobiology and glycomics are where genomics was at the start of the Human Genome Project. They respresent a nascent space and with full headroom for growth. Select Biosciences will continue to track this exciting field for research developments as well as development of biomarkers based on glyco-epitopes.

Enal Razvi, Ph.D., conducted his doctoral work on viral immunology and subsequent to receiving his Ph.D. went on to the Rockefeller University in New York to serve as Aaron Diamond Post-doctoral fellow under Professor Ralph Steinman [Nobel Prize Winner in 2011 for his discovery of dendritic cells in the early-70s with Zanvil Cohn]. Subsequently, Dr. Razvi completed his research fellowship at Harvard Medical School. For the last two decades Dr. Razvi has worked with small and large companies and consulted for more than 100 clients worldwide. He currently serves as Biotechnology Analyst and Managing Director of SelectBio U.S. He can be reached at enal@selectbio.us. Gary M. Oosta holds a Ph.D. in Biophysics from Massachusetts Institute of Technology and a B.A. in Chemistry from E. Mich. Univ. He has 25 years of industrial research experience in various technology areas including medical diagnostics, thin-layer coating, bio-effects of electromagnetic radiation, and blood coagulation. Dr. Oosta has authored 20 technical publications and is an inventor on 77 patents worldwide. In addition, he has managed research groups that were responsible for many other patented innovations. Dr. Oosta has a long-standing interest in using patents and publications as strategic technology indicators for future technology selection and new product development. To enjoy more articles like this from GEN, click here to subscribe now!

RELATED CONTENT
Ezose, Hirosaki University Sign Glycomics Partnership to Identify Urologic Cancer Biomarkers
Getting Testy Over Liquid Biopsies
Enabling High-Throughput Glycomics
Market & Tech Analysis
The Evolution of the Glycobiology Space
Cancer Immunotherapy 2016
The Cancer Biomarkers Marketplace
Microfluidics in the Life Sciences
Liquid Biopsies Landscape

Read Full Post »

High blood pressure can damage the retina’s blood vessels and limit the retina’s function. It can also put pressure on the optic nerve.

Sourced through Scoop.it from: www.healthline.com

See on Scoop.itCardiovascular Disease: PHARMACO-THERAPY

Read Full Post »

Einstein and General Theory of Relativity

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

 

General Relativity And The ‘Lone Genius’ Model Of Science

Chad Orzel

http://www.forbes.com/sites/chadorzel/2015/11/24/general-relativity-and-the-lone-genius-model-of-science/

 

(Credit: AP)

 

One hundred years ago this Wednesday, Albert Einstein gave the last of a series of presentations to the Prussian Academy of Sciences, which marks the official completion of his General Theory of Relativity. This anniversary is generating a good deal of press and various celebratory events, such as the premiere of a new documentary special. If you prefer your physics explanations in the plainest language possible, there’s even an “Up Goer Five” version (personally, I don’t find these all that illuminating, but lots of people seem to love it).

Einstein is, of course, the most iconic scientist in history, and much of the attention to this week’s centennial will center on the idea of his singular genius. Honestly, general relativity is esoteric enough that were it not for Einstein’s personal fame, there probably wouldn’t be all that much attention paid to this outside of the specialist science audience.

But, of course, while the notion of Einstein as a lone, unrecognized genius is a big part of his myth, he didn’t create relativity entirely on his own, asthis article in Nature News makes clear. The genesis of relativity is a single simple idea, but even in the early stages, when he developed Special Relativity while working as a patent clerk, he honed his ideas through frequent discussions with friends and colleagues. Most notable among these was probably Michele Besso, who Einstein later referred to as “the best sounding board in Europe.”

And most of the work on General Relativity came not when Einstein was toiling in obscurity, but after he had begun to climb the academic ladder in Europe. In the ten years between the Special and General theories, he went through a series of faculty jobs of increasing prestige. He also laboriously learned a great deal of mathematics in order to reach the final form of the theory, largely with the assistance of his friend Marcel Grossmann. The path to General Relativity was neither simple nor solitary, and the Nature piece documents both the mis-steps along the way and the various people who helped out.

While Einstein wasn’t working alone, though, the Nature piece also makes an indirect case for his status as a genius worth celebrating. Not because of the way he solved the problem, but through the choice of problem to solve. Einstein pursued a theory that would incorporate gravitation into relativity with dogged determination through those years, but he was one of a very few people working on it. There were a couple of other theories kicking around, particularly Gunnar Nordström’s, but these didn’t generate all that much attention. The mathematician David Hilbert nearly scooped Einstein with the final form of the field equations in November of 1915 (some say he did get there first), but Hilbert was a latecomer who only got interested in the problem of gravitation after hearing about it from Einstein, and his success was a matter of greater familiarity with the necessary math. One of the books I used when I taught a relativity class last year quoted Hilbert as saying that “every child in the streets of Göttingen knows more about four-dimensional geometry than Einstein,” but that Einstein’s physical insight got him to the theory before superior mathematicians.

 

History: Einstein was no lone genius

Michel Janssen & Jürgen Renn   

16 November 2015 Corrected:   17 November 2015    Nature Nov 2015; 527(7578)

Lesser-known and junior colleagues helped the great physicist to piece together his general theory of relativity, explain Michel Janssen and Jürgen Renn.

http://www.nature.com/news/history-einstein-was-no-lone-genius-1.18793

 

http://www.nature.com/polopoly_fs/7.31357.1447429421!/image/Comment2.jpg_gen/derivatives/landscape_630/Comment2.jpg

Marcel Grossmann (left) and Michele Besso (right), university friends of Albert Einstein (centre), both made important contributions to general relativity.

 

A century ago, in November 1915, Albert Einstein published his general theory of relativity in four short papers in the proceedings of the Prussian Academy of Sciences in Berlin1. The landmark theory is often presented as the work of a lone genius. In fact, the physicist received a great deal of help from friends and colleagues, most of whom never rose to prominence and have been forgotten2, 3, 4, 5. (For full reference details of all Einstein texts mentioned in this piece, seeSupplementary Information.)

Here we tell the story of how their insights were woven into the final version of the theory. Two friends from Einstein’s student days — Marcel Grossmann and Michele Besso — were particularly important. Grossmann was a gifted mathematician and organized student who helped the more visionary and fanciful Einstein at crucial moments. Besso was an engineer, imaginative and somewhat disorganized, and a caring and lifelong friend to Einstein. A cast of others contributed too.

Einstein met Grossmann and Besso at the Swiss Federal Polytechnical School in Zurich6 — later renamed the Swiss Federal Institute of Technology (Eidgenössische Technische Hochschule; ETH) — where, between 1896 and 1900, he studied to become a school teacher in physics and mathematics. Einstein also met his future wife at the ETH, classmate Mileva Marić. Legend has it that Einstein often skipped class and relied on Grossmann’s notes to pass exams.

 

http://www.nature.com/polopoly_fs/7.31485.1447758022!/image/entanglement.jpg_gen/derivatives/fullsize/entanglement.jpg

 

Grossmann’s father helped Einstein to secure a position at the patent office in Berne in 1902, where Besso joined him two years later. Discussions between Besso and Einstein earned the former the sole acknowledgment in the most famous of Einstein’s 1905 papers, the one introducing the special theory of relativity. As well as publishing the papers that made 1905 his annus mirabilis, Einstein completed his dissertation that year to earn a PhD in physics from the University of Zurich.

In 1907, while still at the patent office, he started to think about extending the principle of relativity from uniform to arbitrary motion through a new theory of gravity. Presciently, Einstein wrote to his friend Conrad Habicht — whom he knew from a reading group in Berne mockingly called the Olympia Academy by its three members — saying that he hoped that this new theory would account for a discrepancy of about 43˝ (seconds of arc) per century between Newtonian predictions and observations of the motion of Mercury’s perihelion, the point of its orbit closest to the Sun.

Einstein started to work in earnest on this new theory only after he left the patent office in 1909, to take up professorships first at the University of Zurich and two years later at the Charles University in Prague. He realized that gravity must be incorporated into the structure of space-time, such that a particle subject to no other force would follow the straightest possible trajectory through a curved space-time.

In 1912, Einstein returned to Zurich and was reunited with Grossmann at the ETH. The pair joined forces to generate a fully fledged theory. The relevant mathematics was Gauss’s theory of curved surfaces, which Einstein probably learned from Grossmann’s notes. As we know from recollected conversations, Einstein told Grossmann7: “You must help me, or else I’ll go crazy.”

Their collaboration, recorded in Einstein’s ‘Zurich notebook‘, resulted in a joint paper published in June 1913, known as the Entwurf (‘outline’) paper. The main advance between this 1913 Entwurf theory and the general relativity theory of November 1915 are the field equations, which determine how matter curves space-time. The final field equations are ‘generally covariant’: they retain their form no matter what system of coordinates is chosen to express them. The covariance of the Entwurf field equations, by contrast, was severely limited.

 

http://www.nature.com/polopoly_fs/7.31488.1447759403!/image/einstein_lost.jpg_gen/derivatives/fullsize/einstein_lost.jpg

Einstein’s lost theory uncovered

 

Two Theories

In May 1913, as he and Grossmann put the finishing touches to their Entwurf paper, Einstein was asked to lecture at the annual meeting of the Society of German Natural Scientists and Physicians to be held that September in Vienna, an invitation that reflects the high esteem in which the 34-year-old was held by his peers.

In July 1913, Max Planck and Walther Nernst, two leading physicists from Berlin, came to Zurich to offer Einstein a well-paid and teaching-free position at the Prussian Academy of Sciences in Berlin, which he swiftly accepted and took up in March 1914. Gravity was not a pressing problem for Planck and Nernst; they were mainly interested in what Einstein could do for quantum physics.  (It was Walther Nernst who advised that Germany could not engage in WWI and win unless it was a short war).

Several new theories had been proposed in which gravity, like electromagnetism, was represented by a field in the flat space-time of special relativity. A particularly promising one came from the young Finnish physicist Gunnar Nordström. In his Vienna lecture, Einstein compared his own Entwurf theory to Nordström’s theory. Einstein worked on both theories between May and late August 1913, when he submitted the text of his lecture for publication in the proceedings of the 1913 Vienna meeting.

In the summer of 1913, Nordström visited Einstein in Zurich. Einstein convinced him that the source of the gravitational field in both their theories should be constructed out of the ‘energy–momentum tensor’: in pre-relativistic theories, the density and the flow of energy and momentum were represented by separate quantities; in relativity theory, they are combined into one quantity with ten different components.

 

http://www.nature.com/polopoly_fs/7.31358.1447420168!/image/Comment4.jpg_gen/derivatives/landscape_630/Comment4.jpg

ETH-Bibliothek Zürich, Bildarchiv

ETH Zurich, where Einstein met friends with whom he worked on general relativity.

 

This energy–momentum tensor made its first appearance in 1907–8 in the special-relativistic reformulation of the theory of electrodynamics of James Clerk Maxwell and Hendrik Antoon Lorentz by Hermann Minkowski. It soon became clear that an energy–momentum tensor could be defined for physical systems other than electromagnetic fields. The tensor took centre stage in the new relativistic mechanics presented in the first textbook on special relativity, Das Relativitätsprinzip, written by Max Laue in 1911. In 1912, a young Viennese physicist, Friedrich Kottler, generalized Laue’s formalism from flat to curved space-time. Einstein and Grossmann relied on this generalization in their formulation of the Entwurf theory. During his Vienna lecture, Einstein called for Kottler to stand up and be recognized for this work8.

Einstein also worked with Besso that summer to investigate whether the Entwurf theory could account for the missing 43˝ per century for Mercury’s perihelion. Unfortunately, they found that it could only explain 18˝. Nordström’s theory, Besso checked later, gave 7˝ in the wrong direction. These calculations are preserved in the ‘Einstein–Besso manuscript‘ of 1913.

Besso contributed significantly to the calculations and raised interesting questions. He wondered, for instance, whether the Entwurf field equations have an unambiguous solution that uniquely determines the gravitational field of the Sun. Historical analysis of extant manuscripts suggests that this query gave Einstein the idea for an argument that reconciled him with the restricted covariance of the Entwurf equations. This ‘hole argument’ seemed to show that generally covariant field equations cannot uniquely determine the gravitational field and are therefore inadmissible9.

Einstein and Besso also checked whether the Entwurf equations hold in a rotating coordinate system. In that case the inertial forces of rotation, such as the centrifugal force we experience on a merry-go-round, can be interpreted as gravitational forces. The theory seemed to pass this test. In August 1913, however, Besso warned him that it did not. Einstein did not heed the warning, which would come back to haunt him.

 

http://www.nature.com/polopoly_fs/7.31486.1447758069!/image/integrity.jpg_gen/derivatives/fullsize/integrity.jpg

Scientific method: Defend the integrity of physics

 

In his lecture in Vienna in September 1913, Einstein concluded his comparison of the two theories with a call for experiment to decide. The Entwurf theory predicts that gravity bends light, whereas Nordström’s does not. It would take another five years to find out. Erwin Finlay Freundlich, a junior astronomer in Berlin with whom Einstein had been in touch since his days in Prague, travelled to Crimea for the solar eclipse of August 1914 to determine whether gravity bends light but was interned by the Russians just as the First World War broke out. Finally, in 1919, English astronomer Arthur Eddington confirmed Einstein’s prediction of light bending by observing the deflection of distant stars seen close to the Sun’s edge during another eclipse, making Einstein a household name10.

Back in Zurich, after the Vienna lecture, Einstein teamed up with another young physicist, Adriaan Fokker, a student of Lorentz, to reformulate the Nordström theory using the same kind of mathematics that he and Grossmann had used to formulate the Entwurf theory. Einstein and Fokker showed that in both theories the gravitational field can be incorporated into the structure of a curved space-time. This work also gave Einstein a clearer picture of the structure of the Entwurf theory, which helped him and Grossmann in a second joint paper on the theory. By the time it was published in May 1914, Einstein had left for Berlin.

 

http://www.nature.com/polopoly_fs/7.31489.1447761264!/image/Einstein_frontal_small.jpg_gen/derivatives/fullsize/Einstein_frontal_small.jpg

Snapshots explore Einstein’s unusual brain

 

The Breakup

Turmoil erupted soon after the move. Einstein’s marriage fell apart and Mileva moved back to Zurich with their two young sons. Albert renewed the affair he had started and broken off two years before with his cousin Elsa Löwenthal (née Einstein). The First World War began. Berlin’s scientific elite showed no interest in the Entwurf theory, although renowned colleagues elsewhere did, such as Lorentz and Paul Ehrenfest in Leiden, the Netherlands. Einstein soldiered on.

By the end of 1914, his confidence had grown enough to write a long exposition of the theory. But in the summer of 1915, after a series of his lectures in Göttingen had piqued the interest of the great mathematician David Hilbert, Einstein started to have serious doubts. He discovered to his dismay that the Entwurf theory does not make rotational motion relative. Besso was right. Einstein wrote to Freundlich for help: his “mind was in a deep rut”, so he hoped that the young astronomer as “a fellow human being with unspoiled brain matter” could tell him what he was doing wrong. Freundlich could not help him.

“Worried that Hilbert might beat him to the punch, Einstein rushed new equations into print.”

The problem, Einstein soon realized, lay with the Entwurf field equations. Worried that Hilbert might beat him to the punch, Einstein rushed new equations into print in early November 1915, modifying them the following week and again two weeks later in subsequent papers submitted to the Prussian Academy. The field equations were generally covariant at last.

In the first November paper, Einstein wrote that the theory was “a real triumph” of the mathematics of Carl Friedrich Gauss and Bernhard Riemann. He recalled in this paper that he and Grossmann had considered the same equations before, and suggested that if only they had allowed themselves to be guided by pure mathematics rather than physics, they would never have accepted equations of limited covariance in the first place.

Other passages in the first November paper, however, as well as his other papers and correspondence in 1913–15, tell a different story. It was thanks to the elaboration of the Entwurf theory, with the help of Grossmann, Besso, Nordström and Fokker, that Einstein saw how to solve the problems with the physical interpretation of these equations that had previously defeated him.

In setting out the generally covariant field equations in the second and fourth papers, he made no mention of the hole argument. Only when Besso and Ehrenfest pressed him a few weeks after the final paper, dated 25 November, did Einstein find a way out of this bind — by realizing that only coincident events and not coordinates have physical meaning. Besso had suggested a similar escape two years earlier, which Einstein had brusquely rejected2.

In his third November paper, Einstein returned to the perihelion motion of Mercury. Inserting the astronomical data supplied by Freundlich into the formula he derived using his new theory, Einstein arrived at the result of 43″ per century and could thus fully account for the difference between Newtonian theory and observation. “Congratulations on conquering the perihelion motion,” Hilbert wrote to him on 19 November. “If I could calculate as fast as you can,” he quipped, “the hydrogen atom would have to bring a note from home to be excused for not radiating.”

Einstein kept quiet on why he had been able to do the calculations so fast. They were minor variations on the ones he had done with Besso in 1913. He probably enjoyed giving Hilbert a taste of his own medicine: in a letter to Ehrenfest written in May 1916, Einstein characterized Hilbert’s style as “creating the impression of being superhuman by obfuscating one’s methods”.

Einstein emphasized that his general theory of relativity built on the work of Gauss and Riemann, giants of the mathematical world. But it also built on the work of towering figures in physics, such as Maxwell and Lorentz, and on the work of researchers of lesser stature, notably Grossmann, Besso, Freundlich, Kottler, Nordström and Fokker. As with many other major breakthroughs in the history of science, Einstein was standing on the shoulders of many scientists, not just the proverbial giants4.

 

http://www.nature.com/polopoly_fs/7.31375.1447420557!/image/cartoon.jpg_gen/derivatives/landscape_630/cartoon.jpg

Berlin’s physics elite (Fritz Haber, Walther Nernst, Heinrich Rubens, Max Planck) and Einstein’s old and new family (Mileva Einstein-Marić and heir sons Eduard and Hans Albert; Elsa Einstein-Löwenthal and her daughters Ilse and Margot) are watching as Einstein is pursuing his new theory of gravity and his idée fixeof generalizing the relativity principle while carried by giants of both physics and mathematics (Isaac Newton, James Clerk Maxwell, Carl Friedrich Gauss, Bernhard Riemann) and scientists of lesser stature (Marcel Grossmann, Gunnar Nordström, Erwin Finlay Freundlich, Michele Besso).

Nature 527, 298–300 (19 Nov 2015)       http://dx.doi.org:/10.1038/527298a

 

 

Read Full Post »

Twitter, Google, LinkedIn Enter in the Curation Foray: What’s Up With That?

 

Reporter: Stephen J. Williams, Ph.D.

Recently Twitter has announced a new feature which they hope to use to increase engagement on their platform. Originally dubbed Project Lightning and now called Moments, this feature involves many human curators which aggregate and curate tweets surrounding individual live events(which used to be under #Live).

As Madhu Muthukumar (@justmadhu), Twitter’s Product Manager, published a blog post describing Moments said:

“Every day, people share hundreds of millions of tweets. Among them are things you can’t experience anywhere but on Twitter: conversations between world leaders and celebrities, citizens reporting events as they happen, cultural memes, live commentary on the night’s big game, and many more,” the blog post noted. “We know finding these only-on-Twitter moments can be a challenge, especially if you haven’t followed certain accounts. But it doesn’t have to be.”

Please see more about Moments on his blog here.

Moments is a new tab on Twitter’s mobile and desktop home screens where the company will curate trending topics as they’re unfolding in real-time — from citizen-reported news to cultural memes to sports events and more. Moments will fall into five total categories, including “Today,” “News,” “Sports,” “Entertainment” and “Fun.” (Source: Fox)

Now It’s Google’s Turn

 

As Dana Blankenhorn wrote in his article Twitter, Google Try It Buzzfeed’s Way With Curation

in SeekingAlpha

What’s a challenge for Google is a direct threat to Twitter’s existence.

For all the talk about what doesn’t work in journalism, curation works. Following the news, collecting it and commenting, and encouraging discussion, is the “secret sauce” for companies like Buzzfeed, Vox, Vice and The Huffington Post, which often wind up getting more traffic from a story at, say The New York Times (NYSE:NYT), than the Times does as a result.

Curation is, in some ways, a throwback to the pre-Internet era. It’s done by people. (At least I think I’m a people.) So as odd as it is for Twitter (NYSE:TWTR) to announce it will curate live events it’s even odder to see Google (NASDAQ:GOOG) (NASDAQ:GOOGL) doing it in a project called YouTube Newswire.

Buzzfeed, Google’s content curation platform, made for desktop as well as a mobile app, allows sharing of curated news, viral videos.

The feel for both Twitter and Google’s content curation will be like a newspaper, with an army of human content curators determining what is the trendiest news to read or videos to watch.

BuzzFeed articles, or at least, the headlines can easily be mined from any social network but reading the whole article still requires that you open the link within the app or outside using a mobile web browser. Loading takes some time–a few seconds longer. Try browsing the BuzzFeed feed on the app and you’ll notice the obvious difference.

However it was earlier this summer in a Forbes article Why Apple, Snapchat and Twitter are betting on human editors, but Facebook and Google aren’t that Apple, Snapchat and Twitter as well as LinkedIn Pulse and Instragram were going to use human editors and curators while Facebook and Google were going to rely on their powerful algorithms. Google (now Alphabet) CEO Eric Schmidt has even called Apple’s human curated playlists “elitist” although Google Play has human curated playlists.

Maybe Google is responding to views on its Google News like this review in VentureBeat:

Google News: Less focused on social signals than textual ones, Google News uses its analytic tools to group together related stories and highlight the biggest ones. Unlike Techmeme, it’s entirely driven by algorithms, and that means it often makes weird choices. I’ve heard that Google uses social sharing signals from Google+ to help determine which stories appear on Google News, but have never heard definitive confirmation of that — and now that Google+ is all but dead, it’s mostly moot. I find Google News an unsatisfying home page, but it is a good place to search for news once you’ve found it.

Now WordPress Too!

 

WordPress also has announced its curation plugin called Curation Traffic.

According to WordPress

You Own the Platform, You Benefit from the Traffic

“The Curation Traffic™ System is a complete WordPress based content curation solution. Giving you all the tools and strategies you need to put content curation into action.

It is push-button simple and seamlessly integrates with any WordPress site or blog.

With Curation Traffic™, curating your first post is as easy as clicking “Curate” and the same post that may originally only been sent to Facebook or Twitter is now sent to your own site that you control, you benefit from, and still goes across all of your social sites.”

The theory the more you share on your platform the more engagement the better marketing experience. And with all the WordPress users out there they have already an army of human curators.

So That’s Great For News But What About Science and Medicine?

 

The news and trendy topics such as fashion and music are common in most people’s experiences. However more technical areas of science, medicine, engineering are not in most people’s domain so aggregation of content needs a process of peer review to sort basically “the fact from fiction”. On social media this is extremely important as sensational stories of breakthroughs can spread virally without proper vetting and even influence patient decisions about their own personal care.

Expertise Depends on Experience

In steps the human experience. On this site (www.pharmaceuticalintelligence.com) we attempt to do just this. A consortium of M.D.s, Ph.D. and other medical professionals spend their own time to aggregate not only topics of interest but curate on specific topics to add some more insight from acceptable sources over the web.

In Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison; Dr. Larry Berstein compares a museum or music curator to curation of scientific findings and literature and draws similar conclusions from each: that a curation can be a tool to gain new insights previously unseen an observer. A way of stepping back to see a different picture, hear a different song.

 

For instance, using a Twitter platform, we curate #live meeting notes and tweets from meeting attendees (please see links below and links within) to give a live conference coverage

http://pharmaceuticalintelligence.com/press-coverage/

and curation and analysis give rise not only to meeting engagement butunique insights into presentations.

 

In addition, the use of a WordPress platform allows easy sharing among many different social platforms including Twitter, Google+, LinkedIn, Pinterest etc.

Hopefully, this will catch on to the big powers of Twitter, Google and Facebook to realize there exists armies of niche curation communities which they can draw on for expert curation in the biosciences.

Other posts on this site on Curation and include

 

Inevitability of Curation: Scientific Publishing moves to embrace Open Data, Libraries and Researchers are trying to keep up

The Methodology of Curation for Scientific Research Findings

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

The growing importance of content curation

Data Curation is for Big Data what Data Integration is for Small Data

Stem Cells and Cardiac Repair: Content Curation & Scientific Reporting

Cardiovascular Diseases and Pharmacological Therapy: Curations

Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison

 

 

 

 

 

 

 

Read Full Post »

Older Posts »

%d bloggers like this: