Feeds:
Posts
Comments

Posts Tagged ‘LinkedIn’

Will Web 3.0 Do Away With Science 2.0? Is Science Falling Behind?

Curator: Stephen J. Williams, Ph.D.

UPDATED 4/06/2022

A while back (actually many moons ago) I had put on two posts on this site:

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

Twitter is Becoming a Powerful Tool in Science and Medicine

Each of these posts were on the importance of scientific curation of findings within the realm of social media and the Web 2.0; a sub-environment known throughout the scientific communities as Science 2.0, in which expert networks collaborated together to produce massive new corpus of knowledge by sharing their views, insights on peer reviewed scientific findings. And through this new media, this process of curation would, in itself generate new ideas and new directions for research and discovery.

The platform sort of looked like the image below:

 

This system lied above a platform of the original Science 1.0, made up of all the scientific journals, books, and traditional literature:

In the old Science 1.0 format, scientific dissemination was in the format of hard print journals, and library subscriptions were mandatory (and eventually expensive). Open Access has tried to ameliorate the expense problem.

Previous image source: PeerJ.com

To index the massive and voluminous research and papers beyond the old Dewey Decimal system, a process of curation was mandatory. The dissemination of this was a natural for the new social media however the cost had to be spread out among numerous players. Journals, faced with the high costs of subscriptions and their only way to access this new media as an outlet was to become Open Access, a movement first sparked by journals like PLOS and PeerJ but then begrudingly adopted throughout the landscape. But with any movement or new adoption one gets the Good the Bad and the Ugly (as described in my cited, above, Clive Thompson article). The bad side of Open Access Journals were

  1. costs are still assumed by the individual researcher not by the journals
  2. the arise of the numerous Predatory Journals

 

Even PeerJ, in their column celebrating an anniversary of a year’s worth of Open Access success stories, lamented the key issues still facing Open Access in practice

  • which included the cost and the rise of predatory journals.

In essence, Open Access and Science 2.0 sprung full force BEFORE anyone thought of a way to defray the costs

 

Can Web 3.0 Finally Offer a Way to Right the Issues Facing High Costs of Scientific Publishing?

What is Web 3.0?

From Wikipedia: https://en.wikipedia.org/wiki/Web3

Web 1.0 and Web 2.0 refer to eras in the history of the Internet as it evolved through various technologies and formats. Web 1.0 refers roughly to the period from 1991 to 2004, where most websites were static webpages, and the vast majority of users were consumers, not producers, of content.[6][7] Web 2.0 is based around the idea of “the web as platform”,[8] and centers on user-created content uploaded to social-networking services, blogs, and wikis, among other services.[9] Web 2.0 is generally considered to have begun around 2004, and continues to the current day.[8][10][4]

Terminology[edit]

The term “Web3”, specifically “Web 3.0”, was coined by Ethereum co-founder Gavin Wood in 2014.[1] In 2020 and 2021, the idea of Web3 gained popularity[citation needed]. Particular interest spiked towards the end of 2021, largely due to interest from cryptocurrency enthusiasts and investments from high-profile technologists and companies.[4][5] Executives from venture capital firm Andreessen Horowitz travelled to Washington, D.C. in October 2021 to lobby for the idea as a potential solution to questions about Internet regulation with which policymakers have been grappling.[11]

Web3 is distinct from Tim Berners-Lee‘s 1999 concept for a semantic web, which has also been called “Web 3.0”.[12] Some writers referring to the decentralized concept usually known as “Web3” have used the terminology “Web 3.0”, leading to some confusion between the two concepts.[2][3] Furthermore, some visions of Web3 also incorporate ideas relating to the semantic web.[13][14]

Concept[edit]

Web3 revolves around the idea of decentralization, which proponents often contrast with Web 2.0, wherein large amounts of the web’s data and content are centralized in the fairly small group of companies often referred to as Big Tech.[4]

Specific visions for Web3 differ, but all are heavily based in blockchain technologies, such as various cryptocurrencies and non-fungible tokens (NFTs).[4] Bloomberg described Web3 as an idea that “would build financial assets, in the form of tokens, into the inner workings of almost anything you do online”.[15] Some visions are based around the concepts of decentralized autonomous organizations (DAOs).[16] Decentralized finance (DeFi) is another key concept; in it, users exchange currency without bank or government involvement.[4] Self-sovereign identity allows users to identify themselves without relying on an authentication system such as OAuth, in which a trusted party has to be reached in order to assess identity.[17]

Reception[edit]

Technologists and journalists have described Web3 as a possible solution to concerns about the over-centralization of the web in a few “Big Tech” companies.[4][11] Some have expressed the notion that Web3 could improve data securityscalability, and privacy beyond what is currently possible with Web 2.0 platforms.[14] Bloomberg states that sceptics say the idea “is a long way from proving its use beyond niche applications, many of them tools aimed at crypto traders”.[15] The New York Times reported that several investors are betting $27 billion that Web3 “is the future of the internet”.[18][19]

Some companies, including Reddit and Discord, have explored incorporating Web3 technologies into their platforms in late 2021.[4][20] After heavy user backlash, Discord later announced they had no plans to integrate such technologies.[21] The company’s CEO, Jason Citron, tweeted a screenshot suggesting it might be exploring integrating Web3 into their platform. This led some to cancel their paid subscriptions over their distaste for NFTs, and others expressed concerns that such a change might increase the amount of scams and spam they had already experienced on crypto-related Discord servers.[20] Two days later, Citron tweeted that the company had no plans to integrate Web3 technologies into their platform, and said that it was an internal-only concept that had been developed in a company-wide hackathon.[21]

Some legal scholars quoted by The Conversation have expressed concerns over the difficulty of regulating a decentralized web, which they reported might make it more difficult to prevent cybercrimeonline harassmenthate speech, and the dissemination of child abuse images.[13] But, the news website also states that, “[decentralized web] represents the cyber-libertarian views and hopes of the past that the internet can empower ordinary people by breaking down existing power structures.” Some other critics of Web3 see the concept as a part of a cryptocurrency bubble, or as an extension of blockchain-based trends that they see as overhyped or harmful, particularly NFTs.[20] Some critics have raised concerns about the environmental impact of cryptocurrencies and NFTs. Others have expressed beliefs that Web3 and the associated technologies are a pyramid scheme.[5]

Kevin Werbach, author of The Blockchain and the New Architecture of Trust,[22] said that “many so-called ‘web3’ solutions are not as decentralized as they seem, while others have yet to show they are scalable, secure and accessible enough for the mass market”, adding that this “may change, but it’s not a given that all these limitations will be overcome”.[23]

David Gerard, author of Attack of the 50 Foot Blockchain,[24] told The Register that “web3 is a marketing buzzword with no technical meaning. It’s a melange of cryptocurrencies, smart contracts with nigh-magical abilities, and NFTs just because they think they can sell some monkeys to morons”.[25]

Below is an article from MarketWatch.com Distributed Ledger series about the different forms and cryptocurrencies involved

From Marketwatch: https://www.marketwatch.com/story/bitcoin-is-so-2021-heres-why-some-institutions-are-set-to-bypass-the-no-1-crypto-and-invest-in-ethereum-other-blockchains-next-year-11639690654?mod=home-page

by Frances Yue, Editor of Distributed Ledger, Marketwatch.com

Clayton Gardner, co-CEO of crypto investment management firm Titan, told Distributed Ledger that as crypto embraces broader adoption, he expects more institutions to bypass bitcoin and invest in other blockchains, such as Ethereum, Avalanche, and Terra in 2022. which all boast smart-contract features.

Bitcoin traditionally did not support complex smart contracts, which are computer programs stored on blockchains, though a major upgrade in November might have unlocked more potential.

“Bitcoin was originally seen as a macro speculative asset by many funds and for many it still is,” Gardner said. “If anything solidifies its use case, it’s a store of value. It’s not really used as originally intended, perhaps from a medium of exchange perspective.”

For institutions that are looking for blockchains that can “produce utility and some intrinsic value over time,” they might consider some other smart contract blockchains that have been driving the growth of decentralized finance and web 3.0, the third generation of the Internet, according to Gardner. 

Bitcoin is still one of the most secure blockchains, but I think layer-one, layer-two blockchains beyond Bitcoin, will handle the majority of transactions and activities from NFT (nonfungible tokens) to DeFi,“ Gardner said. “So I think institutions see that and insofar as they want to put capital to work in the coming months, I think that could be where they just pump the capital.”

Decentralized social media? 

The price of Decentralized Social, or DeSo, a cryptocurrency powering a blockchain that supports decentralized social media applications, surged roughly 74% to about $164 from $94, after Deso was listed at Coinbase Pro on Monday, before it fell to about $95, according to CoinGecko.

In the eyes of Nader Al-Naji, head of the DeSo foundation, decentralized social media has the potential to be “a lot bigger” than decentralized finance.

“Today there are only a few companies that control most of what we see online,” Al-Naji told Distributed Ledger in an interview. But DeSo is “creating a lot of new ways for creators to make money,” Al-Naji said.

“If you find a creator when they’re small, or an influencer, you can invest in that, and then if they become bigger and more popular, you make money and they make and they get capital early on to produce their creative work,” according to AI-Naji.

BitClout, the first application that was created by AI-Naji and his team on the DeSo blockchain, had initially drawn controversy, as some found that they had profiles on the platform without their consent, while the application’s users were buying and selling tokens representing their identities. Such tokens are called “creator coins.”

AI-Naji responded to the controversy saying that DeSo now supports more than 200 social-media applications including Bitclout. “I think that if you don’t like those features, you now have the freedom to use any app you want. Some apps don’t have that functionality at all.”

 

But Before I get to the “selling monkeys to morons” quote,

I want to talk about

THE GOOD, THE BAD, AND THE UGLY

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

THE GOOD

My foray into Science 2.0 and then pondering what the movement into a Science 3.0 led me to an article by Dr. Vladimir Teif, who studies gene regulation and the nucleosome, as well as creating a worldwide group of scientists who discuss matters on chromatin and gene regulation in a journal club type format.

For more information on this Fragile Nucleosome journal club see https://generegulation.org/fragile-nucleosome/.

Fragile Nucleosome is an international community of scientists interested in chromatin and gene regulation. Fragile Nucleosome is active in several spaces: one is the Discord server where several hundred scientists chat informally on scientific matters. You can join the Fragile Nucleosome Discord server. Another activity of the group is the organization of weekly virtual seminars on Zoom. Our webinars are usually conducted on Wednesdays 9am Pacific time (5pm UK, 6pm Central Europe). Most previous seminars have been recorded and can be viewed at our YouTube channel. The schedule of upcoming webinars is shown below. Our third activity is the organization of weekly journal clubs detailed at a separate page (Fragile Nucleosome Journal Club).

 

His lab site is at https://generegulation.org/ but had published a paper describing what he felt what the #science2_0 to #science3_0 transition would look like (see his blog page on this at https://generegulation.org/open-science/).

This concept of science 3.0 he had coined back in 2009.  As Dr Teif had mentioned

So essentially I first introduced this word Science 3.0 in 2009, and since then we did a lot to implement this in practice. The Twitter account @generegulation is also one of examples

 

This is curious as we still have an ill defined concept of what #science3_0 would look like but it is a good read nonetheless.

His paper,  entitled “Science 3.0: Corrections to the Science 2.0 paradigm” is on the Cornell preprint server at https://arxiv.org/abs/1301.2522 

 

Abstract

Science 3.0: Corrections to the Science 2.0 paradigm

The concept of Science 2.0 was introduced almost a decade ago to describe the new generation of online-based tools for researchers allowing easier data sharing, collaboration and publishing. Although technically sound, the concept still does not work as expected. Here we provide a systematic line of arguments to modify the concept of Science 2.0, making it more consistent with the spirit and traditions of science and Internet. Our first correction to the Science 2.0 paradigm concerns the open-access publication models charging fees to the authors. As discussed elsewhere, we show that the monopoly of such publishing models increases biases and inequalities in the representation of scientific ideas based on the author’s income. Our second correction concerns post-publication comments online, which are all essentially non-anonymous in the current Science 2.0 paradigm. We conclude that scientific post-publication discussions require special anonymization systems. We further analyze the reasons of the failure of the current post-publication peer-review models and suggest what needs to be changed in Science 3.0 to convert Internet into a large journal club. [bold face added]
In this paper it is important to note the transition of a science 1.0, which involved hard copy journal publications usually only accessible in libraries to a more digital 2.0 format where data, papers, and ideas could be easily shared among networks of scientists.
As Dr. Teif states, the term “Science 2.0” had been coined back in 2009, and several influential journals including Science, Nature and Scientific American endorsed this term and suggested scientists to move online and their discussions online.  However, even at present there are thousands on this science 2.0 platform, Dr Teif notes the number of scientists subscribed to many Science 2.0 networking groups such as on LinkedIn and ResearchGate have seemingly saturated over the years, with little new members in recent times. 
The consensus is that science 2.0 networking is:
  1. good because it multiplies the efforts of many scientists, including experts and adds to the scientific discourse unavailable on a 1.0 format
  2. that online data sharing is good because it assists in the process of discovery (can see this evident with preprint servers, bio-curated databases, Github projects)
  3. open-access publishing is beneficial because free access to professional articles and open-access will be the only publishing format in the future (although this is highly debatable as many journals are holding on to a type of “hybrid open access format” which is not truly open access
  4. only sharing of unfinished works and critiques or opinions is good because it creates visibility for scientists where they can receive credit for their expert commentary

There are a few concerns on Science 3.0 Dr. Teif articulates:

A.  Science 3.0 Still Needs Peer Review

Peer review of scientific findings will always be an imperative in the dissemination of well-done, properly controlled scientific discovery.  As Science 2.0 relies on an army of scientific volunteers, the peer review process also involves an army of scientific experts who give their time to safeguard the credibility of science, by ensuring that findings are reliable and data is presented fairly and properly.  It has been very evident, in this time of pandemic and the rapid increase of volumes of preprint server papers on Sars-COV2, that peer review is critical.  Many of these papers on such preprint servers were later either retracted or failed a stringent peer review process.

Now many journals of the 1.0 format do not generally reward their peer reviewers other than the self credit that researchers use on their curriculum vitaes.  Some journals, like the MDPI journal family, do issues peer reviewer credits which can be used to defray the high publication costs of open access (one area that many scientists lament about the open access movement; where the burden of publication cost lies on the individual researcher).

An issue which is highlighted is the potential for INFORMATION NOISE regarding the ability to self publish on Science 2.0 platforms.

 

The NEW BREED was born in 4/2012

An ongoing effort on this platform, https://pharmaceuticalintelligence.com/, is to establish a scientific methodology for curating scientific findings where one the goals is to assist to quell the information noise that can result from the massive amounts of new informatics and data occurring in the biomedical literature. 

B.  Open Access Publishing Model leads to biases and inequalities in the idea selection

The open access publishing model has been compared to the model applied by the advertising industry years ago and publishers then considered the journal articles as “advertisements”.  However NOTHING could be further from the truth.  In advertising the publishers claim the companies not the consumer pays for the ads.  However in scientific open access publishing, although the consumer (libraries) do not pay for access the burden of BOTH the cost of doing the research and publishing the findings is now put on the individual researcher.  Some of these publishing costs can be as high as $4000 USD per article, which is very high for most researchers.  However many universities try to refund the publishers if they do open access publishing so it still costs the consumer and the individual researcher, limiting the cost savings to either.  

However, this sets up a situation in which young researchers, who in general are not well funded, are struggling with the publication costs, and this sets up a bias or inequitable system which rewards the well funded older researchers and bigger academic labs.

C. Post publication comments and discussion require online hubs and anonymization systems

Many recent publications stress the importance of a post-publication review process or system yet, although many big journals like Nature and Science have their own blogs and commentary systems, these are rarely used.  In fact they show that there are just 1 comment per 100 views of a journal article on these systems.  In the traditional journals editors are the referees of comments and have the ability to censure comments or discourse.  The article laments that comments should be easy to do on journals, like how easy it is to make comments on other social sites, however scientists are not offering their comments or opinions on the matter. 

In a personal experience, 

a well written commentary goes through editors which usually reject a comment like they were rejecting an original research article.  Thus many scientists, I believe, after fashioning a well researched and referenced reply, do not get the light of day if not in the editor’s interests.  

Therefore the need for anonymity is greatly needed and the lack of this may be the hindrance why scientific discourse is so limited on these types of Science 2.0 platforms.  Platforms that have success in this arena include anonymous platforms like Wikipedia or certain closed LinkedIn professional platforms but more open platforms like Google Knowledge has been a failure.

A great example on this platform was a very spirited conversation on LinkedIn on genomics, tumor heterogeneity and personalized medicine which we curated from the LinkedIn discussion (unfortunately LinkedIn has closed many groups) seen here:

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

 

Issues in Personalized Medicine: Discussions of Intratumor Heterogeneity from the Oncology Pharma forum on LinkedIn

 

In this discussion, it was surprising that over a weekend so many scientists from all over the world contributed to a great discussion on the topic of tumor heterogeneity.

But many feel such discussions would be safer if they were anonymized.  However then researchers do not get any credit for their opinions or commentaries.

A Major problem is how to take the intangible and make them into tangible assets which would both promote the discourse as well as reward those who take their time to improve scientific discussion.

This is where something like NFTs or a decentralized network may become important!

See

https://pharmaceuticalintelligence.com/portfolio-of-ip-assets/

 

UPDATED 5/09/2022

Below is an online @TwitterSpace Discussion we had with some young scientists who are just starting out and gave their thoughts on what SCIENCE 3.0 and the future of dissemination of science might look like, in light of this new Meta Verse.  However we have to define each of these terms in light of Science and not just the Internet as merely a decentralized marketplace for commonly held goods.

This online discussion was tweeted out and got a fair amount of impressions (60) as well as interactors (50).

 For the recording on both Twitter as well as on an audio format please see below

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Set a reminder for my upcoming Space! <a href=”https://t.co/7mOpScZfGN”>https://t.co/7mOpScZfGN</a&gt; <a href=”https://twitter.com/Pharma_BI?ref_src=twsrc%5Etfw”>@Pharma_BI</a&gt; <a href=”https://twitter.com/PSMTempleU?ref_src=twsrc%5Etfw”>@PSMTempleU</a&gt; <a href=”https://twitter.com/hashtag/science3_0?src=hash&amp;ref_src=twsrc%5Etfw”>#science3_0</a&gt; <a href=”https://twitter.com/science2_0?ref_src=twsrc%5Etfw”>@science2_0</a></p>&mdash; Stephen J Williams (@StephenJWillia2) <a href=”https://twitter.com/StephenJWillia2/status/1519776668176502792?ref_src=twsrc%5Etfw”>April 28, 2022</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js&#8221; charset=”utf-8″></script>

 

 

To introduce this discussion first a few startoff material which will fram this discourse

 






The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?

Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.

Each version of Science has their caveats: their benefits as well as deficiencies. This curation and the ongoing discussion is meant to solidy the basis for the new format, along with definitions and determination of structure.

We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!

 

We really need people and organizations to get on top of this new Web 3.0 or metaverse so the similar issues do not get in the way: namely we need to create an organizing structure (maybe as knowledgebases), we need INCENTIVIZED co-curators, and we need ANALYSIS… lots of it!!

Are these new technologies the cure or is it just another headache?

 

There were a few overarching themes whether one was talking about AI, NLP, Virtual Reality, or other new technologies with respect to this new meta verse and a concensus of Decentralized, Incentivized, and Integrated was commonly expressed among the attendees

The Following are some slides from representative Presentations

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Other article of note on this topic on this Open Access Scientific Journal Include:

Electronic Scientific AGORA: Comment Exchanges by Global Scientists on Articles published in the Open Access Journal @pharmaceuticalintelligence.com – Four Case Studies

eScientific Publishing a Case in Point: Evolution of Platform Architecture Methodologies and of Intellectual Property Development (Content Creation by Curation) Business Model 

e-Scientific Publishing: The Competitive Advantage of a Powerhouse for Curation of Scientific Findings and Methodology Development for e-Scientific Publishing – LPBI Group, A Case in Point

@PharmaceuticalIntelligence.com –  A Case Study on the LEADER in Curation of Scientific Findings

Real Time Coverage @BIOConvention #BIO2019: Falling in Love with Science: Championing Science for Everyone, Everywhere

Old Industrial Revolution Paradigm of Education Needs to End: How Scientific Curation Can Transform Education

 

Read Full Post »

Comment by Cardiologists posted on LinkedIn’s

European Cardiovascular Medical Devices Group, a subgroup of Cardiovascular Medical Devices Group

on Stenting for Proximal LAD Lesions: In Reference to the Invasive Procedure performed on former President George W. Bush

UPDATED on 8/7/2018

Long-Term Outcomes of Stenting the Proximal LAD

Study Questions:

What are the outcomes of patients undergoing drug-eluting stent (DES) implantation according to lesion location within or outside the proximal left anterior descending (LAD) artery?

Methods:

Among the 8,709 patients enrolled in PROTECT (Patient Related Outcomes With Endeavor Versus Cypher Stenting Trial), a multicenter percutaneous coronary intervention (PCI) trial, the investigators compared the outcomes of 2,534 patients (29.1%; 3,871 lesions [31.5%]) with stents implanted in the proximal LAD with 6,172 patients (70.9%; 8,419 lesions [68.5%]) with stents implanted outside the proximal LAD. For each event, a multivariate model was constructed that examined the effect of several individual baseline clinical and angiographic characteristics, including proximal LAD target lesion, on outcomes (i.e., MACE [major adverse cardiac events], target vessel failure [TVF], and myocardial infarction [MI]).

Results:

At 4-year follow-up, death rates were the same (5.8% vs. 5.8%; p > 0.999), but more MIs occurred in the proximal LAD group (6.2% vs. 4.9%; p = 0.015). The rates of clinically driven TVF (14.8% vs. 13.5%; p = 0.109), MACE (15.0% vs. 13.7%; hazard ratio, 1.1; 95% CI, 0.97-1.31; p = 0.139), and stent thrombosis (2.1% vs. 2.0%; p = 0.800) were similar. DES type had no interaction with MACE or TVF. In multivariate analysis, the proximal LAD was a predictor for MI (p = 0.038), but not for TVF (p = 0.149) or MACE (p = 0.069).

Conclusions:

The authors concluded that proximal LAD location was associated with higher rates of MI during the long-term follow-up, but there were no differences in stent thrombosis, death, TVF, or overall MACE.

Perspective:

This post hoc analysis of a prospective, multicenter study reports no difference in the rates of death, MACE, or TVF at 4 years according to intervention at a proximal LAD or nonproximal LAD lesion. The occurrence of the predefined primary endpoint of stent thrombosis was also not dependent on whether a proximal LAD or nonproximal LAD site was treated. However, of note, stenting of proximal LAD lesions was associated with significantly higher rates of MI compared with stenting of nonproximal LAD lesions. Overall, these findings appear to suggest that proximal LAD lesions may not have additional risk in the contemporary DES era, but the higher risk of MI needs to be studied further. Future studies should compare longer-term clinical outcomes between proximal LAD PCI with DES and minimally invasive left internal mammary artery to LAD.

SOURCE

https://www.acc.org/latest-in-cardiology/journal-scans/2017/03/22/15/11/long-term-outcomes-of-stenting-the-proximal-lad

 

Stenting for Proximal LAD Lesions

Curator: Aviva Lev-Ari, PhD, RN

Michael Reinhardt • First, the media really should not be calling this “stent surgery” its a stent procedure just ask any post-CABG patient… Anyway it really is not possible to determine whether or not is was “unnecessary” without all the relevant patient data; which coronary vessel(s) involved, percent stenosis, etc. Actually I find it interesting that they apparently decided to stent the former president on the basis of a CT Angiogram which is not the standard of care for coronary imaging. I have to assume they performed an additional testing like a CT perfusion analysis and saw a clinically relevant defect and this support the decision to stent. Regarding the post-stent drugs cloplidigrel is not a benign drug but benefits far outweigh the downside of a sub-acute thrombosis which might result in a more serious future event = acute MI.

Rafael Beyar • This was absolutely an indicated procedure and almost all rational physician will treat a young patient with proximal LAD lesions with either a stent or bypass surgery

Dov V Shimon MD • No doubt! Proximal (‘close to origin’) LAD lesions are the leading “Widow makers”. Reestablishing of flow in the artery is saving from cardiac damage and death. Drug eluting stent have 2nd and 3rd generations with very low and acceptable reclosure rates and almost no abrupt closure (thrombosis). True, CTA is a screening test, but it astablishes the need for diagnostic and therapeutic angiogram. We, heart surgeons can provide long-term patency to the LAD using LIMA arterial bypass. The current advantage of stent is the incovenience and pain of surgery. Any responsible physician would opt the procedure even for himself, his relatives , his patients and for definitely for GW Bush.

http://www.linkedin.com/groupItem?view=&gid=3358310&type=member&item=265974376&commentID=157366758&goback=%2Egmr_3358310&report%2Esuccess=8ULbKyXO6NDvmoK7o030UNOYGZKrvdhBhypZ_w8EpQrrQI-BBjkmxwkEOwBjLE28YyDIxcyEO7_TA_giuRN#commentID_157366758

Coronary anatomy and anomalies

On the left an overview of the coronary arteries in the anterior projection.

Coronary anatomy and anomalies

RCA, LAD and Cx in the anterior projection

On the left an overview of the coronary arteries in the lateral projection.

  • Left Main or left coronary artery (LCA)
    • Left anterior descending (LAD)
      • diagonal branches (D1, D2)
      • septal branches
    • Circumflex (Cx)
      • Marginal branches (M1,M2)
  • Right coronary artery
    • Acute marginal branch (AM)
    • AV node branch
    • Posterior descending artery (PDA)

Eur J Cardiothorac Surg. 2004 Apr;25(4):567-71.

Isolated high-grade lesion of the proximal LAD: a stent or off-pump LIMA?

Source

Thoraxcentre, Groningen University Hospital, Groningen, The Netherlands.

Abstract

OBJECTIVES:

The objective of this study was to compare the long-term outcome of patients with an isolated high-grade stenosis of the left anterior descending (LAD) coronary artery randomized to percutaneous transluminal coronary angioplasty with stenting (PCI, stenting) or to off-pump coronary artery bypass grafting (surgery).

METHODS:

Patients with an isolated high-grade stenosis (American College of Cardiology/American Heart Association classification type B2/C) of the proximal LAD were randomly assigned to stenting (n=51) or to surgery (n=51) and were followed for 3-5 years (mean 4 years). Primary composite endpoint was freedom from major adverse cardiac and cerebrovascular events (MACCEs), including cardiac death, myocardial infarction, stroke and repeat target vessel revascularization. Secondary endpoints were angina pectoris status and need for anti-anginal medication at follow-up. Analysis was by intention to treat.

RESULTS:

MACCEs occurred in 27.5% after stenting and 9.8% after surgery (P=0.02; absolute risk reduction 17.7%). Freedom from angina pectoris was 67% after stenting and 85% after surgery (P=0.036). Need for anti-anginal medication was significantly lower after surgery compared to stenting (P=0.002).

CONCLUSION:

Patients with an isolated high-grade lesion of the proximal LAD have a significantly better 4-year clinical outcome after off-pump coronary bypass grafting than after PCI.

Daily Dose

08/12/2013 | 5:48 PM

Was George Bush’s stent surgery really unnecessary?

By Deborah Kotz / Globe Staff

VIEW VIDEO

Ever since President George W. Bush had stent surgery last Tuesday to open a blocked artery, leading physicians who weren’t involved in his care have wondered publically why he had this “unnecessary” procedure. Large clinical trials have demonstrated that stent placement doesn’t extend lives or prevent a future heart attack or stroke in those with stable heart disease.

What’s more, Bush could wind up with complications like a reblockage where the stent was placed or excessive bruising or internal bleeding from the blood thinners that he must take likely for the next year.

Dr Richard Besser, the chief medical correspondent for ABC News, questioned why Bush had an exercise stress test as part of his routine physical exam given that he had no symptoms like chest pain or shortness of breath. The stress test indicated signs of an artery blockage.

“In people who are not having symptoms, the American Heart Association says you should not do a stress test,” Besser said, “since the value of opening that artery is to relieve the symptoms.”

Cleveland Clinic cardiologist Dr. Steve Nissen agreed in his interview with USA Today. Bush, he said, likely “got the classical thing that happens to VIP patients, when they get so-called executive physicals and they get a lot of tests that aren’t indicated. This is American medicine at its worst.”

Two physicians wrote in an Washington Post op-ed column titled “President Bush’s unnecessary surgery” that they worry that the media coverage of Bush’s stent will lead “patients to pressure their own doctors for unwarranted and excessive care.”

But none of these doctors actually treated Bush or examined his medical records, so I’m a little surprised they’re making such firm calls.

Bush, an avid biker who recently completed a 100-kilometer ride, probably shouldn’t have had the exercise stress test if he wasn’t having any heart symptoms. “Routine stress testing used to be done 20 years ago, but isn’t recommended any longer since it doesn’t have any benefit,” said Brigham and Women’s cardiologist Dr. Christopher Cannon.

But Bush’s spokesman insisted the stent was necessary after followup heart imaging via a CT angiogram “confirmed a blockage that required opening.”

Cannon said Bush’s doctors may have seen signs that blood flow wasn’t getting to a significant part of the heart muscle, a condition known as ischemia. Researchers have found that those with moderate to severe ischemia appear to experience a reduction in fatal heart attacks when they have a stent placement along with medical therapy, rather than just taking medications alone. (Larger studies, though, are needed to confirm this finding.)

“If a blockage occurs at the very start of the artery and it’s extensive—95 percent blocked—then chances are it will cause significant ischemia,” Cannon said. While severe ischemia usually causes light-headedness or dizziness during exercise, Bush may have had more moderate ischemia that didn’t cause such symptoms.

It’s impossible to know for certain, he added, without seeing his medical records firsthand.

http://www.boston.com/lifestyle/health/blogs/daily-dose/2013/08/12/was-george-bush-stent-surgery-really-unnecessary/DzklhNCGVlgriNxgpKZtuO/blog.html

President Bush’s unnecessary heart surgery

  • By Vinay Prasad and Adam Cifu, Published: August 9

Vinay Prasad is chief fellow of medical oncology at the National Cancer Institute and the National Institutes of Health. Adam Cifu is a professor of medicine at the University of Chicago.

Former president George W. Bush, widely regarded as a model of physical fitness, received a coronary artery stent on Tuesday. Few facts are known about the case, but what is known suggests the procedure was unnecessary.

Before he underwent his annual physical, Mr. Bush reportedly had no symptoms. Quite the opposite: His exercise tolerance was astonishing for his age, 67. He rode more than 30 miles in the heat on a bike ride for veterans injured in the wars in Iraq and Afghanistan.

If Mr. Bush had visited a general internist practicing sound, evidence-based care, he would not have had cardiac testing. Instead, the doctor would have had conducted age-appropriate cancer screening. For the former president, this would include only colon cancer screening. It no longer would include even prostate-specific antigen testing for cancer. The doctor would have screened for cholesterol, checked for hypertension and made sure the patient was up to date on age-appropriate vaccinations, including those for pneumococcal pneumonia and shingles. Presumably Mr. Bush got these things, and he got the cardiac test as well.What value does a stress test add for an otherwise healthy 67-year-old?No study has shown that this examination improves outcomes. The trials that have been done for so-called routine stress testing examined higher-risk patients. They found that performing stress tests on people at high risk of cardiovascular disease may detect blockages but does not improve symptoms or survival. Routine stress testing does, however, increase the use of procedures such as coronary stenting.Unfortunately, Mr. Bush, like many VIPs, may be paying the price of these in-depth investigations. His stress test revealed an abnormality, prompting another test: a CT angiogram. This study showed a blockage, which was stented open during an invasive procedure. It is worth noting that at least two large randomized trials show that stenting these sorts of lesions does not improve survival. Because Mr. Bush had no symptoms, it is impossible that he felt better after these procedures.

Instead, George W. Bush will have to take two blood thinners, aspirin and Plavix, for at least a month and probably a year. (The amount of time a blood thinner is needed depends on the type of stent placed). While he takes these medications, he will have a higher risk of bleeding complications with no real benefit.

Although this may seem like an issue important only to the former president, consider the following: Although the price of excessive screening of so-called VIPs is usually paid for privately, follow-up tests, only “necessary” because of the initial unnecessary screening test, are usually paid for by Medicare, further stressing our health-care system. The media coverage of interventions like Mr. Bush’s also leads patients to pressure their own doctors for unwarranted and excessive care.

http://www.washingtonpost.com/opinions/president-bushs-unnecessary-heart-surgery/2013/08/09/c91c439c-0041-11e3-9a3e-916de805f65d_story.html

Read Full Post »

Our FIRST e-Book on Amazon

(Biomed e-Books) [Kindle Edition]

VIEW COVER PAGE

http://www.amazon.com/dp/B00DINFFYC

Perspectives on Nitric Oxide in Disease Mechanisms

http://www.amazon.com/dp/B00DINFFYC

  • Please write a REVIEW of One Article on AMAZON

>200,000 VIEWERS

>1,030 BioMed Articles

>1,230 Clicks from the NIH

http://pharmaceuticalintelligence.com

http://www.linkedin.com/groups?gid=4346921&trk=hb_side_g

http://twitter.com/pharma_BI

http://www.facebook.com/LeadersInPharmaceuticalBusinessIntelligence

  • Help us Promote our e-Books and we will do same for yours

Aviva Lev-Ari, PhD, RN

BioMed e-Books Series – Editor-in-Chief

1-617-244-4024

avivalev-ari@alum.berkeley.edu

http://pharmaceuticalintelligence.com/biomed-e-books/

SKYPE: HarpPlayer83

http://www.linkedin.com/in/avivalevari

http://pharmaceuticalintelligence.com

http://www.facebook.com/LeadersInPharmaceuticalBusinessIntelligence

http://twitter.com/pharma_BI

http://www.linkedin.com/groups?gid=4346921&trk=hb_side_g

Leaders in Pharmaceutical Business Intelligence

Founder & Director of Pharmaceutical Business Intelligence Services

 

 

Read Full Post »

Reporter: Aviva Lev-Ari, PhD, RN

 

Mark Levin’s business is biotechnology, so it’s no surprise he knew zilch about a tech company called LinkedIn as recently as two years ago. But these days Levin sounds like he can barely do his job without it.

“I’m not the most social media savvy person. I haven’t used a lot of these tools at all,” Levin says, referring to blogs and Twitter. “But I’ll never forget, the first message I got from LinkedIn was an e-mail from what looked like someone called link-a-din. I remember asking myself about Mr. Link-a-din. I was trying to figure out ‘who the hell is this person?’”

Levin, a founding partner of Boston-based Third Rock Ventures and one of the more prominent biotech venture capitalists in the U.S., was a LinkedIn Luddite two years ago. To some extent, he still looks like one: his profile contains no photo, no professional biography, and only tidbits of information posted about his employment history. But appearances can be deceiving. He says he has amassed more than 5,000 connections, and the number keeps growing daily. He says he spends at least a half an hour per day on the site, sifting through more than 100 incoming connection requests a week, and firing off dozens more requests to people he wants to get to know. LinkedIn’s algorithms have gotten to know his tendencies so well, the site is constantly suggesting new people in biotech and pharma companies that he might want to meet. He often does.

Mark Levin of Third Rock Ventures

Levin became so obsessive at one point this year that LinkedIn temporarily shut down his account, until he called the company and assured them he’s a real person using the site for business. Just during a 15-minute phone interview with me on Friday, Levin said he got three new connection requests. One was from an MD that caught his eye immediately.

“About 18 months ago or so, I realized that is an extraordinary way to be in contact with people,” Levin says. “Our biggest challenge is to find great people. We don’t know everybody. And you can find a lot of great people here.”

While many in the tech press mock LinkedIn as an oh-so-boring compiler of mere resumes, it has become the indispensable online hub for networking in life sciences—an industry where relationships make the world go round. LinkedIn has a relatively puny user base of 187 million members around the world, compared to Facebook’s 1 billion, and there’s no question people spend way more time engaging with Mark Zuckerberg’s social network. But it’s also true there’s no question which site matters more to the life sciences. LinkedIn is the singular site for finding people in biotech, whether they are biologists, chemists, toxicologists, admin assistants, business development people, finance pros, or CEOs. There were more than 513,000 people in the LinkedIn database who self-identify as members of the “biotechnology” or “pharmaceutical” industry when I searched on those keywords Friday afternoon.

For journalists like me, this is an everyday reporting tool with almost as much value as Twitter, and possibly more. Even though I only use the basic free version of the site, it’s become an awesome clearinghouse of sources that I call on for help with scoops and analysis. I can slice and dice my network of 2,900 contacts by industry, title, location and more. It’s become a treasure trove of personal e-mails for sources, which I never have to manually update when people leave for new jobs, as they often do. It’s even turned into a place where people read a lot of my stories and the resource where I sometimes find new stories to pursue. In fact, I got the idea for this story by noticing that Levin and I have more than 500 connections in common.

for different reasons, but he raves all the same. Nothing great in biotech can happen without a magical mix of an idea, technology, people, and money.

“Our No. 1 goal in life is to know the best people in the industry who are going to make a difference in our companies,” Levin says. “I don’t remember when it exactly became clear, but it was clear to me that a lot of people were using it to stay in touch. We’ve realized it’s an extraordinary recruiting tool. The more I’ve spent time there, the more aggressive I have gotten.”

Levin isn’t kidding about the emphasis on recruiting at Third Rock, which has a “recruiting partner” in Craig Greaves, a former recruiter at Biogen Idec (NASDAQ: BIIB) and Cubist Pharmaceuticals (NASDAQ: CBST). Levin says all this connecting and re-connecting sometimes leads somewhere fruitful, sometimes not, just like with all other recruiting techniques.

But Levin and his partner at Third Rock aren’t just fiddling around making random contacts, they are being systematic about the connections they form. Once he forms a connection on LinkedIn, he said he sends the new contact a short follow-up note to see what’s new in their lives or careers. He then e-mails his fellow partners to see if any of them know the person. Third Rock uses a premium version of LinkedIn, which has an application that automatically downloads all of Levin’s new contacts into a central database so that all members of the firm can see the person’s profile, Greaves says.

Sometimes an in-person meeting gets scheduled to follow up right away to see if there might be some kind of potential for a match at a Third Rock company. Often Third Rock uses the site for targeted searches, like, say, for an antibody engineer, Greaves says. Other times, it’s just to get acquainted with people who aren’t looking for work now, but might be able to join a startup, consult, or form a valuable partnership with a Third Rock company sometime later, he says.

“We are laying the groundwork and building a network for the long term,” Greaves says.

No doubt, LinkedIn has its potential for misuse just like any other technology, and users need to think about how to use it properly. Back when the site was formed in 2003, people were urged to connect only with people they knew well, because otherwise people might think you were tainted if a shady operator ended up appearing in your network. I think that stigma has largely gone away, because a connection is perceived now as really just like trading business cards, and not an endorsement or recommendation. People have also long worried about whether bosses might be able to use it to spy on their workers, and suspect whether they were getting restless, looking for a new job. I used to leave my entire connections list accessible on the web for anyone who connected with me, until I started connecting with people I don’t really know, and realized some may have ulterior motives that might interfere with my ability to break news.

There are plenty of areas on the site that leave something to be desired. LinkedIn Groups have always struck me as spammy, so I’ve unsubscribed to most of them. The site can be annoying with its constant urges to “update your profile” or “add skills to your profile” or now to “endorse” various people in your network. The whole site appears to be trying really hard to keep people glued to it like Facebook, by constantly updating their status and checking other people’s employment status, which can be annoying and a waste of time.

But the most irritating thing about LinkedIn, to me anyway, is that even though it has achieved critical mass, many C-suite executives and venture capitalists still resist signing up. For example, when I searched on the 40 names of “young and proven” biotech venture capitalists listed in this column two weeks ago, only 24 of the 40 (60 percent) showed up in the LinkedIn database.

I find it baffling that so many senior people in the industry still resist taking advantage of this resource, and have to wonder if they have some better idea on how to network. There’s no getting around the importance of networking. Biotech is a geographically far-flung industry, with hundreds of companies and vendors, who all need to work together in trusting relationships to keep the whole enterprise afloat.

Industry conferences have always been, and still remain, the gold standard way of networking. But those events take time and money, and nobody can do it every day of the week. LinkedIn is becoming the indispensable resource that glues an entire industry together, and helps people make connections between people and ideas and opportunities that would otherwise never be made. While biotech could certainly use a few more groundbreaking advances to make the drug development process more efficient, one of the fastest-growing new tools for the industry is a free resource just a click away on the Web.

Luke Timmerman is the National Biotech Editor of Xconomy. E-mail him at

ltimmerman@xconomy.com 

SOURCE:

 

Read Full Post »

%d bloggers like this: