Funding, Deals & Partnerships: BIOLOGICS & MEDICAL DEVICES; BioMed e-Series; Medicine and Life Sciences Scientific Journal – http://PharmaceuticalIntelligence.com
Gian M. Volpicelli is a senior writer at WIRED, where he covers cryptocurrency, decentralization, politics, and technology regulation. He received a master’s degree in journalism from City University of London after studying politics and international relations in Rome. He lives in London.
The duo behind Twitter Crypto say NFT profile pics and crypto tipping are just the beginning.
YOU MIGHT HAVE heard of crypto Twitter, the corner of the social network where accounts have Bored Apes as profile pictures, posts are rife with talk of tokens, blockchains, and buying the Bitcoin dip, and Elon Musk is venerated.
Then again, you might have heard of Twitter Crypto, the business unit devoted to developing the social network’s strategy for cryptocurrency, blockchains, and that grab-bag of decentralized technologies falling under the rubric of Web3. The team’s unveiling came in November 2021 via a tweet from the newly hired project lead, Tess Rinearson, a Berlin-based American computer scientist whose career includes stints at blockchain companies such as Tendermint and Interchain.
Rinearson joined Twitter at a crucial moment. Jack Dorsey, the vociferously pro-Bitcoin company CEO, would leave a few weeks later, to be replaced by CTO Parag Agrawal. Agrawal had played an instrumental role in Bluesky, a Twitter-backed project to create a protocol—possibly with blockchain components—to build decentralized social networks.
As crypto went mainstream globally and crypto Twitter burgeoned, the company tried to dominate the space. Under the stewardship of product manager Esther Crawford, in September 2021 Twitter introduced a “tipping” feature that helps creators on Twitter to receive Bitcoin contributions through Lightning—a network for fast Bitcoin payments. In January, Twitter allowed subscribers of Twitter’s premium service, Twitter Blue, to flaunt their NFTs as hexagonal profile pictures, through a partnership with NFT marketplace OpenSea.
Twitter Crypto is just getting started. While Rinearson works with people all across the company, her team is still under 10 people, although more hires are in the pipeline, judging from recent job postings. So it’s worth asking what is next. I caught up over a video call with Rinearson and Crawford to talk about where Twitter Crypto is headed.
The conversation has been edited for clarity and brevity.
WIRED: Let’s start with the basics. Why does Twitter have a crypto unit?
Tess Rinearson: We really see crypto—and what we’re now calling Web3— as something that could be this incredibly powerful tool that would unlock a lot for our users. The whole crypto world is like an internet of money, an internet of value that our users can potentially tap into to create new ways of owning their content, monetizing their content, owning their own identity, and even relating to each other.
One of my goals is to build Twitter’s crypto unit in such a way that it caters to communities that go beyond just that core crypto community. I love the crypto Twitter space, obviously—I’m a very proud member of the crypto community. And at the same time, I recognize that people who are really deep in the crypto space may not relate to concepts, like for instance blockchain’s immutability, in the same way that someone who’s less intensely involved might feel about those things.
So a lot of what we try to think about is, what can we learn from this group of people who are super engaged and really, really, creative? And then, how can we translate some of that stuff into a format or a mechanism or a product that’s a little bit more accessible to people who don’t have that background?
How are you learning from crypto Twitter? Do you just follow a lot of accounts, do you actually talk to them? How does that learning experience play out?
Esther Crawford: It’s a combination. We have an amazing research team that sets up panel interviews and surveys. But we’re also embedded in the community itself and follow a bunch of accounts, sit on Twitter spaces, go to conferences and events, engage with customers in that way. That’s the way the research piece of it works. But we also encounter it as end users: Twitter is the discovery platform today for all things crypto.
One of the things we do differently at Twitter is we build out in the open. And so this means having dialog with customers in real time—designers will take something that is very early-stage and post it as a tweet and then get real-time feedback. They’ll hop into spaces with product managers and engineering managers, talk about it live with real customers, and then incorporate that feedback into the designs and what ultimately we end up launching.
Rinearson: One of the things I wanted to make sure of before I came to Twitter was to know that we would be able to build features in the open and solicit feedback and show rough drafts. And so this is something I asked Parag Agrawal, who’s now the CEO, and was the person who hired me. Pretty early in the job interview process, I said this was going to be really important, and he said, “If you think it’s important to the success of this work, great, do it—thumbs up.” He also shares that openness.
As you said, Tess, you come from crypto. When you were out there, what did you think Twitter was getting right? What did you think Twitter was getting wrong?
Rinearson: I had been a Twitter power user for a really long time. The thing that I saw was a lot of aesthetic alignment between how Twitter exists in the world and the way that crypto exists in the world. Twitter has decentralized user experiences in its DNA. And, this is a bit cheesy, but people use Twitter sometimes in ways that they use a public blockchain, as a public database where everything’s time stamped and people can agree on what happened.
And for most people it’s open, it is there for public conversation. And then obviously it was also the place—a place—where the crypto community really found its footing. I think it’s been a place where an enormous amount of discovery happens, and education and learning for the whole community. I joined when there were some murmurings about Twitter starting to do crypto stuff, mostly stuff Esther had led actually, and I was excited to see where it was going. And then Twitter’s investment in Bluesky also gave me a lot of confidence.
Let’s talk about the two main things you have delivered so far: The crypto tipping feature and NFT pictures. Can you give me just a potted history of how each came about and why?
Crawford: Those are our first set of early explorations, and the reason why we started there was we really wanted to make sure that what we built benefited creators, their audiences, and then all the conversations that are happening on Twitter. For creators in particular, we know that they rely on platforms like Twitter to monetize and earn a living, and not all people are able to use traditional currencies. Not everybody has a traditional banking account setup.
And so we wanted to provide an opportunity for a borderless payment solution, and that’s why we decided to go ahead and use Bitcoin Lightning as our first big integration. One of the reasons we chose Bitcoin Lightning was also because of the low transaction fees. And we have Bitcoin and Ethereum addresses that you can also put in there [on your Twitter “tipping jar”]. We noticed that people were actually adding information about their crypto wallet addresses in their profiles. And so we wanted to make a more seamless experience, so that people could just tip through the platform, so that it felt native.
With NFT profile pictures, the way that came about was, again, looking at user behavior. People were adding NFTs that they owned as avatars, but you didn’t really know whether they owned those NFTs or not. So we decided to go ahead and build out that feature so that one could actually prove ownership.
That’s similar to how other things developed on Twitter, right? The hashtag, or even even the retweet, were initially just things users invented—by adding the # sign, or by pasting other users’ tweets—and then Twitter made that a feature.
Crawford: Yeah, exactly. Many of the best ideas come from watching user behavior on the platform, and then we just productize that.
Rinearson: Sometimes I’ve heard people call that the “help wanted signs,” and like, keeping an eye out for the “help wanted signs” across the platform. The NFT profile picture was a clear example of that.
How do all these things—these two things and possibly other crypto features coming further down the line—really help Twitter’s bottom line?
Crawford:With creator monetization our goal was to help creators get paid, not Twitter. But Twitter takes a really small cut of earnings. For more successful creators, we take a larger percentage. The way we think about this is, it is part of our revenue diversification.
Twitter today is a wholly ad-based business. In the future we imagine Twitter making money from a variety of different product areas. So Twitter Blue is one of those products—you can pay $2.99 a month and you get additional features, such as the NFT profile pictures. We really think that revenue diversification sits across a variety of areas, and creator monetization is one really small component of that.
As you said, these are just early experiments. Where is Twitter Crypto going next? What’s your vision for crypto technology’s role within Twitter?
Rinearson: The real trick here is to find the right parts of Twitter to decentralize, and to not try to decentralize everything at once—or, you know, make every user suddenly responsible for taking care of some private keys or something like that.
We have to find the right ways to open up some access to a decentralized economic layer, or give people ways that they can take their identity with them, without relying on a single centralized service.
We’re really early in these explorations, and even looking at things like Bitcoin tipping or the NFT profile pictures—we view those features as experiments themselves in a lot of ways and learning experiences. We’re learning things about how our users relate to these concepts, what they understand about them, what they find confusing, and what’s most useful to them. We really want to try to use this technology to bring utility to people and you know, not just like, sprinkle a little blockchain on it for the sake of it. So creator monetization is an area that I’m really excited about because I think there’s a really clear path forward. But again, we’re looking beyond that: We’re also looking at using crypto technology in fields like [digital] identity and [digital] ownership space and also figuring out how we can better serve crypto communities on the platform.
Are you going to put Twitter verified users’ blue ticks on a blockchain, then?
[Laughter]
No?
[More laughter]
OK, moving on. How does the kind of work you do dovetail with Bluesky’s plan to create a protocol for a decentralized social media platform? Is there any synergy there?
Rinearson: I have known Jay [Graber], the Bluesky lead, for a long time, and she and I are in pretty close contact. We check in with each other regularly and talk a lot about problems we might have in common that we’ll both need to solve. There’s an overlap looking at things in the identity area, but at the end of the day, it’s a separate project. She’s pretty focused on hiring her team, and they’re very focused on building a prototype of a protocol. That is different from what Esther and I are thinking about, which is like: There are all these blockchain protocols that exist, and we need to figure out how to make them useful and accessible for real people.
And when I say “real people,” I mean that in a sort of tongue-in-cheek contrast to hardcore crypto nerds like me. Jay is thinking much more about building for people who are creating decentralized networks. That is a very different focus area. Beyond that, I would just say it’s too early to say what Bluesky will mean for Twitter as a product. We are in touch, we have aligned values. But at the end of the day—separate teams.
Why is a centralized Silicon Valley company like Twitter the right place to start to bring more decentralization to internet users? Don’t we have just to start from scratch, build a new platform that is already decentralized?
Rinearson: I started in crypto in 2015, and I have a very vivid memory from those years of watching some of my coworkers—crypto engineers—trying to figure out how to secure some of their Bitcoin like before one of the Bitcoin forks [in which the Bitcoin blockchain split, creating new currencies], and they were panicking and freaking out. I thought there was no way that a normal person would be able to handle this in a way that would be safe. And so I was a little bit disillusioned with crypto, especially from a consumer perspective.
And then last year, I started seeing more interest from people whom I’ve known for a long time and weren’t crypto people. They were just starting to perk their heads up and take notice and start creating NFTs or start talking about DAOs. And I thought that that was interesting, that we were coming around a corner, and it might be time to start thinking about what this could mean for people beyond that hardcore crypto group.
And that was when Twitter reached out. You know, I don’t think that just any centralized platform would be able to bring crypto to the masses, so to speak. But I think Twitter has the right stuff. I think you have to meet people where they are with new technologies: find ways to onboard them and bring them along and show them what this might mean for them. make things accessible. And it’s really, really hard to do that with just a protocol. You need to have some kind of community, you need to have some kind of user base, you need to have some kind of platform. And Twitter’s just right there.
I don’t think I would say that a centralized platform is definitely the way to “bring crypto to the masses.” I do think that Twitter is the way to do it.
But why do the masses need crypto right now?
Rinearson: I don’t know that anyone needs crypto, and our goal is not to get everyone into crypto. Let’s be clear about that. But I do think that crypto is a potentially very powerful tool for people. And so I think what we are trying to do is show people how powerful it is and unlock those possibilities. It’s also possible that we create some products and features, where people actually don’t even really know what’s happening under the hood.
Like maybe we’re using crypto as a payment rail or again as an identity layer—users don’t necessarily need to know all of those implementation details. And that’s actually something we come back to a lot: What level of abstraction are we talking about with users? What story are we telling them about what’s happening under the hood? But yeah, I would just like to reiterate that the goal is not to just shovel everyone into crypto. We want to provide value for people.
Do you think there is a case for Twitter to launch its own cryptocurrency— a Twittercoin?
Rinearson: I think there’s a case for a lot of things—honestly, there’s a case for a lot of things. We’re trying to think really, really broadly about it.
Crawford: We’re actively exploring a lot of things. It’s not it’s not something we would be making an announcement about.
Rinearson: I think it is really important to stress that when you say “Twittercoin” you probably have a slightly different idea of what it is than we do. And are we exploring those ideas? Yes, we want to think about all of them. Do we have road maps for them? No. But are we trying to think about things really creatively and be really, really open-minded? Yes. We have this new economic technology that we think could unlock a lot of things for people. And we want to go down a bunch of rabbit holes and see what we come up with.
Gian M. Volpicelli is a senior writer at WIRED, where he covers cryptocurrency, decentralization, politics, and technology regulation. He received a master’s degree in journalism from City University of London after studying politics and international relations in Rome. He lives in London.
Highlighted Text in BLUE, BLACK, GREEN, RED by Aviva Lev-Ari, PhD, RN
Each of these posts were on the importance of scientific curation of findings within the realm of social media and the Web 2.0; a sub-environment known throughout the scientific communities as Science 2.0, in which expert networks collaborated together to produce massive new corpus of knowledge by sharing their views, insights on peer reviewed scientific findings. And through this new media, this process of curation would, in itself generate new ideas and new directions for research and discovery.
The platform sort of looked like the image below:
This system lied above a platform of the original Science 1.0, made up of all the scientific journals, books, and traditional literature:
In the old Science 1.0 format, scientific dissemination was in the format of hard print journals, and library subscriptions were mandatory (and eventually expensive). Open Access has tried to ameliorate the expense problem.
Previous image source: PeerJ.com
To index the massive and voluminous research and papers beyond the old Dewey Decimal system, a process of curation was mandatory. The dissemination of this was a natural for the new social media however the cost had to be spread out among numerous players. Journals, faced with the high costs of subscriptions and their only way to access this new media as an outlet was to become Open Access, a movement first sparked by journals like PLOS and PeerJ but then begrudingly adopted throughout the landscape. But with any movement or new adoption one gets the Good the Bad and the Ugly (as described in my cited, above, Clive Thompson article). The bad side of Open Access Journals were
costs are still assumed by the individual researcher not by the journals
the arise of the numerous Predatory Journals
Even PeerJ, in their column celebrating an anniversary of a year’s worth of Open Access success stories, lamented the key issues still facing Open Access in practice
which included the cost and the rise of predatory journals.
In essence, Open Access and Science 2.0 sprung full force BEFORE anyone thought of a way to defray the costs
Can Web 3.0 Finally Offer a Way to Right the Issues Facing High Costs of Scientific Publishing?
Web 1.0 and Web 2.0 refer to eras in the history of the Internet as it evolved through various technologies and formats. Web 1.0 refers roughly to the period from 1991 to 2004, where most websites were static webpages, and the vast majority of users were consumers, not producers, of content.[6][7] Web 2.0 is based around the idea of “the web as platform”,[8] and centers on user-created content uploaded to social-networking services, blogs, and wikis, among other services.[9] Web 2.0 is generally considered to have begun around 2004, and continues to the current day.[8][10][4]
The term “Web3”, specifically “Web 3.0”, was coined by Ethereum co-founder Gavin Wood in 2014.[1] In 2020 and 2021, the idea of Web3 gained popularity[citation needed]. Particular interest spiked towards the end of 2021, largely due to interest from cryptocurrency enthusiasts and investments from high-profile technologists and companies.[4][5] Executives from venture capital firm Andreessen Horowitz travelled to Washington, D.C. in October 2021 to lobby for the idea as a potential solution to questions about Internet regulation with which policymakers have been grappling.[11]
Web3 is distinct from Tim Berners-Lee‘s 1999 concept for a semantic web, which has also been called “Web 3.0”.[12] Some writers referring to the decentralized concept usually known as “Web3” have used the terminology “Web 3.0”, leading to some confusion between the two concepts.[2][3] Furthermore, some visions of Web3 also incorporate ideas relating to the semantic web.[13][14]
Web3 revolves around the idea of decentralization, which proponents often contrast with Web 2.0, wherein large amounts of the web’s data and content are centralized in the fairly small group of companies often referred to as Big Tech.[4]
Specific visions for Web3 differ, but all are heavily based in blockchain technologies, such as various cryptocurrencies and non-fungible tokens (NFTs).[4]Bloomberg described Web3 as an idea that “would build financial assets, in the form of tokens, into the inner workings of almost anything you do online”.[15] Some visions are based around the concepts of decentralized autonomous organizations (DAOs).[16]Decentralized finance (DeFi) is another key concept; in it, users exchange currency without bank or government involvement.[4]Self-sovereign identity allows users to identify themselves without relying on an authentication system such as OAuth, in which a trusted party has to be reached in order to assess identity.[17]
Technologists and journalists have described Web3 as a possible solution to concerns about the over-centralization of the web in a few “Big Tech” companies.[4][11] Some have expressed the notion that Web3 could improve data security, scalability, and privacy beyond what is currently possible with Web 2.0 platforms.[14]Bloomberg states that sceptics say the idea “is a long way from proving its use beyond niche applications, many of them tools aimed at crypto traders”.[15]The New York Times reported that several investors are betting $27 billion that Web3 “is the future of the internet”.[18][19]
Some companies, including Reddit and Discord, have explored incorporating Web3 technologies into their platforms in late 2021.[4][20] After heavy user backlash, Discord later announced they had no plans to integrate such technologies.[21] The company’s CEO, Jason Citron, tweeted a screenshot suggesting it might be exploring integrating Web3 into their platform. This led some to cancel their paid subscriptions over their distaste for NFTs, and others expressed concerns that such a change might increase the amount of scams and spam they had already experienced on crypto-related Discord servers.[20] Two days later, Citron tweeted that the company had no plans to integrate Web3 technologies into their platform, and said that it was an internal-only concept that had been developed in a company-wide hackathon.[21]
Some legal scholars quoted by The Conversation have expressed concerns over the difficulty of regulating a decentralized web, which they reported might make it more difficult to prevent cybercrime, online harassment, hate speech, and the dissemination of child abuse images.[13] But, the news website also states that, “[decentralized web] represents the cyber-libertarian views and hopes of the past that the internet can empower ordinary people by breaking down existing power structures.” Some other critics of Web3 see the concept as a part of a cryptocurrency bubble, or as an extension of blockchain-based trends that they see as overhyped or harmful, particularly NFTs.[20] Some critics have raised concerns about the environmental impact of cryptocurrencies and NFTs. Others have expressed beliefs that Web3 and the associated technologies are a pyramid scheme.[5]
Kevin Werbach, author of The Blockchain and the New Architecture of Trust,[22] said that “many so-called ‘web3’ solutions are not as decentralized as they seem, while others have yet to show they are scalable, secure and accessible enough for the mass market”, adding that this “may change, but it’s not a given that all these limitations will be overcome”.[23]
David Gerard, author of Attack of the 50 Foot Blockchain,[24] told The Register that “web3 is a marketing buzzword with no technical meaning. It’s a melange of cryptocurrencies, smart contracts with nigh-magical abilities, and NFTs just because they think they can sell some monkeys to morons”.[25]
Below is an article from MarketWatch.com Distributed Ledger series about the different forms and cryptocurrencies involved
by Frances Yue, Editor of Distributed Ledger, Marketwatch.com
Clayton Gardner, co-CEO of crypto investment management firm Titan, told Distributed Ledger that as crypto embraces broader adoption, he expects more institutions to bypass bitcoin and invest in other blockchains, such as Ethereum, Avalanche, and Terra in 2022. which all boast smart-contract features.
Bitcoin traditionally did not support complex smart contracts, which are computer programs stored on blockchains, though a major upgrade in November might have unlocked more potential.
“Bitcoin was originally seen as a macro speculative asset by many funds and for many it still is,” Gardner said. “If anything solidifies its use case, it’s a store of value. It’s not really used as originally intended, perhaps from a medium of exchange perspective.”
For institutions that are looking for blockchains that can “produce utility and some intrinsic value over time,” they might consider some other smart contract blockchains that have been driving the growth of decentralized finance and web 3.0, the third generation of the Internet, according to Gardner.
“Bitcoin is still one of the most secure blockchains, but I think layer-one, layer-two blockchains beyond Bitcoin, will handle the majority of transactions and activities from NFT (nonfungible tokens) to DeFi,“ Gardner said. “So I think institutions see that and insofar as they want to put capital to work in the coming months, I think that could be where they just pump the capital.”
Decentralized social media?
The price of Decentralized Social, or DeSo, a cryptocurrency powering a blockchain that supports decentralized social media applications, surged roughly 74% to about $164 from $94, after Deso was listed at Coinbase Pro on Monday, before it fell to about $95, according to CoinGecko.
In the eyes of Nader Al-Naji, head of the DeSo foundation, decentralized social media has the potential to be “a lot bigger” than decentralized finance.
“Today there are only a few companies that control most of what we see online,” Al-Naji told Distributed Ledger in an interview. But DeSo is “creating a lot of new ways for creators to make money,” Al-Naji said.
“If you find a creator when they’re small, or an influencer, you can invest in that, and then if they become bigger and more popular, you make money and they make and they get capital early on to produce their creative work,” according to AI-Naji.
BitClout, the first application that was created by AI-Naji and his team on the DeSo blockchain, had initially drawn controversy, as some found that they had profiles on the platform without their consent, while the application’s users were buying and selling tokens representing their identities. Such tokens are called “creator coins.”
AI-Naji responded to the controversy saying that DeSo now supports more than 200 social-media applications including Bitclout. “I think that if you don’t like those features, you now have the freedom to use any app you want. Some apps don’t have that functionality at all.”
But Before I get to the “selling monkeys to morons” quote,
I want to talk about
THE GOOD, THE BAD, AND THE UGLY
THE GOOD
My foray into Science 2.0 and then pondering what the movement into a Science 3.0 led me to an article by Dr. Vladimir Teif, who studies gene regulation and the nucleosome, as well as creating a worldwide group of scientists who discuss matters on chromatin and gene regulation in a journal club type format.
Fragile Nucleosome is an international community of scientists interested in chromatin and gene regulation. Fragile Nucleosome is active in several spaces: one is the Discord server where several hundred scientists chat informally on scientific matters. You can join the Fragile Nucleosome Discord server. Another activity of the group is the organization of weekly virtual seminars on Zoom. Our webinars are usually conducted on Wednesdays 9am Pacific time (5pm UK, 6pm Central Europe). Most previous seminars have been recorded and can be viewed at our YouTube channel. The schedule of upcoming webinars is shown below. Our third activity is the organization of weekly journal clubs detailed at a separate page (Fragile Nucleosome Journal Club).
This concept of science 3.0 he had coined back in 2009. As Dr Teif had mentioned
So essentially I first introduced this word Science 3.0 in 2009, and since then we did a lot to implement this in practice. The Twitter account @generegulation is also one of examples
This is curious as we still have an ill defined concept of what #science3_0 would look like but it is a good read nonetheless.
The concept of Science 2.0 was introduced almost a decade ago to describe the new generation of online-based tools for researchers allowing easier data sharing, collaboration and publishing. Although technically sound, the concept still does not work as expected. Here we provide a systematic line of arguments to modify the concept of Science 2.0, making it more consistent with the spirit and traditions of science and Internet. Our first correction to the Science 2.0 paradigm concerns the open-access publication models charging fees to the authors. As discussed elsewhere, we show that the monopoly of such publishing models increases biases and inequalities in the representation of scientific ideas based on the author’s income. Our second correction concerns post-publication comments online, which are all essentially non-anonymous in the current Science 2.0 paradigm. We conclude that scientific post-publication discussions require special anonymization systems. We further analyze the reasons of the failure of the current post-publication peer-review models and suggest what needs to be changed in Science 3.0 to convert Internet into a large journal club. [bold face added]
In this paper it is important to note the transition of a science 1.0, which involved hard copy journal publications usually only accessible in libraries to a more digital 2.0 format where data, papers, and ideas could be easily shared among networks of scientists.
As Dr. Teif states, the term “Science 2.0” had been coined back in 2009, and several influential journals including Science, Nature and Scientific American endorsed this term and suggested scientists to move online and their discussions online. However, even at present there are thousands on this science 2.0 platform, Dr Teif notes the number of scientists subscribed to many Science 2.0 networking groups such as on LinkedIn and ResearchGate have seemingly saturated over the years, with little new members in recent times.
The consensus is that science 2.0 networking is:
good because it multiplies the efforts of many scientists, including experts and adds to the scientific discourse unavailable on a 1.0 format
that online data sharing is good because it assists in the process of discovery (can see this evident with preprint servers, bio-curated databases, Github projects)
open-access publishing is beneficial because free access to professional articles and open-access will be the only publishing format in the future (although this is highly debatable as many journals are holding on to a type of “hybrid open access format” which is not truly open access
only sharing of unfinished works and critiques or opinions is good because it creates visibility for scientists where they can receive credit for their expert commentary
There are a few concerns on Science 3.0 Dr. Teif articulates:
A. Science 3.0 Still Needs Peer Review
Peer review of scientific findings will always be an imperative in the dissemination of well-done, properly controlled scientific discovery. As Science 2.0 relies on an army of scientific volunteers, the peer review process also involves an army of scientific experts who give their time to safeguard the credibility of science, by ensuring that findings are reliable and data is presented fairly and properly. It has been very evident, in this time of pandemic and the rapid increase of volumes of preprint server papers on Sars-COV2, that peer review is critical. Many of these papers on such preprint servers were later either retracted or failed a stringent peer review process.
Now many journals of the 1.0 format do not generally reward their peer reviewers other than the self credit that researchers use on their curriculum vitaes. Some journals, like the MDPI journal family, do issues peer reviewer credits which can be used to defray the high publication costs of open access (one area that many scientists lament about the open access movement; where the burden of publication cost lies on the individual researcher).
An issue which is highlighted is the potential for INFORMATION NOISE regarding the ability to self publish on Science 2.0 platforms.
The NEW BREED was born in 4/2012
An ongoing effort on this platform, https://pharmaceuticalintelligence.com/, is to establish a scientific methodology for curating scientific findings where one the goals is to assist to quell the information noise that can result from the massive amounts of new informatics and data occurring in the biomedical literature.
B. Open Access Publishing Model leads to biases and inequalities in the idea selection
The open access publishing model has been compared to the model applied by the advertising industry years ago and publishers then considered the journal articles as “advertisements”. However NOTHING could be further from the truth. In advertising the publishers claim the companies not the consumer pays for the ads. However in scientific open access publishing, although the consumer (libraries) do not pay for access the burden of BOTH the cost of doing the research and publishing the findings is now put on the individual researcher. Some of these publishing costs can be as high as $4000 USD per article, which is very high for most researchers. However many universities try to refund the publishers if they do open access publishing so it still costs the consumer and the individual researcher, limiting the cost savings to either.
However, this sets up a situation in which young researchers, who in general are not well funded, are struggling with the publication costs, and this sets up a bias or inequitable system which rewards the well funded older researchers and bigger academic labs.
C. Post publication comments and discussion require online hubs and anonymization systems
Many recent publications stress the importance of a post-publication review process or system yet, although many big journals like Nature and Science have their own blogs and commentary systems, these are rarely used. In fact they show that there are just 1 comment per 100 views of a journal article on these systems. In the traditional journals editors are the referees of comments and have the ability to censure comments or discourse. The article laments that comments should be easy to do on journals, like how easy it is to make comments on other social sites, however scientists are not offering their comments or opinions on the matter.
In a personal experience,
a well written commentary goes through editors which usually reject a comment like they were rejecting an original research article. Thus many scientists, I believe, after fashioning a well researched and referenced reply, do not get the light of day if not in the editor’s interests.
Therefore the need for anonymity is greatly needed and the lack of this may be the hindrance why scientific discourse is so limited on these types of Science 2.0 platforms. Platforms that have success in this arena include anonymous platforms like Wikipedia or certain closed LinkedIn professional platforms but more open platforms like Google Knowledge has been a failure.
A great example on this platform was a very spirited conversation on LinkedIn on genomics, tumor heterogeneity and personalized medicine which we curated from the LinkedIn discussion (unfortunately LinkedIn has closed many groups) seen here:
In this discussion, it was surprising that over a weekend so many scientists from all over the world contributed to a great discussion on the topic of tumor heterogeneity.
But many feel such discussions would be safer if they were anonymized. However then researchers do not get any credit for their opinions or commentaries.
A Major problem is how to take the intangible and make them into tangible assets which would both promote the discourse as well as reward those who take their time to improve scientific discussion.
This is where something like NFTs or a decentralized network may become important!
Below is an online @TwitterSpace Discussion we had with some young scientists who are just starting out and gave their thoughts on what SCIENCE 3.0 and the future of dissemination of science might look like, in light of this new Meta Verse. However we have to define each of these terms in light of Science and not just the Internet as merely a decentralized marketplace for commonly held goods.
This online discussion was tweeted out and got a fair amount of impressions (60) as well as interactors (50).
To introduce this discussion first a few startoff material which will fram this discourse
The Intenet and the Web is rapidly adopting a new “Web 3.0” format, with decentralized networks, enhanced virtual experiences, and greater interconnection between people. Here we start the discussion what will the move from Science 2.0, where dissemination of scientific findings was revolutionized and piggybacking on Web 2.0 or social media, to a Science 3.0 format. And what will it involve or what paradigms will be turned upside down?
Old Science 1.0 is still the backbone of all scientific discourse, built on the massive amount of experimental and review literature. However this literature was in analog format, and we moved to a more accesible digital open access format for both publications as well as raw data. However as there was a structure for 1.0, like the Dewey decimal system and indexing, 2.0 made science more accesible and easier to search due to the newer digital formats. Yet both needed an organizing structure; for 1.0 that was the scientific method of data and literature organization with libraries as the indexers. In 2.0 this relied on an army mostly of volunteers who did not have much in the way of incentivization to co-curate and organize the findings and massive literature.
Each version of Science has their caveats: their benefits as well as deficiencies. This curation and the ongoing discussion is meant to solidy the basis for the new format, along with definitions and determination of structure.
We had high hopes for Science 2.0, in particular the smashing of data and knowledge silos. However the digital age along with 2.0 platforms seemed to excaccerbate this somehow. We still are critically short on analysis!
We really need people and organizations to get on top of this new Web 3.0 or metaverse so the similar issues do not get in the way: namely we need to create an organizing structure (maybe as knowledgebases), we need INCENTIVIZED co-curators, and we need ANALYSIS… lots of it!!
Are these new technologies the cure or is it just another headache?
There were a few overarching themes whether one was talking about AI, NLP, Virtual Reality, or other new technologies with respect to this new meta verse and a concensus of Decentralized, Incentivized, and Integrated was commonly expressed among the attendees
The Following are some slides from representative Presentations
Other article of note on this topic on this Open Access Scientific Journal Include:
With the explosive development of decentralized finance, we witness a phenomenal growth in tokenization of all kinds of assets, including equity, funds, debt, and real estate. By taking advantage of blockchain technology, digital assets are broadly grouped into fungible and non-fungible tokens (NFT). Here non-fungible tokens refer to those with unique and non-substitutable properties. NFT has widely attracted attention, and its protocols, standards, and applications are developing exponentially. It has been successfully applied to digital fantasy artwork, games, collectibles, etc. However, there is a lack of research in utilizing NFT in issues such as Intellectual Property. Applying for a patent and trademark is not only a time-consuming and lengthy process but also costly. NFT has considerable potential in the intellectual property domain. It can promote transparency and liquidity and open the market to innovators who aim to commercialize their inventions efficiently. The main objective of this paper is to examine the requirements of presenting intellectual property assets, specifically patents, as NFTs. Hence, we offer a layered conceptual NFT-based patent framework. Furthermore, a series of open challenges about NFT-based patents and the possible future directions are highlighted. The proposed framework provides fundamental elements and guidance for businesses in taking advantage of NFTs in real-world problems such as grant patents, funding, biotechnology, and so forth.
Introduction
Distributed ledger technologies (DLTs) such as blockchain are emerging technologies posing a threat to existing business models. Traditionally, most companies used centralized authorities in various aspects of their business, such as financial operations and setting up a trust with their counterparts. By the emergence of blockchain, centralized organizations can be substituted with a decentralized group of resources and actors. The blockchain mechanism was introduced in Bitcoin white paper in 2008, which lets users generate transactions and spend their money without the intervention of banks1. Ethereum, which is a second generation of blockchain, was introduced in 2014, allowing developers to run smart contracts on a distributed ledger. With smart contracts, developers and businesses can create financial applications that use cryptocurrencies and other forms of tokens for applications such as decentralized finance (DeFi), crowdfunding, decentralized exchanges, data records keeping, etc.2. Recent advances in distributed ledger technology have developed concepts that lead to cost reduction and the simplification of value exchange. Nowadays, by leveraging the advantages of blockchain and taking into account the governance issues, digital assets could be represented as tokens that existed in the blockchain network, which facilitates their transmission and traceability, increases their transparency, and improves their security3.
In the landscape of blockchain technology, there could be defined two types of tokens, including fungible tokens, in which all the tokens have equal value and non-fungible tokens (NFTs) that feature unique characteristics and are not interchangeable. Actually, non-fungible tokens are digital assets with a unique identifier that is stored on a blockchain4. NFT was initially suggested in Ethereum Improvement Proposals (EIP)-7215, and it was later expanded in EIP-11556. NFTs became one of the most widespread applications of blockchain technology that reached worldwide attention in early 2021. They can be digital representations of real-world objects. NFTs are tradable rights of digital assets (pictures, music, films, and virtual creations) where ownership is recorded in blockchain smart contracts7.
In particular, fungibility is the ability to exchange one with another of the same kind as an essential currency feature. The non-fungible token is unique and therefore cannot be substituted8. Recently, blockchain enthusiasts have indicated significant interest in various types of NFTs. They enthusiastically participate in NFT-related games or trades. CryptoPunks9, as one of the first NFTs on Ethereum, has developed almost 10,000 collectible punks and helped popularize the ERC-721 Standard. With the gamification of the breeding mechanics, CryptoKitties10 officially placed NFTs at the forefront of the market in 2017. CryptoKitties is an early blockchain game that enables users to buy, sell, collect, and digital breed cats. Another example is NBA Top Shot11, an NFT trading platform for digital short films buying and selling NBA events.
NFTs are developing remarkably and have provided many applications such as artist royalties, in-game assets, educational certificates, etc. However, it is a relatively new concept, and many areas of application need to be explored. Intellectual Property, including patent, trademark, and copyright, is an important area where NFTs can be applied usefully and solve existing problems.
Although NFTs have had many applications so far, it rarely has been used to solve real-world problems. In fact, an NFT is an exciting concept about Intellectual Property (IP). Applying for a patent and trademark is a time-consuming and lengthy process, but it is also costly. That is, registering a copyright or trademark may take months, while securing a patent can take years. On the contrary, with the help of unique features of NFT technology, it is possible to accelerate this process with considerable confidence and assurance about protecting the ownership of an IP. NFTs can offer IP protection while an applicant waits for the government to grant his/her more formal protection. It is cause for excitement that people who believe NFTs and Blockchain would make buying and selling patents easier, offering new opportunities for companies, universities, and inventors to make money off their innovations12. Patent holders will benefit from such innovation. It would give them the ability to ‘tokenize’ their patents. Because every transaction would be logged on a blockchain, it will be much easier to trace patent ownership changes. However, NFT would also facilitate the revenue generation of patents by democratizing patent licensing via NFT. NFTs support the intellectual property market by embedding automatic royalty collecting methods inside inventors’ works, providing them with financial benefits anytime their innovation is licensed. For example, each inventor’s patent would be minted as an NFT, and these NFTs would be joined together to form a commercial IP portfolio and minted as a compounded NFT. Each investor would automatically get their fair share of royalties whenever the licensing revenue is generated without tracking them down.
The authors in13, an overview of NFTs’ applications in different aspects such as gambling, games, and collectibles has been discussed. In addition4, provides a prototype for an event-tracking application based on Ethereum smart contract, and NFT as a solution for art and real estate auction systems is described in14. However, these studies have not discussed existing standards or a generalized architecture, enabling NFTs to be applied in diverse applications. For example, the authors in15 provide two general design patterns for creating and trading NFTs and discuss existing token standards for NFT. However, the proposed designs are limited to Ethereum, and other blockchains are not considered16. Moreover, different technologies for each step of the proposed procedure are not discussed. In8, the authors provide a conceptual framework for token designing and managing and discuss five views: token view, wallet view, transaction view, user interface view, and protocol view. However, no research provides a generalized conceptual framework for generating, recording, and tracing NFT based-IP, in blockchain network.
Even with the clear benefits that NFT-backed patents offer, there are a number of impediments to actually achieving such a system. For example, convincing patent owners to put current ownership records for their patents into NFTs poses an initial obstacle. Because there is no reliable framework for NFT-based patents, this paper provides a conceptual framework for presenting NFT-based patents with a comprehensive discussion on many aspects, ranging from the background, model components, token standards to application domains and research challenges. The main objective of this paper is to provide a layered conceptual NFT-based patent framework that can be used to register patents in a decentralized, tamper-proof, and trustworthy peer-to-peer network to trade and exchange them in the worldwide market. The main contributions of this paper are highlighted as follows:
Providing a comprehensive overview on tokenization of IP assets to create unique digital tokens.
Discussing the components of a distributed and trustworthy framework for minting NFT-based patents.
Highlighting a series of open challenges of NFT-based patents and enlightening the possible future trends.
The rest of the paper is structured as follows: “Background” section describes the Background of NFTs, Non-Fungible Token Standards. The NFT-based patent framework is described in “NFT-based patent framework” section. The Discussion and challenges are presented in “Discussion” section. Lastly, conclusions are given in “Conclusion” section.
Background
Colored Coins could be considered the first steps toward NFTs designed on the top of the Bitcoin network. Bitcoins are fungible, but it is possible to mark them to be distinguishable from the other bitcoins. These marked coins have special properties representing real-world assets like cars and stocks, and owners can prove their ownership of physical assets through the colored coins. By utilizing Colored Coins, users can transfer their marked coins’ ownership like a usual transaction and benefit from Bitcoin’s decentralized network17. Colored Coins had limited functionality due to the Bitcoin script limitations. Pepe is a green frog meme originated by Matt Furie that; users define tokens for Pepes and trade them through the Counterparty platform. Then, the tokens that were created by the picture of Pepes are decided if they are rare enough. Rare Pepe allows users to preserve scarcity, manage the ownership, and transfer their purchased Pepes.
In 2017, Larva Labs developed the first Ethereum-based NFT named CryptoPunks. It contains 10,000 unique human-like characters generated randomly. The official ownership of each character is stored in the Ethereum smart contract, and owners would trade characters. CryptoPunks project inspired CryptoKitties project. CryptoKitties attracts attention to NFT, and it is a pioneer in blockchain games and NFTs that launched in late 2017. CryptoKitties is a blockchain-based virtual game, and users collect and trade characters with unique features that shape kitties. This game was developed in Ethereum smart contract, and it pioneered the ERC-721 token, which was the first standard token in the Ethereum blockchain for NFTs. After the 2017 hype in NFTs, many projects started in this context. Due to increased attention to NFTs’ use-cases and growing market cap, different blockchains like EOS, Algorand, and Tezos started to support NFTs, and various marketplaces like SuperRare and Rarible, and OpenSea are developed to help users to trade NFTs. As mentioned, in general, assets are categorized into two main classes, fungible and non-fungible assets. Fungible assets are the ones that another similar asset can replace. Fungible items could have two main characteristics: replicability and divisibility.
Currency is a fungible item because a ten-dollar bill can be exchanged for another ten-dollar bill or divided into ten one-dollar bills. Despite fungible items, non-fungible items are unique and distinguishable. They cannot be divided or exchanged by another identical item. The first tweet on Twitter is a non-fungible item with mentioned characteristics. Another tweet cannot replace it, and it is unique and not divisible. NFT is a non-fungible cryptographic asset that is declared in a standard token format and has a unique set of attributes. Due to transparency, proof of ownership, and traceable transactions in the blockchain network, NFTs are created using blockchain technology.
Blockchain-based NFTs help enthusiasts create NFTs in the standard token format in blockchain, transfer the ownership of their NFTs to a buyer, assure uniqueness of NFTs, and manage NFTs completely. In addition, there are semi-fungible tokens that have characteristics of both fungible and non-fungible tokens. Semi-fungible tokens are fungible in the same class or specific time and non-fungible in other classes or different times. A plane ticket can be considered a semi-fungible token because a charter ticket can be exchanged by another charter ticket but cannot be exchanged by a first-class ticket. The concept of semi-fungible tokens plays the main role in blockchain-based games and reduces NFTs overhead. In Fig. 1, we illustrate fungible, non-fungible, and semi-fungible tokens. The main properties of NFTs are described as follows15:
Figure 1
Ownership: Because of the blockchain layer, the owner of NFT can easily prove the right of possession by his/her keys. Other nodes can verify the user’s ownership publicly.
Transferable: Users can freely transfer owned NFTs ownership to others on dedicated markets.
Transparency: By using blockchain, all transactions are transparent, and every node in the network can confirm and trace the trades.
Fraud Prevention: Fraud is one of the key problems in trading assets; hence, using NFTs ensures buyers buy a non-counterfeit item.
Immutability: Metadata, token ID, and history of transactions of NFTs are recorded in a distributed ledger, and it is impossible to change the information of the purchased NFTs.
Non-fungible standards
Ethereum blockchain was pioneered in implementing NFTs. ERC-721 token was the first standard token accepted in the Ethereum network. With the increase in popularity of the NFTs, developers started developing and enhancing NFTs standards in different blockchains like EOS, Algorand, and Tezos. This section provides a review of implemented NFTs standards on the mentioned blockchains.
Ethereum
ERC-721 was the first Standard for NFTs developed in Ethereum, a free and open-source standard. ERC-721 is an interface that a smart contract should implement to have the ability to transfer and manage NFTs. Each ERC-721 token has unique properties and a different Token Id. ERC-721 tokens include the owner’s information, a list of approved addresses, a transfer function that implements transferring tokens from owner to buyer, and other useful functions5.
In ERC-721, smart contracts can group tokens with the same configuration, and each token has different properties, so ERC-721 does not support fungible tokens. However, ERC-1155 is another standard on Ethereum developed by Enjin and has richer functionalities than ERC-721 that supports fungible, non-fungible, and semi-fungible tokens. In ERC-1155, IDs define the class of assets. So different IDs have a different class of assets, and each ID may contain different assets of the same class. Using ERC-1155, a user can transfer different types of tokens in a single transaction and mix multiple fungible and non-fungible types of tokens in a single smart contract6. ERC-721 and ERC-1155 both support operators in which the owner can let the operator originate transferring of the token.
EOSIO
EOSIO is an open-source blockchain platform released in 2018 and claims to eliminate transaction fees and increase transaction throughput. EOSIO differs from Ethereum in the wallet creation algorithm and procedure of handling transactions. dGood is a free standard developed in the EOS blockchain for assets, and it focuses on large-scale use cases. It supports a hierarchical naming structure in smart contracts. Each contract has a unique symbol and a list of categories, and each category contains a list of token names. Therefore, a single contract in dGoods could contain many tokens, which causes efficiency in transferring a group of tokens. Using this hierarchy, dGoods supports fungible, non-fungible, and semi-fungible tokens. It also supports batch transferring, where the owner can transfer many tokens in one operation18.
Algorand
Algorand is a new high-performance public blockchain launched in 2019. It provides scalability while maintaining security and decentralization. It supports smart contracts and tokens for representing assets19. Algorand defines Algorand Standard Assets (ASA) concept to create and manage assets in the Algorand blockchain. Using ASA, users are able to define fungible and non-fungible tokens. In Algorand, users can create NFTs or FTs without writing smart contracts, and they should run just a single transaction in the Algorand blockchain. Each transaction contains some mutable and immutable properties20.
Each account in Algorand can create up to 1000 assets, and for every asset, an account creates or receives, the minimum balance of the account increases by 0.1 Algos. Also, Algorand supports fractional NFTs by splitting an NFT into a group of divided FTs or NFTs, and each part can be exchanged dependently21. Algorand uses a Clawback Address that operates like an operator in ERC-1155, and it is allowed to transfer tokens of an owner who has permitted the operator.
Tezos
Tezos is another decentralized open-source blockchain. Tezos supports the meta-consensus concept. In addition to using a consensus protocol on the ledger’s state like Bitcoin and Ethereum, It also attempts to reach a consensus about how nodes and the protocol should change or upgrade22. FA2 (TZIP-12) is a standard for a unified token contract interface in the Tezos blockchain. FA2 supports different token types like fungible, non-fungible, and fractionalized NFT contracts. In Tezos, tokens are identified with a token contract address and token ID pair. Also, Tezos supports batch token transferring, which reduces the cost of transferring multiple tokens.
Flow
Flow was developed by Dapper Labs to remove the scalability limitation of the Ethereum blockchain. Flow is a fast and decentralized blockchain that focuses on games and digital collectibles. It improves throughput and scalability without sharding due to its architecture. Flow supports smart contracts using Cadence, which is a resource-oriented programming language. NFTs can be described as a resource with a unique id in Cadence. Resources have important rules for ownership management; that is, resources have just one owner and cannot be copied or lost. These features assure the NFT owner. NFTs’ metadata, including images and documents, can be stored off-chain or on-chain in Flow. In addition, Flow defines a Collection concept, in which each collection is an NFT resource that can include a list of resources. It is a dictionary that the key is resource id, and the value is corresponding NFT.
The collection concept provides batch transferring of NFTs. Besides, users can define an NFT for an FT. For instance, in CryptoKitties, a unique cat as an NFT can own a unique hat (another NFT). Flow uses Cadence’s second layer of access control to allow some operators to access some fields of the NFT23. In Table 1, we provide a comparison between explained standards. They are compared in support of fungible-tokens, non-fungible tokens, batch transferring that owner can transform multiple tokens in one operation, operator support in which the owner can approve an operator to originate token transfer, and fractionalized NFTs that an NFT can divide to different tokens and each exchange dependently.Table 1 Comparing NFT standards.
In this section, we propose a framework for presenting NFT-based patents. We describe details of the proposed distributed and trustworthy framework for minting NFT-based patents, as shown in Fig. 2. The proposed framework includes five main layers: Storage Layer, Authentication Layer, Verification Layer, Blockchain Layer, and Application Layer. Details of each layer and the general concepts are presented as follows.
Figure 2
Storage layer
The continuous rise of the data in blockchain technology is moving various information systems towards the use of decentralized storage networks. Decentralized storage networks were created to provide more benefits to the technological world24. Some of the benefits of using decentralized storage systems are explained: (1) Cost savings are achieved by making optimal use of current storage. (2) Multiple copies are kept on various nodes, avoiding bottlenecks on central servers and speeding up downloads. This foundation layer implicitly provides the infrastructure required for the storage. The items on NFT platforms have unique characteristics that must be included for identification.
Non-fungible token metadata provides information that describes a particular token ID. NFT metadata is either represented on the On-chain or Off-chain. On-chain means direct incorporation of the metadata into the NFT’s smart contract, which represents the tokens. On the other hand, off-chain storage means hosting the metadata separately25.
Blockchains provide decentralization but are expensive for data storage and never allow data to be removed. For example, because of the Ethereum blockchain’s current storage limits and high maintenance costs, many projects’ metadata is maintained off-chain. Developers utilize the ERC721 Standard, which features a method known as tokenURI. This method is implemented to let applications know the location of the metadata for a specific item. Currently, there are three solutions for off-chain storage, including InterPlanetary File System (IPFS), Pinata, and Filecoin.
IPFS
InterPlanetary File System (IPFS) is a peer-to-peer hypermedia protocol for decentralized media content storage. Because of the high cost of storing media files related to NFTS on Blockchain, IPFS can be the most affordable and efficient solution. IPFS combines multiple technologies inspired by Gita and BitTorrent, such as Block Exchange System, Distributed Hash Tables (DHT), and Version Control System26. On a peer-to-peer network, DHT is used to coordinate and maintain metadata.
In other words, the hash values must be mapped to the objects they represent. An IPFS generates a hash value that starts with the prefix {Q}_{m} and acts as a reference to a specific item when storing an object like a file. Objects larger than 256 KB are divided into smaller blocks up to 256 KB. Then a hash tree is used to interconnect all the blocks that are a part of the same object. IPFS uses Kamdelia DHT. The Block Exchange System, or BitSwap, is a BitTorrent-inspired system that is used to exchange blocks. It is possible to use asymmetric encryption to prevent unauthorized access to stored content on IPFS27.
Pinata
Pinata is a popular platform for managing and uploading files on IPFS. It provides secure and verifiable files for NFTs. Most data is stored off-chain by most NFTs, where a URL of the data is pointed to the NFT on the blockchain. The main problem here is that some information in the URL can change.
This indicates that an NFT supposed to describe a certain patent can be changed without anyone knowing. This defeats the purpose of the NFT in the first place. This is where Pinata comes in handy. Pinata uses the IPFS to create content-addressable hashes of data, also known as Content-Identifiers (CIDs). These CIDs serve as both a way of retrieving data and a means to ensure data validity. Those looking to retrieve data simply ask the IPFS network for the data associated with a certain CID, and if any node on the network contains that data, it will be returned to the requester. The data is automatically rehashed on the requester’s computer when the requester retrieves it to make sure that the data matches back up with the original CID they asked for. This process ensures the data that’s received is exactly what was asked for; if a malicious node attempts to send fake data, the resulting CID on the requester’s end will be different, alerting the requester that they’re receiving incorrect data28.
Filecoin
Another decentralized storage network is Filecoin. It is built on top of IPFS and is designed to store the most important data, such as media files. Truffle Suite has also launched NFT Development Template with Filecoin Box. NFT.Storage (Free Decentralized Storage for NFTs)29 allows users to easily and securely store their NFT content and metadata using IPFS and Filecoin. NFT.Storage is a service backed by Protocol Labs and Pinata specifically for storing NFT data. Through content addressing and decentralized storage, NFT.Storage allows developers to protect their NFT assets and associated metadata, ensuring that all NFTs follow best practices to stay accessible for the long term. NFT.Storage makes it completely frictionless to mint NFTs following best practices through resilient persistence on IPFS and Filecoin. NFT.Storage allows developers to quickly, safely, and for free store NFT data on decentralized networks. Anyone can leverage the power of IPFS and Filecoin to ensure the persistence of their NFTs. The details of this system are stated as follows30:
Content addressing
Once users upload data on NFT.Storage, They receive a CID, which is an IPFS hash of the content. CIDs are the data’s unique fingerprints, universal addresses that can be used to refer to it regardless of how or where it is stored. Using CIDs to reference NFT data avoids problems such as weak links and “rug pulls” since CIDs are generated from the content itself.
Provable storage
NFT.Storage uses Filecoin for long-term decentralized data storage. Filecoin uses cryptographic proofs to assure the NFT data’s durability and persistence over time.
Resilient retrieval
This data stored via IPFS and Filecoin can be fetched directly in the browser via any public IPFS.
Authentication Layer
The second layer is the authentication layer, which we briefly highlight its functions in this section. The Decentralized Identity (DID) approach assists users in collecting credentials from a variety of issuers, such as the government, educational institutions, or employers, and saving them in a digital wallet. The verifier then uses these credentials to verify a person’s validity by using a blockchain-based ledger to follow the “identity and access management (IAM)” process. Therefore, DID allows users to be in control of their identity. A lack of NFT verifiability also causes intellectual property and copyright infringements; of course, the chain of custody may be traced back to the creator’s public address to check whether a similar patent is filed using that address. However, there is no quick and foolproof way to check an NFTs creator’s legitimacy. Without such verification built into the NFT, an NFT proves ownership only over that NFT itself and nothing more.
Self-sovereign identity (SSI)31 is a solution to this problem. SSI is a new series of standards that will guide a new identity architecture for the Internet. With a focus on privacy, security interoperability, SSI applications use public-key cryptography with public blockchains to generate persistent identities for people with private and selective information disclosure. Blockchain technology offers a solution to establish trust and transparency and provide a secure and publicly verifiable KYC (Know Your Customer). The blockchain architecture allows you to collect information from various service providers into a single cryptographically secure and unchanging database that does not need a third party to verify the authenticity of the information.
The proposed platform generates patents-related smart contracts acting as a program that runs on the blockchain to receive and send transactions. They are unalterable privately identifying clients with a thorough KYC process. After KYC approval, then mint an NFT on the blockchain as a certificate of verification32. This article uses a decentralized authentication solution at this layer for authentication. This solution has been used for various applications in the field of the blockchain (exp: smart city, Internet of Things, etc.33, 34, but we use it here for the proposed framework (patent as NFTs). Details of this solution will be presented in the following.
Decentralized authentication
This section presents the authentication layer similar35 to build validated communication in a secure and decentralized manner via blockchain technology. As shown in Fig. 3, the authentication protocol comprises two processes, including registration and login.
Figure 3
Registration
In the registration process of a suggested authentication protocol, we first initialize a user’s public key as their identity key (UserName). Then, we upload this identity key on a blockchain, in which transactions can be verified later by other users. Finally, the user generates an identity transaction.
Login
After registration, a user logs in to the system. The login process is described as follows:
1. The user commits identity information and imports their secret key into the service application to log in.
2. A user who needs to log in sends a login request to the network’s service provider.
3. The service provider analyzes the login request, extracts the hash, queries the blockchain, and obtains identity information from an identity list (identity transactions).
4. The service provider responds with an authentication request when the above process is completed. A timestamp (to avoid a replay attack), the user’s UserName, and a signature are all included in the authentication request.
5. The user creates a signature with five parameters: timestamp, UserName, and PK, as well as the UserName and PK of the service provider. The user authentication credential is used as the signature.
6. The service provider verifies the received information, and if the received information is valid, the authentication succeeds; otherwise, the authentication fails, and the user’s login is denied.
The World Intellectual Property Organization (WIPO) and multiple target patent offices in various nations or regions should assess a patent application, resulting in inefficiency, high costs, and uncertainty. This study presented a conceptual NFT-based patent framework for issuing, validating, and sharing patent certificates. The platform aims to support counterfeit protection as well as secure access and management of certificates according to the needs of learners, companies, education institutions, and certification authorities.
Here, the certification authority (CA) is used to authenticate patent offices. The procedure will first validate a patent if it is provided with a digital certificate that meets the X.509 standard. Certificate authorities are introduced into the system to authenticate both the nodes and clients connected to the blockchain network.
Verification layer
In permissioned blockchains, just identified nodes can read and write in the distributed ledger. Nodes can act in different roles and have various permissions. Therefore, a distributed system can be designed to be the identified nodes for patent granting offices. Here the system is described conceptually at a high level. Figure 4 illustrates the sequence diagram of this layer. This layer includes four levels as below:
Figure 4
Digitalization
For a patent to publish as an NFT in the blockchain, it must have a digitalized format. This level is the “filling step” in traditional patent registering. An application could be designed in the application layer to allow users to enter different patent information online.
Recording
Patents provide valuable information and would bring financial benefits for their owner. If they are publicly published in a blockchain network, miners may refuse the patent and take the innovation for themselves. At least it can weaken consensus reliability and encourage miners to misbehave. The inventor should record his innovation privately first using proof of existence to prevent this. The inventor generates the hash of the patent document and records it in the blockchain. As soon as it is recorded in the blockchain, the timestamp and the hash are available for others publicly. Then, the inventor can prove the existence of the patent document whenever it is needed.
Furthermore, using methods like Decision Thinking36, an inventor can record each phase of patent development separately. In each stage, a user generates the hash of the finished part and publishes the hash regarding the last part’s hash. Finally, they have a coupled series of hashes that indicate patent development, and they can prove the existence of each phase using the original related documents. This level should be done to prevent others from abusing the patent and taking it for themselves. The inventor can make sure that their patent document is recorded confidentially and immutably37.
Different hash algorithms exist with different architecture, time complexity, and security considerations. Hash functions should satisfy two main requirements: Pre-Image Resistance: This means that it should be computationally hard to find the input of a hash function while the output and the hash algorithm are known publicly. Collision Resistance: This means that it is computationally hard to find two arbitrary inputs, x, and y, that have the same hash output. These requirements are vital for recording patents. First, the hash function should be Pre-Image Resistance to make it impossible for others to calculate the patent documentation. Otherwise, everybody can read the patent, even before its official publication. Second, the hash function should satisfy Collision Resistance to preclude users from changing their document after recording. Otherwise, users can upload another document, and after a while, they can replace it with another one.
There are various hash algorithms, and MD and SHA families are the most useful algorithms. According to38, Collisions have been found for MD2, MD4, MD5, SHA-0, and SHA-1 hash functions. Hence, they cannot be a good choice for recording patents. SHA2 hash algorithm is secure, and no collision has been found. Although SHA2 is noticeably slower than prior hash algorithms, the recording phase is not highly time-sensitive. So, it is a better choice and provides excellent security for users.
Validating
In this phase, the inventors first create NFT for their patents and publish it to the miners/validators. Miners are some identified nodes that validate NFTs to record in the blockchain. Due to the specialization of the patent validation, miners cannot be inexpert public persons. In addition, patent offices are not too many to make the network fully decentralized. Therefore, the miners can be related specialist persons that are certified by the patent offices. They should receive a digital certificate from patent offices that show their eligibility to referee a patent.
Digital certificate
Digital certificates are digital credentials used to verify networked entities’ online identities. They usually include a public key as well as the owner’s identification. They are issued by Certification Authorities (CAs), who must verify the certificate holder’s identity. Certificates contain cryptographic keys for signing, encryption, and decryption. X.509 is a standard that defines the format of public-key certificates and is signed by a certificate authority. X.509 standard has multiple fields, and its structure is shown in Fig. 5. Version: This field indicated the version of the X.509 standard. X.509 contains multiple versions, and each version has a different structure. According to the CA, validators can choose their desired version. Serial Number: It is used to distinguish a certificate from other certificates. Thus, each certificate has a unique serial number. Signature Algorithm Identifier: This field indicates the cryptographic encryption algorithm used by a certificate authority. Issuer Name: This field indicates the issuer’s name, which is generally certificate authority. Validity Period: Each certificate is valid for a defined period, defined as the Validity Period. This limited period partly protects certificates against exposing CA’s private key. Subject Name: Name of the requester. In our proposed framework, it is the validator’s name. Subject Public Key Info: Shows the CA’s or organization’s public key that issued the certificate. These fields are identical among all versions of the X.509 standard39.
Figure 5
Certificate authority
A Certificate Authority (CA) issues digital certificates. CAs encrypt the certificate with their private key, which is not public, and others can decrypt the certificates containing the CA’s public key.
Here, the patent office creates a certificate for requested patent referees. The patent office writes the information of the validator in their certificate and encrypts it with the patent offices’ private key. The validator can use the certificate to assure others about their eligibility. Other nodes can check the requesting node’s information by decrypting the certificate using the public key of the patent office. Therefore, persons can join the network’s miners/validators using their credentials. In this phase, miners perform Formal Examinations, Prior Art Research, and Substantive Examinations and vote to grant or refuse the patent.
Miners perform a consensus about the patent and record the patent in the blockchain. After that, the NFT is recorded in the blockchain with corresponding comments in granting or needing reformations. If the miners detect the NFT as a malicious request, they do not record it in the blockchain.
Blockchain layer
This layer plays as a middleware between the Verification Layer and Application Layer in the patents as NFTs architecture. The main purpose of the blockchain layer in the proposed architecture is to provide IP management. We find that transitioning to a blockchain-based patent as a NFTs records system enables many previously suggested improvements to current patent systems in a flexible, scalable, and transparent manner.
On the other hand, we can use multiple blockchain platforms, including Ethereum, EOS, Flow, and Tezos. Blockchain Systems can be mainly classified into two major types: Permissionless (public) and Permissioned (private) Blockchains based on their consensus mechanism. In a public blockchain, any node can participate in the peer-to-peer network, where the blockchain is fully decentralized. A node can leave the network without any consent from the other nodes in the network.
Bitcoin is one of the most popular examples that fall under the public and permissionless blockchain. Proof of Work (POW), Proof-of-Stake (POS), and directed acyclic graph (DAG) are some examples of consensus algorithms in permissionless blockchains. Bitcoin and Ethereum, two famous and trustable blockchain networks, use the PoW consensus mechanism. Blockchain platforms like Cardano and EOS adopt the PoS consensus40.
Nodes require specific access or permission to get network authentication in a private blockchain. Hyperledger is among the most popular private blockchains, which allow only permissioned members to join the network after authentication. This provides security to a group of entities that do not completely trust one another but wants to achieve a common objective such as exchanging information. All entities of a permissioned blockchain network can use Byzantine-fault-tolerant (BFT) consensus. The Fabric has a membership identity service that manages user IDs and verifies network participants.
Therefore, members are aware of each other’s identity while maintaining privacy and secrecy because they are unaware of each other’s activities41. Due to their more secure nature, private blockchains have sparked a large interest in banking and financial organizations, believing that these platforms can disrupt current centralized systems. Hyperledger, Quorum, Corda, EOS are some examples of permissioned blockchains42.
Reaching consensus in a distributed environment is a challenge. Blockchain is a decentralized network with no central node to observe and check all transactions. Thus, there is a need to design protocols that indicate all transactions are valid. So, the consensus algorithms are considered as the core of each blockchain43. In distributed systems, the consensus has become a problem in which all network members (nodes) agree on accept or reject of a block. When all network members accept the new block, it can append to the previous block.
As mentioned, the main concern in the blockchains is how to reach consensus among network members. A wide range of consensus algorithms has been designed in which each of them has its own pros and cons42. Blockchain consensus algorithms are mainly classified into three groups shown in Table 2. As the first group, proof-based consensus algorithms require the nodes joining the verifying network to demonstrate their qualification to do the appending task. The second group is voting-based consensus that requires validators in the network to share their results of validating a new block or transaction before making the final decision. The third group is DAG-based consensus, a new class of consensus algorithms. These algorithms allow several different blocks to be published and recorded simultaneously on the network.Table 2 Consensus algorithms in blockchain networks.
The proposed patent as the NFTs platform that builds blockchain intellectual property empowers the entire patent ecosystem. It is a solution that removes barriers by addressing fundamental issues within the traditional patent ecosystem. Blockchain can efficiently handle patents and trademarks by effectively reducing approval wait time and other required resources. The user entities involved in Intellectual Property management are Creators, Patent Consumers, and Copyright Managing Entities. Users with ownership of the original data are the patent creators, e.g., inventors, writers, and researchers. Patent Consumers are the users who are willing to consume the content and support the creator’s work. On the other hand, Users responsible for protecting the creators’ Intellectual Property are the copyright management entities, e.g., lawyers. The patents as NFTs solution for IP management in blockchain layer works by implementing the following steps62:
Creators sign up to the platform
Creators need to sign up on the blockchain platform to patent their creative work. The identity information will be required while signing up.
Creators upload IP on the blockchain network
Now, add an intellectual property for which the patent application is required. The creator will upload the information related to IP and the data on the blockchain network. Blockchain ensures traceability and auditability to prevent data from duplicity and manipulation. The patent becomes visible to all network members once it is uploaded to the blockchain.
Consumers generate request to use the content
Consumers who want to access the content must first register on the blockchain network. After Signing up, consumers can ask creators to grant access to the patented content. Before the patent owner authorizes the request, a Smart Contract is created to allow customers to access information such as the owner’s data. Furthermore, consumers are required to pay fees in either fiat money or unique tokens in order to use the creator’s original information. When the creator approves the request, an NDA (Non-Disclosure Agreement) is produced and signed by both parties. Blockchain manages the agreement and guarantees that all parties agree to the terms and conditions filed.
Patent management entities leverage blockchain to protect copyrights and solve related disputes
Blockchain assists the patent management entities in resolving a variety of disputes that may include: sharing confidential information, establishing proof of authorship, transferring IP rights, and making defensive publications, etc. Suppose a person used an Invention from a patent for his company without the inventor’s consent. The inventor can report it to the patent office and claim that he is the owner of that invention.
Application layer
The patent Platform Global Marketplace technology would allow many enterprises, governments, universities, and Small and medium-sized enterprises (SMEs) worldwide to tokenize patents as NFTs to create an infrastructure for storing patent records on a blockchain-based network and developing a decentralized marketplace in which patent holders would easily sell or otherwise monetize their patents. The NFTs-based patent can use smart contracts to determine a set price for a license or purchase.
Any buyer satisfied with the conditions can pay and immediately unlock the rights to the patent without either party ever having to interact directly. While patents are currently regulated jurisdictionally around the world, a blockchain-based patent marketplace using NFTs can reduce the geographical barriers between patent systems using as simple a tool as a search query. The ease of access to patents globally can help aspiring inventors accelerate the innovative process by building upon others’ patented inventions through licenses. There are a wide variety of use cases for patent NFTs such as SMEs, Patent Organization, Grant & Funding, and fundraising/transferring information relating to patents. These applications keep growing as time progresses, and we are constantly finding new ways to utilize these tokens. Some of the most commonly used applications can be seen as follows.
SMEs
The aim is to move intellectual property assets onto a digital, centralized, and secure blockchain network, enabling easier commercialization of patents, especially for small or medium enterprises (SMEs). Smart contracts can be attached to NFTs so terms of use and ownership can be outlined and agreed upon without incurring as many legal fees as traditional IP transfers. This is believed to help SMEs secure funding, as they could more easily leverage the previously undisclosed value of their patent portfolios63.
Transfer ownership of patents
NFTs can be used to transfer ownership of patents. The blockchain can be used to keep track of patent owners, and tokens would include self-executing contracts that transfer the legal rights associated with patents when the tokens are transferred. A partnership between IBM and IPwe has spearheaded the use of NFTs to secure patent ownership. These two companies have teamed together to build the infrastructure for an NFT-based patent marketplace.
Discussion
There are exciting proposals in the legal and economic literature that suggest seemingly straightforward solutions to many of the issues plaguing current patent systems. However, most solutions would constitute major administrative disruptions and place significant and continuous financial burdens on patent offices or their users. An NFT-based patents system not only makes many of these ideas administratively feasible but can also be examined in a step-wise, scalable, and very public manner.
Furthermore, NFT-based patents may facilitate reliable information sharing among offices and patentees worldwide, reducing the burden on examiners and perhaps even accelerating harmonization efforts. NFT-based patents also have additional transparency and archival attributes baked in. A patent should be a privilege bestowed on those who take resource-intensive risks to explore the frontier of technological capabilities. As a reward for their achievements, full transparency of these rewards is much public interest. It is a society that pays for administrative and economic inefficiencies that exist in today’s systems. NFT-based patents can enhance this transparency. From an organizational perspective, an NFT-based patent can remove current bottlenecks in patent processes by making these processes more efficient, rapid, and convenient for applicants without compromising the quality of granted patents.
The proposed framework encounters some challenges that should be solved to reach a developed patent verification platform. First, technical problems are discussed. The consensus method that is used in the verification layer is not addressed in detail. Due to the permissioned structure of miners in the NFT-based patents, consensus algorithms like PBFT, Federated Consensus, and Round Robin Consensus are designed for permissioned blockchains can be applied. Also, miners/validators spend some time validating the patents; hence a protocol should be designed to profit them. Some challenges like proving the miners’ time and effort, the price that inventors should pay to miners, and other economic trade-offs should be considered.
Different NFT standards were discussed. If various patent services use NFT standards, there will be some cross-platform problems. For instance, transferring an NFT from Ethereum blockchain (ERC-721 token) to EOS blockchain is not a forward and straight work and needs some considerations. Also, people usually trade NFTs in marketplaces such as Rarible and OpenSea. These marketplaces are centralized and may prompt some challenges because of their centralized nature. Besides, there exist some other types of challenges. For example, the novelty of NFT-based patents and blockchain services.
Blockchain-based patent service has not been tested before. The patent registration procedure and concepts of the Patent as NFT system may be ambiguous for people who still prefer conventional centralized patent systems over decentralized ones. It should be noted that there are some problems in the mining part. Miners should receive certificates from the accepted organizations. Determining these organizations and how they accept referees as validators need more consideration. Some types of inventions in some countries are prohibited, and inventors cannot register them. In NFT-based patents, inventors can register their patents publicly, and maybe some collisions occur between inventors and the government. There exist some misunderstandings about NFT’s ownership rights. It is not clear that when a person buys an NFT, which rights are given to them exactly; for instance, they have property rights or have moral rights, too.
Conclusion
Blockchain technology provides strong timestamping, the potential for smart contracts, proof-of-existence. It enables creating a transparent, distributed, cost-effective, and resilient environment that is open to all and where each transaction is auditable. On the other hand, blockchain is a definite boon to the IP industry, benefitting patent owners. When blockchain technology’s intrinsic characteristics are applied to the IP domain, it helps copyrights. This paper provided a conceptual framework for presenting an NFT-based patent with a comprehensive discussion of many aspects: background, model components, token standards to application areas, and research challenges. The proposed framework includes five main layers: Storage Layer, Authentication Layer, Verification Layer, Blockchain Layer, and Application. The primary purpose of this patent framework was to provide an NFT-based concept that could be used to patent a decentralized, anti-tamper, and reliable network for trade and exchange around the world. Finally, we addressed several open challenges to NFT-based inventions.
References
Nakamoto, S. Bitcoin: A peer-to-peer electronic cash system. Decent. Bus. Rev. 21260, https://bitcoin.org/bitcoin.pdf (2008).
Buterin, V. A next-generation smart contract and decentralized application platform. White Pap.3 (2014).
Nofer, M., Gomber, P., Hinz, O. & Schiereck, D. Business & infomation system engineering. Blockchain59, 183–187 (2017).Google Scholar
Entriken, W., Shirley, D., Evans, J. & Sachs, N. EIP 721: ERC-721 non-fungible token standard. Ethereum Improv. Propos.. https://eips.ethereum.org/EIPS/eip-721 (2018).
Radomski, W. et al. Eip 1155: Erc-1155 multi token standard. In Ethereum, Standard (2018).
Fairfield, J. Tokenized: The law of non-fungible tokens and unique digital property. Indiana Law J. forthcoming (2021).
Chevet, S. Blockchain technology and non-fungible tokens: Reshaping value chains in creative industries. Available at SSRN 3212662 (2018).
Bal, M. & Ner, C. NFTracer: a Non-Fungible token tracking proof-of-concept using Hyperledger Fabric. arXiv preprint arXiv:1905.04795 (2019).
Wang, Q., Li, R., Wang, Q. & Chen, S. Non-fungible token (NFT): Overview, evaluation, opportunities and challenges. arXiv preprint arXiv:2105.07447 (2021).
Qu, Q., Nurgaliev, I., Muzammal, M., Jensen, C. S. & Fan, J. On spatio-temporal blockchain query processing. Future Gener. Comput. Syst.98: 208–218 (2019).ArticleGoogle Scholar
Rosenfeld, M. Overview of colored coins. White paper, bitcoil. co. il41, 94 (2012).
Benisi, N. Z., Aminian, M. & Javadi, B. Blockchain-based decentralized storage networks: A survey. J. Netw. Comput. Appl.162, 102656 (2020).ArticleGoogle Scholar
NFTReview. On-chain vs. Off-chain Metadata (2021).
Nizamuddin, N., Salah, K., Azad, M. A., Arshad, J. & Rehman, M. Decentralized document version control using ethereum blockchain and IPFS. Comput. Electr. Eng.76, 183–197 (2019).ArticleGoogle Scholar
Tut, K. Who Is Responsible for NFT Data? (2020).
nft.storage. Free Storage for NFTs, Retrieved 16 May, 2021, from https://nft.storage/. (2021).
Psaras, Y. & Dias, D. in 2020 50th Annual IEEE-IFIP International Conference on Dependable Systems and Networks-Supplemental Volume (DSN-S). 80–80 (IEEE).
Tanner, J. & Roelofs, C. NFTs and the need for Self-Sovereign Identity (2021).
Martens, D., Tuyll van Serooskerken, A. V. & Steenhagen, M. Exploring the potential of blockchain for KYC. J. Digit. Bank.2, 123–131 (2017).Google Scholar
Hammi, M. T., Bellot, P. & Serhrouchni, A. In 2018 IEEE Wireless Communications and Networking Conference (WCNC). 1–6 (IEEE).
Khalid, U. et al. A decentralized lightweight blockchain-based authentication mechanism for IoT systems. Cluster Comput. 1–21 (2020).
Zhong, Y. et al. Distributed blockchain-based authentication and authorization protocol for smart grid. Wirel. Commun. Mobile Comput. (2021).
Schönhals, A., Hepp, T. & Gipp, B. In Proceedings of the 1st Workshop on Cryptocurrencies and Blockchains for Distributed Systems. 105–110.
Verma, S. & Prajapati, G. A Survey of Cryptographic Hash Algorithms and Issues. International Journal of Computer Security & Source Code Analysis (IJCSSCA) 1, 17–20, (2015).
Verma, S. & Prajapati, G. A survey of cryptographic hash algorithms and issues. Int. J. Comput. Secur. Source Code Anal. (IJCSSCA)1 (2015).
SDK, I. X.509 Certificates (1996).
Helliar, C. V., Crawford, L., Rocca, L., Teodori, C. & Veneziani, M. Permissionless and permissioned blockchain diffusion. Int. J. Inf. Manag.54, 102136 (2020).ArticleGoogle Scholar
Frizzo-Barker, J. et al. Blockchain as a disruptive technology for business: A systematic review. Int. J. Inf. Manag.51, 102029 (2020).ArticleGoogle Scholar
Bamakan, S. M. H., Motavali, A. & Bondarti, A. B. A survey of blockchain consensus algorithms performance evaluation criteria. Expert Syst. Appl.154, 113385 (2020).ArticleGoogle Scholar
Bamakan, S. M. H., Bondarti, A. B., Bondarti, P. B. & Qu, Q. Blockchain technology forecasting by patent analytics and text mining. Blockchain Res. Appl. 100019 (2021).
Castro, M. & Liskov, B. Practical Byzantine fault tolerance and proactive recovery. ACM Trans. Comput. Syst. (TOCS)20, 398–461 (2002).ArticleGoogle Scholar
Muratov, F., Lebedev, A., Iushkevich, N., Nasrulin, B. & Takemiya, M. YAC: BFT consensus algorithm for blockchain. arXiv preprint arXiv:1809.00554 (2018).
Bessani, A., Sousa, J. & Alchieri, E. E. In 2014 44th Annual IEEE/IFIP International Conference on Dependable Systems and Networks. 355–362 (IEEE).
Todd, P. Ripple protocol consensus algorithm review. May 11th (2015).
Ongaro, D. & Ousterhout, J. In 2014 {USENIX} Annual Technical Conference ({USENIX}{ATC} 14). 305–319.
Dziembowski, S., Faust, S., Kolmogorov, V. & Pietrzak, K. In Annual Cryptology Conference. 585–605 (Springer).
Bentov, I., Lee, C., Mizrahi, A. & Rosenfeld, M. Proof of Activity: Extending Bitcoin’s Proof of Work via Proof of Stake. IACR Cryptology ePrint Archive2014, 452 (2014).Google Scholar
Bramas, Q. The Stability and the Security of the Tangle (2018).
Baird, L. The swirlds hashgraph consensus algorithm: Fair, fast, byzantine fault tolerance. In Swirlds Tech Reports SWIRLDS-TR-2016–01, Tech. Rep (2016).
LeMahieu, C. Nano: A feeless distributed cryptocurrency network. Nano [Online resource].https://nano.org/en/whitepaper (date of access: 24.03. 2018) 16, 17 (2018).
Casino, F., Dasaklis, T. K. & Patsakis, C. A systematic literature review of blockchain-based applications: Current status, classification and open issues. Telematics Inform.36, 55–81 (2019).ArticleGoogle Scholar
bigredawesomedodo. Helping Small Businesses Survive and Grow With Marketing, Retrieved 3 June, 2021, from https://bigredawesomedodo.com/nft/. (2020).
This work has been partially supported by CAS President’s International Fellowship Initiative, China [grant number 2021VTB0002, 2021] and National Natural Science Foundation of China (No. 61902385).
Author information
Affiliations
Department of Industrial Management, Yazd University, Yazd City, IranSeyed Mojtaba Hosseini Bamakan
Department of Electrical and Computer Engineering, Isfahan University of Technology, Isfahan City, IranNasim Nezhadsistani
School of Electrical and Computer Engineering, University of Tehran, Tehran City, IranOmid Bodaghi
Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, ChinaSeyed Mojtaba Hosseini Bamakan & Qiang Qu
Developing Machine Learning Models for Prediction of Onset of Type-2 Diabetes
Reporter: Amandeep Kaur, B.Sc., M.Sc.
A recent study reports the development of an advanced AI algorithm which predicts up to five years in advance the starting of type 2 diabetes by utilizing regularly collected medical data. Researchers described their AI model as notable and distinctive based on the specific design which perform assessments at the population level.
The first author Mathieu Ravaut, M.Sc. of the University of Toronto and other team members stated that “The main purpose of our model was to inform population health planning and management for the prevention of diabetes that incorporates health equity. It was not our goal for this model to be applied in the context of individual patient care.”
Research group collected data from 2006 to 2016 of approximately 2.1 million patients treated at the same healthcare system in Ontario, Canada. Even though the patients were belonged to the same area, the authors highlighted that Ontario encompasses a diverse and large population.
The newly developed algorithm was instructed with data of approximately 1.6 million patients, validated with data of about 243,000 patients and evaluated with more than 236,000 patient’s data. The data used to improve the algorithm included the medical history of each patient from previous two years- prescriptions, medications, lab tests and demographic information.
When predicting the onset of type 2 diabetes within five years, the algorithm model reached a test area under the ROC curve of 80.26.
The authors reported that “Our model showed consistent calibration across sex, immigration status, racial/ethnic and material deprivation, and a low to moderate number of events in the health care history of the patient. The cohort was representative of the whole population of Ontario, which is itself among the most diverse in the world. The model was well calibrated, and its discrimination, although with a slightly different end goal, was competitive with results reported in the literature for other machine learning–based studies that used more granular clinical data from electronic medical records without any modifications to the original test set distribution.”
This model could potentially improve the healthcare system of countries equipped with thorough administrative databases and aim towards specific cohorts that may encounter the faulty outcomes.
Research group stated that “Because our machine learning model included social determinants of health that are known to contribute to diabetes risk, our population-wide approach to risk assessment may represent a tool for addressing health disparities.”
Ravaut M, Harish V, Sadeghi H, et al. Development and Validation of a Machine Learning Model Using Administrative Health Data to Predict Onset of Type 2 Diabetes. JAMA Netw Open. 2021;4(5):e2111315. doi:10.1001/jamanetworkopen.2021.11315 https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2780137
Other related articles were published in this Open Access Online Scientific Journal, including the following:
AI in Drug Discovery: Data Science and Core Biology @Merck &Co, Inc., @GNS Healthcare, @QuartzBio, @Benevolent AI and Nuritas
Reporters: Aviva Lev-Ari, PhD, RN and Irina Robu, PhD
re:Invent 2020 – Virtual 3 weeks Conference, Nov. 30 – Dec. 18, 2020: How Healthcare & Life Sciences leaders are using AWS to transform their businesses and innovate on behalf of their customers.
Preview the tracks that will be available, along with the general agenda, on the website.
AWS re:Invent routinely fills several Las Vegas venues with standing-room only crowds, but we are bringing it to you with an all-virtual and free event this year. This year’s conference is gearing up to be our biggest yet and we have an exciting program planned with five keynotes, 18 leadership sessions, and over 500 breakout sessions beginning November 30. Hear how AWS experts and inspiring members of the Life Sciences & Genomics industry are using cloud technology to transform their businesses and innovate on the behalf of their customers.For Life Sciences attendees looking to get the most out of their experience, follow these steps:
Take a look at all of the Life Sciences sessions available, as well as lots of other information and additional activities, in our curated Life Sciences Attendee Guide coming soon!
Check back on this post regularly, as we’ll continually update it to reflect the newest information.
Life Sciences at re:Invent 2020
AWS enables pharma and biotech companies to transform every stage of the pharma value chain, with services that enhance data liquidity, operational excellence, and customer engagement. AWS is the trusted technology provider with the cost-effective compute and storage, machine learning capabilities, and customer-centric know how to help companies embrace innovation and bring differentiated therapeutics to market faster.
Dates and Presentation Title
BlockChain Architecture
DEC 1, 2020 | 12:00 AM – 12:20 AM EST
Nestlé brings supply chain transparency with Amazon Managed Blockchain
Chain of origin is the Nestlé answer to complete supply chain transparency, from crop to coffee cup, with technology at its heart. Today, consumers want to know about the quality of their product and know where it is sourced from. Using AWS and Amazon Man
This session demonstrates how to build interactive, trusted, and transparent live virtual experiences using the Amazon Interactive Video Service, Amazon Chime, Amazon QLDB, and Amazon Managed Blockchain to capture cryptographic, immutable, and verifiable records of real-time audience interactions.
Add To Calendar
Architecture Blockchain
DEC 10, 2020 | 11:15 PM – 11:45 PM EST
Trust and transparency in live virtual events
This session demonstrates how to build interactive, trusted, and transparent live virtual experiences using the Amazon Interactive Video Service, Amazon Chime, Amazon QLDB, and Amazon Managed Blockchain to capture cryptographic, immutable, and verifiable records of real-time audience interactions.
Add To Calendar
Architecture Blockchain
Friday, December 11
DEC 11, 2020 | 7:15 AM – 7:45 AM EST
Trust and transparency in live virtual events
This session demonstrates how to build interactive, trusted, and transparent live virtual experiences using the Amazon Interactive Video Service, Amazon Chime, Amazon QLDB, and Amazon Managed Blockchain to capture cryptographic, immutable, and verifiable records of real-time audience interactions.
Add To Calendar
Architecture Blockchain
Tuesday, December 15
DEC 15, 2020 | 4:00 PM – 4:30 PM EST
What’s new in Amazon Managed Blockchain
Amazon Managed Blockchain is a fully managed service that makes it easy for you to create and manage scalable blockchain networks using popular open-source technologies. Blockchain technologies enable groups of organizations to securely transact and
Amazon Managed Blockchain is a fully managed service that makes it easy for you to create and manage scalable blockchain networks using popular open-source technologies. Blockchain technologies enable groups of organizations to securely transact and
Amazon Managed Blockchain is a fully managed service that makes it easy for you to create and manage scalable blockchain networks using popular open-source technologies. Blockchain technologies enable groups of organizations to securely transact and
Building robots to help hospitals become safer and smarter
In hospitals and other healthcare venues, robots increasingly perform contactless delivery and autonomous maintenance services to reduce the risk of exposing patients and medical staff to harmful viruses and bacteria. In this session, see how Solaris JetBrain and M
Transform research environments with Service Workbench on AWS
Reenvison how research environments are spun up by reducing wait times from days to minutes. Service Workbench on AWS promotes repeatability, multi-institutional collaboration, and transparency in the research process. In this session, learn how Harvard Medical School is procuring and deploying domain-specific data, tools, and secure IT environments to accelerate research.
Add To Calendar
Public Sector Healthcare
Wednesday, December 2
DEC 2, 2020 | 1:15 AM – 1:45 AM EST
Transform research environments with Service Workbench on AWS
Reenvison how research environments are spun up by reducing wait times from days to minutes. Service Workbench on AWS promotes repeatability, multi-institutional collaboration, and transparency in the research process. In this session, learn how Harvard Medical School is procuring and deploying domain-specific data, tools, and secure IT environments to accelerate research.
Add To Calendar
Public Sector Healthcare
DEC 2, 2020 | 9:15 AM – 9:45 AM EST
Transform research environments with Service Workbench on AWS
Reenvison how research environments are spun up by reducing wait times from days to minutes. Service Workbench on AWS promotes repeatability, multi-institutional collaboration, and transparency in the research process. In this session, learn how Harvard Medical School is procuring and deploying domain-specific data, tools, and secure IT environments to accelerate research.
Add To Calendar
Public Sector Healthcare
DEC 2, 2020 | 11:00 AM – 11:30 AM EST
Reinventing medical imaging with machine learning on AWS
It is hard to imagine the future of medical imaging without machine learning (ML) as its central innovation engine. Countless researchers, developers, startups, and larger enterprises are engaged in building, training, and deploying ML solutions for medical i
Making healthcare more personal with MetroPlus Health
COVID has made a huge impact across the world, and organizations have had to adapt quickly to changing requirements as a result. Learn how MetroPlus Health, a New York City health plan covering over half a million people, leveraged AWS technology to quickly
Securing protected health information and high-risk datasets
Join this session featuring Jonathan Cook, Chief Technology Officer at Arcadia, for a discussion around securing mission-critical and high-risk datasets such as personal health information (PHI) in the cloud. Learn how Arcadia developed a HITRUST CSF-certified plat
ML and analytics addressing nationwide COVID-19 impact and recovery (sponsored …
In this session, learn how Fractal.AI delivered a platform to help analyze data and make decisions related to COVID-19 progression for the government of Telangana, India, deploying it on AWS in five days. The solution, based on Intel processors, delivered more than 100 dashboards using anonymized government and public datasets with hundreds of thousands of COVID-19 d… Learn More
Add To Calendar
Public Sector Partner Solutions for Business
DEC 2, 2020 | 6:30 PM – 7:00 PM EST
How Vyaire uses AWS analytics to scale ventilator production & save lives
When COVID-19 hit the US, Vyaire Medical, one of the country’s only ventilator manufacturers, knew it would have to scale rapidly while still offering quality machinery. In order to scale to 20 times more than its usual production and gain insights from all its new
Reinventing medical imaging with machine learning on AWS
It is hard to imagine the future of medical imaging without machine learning (ML) as its central innovation engine. Countless researchers, developers, startups, and larger enterprises are engaged in building, training, and deploying ML solutions for medical i
Making healthcare more personal with MetroPlus Health
COVID has made a huge impact across the world, and organizations have had to adapt quickly to changing requirements as a result. Learn how MetroPlus Health, a New York City health plan covering over half a million people, leveraged AWS technology to quickly
Securing protected health information and high-risk datasets
Join this session featuring Jonathan Cook, Chief Technology Officer at Arcadia, for a discussion around securing mission-critical and high-risk datasets such as personal health information (PHI) in the cloud. Learn how Arcadia developed a HITRUST CSF-certified plat
ML and analytics addressing nationwide COVID-19 impact and recovery (sponsored …
In this session, learn how Fractal.AI delivered a platform to help analyze data and make decisions related to COVID-19 progression for the government of Telangana, India, deploying it on AWS in five days. The solution, based on Intel processors, delivered more than 100 dashboards using anonymized government and public datasets with hundreds of thousands of COVID-19 d… Learn More
Add To Calendar
Public Sector Partner Solutions for Business
DEC 3, 2020 | 2:30 AM – 3:00 AM EST
How Vyaire uses AWS analytics to scale ventilator production & save lives
When COVID-19 hit the US, Vyaire Medical, one of the country’s only ventilator manufacturers, knew it would have to scale rapidly while still offering quality machinery. In order to scale to 20 times more than its usual production and gain insights from all its new
Reinventing medical imaging with machine learning on AWS
It is hard to imagine the future of medical imaging without machine learning (ML) as its central innovation engine. Countless researchers, developers, startups, and larger enterprises are engaged in building, training, and deploying ML solutions for medical i
Making healthcare more personal with MetroPlus Health
COVID has made a huge impact across the world, and organizations have had to adapt quickly to changing requirements as a result. Learn how MetroPlus Health, a New York City health plan covering over half a million people, leveraged AWS technology to quickly
Securing protected health information and high-risk datasets
Join this session featuring Jonathan Cook, Chief Technology Officer at Arcadia, for a discussion around securing mission-critical and high-risk datasets such as personal health information (PHI) in the cloud. Learn how Arcadia developed a HITRUST CSF-certified plat
ML and analytics addressing nationwide COVID-19 impact and recovery (sponsored …
In this session, learn how Fractal.AI delivered a platform to help analyze data and make decisions related to COVID-19 progression for the government of Telangana, India, deploying it on AWS in five days. The solution, based on Intel processors, delivered more than 100 dashboards using anonymized government and public datasets with hundreds of thousands of COVID-19 d… Learn More
Add To Calendar
Public Sector Partner Solutions for Business
DEC 3, 2020 | 10:30 AM – 11:00 AM EST
How Vyaire uses AWS analytics to scale ventilator production & save lives
When COVID-19 hit the US, Vyaire Medical, one of the country’s only ventilator manufacturers, knew it would have to scale rapidly while still offering quality machinery. In order to scale to 20 times more than its usual production and gain insights from all its new
Productionizing R workloads using Amazon SageMaker, featuring Siemens
R language and its 16,000+ packages dedicated to statistics and ML are used by statisticians and data scientists in industries such as energy, healthcare, life science, and financial services. Using R, you can run simulations and ML securely and at scale with Amazon S
Accelerating the transition to telehealth with AWS
Learn how AWS is helping healthcare organizations develop and deploy telehealth solutions quickly and at scale. Join speakers from AWS and MedStar Health as they discuss their experience developing and deploying two call centers in less than a week using Amaz
Productionizing R workloads using Amazon SageMaker, featuring Siemens
R language and its 16,000+ packages dedicated to statistics and ML are used by statisticians and data scientists in industries such as energy, healthcare, life science, and financial services. Using R, you can run simulations and ML securely and at scale with Amazon S
Accelerating the transition to telehealth with AWS
Learn how AWS is helping healthcare organizations develop and deploy telehealth solutions quickly and at scale. Join speakers from AWS and MedStar Health as they discuss their experience developing and deploying two call centers in less than a week using Amaz
Productionizing R workloads using Amazon SageMaker, featuring Siemens
R language and its 16,000+ packages dedicated to statistics and ML are used by statisticians and data scientists in industries such as energy, healthcare, life science, and financial services. Using R, you can run simulations and ML securely and at scale with Amazon S
Accelerating the transition to telehealth with AWS
Learn how AWS is helping healthcare organizations develop and deploy telehealth solutions quickly and at scale. Join speakers from AWS and MedStar Health as they discuss their experience developing and deploying two call centers in less than a week using Amaz
How Erickson Living built a COVID-19 outbreak management solution
As innovators in independent living, assisted living, and skilled nursing care, Erickson Living responded to the COVID-19 outbreak by building an infectious disease management system with the AWS Data Lab to prevent the spread of the virus. This session focuses o
Rapidly deploying social services on Amazon Connect
Organizations that respond to disruptive, large-scale events need the ability to rapidly scale and iterate on their contact centers to provide services to their constituents. Amazon Connect can be set up in minutes and scale to handle virtually any number of contact
Improving data liquidity in Roche’s personalized healthcare platform
Roche’s personalized healthcare mission is to accelerate drug discovery and transform the patient journey by using digital technologies and advanced analytics to facilitate greater scientific collaboration and insight sharing. In this session, Roche shares ho
Life Sciences Artificial Intelligence & Machine Learning
DEC 8, 2020 | 10:30 PM – 11:00 PM EST
How Erickson Living built a COVID-19 outbreak management solution
As innovators in independent living, assisted living, and skilled nursing care, Erickson Living responded to the COVID-19 outbreak by building an infectious disease management system with the AWS Data Lab to prevent the spread of the virus. This session focuses o
Rapidly deploying social services on Amazon Connect
Organizations that respond to disruptive, large-scale events need the ability to rapidly scale and iterate on their contact centers to provide services to their constituents. Amazon Connect can be set up in minutes and scale to handle virtually any number of contact
Improving data liquidity in Roche’s personalized healthcare platform
Roche’s personalized healthcare mission is to accelerate drug discovery and transform the patient journey by using digital technologies and advanced analytics to facilitate greater scientific collaboration and insight sharing. In this session, Roche shares ho
Life Sciences Artificial Intelligence & Machine Learning
DEC 9, 2020 | 6:30 AM – 7:00 AM EST
How Erickson Living built a COVID-19 outbreak management solution
As innovators in independent living, assisted living, and skilled nursing care, Erickson Living responded to the COVID-19 outbreak by building an infectious disease management system with the AWS Data Lab to prevent the spread of the virus. This session focuses o
Rapidly deploying social services on Amazon Connect
Organizations that respond to disruptive, large-scale events need the ability to rapidly scale and iterate on their contact centers to provide services to their constituents. Amazon Connect can be set up in minutes and scale to handle virtually any number of contact
Improving data liquidity in Roche’s personalized healthcare platform
Roche’s personalized healthcare mission is to accelerate drug discovery and transform the patient journey by using digital technologies and advanced analytics to facilitate greater scientific collaboration and insight sharing. In this session, Roche shares ho
Life Sciences Artificial Intelligence & Machine Learning
DEC 9, 2020 | 10:30 AM – 11:00 AM EST
BlueJeans’ explosive growth journey with AWS during the pandemic
Global video provider BlueJeans (a Verizon company) supports employees working from home, healthcare providers shifting to telehealth, and educators moving to distance learning. With so many people now working from home, BlueJeans saw explosive gro
Learn how healthcare organizations can harness the power of AI and machine learning to automate clinical workflows, digitize medical information, extract and summarize medical information, protect patient data, and more. Using AWS document understand
As clinical trials increasingly become decentralized and virtual, engaging effectively with patients can be challenging. In this session, hear how Evidation Health architects on AWS to create patient-centric experiences, ingests data from millions of devices in real time
BlueJeans’ explosive growth journey with AWS during the pandemic
Global video provider BlueJeans (a Verizon company) supports employees working from home, healthcare providers shifting to telehealth, and educators moving to distance learning. With so many people now working from home, BlueJeans saw explosive gro
Learn how healthcare organizations can harness the power of AI and machine learning to automate clinical workflows, digitize medical information, extract and summarize medical information, protect patient data, and more. Using AWS document understand
As clinical trials increasingly become decentralized and virtual, engaging effectively with patients can be challenging. In this session, hear how Evidation Health architects on AWS to create patient-centric experiences, ingests data from millions of devices in real time
BlueJeans’ explosive growth journey with AWS during the pandemic
Global video provider BlueJeans (a Verizon company) supports employees working from home, healthcare providers shifting to telehealth, and educators moving to distance learning. With so many people now working from home, BlueJeans saw explosive gro
Learn how healthcare organizations can harness the power of AI and machine learning to automate clinical workflows, digitize medical information, extract and summarize medical information, protect patient data, and more. Using AWS document understand
As clinical trials increasingly become decentralized and virtual, engaging effectively with patients can be challenging. In this session, hear how Evidation Health architects on AWS to create patient-centric experiences, ingests data from millions of devices in real time
Edge computing innovation with AWS Snowcone and AWS Snowball Edge
In this session, learn how the AWS Snow Family can help you run operations in harsh, non-data center environments and in locations where there is a lack of consistent network connectivity. The AWS Snow Family, comprised of AWS Snowcone and AWS Snowball Edge, offers a number of physical devices and capacity points with built-in computing capabilities. This sess… Learn More
Join AWS Healthcare and Life Science Leader Shez Partovi, MD, for a look into how cloud technology can reshape the future of healthcare. In this executive overview, Dr. Partovi shares a vision of a digitally enhanced, data-driven future. Learn how AWS is working
Edge computing innovation with AWS Snowcone and AWS Snowball Edge
In this session, learn how the AWS Snow Family can help you run operations in harsh, non-data center environments and in locations where there is a lack of consistent network connectivity. The AWS Snow Family, comprised of AWS Snowcone and AWS Snowball Edge, offers a number of physical devices and capacity points with built-in computing capabilities. This sess… Learn More
Join AWS Healthcare and Life Science Leader Shez Partovi, MD, for a look into how cloud technology can reshape the future of healthcare. In this executive overview, Dr. Partovi shares a vision of a digitally enhanced, data-driven future. Learn how AWS is worki
Edge computing innovation with AWS Snowcone and AWS Snowball Edge
In this session, learn how the AWS Snow Family can help you run operations in harsh, non-data center environments and in locations where there is a lack of consistent network connectivity. The AWS Snow Family, comprised of AWS Snowcone and AWS Snowball Edge, offers a number of physical devices and capacity points with built-in computing capabilities. This sess… Learn More
Join AWS Healthcare and Life Science Leader Shez Partovi, MD, for a look into how cloud technology can reshape the future of healthcare. In this executive overview, Dr. Partovi shares a vision of a digitally enhanced, data-driven future. Learn how AWS is worki
Intelligent document processing for the insurance industry
Organizations in the insurance industry, both for healthcare and financial services, extract sensitive information such as names, dates, claims, or medical procedures from scanned images and documents to perform their business operations. These organization
This session is open to anyone, but it is intended for current and potential AWS Partners. The COVID-19 pandemic has been a formative event affecting our world physically, emotionally, and economically. Despite the challenges created, AWS Partners have responded quickly and proven their resiliency by enabling digital transformation at unprecedented rates. In this sessi… Learn More
Add To Calendar
Global Partner Summit (GPS) Session
DEC 15, 2020 | 9:00 PM – 9:30 PM EST
Intelligent document processing for the insurance industry
Organizations in the insurance industry, both for healthcare and financial services, extract sensitive information such as names, dates, claims, or medical procedures from scanned images and documents to perform their business operations. These organization
This session is open to anyone, but it is intended for current and potential AWS Partners. The COVID-19 pandemic has been a formative event affecting our world physically, emotionally, and economically. Despite the challenges created, AWS Partners have responded quickly and proven their resiliency by enabling digital transformation at unprecedented rates. In this sessi… Learn More
Add To Calendar
Global Partner Summit (GPS) Session
Wednesday, December 16
DEC 16, 2020 | 5:00 AM – 5:30 AM EST
Intelligent document processing for the insurance industry
Organizations in the insurance industry, both for healthcare and financial services, extract sensitive information such as names, dates, claims, or medical procedures from scanned images and documents to perform their business operations. These organization
This session is open to anyone, but it is intended for current and potential AWS Partners. The COVID-19 pandemic has been a formative event affecting our world physically, emotionally, and economically. Despite the challenges created, AWS Partners have responded quickly and proven their resiliency by enabling digital transformation at unprecedented rates. In this sessi… Learn More
Add To Calendar
Global Partner Summit (GPS) Session
DEC 16, 2020 | 11:45 AM – 12:15 PM EST
Simplify healthcare compliance with third-party solutions
Sensitive health data must be protected to ensure patient privacy, and healthcare organizations need to ensure that IT infrastructure is compliant with changing policies and regulations. In this session, learn how healthcare providers can address compli
An introduction to healthcare interoperability and FHIR Works on AWS
Fast Healthcare Interoperability Resources (FHIR) is gaining popularity around the world as the standard to use for exchanging healthcare data, and it is being increasingly adopted in Europe and Australasia. In the US, it is actually mandated in the 21st Century
Achieving healthcare interoperability with FHIR Works on AWS
The Fast Healthcare Interoperability Resources (FHIR) standard has become increasingly necessary for enabling interoperability between healthcare applications and organizations. Join this session for a deep dive into how the FHIR Works on AWS open-source t
AWS at the edge: Using AWS IoT to optimize Amazon wind farms
AWS IoT and edge computing solutions move data processing and analysis closer to where data is generated to enable customers to innovate and achieve more sustainable operations. In this session, learn how Amazon renewable energy projects use AWS IoT to coll
Simplify healthcare compliance with third-party solutions
Sensitive health data must be protected to ensure patient privacy, and healthcare organizations need to ensure that IT infrastructure is compliant with changing policies and regulations. In this session, learn how healthcare providers can address compli
An introduction to healthcare interoperability and FHIR Works on AWS
Fast Healthcare Interoperability Resources (FHIR) is gaining popularity around the world as the standard to use for exchanging healthcare data, and it is being increasingly adopted in Europe and Australasia. In the US, it is actually mandated in the 21st Century
Achieving healthcare interoperability with FHIR Works on AWS
The Fast Healthcare Interoperability Resources (FHIR) standard has become increasingly necessary for enabling interoperability between healthcare applications and organizations. Join this session for a deep dive into how the FHIR Works on AWS open-source t
AWS at the edge: Using AWS IoT to optimize Amazon wind farms
AWS IoT and edge computing solutions move data processing and analysis closer to where data is generated to enable customers to innovate and achieve more sustainable operations. In this session, learn how Amazon renewable energy projects use AWS IoT to coll
Simplify healthcare compliance with third-party solutions
Sensitive health data must be protected to ensure patient privacy, and healthcare organizations need to ensure that IT infrastructure is compliant with changing policies and regulations. In this session, learn how healthcare providers can address compliance
An introduction to healthcare interoperability and FHIR Works on AWS
Fast Healthcare Interoperability Resources (FHIR) is gaining popularity around the world as the standard to use for exchanging healthcare data, and it is being increasingly adopted in Europe and Australasia. In the US, it is actually mandated in the 21st Century
Achieving healthcare interoperability with FHIR Works on AWS
The Fast Healthcare Interoperability Resources (FHIR) standard has become increasingly necessary for enabling interoperability between healthcare applications and organizations. Join this session for a deep dive into how the FHIR Works on AWS open-source t
AWS at the edge: Using AWS IoT to optimize Amazon wind farms
AWS IoT and edge computing solutions move data processing and analysis closer to where data is generated to enable customers to innovate and achieve more sustainable operations. In this session, learn how Amazon renewable energy projects use AWS IoT to coll
Share information by removing language barriers with Amazon Translate
In this session, see how to use Amazon Translate and Amazon Transcribe to create subtitles for educational videos and translate them to the language of the consumer’s choice. The session includes a demonstration of how this process reduced the time it took for information about COVID-19 to be translated to many languages, thus spreading accurate information quickly.
Share information by removing language barriers with Amazon Translate
In this session, see how to use Amazon Translate and Amazon Transcribe to create subtitles for educational videos and translate them to the language of the consumer’s choice. The session includes a demonstration of how this process reduced the time it took for information about COVID-19 to be translated to many languages, thus spreading accurate information quickly.
Share information by removing language barriers with Amazon Translate
In this session, see how to use Amazon Translate and Amazon Transcribe to create subtitles for educational videos and translate them to the language of the consumer’s choice. The session includes a demonstration of how this process reduced the time it took for information about COVID-19 to be translated to many languages, thus spreading accurate information quickly.
Share information by removing language barriers with Amazon Translate
In this session, see how to use Amazon Translate and Amazon Transcribe to create subtitles for educational videos and translate them to the language of the consumer’s choice. The session includes a demonstration of how this process reduced the time it took for information about COVID-19 to be translated to many languages, thus spreading accurate information quickly.
Bookmark this blog and check back for direct links to each session and add to your re:Invent schedule as soon as the session catalog is released:
LFS201: Life Sciences Industry: Executive Outlook
Learn how AWS technology is helping organizations improve their data liquidity, achieve operational excellence, and enhance customer engagement.
LFS202: Improving data liquidity in Roche’s personalized healthcare platform
Learn how Roche’s personalized healthcare platform is accelerating drug discovery and transforming the patient journey with digital technology.
LFS302: AstraZeneca genomics on AWS: from petabytes to new medicines
Learn how AstraZeneca built an industry leading genomics pipeline on AWS to analyze 2 million genomes in support of precision medicine.
LFS303: Building patient-centric virtualized trials
Learn how Evidation Health architects on AWS to create patient-centric experiences in decentralized and virtual clinical trials.
LFS304: Streamlining manufacturing and supply chain at Novartis
Learn how Novartis is creating real-time analytics and transparency in the pharma manufacturing process and supply chain to bring innovative medicines to market.
LFS305: Accelerating regulatory assessments in life sciences manufacturing
Learn how Merck leveraged Amazon Machine Learning to build an evaluation and recommendation engine for streamlining pharma manufacturing change requests.
Other related sessions of interest:
ENT203: How BMS automates GxP compliance for SAP systems on AWS
HLC203: Securing Personal Health Information and High Risk Data Sets
WPS202: Transform research environments with Service Workbench on AWS
AIM310: Intelligent document processing for healthcare organizations
Healthcare Attendee Guide
AWS re:Invent routinely fills several Las Vegas venues with standing-room only crowds, but we are bringing it to you with an all-virtual and free event this year. This year’s conference is gearing up to be our biggest yet and we have an exciting program planned for the Healthcare industry with five keynotes, 18 leadership sessions, and over 500 breakout sessions beginning November 30. See how AWS experts and talented members of the Healthcare industry are using cloud technology to transform their businesses and innovate on the behalf of their customers.For Healthcare attendees looking to get the most out of their experience, follow these steps:
Take a look at all of the Healthcare sessions available, as well as lots of other information and additional activities, in our curated Healthcare Attendee Guide coming soon!
Check back on this post regularly, as we’ll continually update it to reflect the newest information.
Healthcare at re:Invent 2020
AWS is the trusted technology partner to the global healthcare industry. For over 12 years, AWS has established itself as the most mature, comprehensive, and broadly adopted cloud platform and is trusted by thousands of healthcare customers around the world—including the fastest-growing startups, the largest enterprises, and leading government agencies. The secure and compliant AWS technology enables the highly regulated healthcare industry to improve outcomes and lower costs by providing the tools to unlock the potential of healthcare data, predict healthcare events, and build closer relationships with patients and consumers. The healthcare track at re:Invent 2020 will feature customer-led sessions focused on these each of these critical components, accelerating the transformation of healthcare.
Healthcare sessions
Learn more and bookmark each Healthcare session:
HCL201: Healthcare Executive Outlook: Accelerating Transformation
Learn how AWS is working with industry leaders to increase their pace of innovation, unlock the potential of their healthcare data, help predict patient health events, and personalize the healthcare journey for their patients, consumers, and members.
HLC202: Making Healthcare More Personal with MetroPlus Health
Learn how MetroPlus Health leveraged AWS technology to quickly build and deploy an application that personally and proactively reached out to its members during a time of critical need.
HLC203: Securing Personal Health Information and High Risk Data Sets
Learn how Arcadia developed a HITRUST CSF Certified platform by leveraging AWS technology to enable the secure management of data from over 100 million patients.
HLC204: Accelerating the Transition to Virtual Care with AWS
Learn how MedStar Health developed and deployed two call centers in less than week that are supporting more than 3,500 outpatient telehealth sessions a day.
WPS202: Transform research environments with Service Workbench on AWS
Learn how Harvard Medical School is using AWS to procure and deploy domain-specific data, tools, and secure IT environments to accelerate research.
WPS209: Reinventing medical imaging with machine learning on AWS
Learn how Radboud University Medical Center uses AWS to power its machine learning imaging platform with 45,000+ registered researchers and clinicians from all over the world.
WPS211: An introduction to healthcare interoperability and FHIR Works on AWS
Learn about AWS FHIR Works, an open-source project, designed to accelerate the industries use of the interoperability standard, Fast Healthcare Interoperability Resources (FHIR).
WPS304: Achieving healthcare interoperability with FHIR Works on AWS
Learn how Black Pear Software leveraged AWS to build an integration toolkit to help their customers share healthcare data more effectively.
Extras you won’t want to miss out on!
LFS201: Life Sciences Industry: Executive outlook
LFS202: AstraZeneca genomics on AWS: From petabytes to new medicines
LFS303: Building patient-centric virtualized trials
AIM303: Using AI to automate clinical workflows
AIM310: Intelligent document processing for the insurance industry
INO204: Solving societal challenges with digital innovation on AWS
ZWL208: Using cloud-based genomic research to reduce health care disparities