Feeds:
Posts
Comments

Archive for the ‘Health Law Policy’ Category

From the journal Nature: NFT, Patents, and Intellectual Property: Potential Design

Reporter: Stephen J. Williams, Ph.D.

 

From the journal Nature

Source: https://www.nature.com/articles/s41598-022-05920-6

Patents and intellectual property assets as non-fungible tokens; key technologies and challenges

Scientific Reports volume 12, Article number: 2178 (2022)

Abstract

With the explosive development of decentralized finance, we witness a phenomenal growth in tokenization of all kinds of assets, including equity, funds, debt, and real estate. By taking advantage of blockchain technology, digital assets are broadly grouped into fungible and non-fungible tokens (NFT). Here non-fungible tokens refer to those with unique and non-substitutable properties. NFT has widely attracted attention, and its protocols, standards, and applications are developing exponentially. It has been successfully applied to digital fantasy artwork, games, collectibles, etc. However, there is a lack of research in utilizing NFT in issues such as Intellectual Property. Applying for a patent and trademark is not only a time-consuming and lengthy process but also costly. NFT has considerable potential in the intellectual property domain. It can promote transparency and liquidity and open the market to innovators who aim to commercialize their inventions efficiently. The main objective of this paper is to examine the requirements of presenting intellectual property assets, specifically patents, as NFTs. Hence, we offer a layered conceptual NFT-based patent framework. Furthermore, a series of open challenges about NFT-based patents and the possible future directions are highlighted. The proposed framework provides fundamental elements and guidance for businesses in taking advantage of NFTs in real-world problems such as grant patents, funding, biotechnology, and so forth.

Introduction

Distributed ledger technologies (DLTs) such as blockchain are emerging technologies posing a threat to existing business models. Traditionally, most companies used centralized authorities in various aspects of their business, such as financial operations and setting up a trust with their counterparts. By the emergence of blockchain, centralized organizations can be substituted with a decentralized group of resources and actors. The blockchain mechanism was introduced in Bitcoin white paper in 2008, which lets users generate transactions and spend their money without the intervention of banks1. Ethereum, which is a second generation of blockchain, was introduced in 2014, allowing developers to run smart contracts on a distributed ledger. With smart contracts, developers and businesses can create financial applications that use cryptocurrencies and other forms of tokens for applications such as decentralized finance (DeFi), crowdfunding, decentralized exchanges, data records keeping, etc.2. Recent advances in distributed ledger technology have developed concepts that lead to cost reduction and the simplification of value exchange. Nowadays, by leveraging the advantages of blockchain and taking into account the governance issues, digital assets could be represented as tokens that existed in the blockchain network, which facilitates their transmission and traceability, increases their transparency, and improves their security3.

In the landscape of blockchain technology, there could be defined two types of tokens, including fungible tokens, in which all the tokens have equal value and non-fungible tokens (NFTs) that feature unique characteristics and are not interchangeable. Actually, non-fungible tokens are digital assets with a unique identifier that is stored on a blockchain4. NFT was initially suggested in Ethereum Improvement Proposals (EIP)-7215, and it was later expanded in EIP-11556. NFTs became one of the most widespread applications of blockchain technology that reached worldwide attention in early 2021. They can be digital representations of real-world objects. NFTs are tradable rights of digital assets (pictures, music, films, and virtual creations) where ownership is recorded in blockchain smart contracts7.

In particular, fungibility is the ability to exchange one with another of the same kind as an essential currency feature. The non-fungible token is unique and therefore cannot be substituted8. Recently, blockchain enthusiasts have indicated significant interest in various types of NFTs. They enthusiastically participate in NFT-related games or trades. CryptoPunks9, as one of the first NFTs on Ethereum, has developed almost 10,000 collectible punks and helped popularize the ERC-721 Standard. With the gamification of the breeding mechanics, CryptoKitties10 officially placed NFTs at the forefront of the market in 2017. CryptoKitties is an early blockchain game that enables users to buy, sell, collect, and digital breed cats. Another example is NBA Top Shot11, an NFT trading platform for digital short films buying and selling NBA events.

NFTs are developing remarkably and have provided many applications such as artist royalties, in-game assets, educational certificates, etc. However, it is a relatively new concept, and many areas of application need to be explored. Intellectual Property, including patent, trademark, and copyright, is an important area where NFTs can be applied usefully and solve existing problems.

Although NFTs have had many applications so far, it rarely has been used to solve real-world problems. In fact, an NFT is an exciting concept about Intellectual Property (IP). Applying for a patent and trademark is a time-consuming and lengthy process, but it is also costly. That is, registering a copyright or trademark may take months, while securing a patent can take years. On the contrary, with the help of unique features of NFT technology, it is possible to accelerate this process with considerable confidence and assurance about protecting the ownership of an IP. NFTs can offer IP protection while an applicant waits for the government to grant his/her more formal protection. It is cause for excitement that people who believe NFTs and Blockchain would make buying and selling patents easier, offering new opportunities for companies, universities, and inventors to make money off their innovations12. Patent holders will benefit from such innovation. It would give them the ability to ‘tokenize’ their patents. Because every transaction would be logged on a blockchain, it will be much easier to trace patent ownership changes. However, NFT would also facilitate the revenue generation of patents by democratizing patent licensing via NFT. NFTs support the intellectual property market by embedding automatic royalty collecting methods inside inventors’ works, providing them with financial benefits anytime their innovation is licensed. For example, each inventor’s patent would be minted as an NFT, and these NFTs would be joined together to form a commercial IP portfolio and minted as a compounded NFT. Each investor would automatically get their fair share of royalties whenever the licensing revenue is generated without tracking them down.

The authors in13, an overview of NFTs’ applications in different aspects such as gambling, games, and collectibles has been discussed. In addition4, provides a prototype for an event-tracking application based on Ethereum smart contract, and NFT as a solution for art and real estate auction systems is described in14. However, these studies have not discussed existing standards or a generalized architecture, enabling NFTs to be applied in diverse applications. For example, the authors in15 provide two general design patterns for creating and trading NFTs and discuss existing token standards for NFT. However, the proposed designs are limited to Ethereum, and other blockchains are not considered16. Moreover, different technologies for each step of the proposed procedure are not discussed. In8, the authors provide a conceptual framework for token designing and managing and discuss five views: token view, wallet view, transaction view, user interface view, and protocol view. However, no research provides a generalized conceptual framework for generating, recording, and tracing NFT based-IP, in blockchain network.

Even with the clear benefits that NFT-backed patents offer, there are a number of impediments to actually achieving such a system. For example, convincing patent owners to put current ownership records for their patents into NFTs poses an initial obstacle. Because there is no reliable framework for NFT-based patents, this paper provides a conceptual framework for presenting NFT-based patents with a comprehensive discussion on many aspects, ranging from the background, model components, token standards to application domains and research challenges. The main objective of this paper is to provide a layered conceptual NFT-based patent framework that can be used to register patents in a decentralized, tamper-proof, and trustworthy peer-to-peer network to trade and exchange them in the worldwide market. The main contributions of this paper are highlighted as follows:

  • Providing a comprehensive overview on tokenization of IP assets to create unique digital tokens.
  • Discussing the components of a distributed and trustworthy framework for minting NFT-based patents.
  • Highlighting a series of open challenges of NFT-based patents and enlightening the possible future trends.

The rest of the paper is structured as follows: “Background” section describes the Background of NFTs, Non-Fungible Token Standards. The NFT-based patent framework is described in “NFT-based patent framework” section. The Discussion and challenges are presented in “Discussion” section. Lastly, conclusions are given in “Conclusion” section.

Background

Colored Coins could be considered the first steps toward NFTs designed on the top of the Bitcoin network. Bitcoins are fungible, but it is possible to mark them to be distinguishable from the other bitcoins. These marked coins have special properties representing real-world assets like cars and stocks, and owners can prove their ownership of physical assets through the colored coins. By utilizing Colored Coins, users can transfer their marked coins’ ownership like a usual transaction and benefit from Bitcoin’s decentralized network17. Colored Coins had limited functionality due to the Bitcoin script limitations. Pepe is a green frog meme originated by Matt Furie that; users define tokens for Pepes and trade them through the Counterparty platform. Then, the tokens that were created by the picture of Pepes are decided if they are rare enough. Rare Pepe allows users to preserve scarcity, manage the ownership, and transfer their purchased Pepes.

In 2017, Larva Labs developed the first Ethereum-based NFT named CryptoPunks. It contains 10,000 unique human-like characters generated randomly. The official ownership of each character is stored in the Ethereum smart contract, and owners would trade characters. CryptoPunks project inspired CryptoKitties project. CryptoKitties attracts attention to NFT, and it is a pioneer in blockchain games and NFTs that launched in late 2017. CryptoKitties is a blockchain-based virtual game, and users collect and trade characters with unique features that shape kitties. This game was developed in Ethereum smart contract, and it pioneered the ERC-721 token, which was the first standard token in the Ethereum blockchain for NFTs. After the 2017 hype in NFTs, many projects started in this context. Due to increased attention to NFTs’ use-cases and growing market cap, different blockchains like EOS, Algorand, and Tezos started to support NFTs, and various marketplaces like SuperRare and Rarible, and OpenSea are developed to help users to trade NFTs. As mentioned, in general, assets are categorized into two main classes, fungible and non-fungible assets. Fungible assets are the ones that another similar asset can replace. Fungible items could have two main characteristics: replicability and divisibility.

Currency is a fungible item because a ten-dollar bill can be exchanged for another ten-dollar bill or divided into ten one-dollar bills. Despite fungible items, non-fungible items are unique and distinguishable. They cannot be divided or exchanged by another identical item. The first tweet on Twitter is a non-fungible item with mentioned characteristics. Another tweet cannot replace it, and it is unique and not divisible. NFT is a non-fungible cryptographic asset that is declared in a standard token format and has a unique set of attributes. Due to transparency, proof of ownership, and traceable transactions in the blockchain network, NFTs are created using blockchain technology.

Blockchain-based NFTs help enthusiasts create NFTs in the standard token format in blockchain, transfer the ownership of their NFTs to a buyer, assure uniqueness of NFTs, and manage NFTs completely. In addition, there are semi-fungible tokens that have characteristics of both fungible and non-fungible tokens. Semi-fungible tokens are fungible in the same class or specific time and non-fungible in other classes or different times. A plane ticket can be considered a semi-fungible token because a charter ticket can be exchanged by another charter ticket but cannot be exchanged by a first-class ticket. The concept of semi-fungible tokens plays the main role in blockchain-based games and reduces NFTs overhead. In Fig. 1, we illustrate fungible, non-fungible, and semi-fungible tokens. The main properties of NFTs are described as follows15:

figure 1
Figure 1

Ownership: Because of the blockchain layer, the owner of NFT can easily prove the right of possession by his/her keys. Other nodes can verify the user’s ownership publicly.

  • Transferable: Users can freely transfer owned NFTs ownership to others on dedicated markets.
  • Transparency: By using blockchain, all transactions are transparent, and every node in the network can confirm and trace the trades.
  • Fraud Prevention: Fraud is one of the key problems in trading assets; hence, using NFTs ensures buyers buy a non-counterfeit item.
  • Immutability: Metadata, token ID, and history of transactions of NFTs are recorded in a distributed ledger, and it is impossible to change the information of the purchased NFTs.

Non-fungible standards

Ethereum blockchain was pioneered in implementing NFTs. ERC-721 token was the first standard token accepted in the Ethereum network. With the increase in popularity of the NFTs, developers started developing and enhancing NFTs standards in different blockchains like EOS, Algorand, and Tezos. This section provides a review of implemented NFTs standards on the mentioned blockchains.

Ethereum

ERC-721 was the first Standard for NFTs developed in Ethereum, a free and open-source standard. ERC-721 is an interface that a smart contract should implement to have the ability to transfer and manage NFTs. Each ERC-721 token has unique properties and a different Token Id. ERC-721 tokens include the owner’s information, a list of approved addresses, a transfer function that implements transferring tokens from owner to buyer, and other useful functions5.

In ERC-721, smart contracts can group tokens with the same configuration, and each token has different properties, so ERC-721 does not support fungible tokens. However, ERC-1155 is another standard on Ethereum developed by Enjin and has richer functionalities than ERC-721 that supports fungible, non-fungible, and semi-fungible tokens. In ERC-1155, IDs define the class of assets. So different IDs have a different class of assets, and each ID may contain different assets of the same class. Using ERC-1155, a user can transfer different types of tokens in a single transaction and mix multiple fungible and non-fungible types of tokens in a single smart contract6. ERC-721 and ERC-1155 both support operators in which the owner can let the operator originate transferring of the token.

EOSIO

EOSIO is an open-source blockchain platform released in 2018 and claims to eliminate transaction fees and increase transaction throughput. EOSIO differs from Ethereum in the wallet creation algorithm and procedure of handling transactions. dGood is a free standard developed in the EOS blockchain for assets, and it focuses on large-scale use cases. It supports a hierarchical naming structure in smart contracts. Each contract has a unique symbol and a list of categories, and each category contains a list of token names. Therefore, a single contract in dGoods could contain many tokens, which causes efficiency in transferring a group of tokens. Using this hierarchy, dGoods supports fungible, non-fungible, and semi-fungible tokens. It also supports batch transferring, where the owner can transfer many tokens in one operation18.

Algorand

Algorand is a new high-performance public blockchain launched in 2019. It provides scalability while maintaining security and decentralization. It supports smart contracts and tokens for representing assets19. Algorand defines Algorand Standard Assets (ASA) concept to create and manage assets in the Algorand blockchain. Using ASA, users are able to define fungible and non-fungible tokens. In Algorand, users can create NFTs or FTs without writing smart contracts, and they should run just a single transaction in the Algorand blockchain. Each transaction contains some mutable and immutable properties20.

Each account in Algorand can create up to 1000 assets, and for every asset, an account creates or receives, the minimum balance of the account increases by 0.1 Algos. Also, Algorand supports fractional NFTs by splitting an NFT into a group of divided FTs or NFTs, and each part can be exchanged dependently21. Algorand uses a Clawback Address that operates like an operator in ERC-1155, and it is allowed to transfer tokens of an owner who has permitted the operator.

Tezos

Tezos is another decentralized open-source blockchain. Tezos supports the meta-consensus concept. In addition to using a consensus protocol on the ledger’s state like Bitcoin and Ethereum, It also attempts to reach a consensus about how nodes and the protocol should change or upgrade22. FA2 (TZIP-12) is a standard for a unified token contract interface in the Tezos blockchain. FA2 supports different token types like fungible, non-fungible, and fractionalized NFT contracts. In Tezos, tokens are identified with a token contract address and token ID pair. Also, Tezos supports batch token transferring, which reduces the cost of transferring multiple tokens.

Flow

Flow was developed by Dapper Labs to remove the scalability limitation of the Ethereum blockchain. Flow is a fast and decentralized blockchain that focuses on games and digital collectibles. It improves throughput and scalability without sharding due to its architecture. Flow supports smart contracts using Cadence, which is a resource-oriented programming language. NFTs can be described as a resource with a unique id in Cadence. Resources have important rules for ownership management; that is, resources have just one owner and cannot be copied or lost. These features assure the NFT owner. NFTs’ metadata, including images and documents, can be stored off-chain or on-chain in Flow. In addition, Flow defines a Collection concept, in which each collection is an NFT resource that can include a list of resources. It is a dictionary that the key is resource id, and the value is corresponding NFT.

The collection concept provides batch transferring of NFTs. Besides, users can define an NFT for an FT. For instance, in CryptoKitties, a unique cat as an NFT can own a unique hat (another NFT). Flow uses Cadence’s second layer of access control to allow some operators to access some fields of the NFT23. In Table 1, we provide a comparison between explained standards. They are compared in support of fungible-tokens, non-fungible tokens, batch transferring that owner can transform multiple tokens in one operation, operator support in which the owner can approve an operator to originate token transfer, and fractionalized NFTs that an NFT can divide to different tokens and each exchange dependently.Table 1 Comparing NFT standards.

Full size table

NFT-based patent framework

In this section, we propose a framework for presenting NFT-based patents. We describe details of the proposed distributed and trustworthy framework for minting NFT-based patents, as shown in Fig. 2. The proposed framework includes five main layers: Storage Layer, Authentication Layer, Verification Layer, Blockchain Layer, and Application Layer. Details of each layer and the general concepts are presented as follows.

figure 2
Figure 2

Storage layer

The continuous rise of the data in blockchain technology is moving various information systems towards the use of decentralized storage networks. Decentralized storage networks were created to provide more benefits to the technological world24. Some of the benefits of using decentralized storage systems are explained: (1) Cost savings are achieved by making optimal use of current storage. (2) Multiple copies are kept on various nodes, avoiding bottlenecks on central servers and speeding up downloads. This foundation layer implicitly provides the infrastructure required for the storage. The items on NFT platforms have unique characteristics that must be included for identification.

Non-fungible token metadata provides information that describes a particular token ID. NFT metadata is either represented on the On-chain or Off-chain. On-chain means direct incorporation of the metadata into the NFT’s smart contract, which represents the tokens. On the other hand, off-chain storage means hosting the metadata separately25.

Blockchains provide decentralization but are expensive for data storage and never allow data to be removed. For example, because of the Ethereum blockchain’s current storage limits and high maintenance costs, many projects’ metadata is maintained off-chain. Developers utilize the ERC721 Standard, which features a method known as tokenURI. This method is implemented to let applications know the location of the metadata for a specific item. Currently, there are three solutions for off-chain storage, including InterPlanetary File System (IPFS), Pinata, and Filecoin.

IPFS

InterPlanetary File System (IPFS) is a peer-to-peer hypermedia protocol for decentralized media content storage. Because of the high cost of storing media files related to NFTS on Blockchain, IPFS can be the most affordable and efficient solution. IPFS combines multiple technologies inspired by Gita and BitTorrent, such as Block Exchange System, Distributed Hash Tables (DHT), and Version Control System26. On a peer-to-peer network, DHT is used to coordinate and maintain metadata.

In other words, the hash values must be mapped to the objects they represent. An IPFS generates a hash value that starts with the prefix {Q}_{m} and acts as a reference to a specific item when storing an object like a file. Objects larger than 256 KB are divided into smaller blocks up to 256 KB. Then a hash tree is used to interconnect all the blocks that are a part of the same object. IPFS uses Kamdelia DHT. The Block Exchange System, or BitSwap, is a BitTorrent-inspired system that is used to exchange blocks. It is possible to use asymmetric encryption to prevent unauthorized access to stored content on IPFS27.

Pinata

Pinata is a popular platform for managing and uploading files on IPFS. It provides secure and verifiable files for NFTs. Most data is stored off-chain by most NFTs, where a URL of the data is pointed to the NFT on the blockchain. The main problem here is that some information in the URL can change.

This indicates that an NFT supposed to describe a certain patent can be changed without anyone knowing. This defeats the purpose of the NFT in the first place. This is where Pinata comes in handy. Pinata uses the IPFS to create content-addressable hashes of data, also known as Content-Identifiers (CIDs). These CIDs serve as both a way of retrieving data and a means to ensure data validity. Those looking to retrieve data simply ask the IPFS network for the data associated with a certain CID, and if any node on the network contains that data, it will be returned to the requester. The data is automatically rehashed on the requester’s computer when the requester retrieves it to make sure that the data matches back up with the original CID they asked for. This process ensures the data that’s received is exactly what was asked for; if a malicious node attempts to send fake data, the resulting CID on the requester’s end will be different, alerting the requester that they’re receiving incorrect data28.

Filecoin

Another decentralized storage network is Filecoin. It is built on top of IPFS and is designed to store the most important data, such as media files. Truffle Suite has also launched NFT Development Template with Filecoin Box. NFT.Storage (Free Decentralized Storage for NFTs)29 allows users to easily and securely store their NFT content and metadata using IPFS and Filecoin. NFT.Storage is a service backed by Protocol Labs and Pinata specifically for storing NFT data. Through content addressing and decentralized storage, NFT.Storage allows developers to protect their NFT assets and associated metadata, ensuring that all NFTs follow best practices to stay accessible for the long term. NFT.Storage makes it completely frictionless to mint NFTs following best practices through resilient persistence on IPFS and Filecoin. NFT.Storage allows developers to quickly, safely, and for free store NFT data on decentralized networks. Anyone can leverage the power of IPFS and Filecoin to ensure the persistence of their NFTs. The details of this system are stated as follows30:

Content addressing

Once users upload data on NFT.Storage, They receive a CID, which is an IPFS hash of the content. CIDs are the data’s unique fingerprints, universal addresses that can be used to refer to it regardless of how or where it is stored. Using CIDs to reference NFT data avoids problems such as weak links and “rug pulls” since CIDs are generated from the content itself.

Provable storage

NFT.Storage uses Filecoin for long-term decentralized data storage. Filecoin uses cryptographic proofs to assure the NFT data’s durability and persistence over time.

Resilient retrieval

This data stored via IPFS and Filecoin can be fetched directly in the browser via any public IPFS.

Authentication Layer

The second layer is the authentication layer, which we briefly highlight its functions in this section. The Decentralized Identity (DID) approach assists users in collecting credentials from a variety of issuers, such as the government, educational institutions, or employers, and saving them in a digital wallet. The verifier then uses these credentials to verify a person’s validity by using a blockchain-based ledger to follow the “identity and access management (IAM)” process. Therefore, DID allows users to be in control of their identity. A lack of NFT verifiability also causes intellectual property and copyright infringements; of course, the chain of custody may be traced back to the creator’s public address to check whether a similar patent is filed using that address. However, there is no quick and foolproof way to check an NFTs creator’s legitimacy. Without such verification built into the NFT, an NFT proves ownership only over that NFT itself and nothing more.

Self-sovereign identity (SSI)31 is a solution to this problem. SSI is a new series of standards that will guide a new identity architecture for the Internet. With a focus on privacy, security interoperability, SSI applications use public-key cryptography with public blockchains to generate persistent identities for people with private and selective information disclosure. Blockchain technology offers a solution to establish trust and transparency and provide a secure and publicly verifiable KYC (Know Your Customer). The blockchain architecture allows you to collect information from various service providers into a single cryptographically secure and unchanging database that does not need a third party to verify the authenticity of the information.

The proposed platform generates patents-related smart contracts acting as a program that runs on the blockchain to receive and send transactions. They are unalterable privately identifying clients with a thorough KYC process. After KYC approval, then mint an NFT on the blockchain as a certificate of verification32. This article uses a decentralized authentication solution at this layer for authentication. This solution has been used for various applications in the field of the blockchain (exp: smart city, Internet of Things, etc.3334, but we use it here for the proposed framework (patent as NFTs). Details of this solution will be presented in the following.

Decentralized authentication

This section presents the authentication layer similar35 to build validated communication in a secure and decentralized manner via blockchain technology. As shown in Fig. 3, the authentication protocol comprises two processes, including registration and login.

figure 3
Figure 3
Registration

In the registration process of a suggested authentication protocol, we first initialize a user’s public key as their identity key (UserName). Then, we upload this identity key on a blockchain, in which transactions can be verified later by other users. Finally, the user generates an identity transaction.

Login

After registration, a user logs in to the system. The login process is described as follows:

  • 1. The user commits identity information and imports their secret key into the service application to log in.
  • 2. A user who needs to log in sends a login request to the network’s service provider.
  • 3. The service provider analyzes the login request, extracts the hash, queries the blockchain, and obtains identity information from an identity list (identity transactions).
  • 4. The service provider responds with an authentication request when the above process is completed. A timestamp (to avoid a replay attack), the user’s UserName, and a signature are all included in the authentication request.
  • 5. The user creates a signature with five parameters: timestamp, UserName, and PK, as well as the UserName and PK of the service provider. The user authentication credential is used as the signature.
  • 6. The service provider verifies the received information, and if the received information is valid, the authentication succeeds; otherwise, the authentication fails, and the user’s login is denied.

The World Intellectual Property Organization (WIPO) and multiple target patent offices in various nations or regions should assess a patent application, resulting in inefficiency, high costs, and uncertainty. This study presented a conceptual NFT-based patent framework for issuing, validating, and sharing patent certificates. The platform aims to support counterfeit protection as well as secure access and management of certificates according to the needs of learners, companies, education institutions, and certification authorities.

Here, the certification authority (CA) is used to authenticate patent offices. The procedure will first validate a patent if it is provided with a digital certificate that meets the X.509 standard. Certificate authorities are introduced into the system to authenticate both the nodes and clients connected to the blockchain network.

Verification layer

In permissioned blockchains, just identified nodes can read and write in the distributed ledger. Nodes can act in different roles and have various permissions. Therefore, a distributed system can be designed to be the identified nodes for patent granting offices. Here the system is described conceptually at a high level. Figure 4 illustrates the sequence diagram of this layer. This layer includes four levels as below:

figure 4
Figure 4

Digitalization

For a patent to publish as an NFT in the blockchain, it must have a digitalized format. This level is the “filling step” in traditional patent registering. An application could be designed in the application layer to allow users to enter different patent information online.

Recording

Patents provide valuable information and would bring financial benefits for their owner. If they are publicly published in a blockchain network, miners may refuse the patent and take the innovation for themselves. At least it can weaken consensus reliability and encourage miners to misbehave. The inventor should record his innovation privately first using proof of existence to prevent this. The inventor generates the hash of the patent document and records it in the blockchain. As soon as it is recorded in the blockchain, the timestamp and the hash are available for others publicly. Then, the inventor can prove the existence of the patent document whenever it is needed.

Furthermore, using methods like Decision Thinking36, an inventor can record each phase of patent development separately. In each stage, a user generates the hash of the finished part and publishes the hash regarding the last part’s hash. Finally, they have a coupled series of hashes that indicate patent development, and they can prove the existence of each phase using the original related documents. This level should be done to prevent others from abusing the patent and taking it for themselves. The inventor can make sure that their patent document is recorded confidentially and immutably37.

Different hash algorithms exist with different architecture, time complexity, and security considerations. Hash functions should satisfy two main requirements: Pre-Image Resistance: This means that it should be computationally hard to find the input of a hash function while the output and the hash algorithm are known publicly. Collision Resistance: This means that it is computationally hard to find two arbitrary inputs, x, and y, that have the same hash output. These requirements are vital for recording patents. First, the hash function should be Pre-Image Resistance to make it impossible for others to calculate the patent documentation. Otherwise, everybody can read the patent, even before its official publication. Second, the hash function should satisfy Collision Resistance to preclude users from changing their document after recording. Otherwise, users can upload another document, and after a while, they can replace it with another one.

There are various hash algorithms, and MD and SHA families are the most useful algorithms. According to38, Collisions have been found for MD2, MD4, MD5, SHA-0, and SHA-1 hash functions. Hence, they cannot be a good choice for recording patents. SHA2 hash algorithm is secure, and no collision has been found. Although SHA2 is noticeably slower than prior hash algorithms, the recording phase is not highly time-sensitive. So, it is a better choice and provides excellent security for users.

Validating

In this phase, the inventors first create NFT for their patents and publish it to the miners/validators. Miners are some identified nodes that validate NFTs to record in the blockchain. Due to the specialization of the patent validation, miners cannot be inexpert public persons. In addition, patent offices are not too many to make the network fully decentralized. Therefore, the miners can be related specialist persons that are certified by the patent offices. They should receive a digital certificate from patent offices that show their eligibility to referee a patent.

Digital certificate

Digital certificates are digital credentials used to verify networked entities’ online identities. They usually include a public key as well as the owner’s identification. They are issued by Certification Authorities (CAs), who must verify the certificate holder’s identity. Certificates contain cryptographic keys for signing, encryption, and decryption. X.509 is a standard that defines the format of public-key certificates and is signed by a certificate authority. X.509 standard has multiple fields, and its structure is shown in Fig. 5. Version: This field indicated the version of the X.509 standard. X.509 contains multiple versions, and each version has a different structure. According to the CA, validators can choose their desired version. Serial Number: It is used to distinguish a certificate from other certificates. Thus, each certificate has a unique serial number. Signature Algorithm Identifier: This field indicates the cryptographic encryption algorithm used by a certificate authority. Issuer Name: This field indicates the issuer’s name, which is generally certificate authority. Validity Period: Each certificate is valid for a defined period, defined as the Validity Period. This limited period partly protects certificates against exposing CA’s private key. Subject Name: Name of the requester. In our proposed framework, it is the validator’s name. Subject Public Key Info: Shows the CA’s or organization’s public key that issued the certificate. These fields are identical among all versions of the X.509 standard39.

figure 5
Figure 5

Certificate authority

A Certificate Authority (CA) issues digital certificates. CAs encrypt the certificate with their private key, which is not public, and others can decrypt the certificates containing the CA’s public key.

Here, the patent office creates a certificate for requested patent referees. The patent office writes the information of the validator in their certificate and encrypts it with the patent offices’ private key. The validator can use the certificate to assure others about their eligibility. Other nodes can check the requesting node’s information by decrypting the certificate using the public key of the patent office. Therefore, persons can join the network’s miners/validators using their credentials. In this phase, miners perform Formal Examinations, Prior Art Research, and Substantive Examinations and vote to grant or refuse the patent.

Miners perform a consensus about the patent and record the patent in the blockchain. After that, the NFT is recorded in the blockchain with corresponding comments in granting or needing reformations. If the miners detect the NFT as a malicious request, they do not record it in the blockchain.

Blockchain layer

This layer plays as a middleware between the Verification Layer and Application Layer in the patents as NFTs architecture. The main purpose of the blockchain layer in the proposed architecture is to provide IP management. We find that transitioning to a blockchain-based patent as a NFTs records system enables many previously suggested improvements to current patent systems in a flexible, scalable, and transparent manner.

On the other hand, we can use multiple blockchain platforms, including Ethereum, EOS, Flow, and Tezos. Blockchain Systems can be mainly classified into two major types: Permissionless (public) and Permissioned (private) Blockchains based on their consensus mechanism. In a public blockchain, any node can participate in the peer-to-peer network, where the blockchain is fully decentralized. A node can leave the network without any consent from the other nodes in the network.

Bitcoin is one of the most popular examples that fall under the public and permissionless blockchain. Proof of Work (POW), Proof-of-Stake (POS), and directed acyclic graph (DAG) are some examples of consensus algorithms in permissionless blockchains. Bitcoin and Ethereum, two famous and trustable blockchain networks, use the PoW consensus mechanism. Blockchain platforms like Cardano and EOS adopt the PoS consensus40.

Nodes require specific access or permission to get network authentication in a private blockchain. Hyperledger is among the most popular private blockchains, which allow only permissioned members to join the network after authentication. This provides security to a group of entities that do not completely trust one another but wants to achieve a common objective such as exchanging information. All entities of a permissioned blockchain network can use Byzantine-fault-tolerant (BFT) consensus. The Fabric has a membership identity service that manages user IDs and verifies network participants.

Therefore, members are aware of each other’s identity while maintaining privacy and secrecy because they are unaware of each other’s activities41. Due to their more secure nature, private blockchains have sparked a large interest in banking and financial organizations, believing that these platforms can disrupt current centralized systems. Hyperledger, Quorum, Corda, EOS are some examples of permissioned blockchains42.

Reaching consensus in a distributed environment is a challenge. Blockchain is a decentralized network with no central node to observe and check all transactions. Thus, there is a need to design protocols that indicate all transactions are valid. So, the consensus algorithms are considered as the core of each blockchain43. In distributed systems, the consensus has become a problem in which all network members (nodes) agree on accept or reject of a block. When all network members accept the new block, it can append to the previous block.

As mentioned, the main concern in the blockchains is how to reach consensus among network members. A wide range of consensus algorithms has been designed in which each of them has its own pros and cons42. Blockchain consensus algorithms are mainly classified into three groups shown in Table 2. As the first group, proof-based consensus algorithms require the nodes joining the verifying network to demonstrate their qualification to do the appending task. The second group is voting-based consensus that requires validators in the network to share their results of validating a new block or transaction before making the final decision. The third group is DAG-based consensus, a new class of consensus algorithms. These algorithms allow several different blocks to be published and recorded simultaneously on the network.Table 2 Consensus algorithms in blockchain networks.

Full size table

The proposed patent as the NFTs platform that builds blockchain intellectual property empowers the entire patent ecosystem. It is a solution that removes barriers by addressing fundamental issues within the traditional patent ecosystem. Blockchain can efficiently handle patents and trademarks by effectively reducing approval wait time and other required resources. The user entities involved in Intellectual Property management are Creators, Patent Consumers, and Copyright Managing Entities. Users with ownership of the original data are the patent creators, e.g., inventors, writers, and researchers. Patent Consumers are the users who are willing to consume the content and support the creator’s work. On the other hand, Users responsible for protecting the creators’ Intellectual Property are the copyright management entities, e.g., lawyers. The patents as NFTs solution for IP management in blockchain layer works by implementing the following steps62:

Creators sign up to the platform

Creators need to sign up on the blockchain platform to patent their creative work. The identity information will be required while signing up.

Creators upload IP on the blockchain network

Now, add an intellectual property for which the patent application is required. The creator will upload the information related to IP and the data on the blockchain network. Blockchain ensures traceability and auditability to prevent data from duplicity and manipulation. The patent becomes visible to all network members once it is uploaded to the blockchain.

Consumers generate request to use the content

Consumers who want to access the content must first register on the blockchain network. After Signing up, consumers can ask creators to grant access to the patented content. Before the patent owner authorizes the request, a Smart Contract is created to allow customers to access information such as the owner’s data. Furthermore, consumers are required to pay fees in either fiat money or unique tokens in order to use the creator’s original information. When the creator approves the request, an NDA (Non-Disclosure Agreement) is produced and signed by both parties. Blockchain manages the agreement and guarantees that all parties agree to the terms and conditions filed.

Patent management entities leverage blockchain to protect copyrights and solve related disputes

Blockchain assists the patent management entities in resolving a variety of disputes that may include: sharing confidential information, establishing proof of authorship, transferring IP rights, and making defensive publications, etc. Suppose a person used an Invention from a patent for his company without the inventor’s consent. The inventor can report it to the patent office and claim that he is the owner of that invention.

Application layer

The patent Platform Global Marketplace technology would allow many enterprises, governments, universities, and Small and medium-sized enterprises (SMEs) worldwide to tokenize patents as NFTs to create an infrastructure for storing patent records on a blockchain-based network and developing a decentralized marketplace in which patent holders would easily sell or otherwise monetize their patents. The NFTs-based patent can use smart contracts to determine a set price for a license or purchase.

Any buyer satisfied with the conditions can pay and immediately unlock the rights to the patent without either party ever having to interact directly. While patents are currently regulated jurisdictionally around the world, a blockchain-based patent marketplace using NFTs can reduce the geographical barriers between patent systems using as simple a tool as a search query. The ease of access to patents globally can help aspiring inventors accelerate the innovative process by building upon others’ patented inventions through licenses. There are a wide variety of use cases for patent NFTs such as SMEs, Patent Organization, Grant & Funding, and fundraising/transferring information relating to patents. These applications keep growing as time progresses, and we are constantly finding new ways to utilize these tokens. Some of the most commonly used applications can be seen as follows.

SMEs

The aim is to move intellectual property assets onto a digital, centralized, and secure blockchain network, enabling easier commercialization of patents, especially for small or medium enterprises (SMEs). Smart contracts can be attached to NFTs so terms of use and ownership can be outlined and agreed upon without incurring as many legal fees as traditional IP transfers. This is believed to help SMEs secure funding, as they could more easily leverage the previously undisclosed value of their patent portfolios63.

Transfer ownership of patents

NFTs can be used to transfer ownership of patents. The blockchain can be used to keep track of patent owners, and tokens would include self-executing contracts that transfer the legal rights associated with patents when the tokens are transferred. A partnership between IBM and IPwe has spearheaded the use of NFTs to secure patent ownership. These two companies have teamed together to build the infrastructure for an NFT-based patent marketplace.

Discussion

There are exciting proposals in the legal and economic literature that suggest seemingly straightforward solutions to many of the issues plaguing current patent systems. However, most solutions would constitute major administrative disruptions and place significant and continuous financial burdens on patent offices or their users. An NFT-based patents system not only makes many of these ideas administratively feasible but can also be examined in a step-wise, scalable, and very public manner.

Furthermore, NFT-based patents may facilitate reliable information sharing among offices and patentees worldwide, reducing the burden on examiners and perhaps even accelerating harmonization efforts. NFT-based patents also have additional transparency and archival attributes baked in. A patent should be a privilege bestowed on those who take resource-intensive risks to explore the frontier of technological capabilities. As a reward for their achievements, full transparency of these rewards is much public interest. It is a society that pays for administrative and economic inefficiencies that exist in today’s systems. NFT-based patents can enhance this transparency. From an organizational perspective, an NFT-based patent can remove current bottlenecks in patent processes by making these processes more efficient, rapid, and convenient for applicants without compromising the quality of granted patents.

The proposed framework encounters some challenges that should be solved to reach a developed patent verification platform. First, technical problems are discussed. The consensus method that is used in the verification layer is not addressed in detail. Due to the permissioned structure of miners in the NFT-based patents, consensus algorithms like PBFT, Federated Consensus, and Round Robin Consensus are designed for permissioned blockchains can be applied. Also, miners/validators spend some time validating the patents; hence a protocol should be designed to profit them. Some challenges like proving the miners’ time and effort, the price that inventors should pay to miners, and other economic trade-offs should be considered.

Different NFT standards were discussed. If various patent services use NFT standards, there will be some cross-platform problems. For instance, transferring an NFT from Ethereum blockchain (ERC-721 token) to EOS blockchain is not a forward and straight work and needs some considerations. Also, people usually trade NFTs in marketplaces such as Rarible and OpenSea. These marketplaces are centralized and may prompt some challenges because of their centralized nature. Besides, there exist some other types of challenges. For example, the novelty of NFT-based patents and blockchain services.

Blockchain-based patent service has not been tested before. The patent registration procedure and concepts of the Patent as NFT system may be ambiguous for people who still prefer conventional centralized patent systems over decentralized ones. It should be noted that there are some problems in the mining part. Miners should receive certificates from the accepted organizations. Determining these organizations and how they accept referees as validators need more consideration. Some types of inventions in some countries are prohibited, and inventors cannot register them. In NFT-based patents, inventors can register their patents publicly, and maybe some collisions occur between inventors and the government. There exist some misunderstandings about NFT’s ownership rights. It is not clear that when a person buys an NFT, which rights are given to them exactly; for instance, they have property rights or have moral rights, too.

Conclusion

Blockchain technology provides strong timestamping, the potential for smart contracts, proof-of-existence. It enables creating a transparent, distributed, cost-effective, and resilient environment that is open to all and where each transaction is auditable. On the other hand, blockchain is a definite boon to the IP industry, benefitting patent owners. When blockchain technology’s intrinsic characteristics are applied to the IP domain, it helps copyrights. This paper provided a conceptual framework for presenting an NFT-based patent with a comprehensive discussion of many aspects: background, model components, token standards to application areas, and research challenges. The proposed framework includes five main layers: Storage Layer, Authentication Layer, Verification Layer, Blockchain Layer, and Application. The primary purpose of this patent framework was to provide an NFT-based concept that could be used to patent a decentralized, anti-tamper, and reliable network for trade and exchange around the world. Finally, we addressed several open challenges to NFT-based inventions.

References

  1. Nakamoto, S. Bitcoin: A peer-to-peer electronic cash system. Decent. Bus. Rev. 21260, https://bitcoin.org/bitcoin.pdf (2008).
  2. Buterin, V. A next-generation smart contract and decentralized application platform. White Pap. 3 (2014).
  3. Nofer, M., Gomber, P., Hinz, O. & Schiereck, D. Business & infomation system engineering. Blockchain 59, 183–187 (2017).Google Scholar 
  4. Regner, F., Urbach, N. & Schweizer, A. NFTs in practice—non-fungible tokens as core component of a blockchain-based event ticketing application. https://www.researchgate.net/publication/336057493_NFTs_in_Practice_-_Non-Fungible_Tokens_as_Core_Component_of_a_Blockchain-based_Event_Ticketing_Application (2019).
  5. Entriken, W., Shirley, D., Evans, J. & Sachs, N. EIP 721: ERC-721 non-fungible token standard. Ethereum Improv. Propos.https://eips.ethereum.org/EIPS/eip-721 (2018).
  6. Radomski, W. et al. Eip 1155: Erc-1155 multi token standard. In Ethereum, Standard (2018).
  7. Dowling, M. Is non-fungible token pricing driven by cryptocurrencies? Finance Res. Lett. 44, 102097. https://doi.org/10.1016/j.frl.2021.102097 (2021).
  8. Lesavre, L., Varin, P. & Yaga, D. Blockchain Networks: Token Design and Management Overview. (National Institute of Standards and Technology, 2020).
  9. Larva-Labs. About Cryptopunks, Retrieved 13 May, 2021, from https://www.larvalabs.com/cryptopunks (2021).
  10. Cryptokitties. About Cryptokitties, Retrieved 28 May, 2021, from https://www.cryptokitties.co/ (2021).
  11. nbatopshot. About Nba top shot, Retrieved 4 April, 2021, from https://nbatopshot.com/terms (2021).
  12. Fairfield, J. Tokenized: The law of non-fungible tokens and unique digital property. Indiana Law J. forthcoming (2021).
  13. Chevet, S. Blockchain technology and non-fungible tokens: Reshaping value chains in creative industries. Available at SSRN 3212662 (2018).
  14. Bal, M. & Ner, C. NFTracer: a Non-Fungible token tracking proof-of-concept using Hyperledger Fabric. arXiv preprint arXiv:1905.04795 (2019).
  15. Wang, Q., Li, R., Wang, Q. & Chen, S. Non-fungible token (NFT): Overview, evaluation, opportunities and challenges. arXiv preprint arXiv:2105.07447 (2021).
  16. Qu, Q., Nurgaliev, I., Muzammal, M., Jensen, C. S. & Fan, J. On spatio-temporal blockchain query processing. Future Gener. Comput. Syst. 98: 208–218 (2019).Article Google Scholar 
  17. Rosenfeld, M. Overview of colored coins. White paper, bitcoil. co. il 41, 94 (2012).
  18. Obsidian-Labs. dGoods Standard, Retrieved 29 April, 2021, from https://docs.eosstudio.io/contracts/dgoods/standard.html. (2021).
  19. Algorand. Algorand Core Technology Innovation, Retrieved 10 March, 2021, from https://www.algorand.com/technology/core-blockchain-innovation. (2021).
  20. Weathersby, J. Building NFTs on Algorand, Retrieved 15 April, 2021, from https://developer.algorand.org/articles/building-nfts-on-algorand/. (2021).
  21. Algorand. How Algorand Democratizes the Access to the NFT Market with Fractional NFTs, Retrieved 7 April, 2021, from https://www.algorand.com/resources/blog/algorand-nft-market-fractional-nfts. (2021).
  22. Tezos. Welcome to the Tezos Developer Documentation, Retrieved 16 May, 2021, from https://tezos.gitlab.io. (2021).
  23. flowdocs. Non-Fungible Tokens, Retrieved 20 May, 2021, from https://docs.onflow.org/cadence/tutorial/04-non-fungible-tokens/. (2021).
  24. Benisi, N. Z., Aminian, M. & Javadi, B. Blockchain-based decentralized storage networks: A survey. J. Netw. Comput. Appl. 162, 102656 (2020).Article Google Scholar 
  25. NFTReview. On-chain vs. Off-chain Metadata (2021).
  26. Benet, J. Ipfs-content addressed, versioned, p2p file system. arXiv preprint arXiv:1407.3561 (2014).
  27. Nizamuddin, N., Salah, K., Azad, M. A., Arshad, J. & Rehman, M. Decentralized document version control using ethereum blockchain and IPFS. Comput. Electr. Eng. 76, 183–197 (2019).Article Google Scholar 
  28. Tut, K. Who Is Responsible for NFT Data? (2020).
  29. nft.storage. Free Storage for NFTs, Retrieved 16 May, 2021, from https://nft.storage/. (2021).
  30. Psaras, Y. & Dias, D. in 2020 50th Annual IEEE-IFIP International Conference on Dependable Systems and Networks-Supplemental Volume (DSN-S). 80–80 (IEEE).
  31. Tanner, J. & Roelofs, C. NFTs and the need for Self-Sovereign Identity (2021).
  32. Martens, D., Tuyll van Serooskerken, A. V. & Steenhagen, M. Exploring the potential of blockchain for KYC. J. Digit. Bank. 2, 123–131 (2017).Google Scholar 
  33. Hammi, M. T., Bellot, P. & Serhrouchni, A. In 2018 IEEE Wireless Communications and Networking Conference (WCNC). 1–6 (IEEE).
  34. Khalid, U. et al. A decentralized lightweight blockchain-based authentication mechanism for IoT systems. Cluster Comput. 1–21 (2020).
  35. Zhong, Y. et al. Distributed blockchain-based authentication and authorization protocol for smart grid. Wirel. Commun. Mobile Comput. (2021).
  36. Schönhals, A., Hepp, T. & Gipp, B. In Proceedings of the 1st Workshop on Cryptocurrencies and Blockchains for Distributed Systems. 105–110.
  37. Verma, S. & Prajapati, G. A Survey of Cryptographic Hash Algorithms and Issues. International Journal of Computer Security & Source Code Analysis (IJCSSCA) 1, 17–20, (2015).
  38. Verma, S. & Prajapati, G. A survey of cryptographic hash algorithms and issues. Int. J. Comput. Secur. Source Code Anal. (IJCSSCA) 1 (2015).
  39. SDK, I. X.509 Certificates (1996).
  40. Helliar, C. V., Crawford, L., Rocca, L., Teodori, C. & Veneziani, M. Permissionless and permissioned blockchain diffusion. Int. J. Inf. Manag. 54, 102136 (2020).Article Google Scholar 
  41. Frizzo-Barker, J. et al. Blockchain as a disruptive technology for business: A systematic review. Int. J. Inf. Manag. 51, 102029 (2020).Article Google Scholar 
  42. Bamakan, S. M. H., Motavali, A. & Bondarti, A. B. A survey of blockchain consensus algorithms performance evaluation criteria. Expert Syst. Appl. 154, 113385 (2020).Article Google Scholar 
  43. Bamakan, S. M. H., Bondarti, A. B., Bondarti, P. B. & Qu, Q. Blockchain technology forecasting by patent analytics and text mining. Blockchain Res. Appl. 100019 (2021).
  44. Castro, M. & Liskov, B. Practical Byzantine fault tolerance and proactive recovery. ACM Trans. Comput. Syst. (TOCS) 20, 398–461 (2002).Article Google Scholar 
  45. Muratov, F., Lebedev, A., Iushkevich, N., Nasrulin, B. & Takemiya, M. YAC: BFT consensus algorithm for blockchain. arXiv preprint arXiv:1809.00554 (2018).
  46. Bessani, A., Sousa, J. & Alchieri, E. E. In 2014 44th Annual IEEE/IFIP International Conference on Dependable Systems and Networks. 355–362 (IEEE).
  47. Todd, P. Ripple protocol consensus algorithm review. May 11th (2015).
  48. Ongaro, D. & Ousterhout, J. In 2014 {USENIX} Annual Technical Conference ({USENIX}{ATC} 14). 305–319.
  49. Larimer, D. Delegated proof-of-stake (dpos). Bitshare whitepaper, Reterived March 31, 2019, from http://docs.bitshares.org/bitshares/dpos.html (2014).
  50. Turner, B. (October, 2007).
  51. De Angelis, S. et al. PBFT vs proof-of-authority: Applying the CAP theorem to permissioned blockchain (2018).
  52. King, S. & Nadal, S. Ppcoin: Peer-to-peer crypto-currency with proof-of-stake. self-published paper, August 19 (2012).
  53. Hyperledger. PoET 1.0 Specification (2017).
  54. Buntinx, J. What Is Proof-of-Weight? Reterived March 31, 2019, from https://nulltx.com/what-is-proof-of-weight/# (2018).
  55. P4Titan. A Peer-to-Peer Crypto-Currency with Proof-of-Burn. Reterived March 10, 2019, from https://github.com/slimcoin-project/slimcoin-project.github.io/raw/master/whitepaperSLM.pdf (2014).
  56. Dziembowski, S., Faust, S., Kolmogorov, V. & Pietrzak, K. In Annual Cryptology Conference. 585–605 (Springer).
  57. Bentov, I., Lee, C., Mizrahi, A. & Rosenfeld, M. Proof of Activity: Extending Bitcoin’s Proof of Work via Proof of Stake. IACR Cryptology ePrint Archive 2014, 452 (2014).Google Scholar 
  58. NEM, T. Nem technical referencehttps://nem.io/wpcontent/themes/nem/files/NEM_techRef.pdf (2018).
  59. Bramas, Q. The Stability and the Security of the Tangle (2018).
  60. Baird, L. The swirlds hashgraph consensus algorithm: Fair, fast, byzantine fault tolerance. In Swirlds Tech Reports SWIRLDS-TR-2016–01, Tech. Rep (2016).
  61. LeMahieu, C. Nano: A feeless distributed cryptocurrency network. Nano [Online resource]. https://nano.org/en/whitepaper (date of access: 24.03. 2018) 16, 17 (2018).
  62. Casino, F., Dasaklis, T. K. & Patsakis, C. A systematic literature review of blockchain-based applications: Current status, classification and open issues. Telematics Inform. 36, 55–81 (2019).Article Google Scholar 
  63. bigredawesomedodo. Helping Small Businesses Survive and Grow With Marketing, Retrieved 3 June, 2021, from https://bigredawesomedodo.com/nft/. (2020).

Download references

Acknowledgements

This work has been partially supported by CAS President’s International Fellowship Initiative, China [grant number 2021VTB0002, 2021] and National Natural Science Foundation of China (No. 61902385).

Author information

Affiliations

  1. Department of Industrial Management, Yazd University, Yazd City, IranSeyed Mojtaba Hosseini Bamakan
  2. Department of Electrical and Computer Engineering, Isfahan University of Technology, Isfahan City, IranNasim Nezhadsistani
  3. School of Electrical and Computer Engineering, University of Tehran, Tehran City, IranOmid Bodaghi
  4. Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055, ChinaSeyed Mojtaba Hosseini Bamakan & Qiang Qu
  5. Huawei Blockchain Lab, Huawei Cloud Tech Co., Ltd., Shenzhen, ChinaQiang Qu

Contributions

NFT: Redefined Format of IP Assets

The collaboration between National Center for Advancing Translational Sciences (NCATS) at NIH and BurstIQ

2.0 LPBI is a Very Unique Organization 

 

Read Full Post »

Reporter: Stephen J. Williams, Ph.D.

From: Heidi Rheim et al. GA4GH: International policies and standards for data sharing across genomic research and healthcare. (2021): Cell Genomics, Volume 1 Issue 2.

Source: DOI:https://doi.org/10.1016/j.xgen.2021.100029

Highlights

  • Siloing genomic data in institutions/jurisdictions limits learning and knowledge
  • GA4GH policy frameworks enable responsible genomic data sharing
  • GA4GH technical standards ensure interoperability, broad access, and global benefits
  • Data sharing across research and healthcare will extend the potential of genomics

Summary

The Global Alliance for Genomics and Health (GA4GH) aims to accelerate biomedical advances by enabling the responsible sharing of clinical and genomic data through both harmonized data aggregation and federated approaches. The decreasing cost of genomic sequencing (along with other genome-wide molecular assays) and increasing evidence of its clinical utility will soon drive the generation of sequence data from tens of millions of humans, with increasing levels of diversity. In this perspective, we present the GA4GH strategies for addressing the major challenges of this data revolution. We describe the GA4GH organization, which is fueled by the development efforts of eight Work Streams and informed by the needs of 24 Driver Projects and other key stakeholders. We present the GA4GH suite of secure, interoperable technical standards and policy frameworks and review the current status of standards, their relevance to key domains of research and clinical care, and future plans of GA4GH. Broad international participation in building, adopting, and deploying GA4GH standards and frameworks will catalyze an unprecedented effort in data sharing that will be critical to advancing genomic medicine and ensuring that all populations can access its benefits.

In order for genomic and personalized medicine to come to fruition it is imperative that data siloes around the world are broken down, allowing the international collaboration for the collection, storage, transferring, accessing and analying of molecular and health-related data.

We had talked on this site in numerous articles about the problems data siloes produce. By data siloes we are meaning that collection and storage of not only DATA but intellectual thought are being held behind physical, electronic, and intellectual walls and inacessible to other scientisits not belonging either to a particular institituion or even a collaborative network.

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

Standardization and harmonization of data is key to this effort to sharing electronic records. The EU has taken bold action in this matter. The following section is about the General Data Protection Regulation of the EU and can be found at the following link:

https://ec.europa.eu/info/law/law-topic/data-protection/data-protection-eu_en

Fundamental rights

The EU Charter of Fundamental Rights stipulates that EU citizens have the right to protection of their personal data.

Protection of personal data

Legislation

The data protection package adopted in May 2016 aims at making Europe fit for the digital age. More than 90% of Europeans say they want the same data protection rights across the EU and regardless of where their data is processed.

The General Data Protection Regulation (GDPR)

Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data. This text includes the corrigendum published in the OJEU of 23 May 2018.

The regulation is an essential step to strengthen individuals’ fundamental rights in the digital age and facilitate business by clarifying rules for companies and public bodies in the digital single market. A single law will also do away with the current fragmentation in different national systems and unnecessary administrative burdens.

The regulation entered into force on 24 May 2016 and applies since 25 May 2018. More information for companies and individuals.

Information about the incorporation of the General Data Protection Regulation (GDPR) into the EEA Agreement.

EU Member States notifications to the European Commission under the GDPR

The Data Protection Law Enforcement Directive

Directive (EU) 2016/680 on the protection of natural persons regarding processing of personal data connected with criminal offences or the execution of criminal penalties, and on the free movement of such data.

The directive protects citizens’ fundamental right to data protection whenever personal data is used by criminal law enforcement authorities for law enforcement purposes. It will in particular ensure that the personal data of victims, witnesses, and suspects of crime are duly protected and will facilitate cross-border cooperation in the fight against crime and terrorism.

The directive entered into force on 5 May 2016 and EU countries had to transpose it into their national law by 6 May 2018.

The following paper by the organiztion The Global Alliance for Genomics and Health discusses these types of collaborative efforts to break down data silos in personalized medicine. This organization has over 2000 subscribers in over 90 countries encompassing over 60 organizations.

Enabling responsible genomic data sharing for the benefit of human health

The Global Alliance for Genomics and Health (GA4GH) is a policy-framing and technical standards-setting organization, seeking to enable responsible genomic data sharing within a human rights framework.

he Global Alliance for Genomics and Health (GA4GH) is an international, nonprofit alliance formed in 2013 to accelerate the potential of research and medicine to advance human health. Bringing together 600+ leading organizations working in healthcare, research, patient advocacy, life science, and information technology, the GA4GH community is working together to create frameworks and standards to enable the responsible, voluntary, and secure sharing of genomic and health-related data. All of our work builds upon the Framework for Responsible Sharing of Genomic and Health-Related Data.

GA4GH Connect is a five-year strategic plan that aims to drive uptake of standards and frameworks for genomic data sharing within the research and healthcare communities in order to enable responsible sharing of clinical-grade genomic data by 2022. GA4GH Connect links our Work Streams with Driver Projects—real-world genomic data initiatives that help guide our development efforts and pilot our tools.

From the article on Cell Genomics GA4GH: International policies and standards for data sharing across genomic research and healthcare

Source: Open Access DOI:https://doi.org/10.1016/j.xgen.2021.100029PlumX Metrics

The Global Alliance for Genomics and Health (GA4GH) is a worldwide alliance of genomics researchers, data scientists, healthcare practitioners, and other stakeholders. We are collaborating to establish policy frameworks and technical standards for responsible, international sharing of genomic and other molecular data as well as related health data. Founded in 2013,3 the GA4GH community now consists of more than 1,000 individuals across more than 90 countries working together to enable broad sharing that transcends the boundaries of any single institution or country (see https://www.ga4gh.org).In this perspective, we present the strategic goals of GA4GH and detail current strategies and operational approaches to enable responsible sharing of clinical and genomic data, through both harmonized data aggregation and federated approaches, to advance genomic medicine and research. We describe technical and policy development activities of the eight GA4GH Work Streams and implementation activities across 24 real-world genomic data initiatives (“Driver Projects”). We review how GA4GH is addressing the major areas in which genomics is currently deployed including rare disease, common disease, cancer, and infectious disease. Finally, we describe differences between genomic sequence data that are generated for research versus healthcare purposes, and define strategies for meeting the unique challenges of responsibly enabling access to data acquired in the clinical setting.

GA4GH organization

GA4GH has partnered with 24 real-world genomic data initiatives (Driver Projects) to ensure its standards are fit for purpose and driven by real-world needs. Driver Projects make a commitment to help guide GA4GH development efforts and pilot GA4GH standards (see Table 2). Each Driver Project is expected to dedicate at least two full-time equivalents to GA4GH standards development, which takes place in the context of GA4GH Work Streams (see Figure 1). Work Streams are the key production teams of GA4GH, tackling challenges in eight distinct areas across the data life cycle (see Box 1). Work Streams consist of experts from their respective sub-disciplines and include membership from Driver Projects as well as hundreds of other organizations across the international genomics and health community.

Figure thumbnail gr1
Figure 1Matrix structure of the Global Alliance for Genomics and HealthShow full caption


Box 1
GA4GH Work Stream focus areasThe GA4GH Work Streams are the key production teams of the organization. Each tackles a specific area in the data life cycle, as described below (URLs listed in the web resources).

  • (1)Data use & researcher identities: Develops ontologies and data models to streamline global access to datasets generated in any country9,10
  • (2)Genomic knowledge standards: Develops specifications and data models for exchanging genomic variant observations and knowledge18
  • (3)Cloud: Develops federated analysis approaches to support the statistical rigor needed to learn from large datasets
  • (4)Data privacy & security: Develops guidelines and recommendations to ensure identifiable genomic and phenotypic data remain appropriately secure without sacrificing their analytic potential
  • (5)Regulatory & ethics: Develops policies and recommendations for ensuring individual-level data are interoperable with existing norms and follow core ethical principles
  • (6)Discovery: Develops data models and APIs to make data findable, accessible, interoperable, and reusable (FAIR)
  • (7)Clinical & phenotypic data capture & exchange: Develops data models to ensure genomic data is most impactful through rich metadata collected in a standardized way
  • (8)Large-scale genomics: Develops APIs and file formats to ensure harmonized technological platforms can support large-scale computing

For more articles on Open Access, Science 2.0, and Data Networks for Genomics on this Open Access Scientific Journal see:

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson and others

Icelandic Population Genomic Study Results by deCODE Genetics come to Fruition: Curation of Current genomic studies

eScientific Publishing a Case in Point: Evolution of Platform Architecture Methodologies and of Intellectual Property Development (Content Creation by Curation) Business Model 

UK Biobank Makes Available 200,000 whole genomes Open Access

Systems Biology Analysis of Transcription Networks, Artificial Intelligence, and High-End Computing Coming to Fruition in Personalized Oncology

Read Full Post »

#TUBiol5227: Biomarkers & Biotargets: Genetic Testing and Bioethics

Curator: Stephen J. Williams, Ph.D.

The advent of direct to consumer (DTC) genetic testing and the resultant rapid increase in its popularity as well as companies offering such services has created some urgent and unique bioethical challenges surrounding this niche in the marketplace. At first, most DTC companies like 23andMe and Ancestry.com offered non-clinical or non-FDA approved genetic testing as a way for consumers to draw casual inferences from their DNA sequence and existence of known genes that are linked to disease risk, or to get a glimpse of their familial background. However, many issues arose, including legal, privacy, medical, and bioethical issues. Below are some articles which will explain and discuss many of these problems associated with the DTC genetic testing market as well as some alternatives which may exist.

‘Direct-to-Consumer (DTC) Genetic Testing Market to hit USD 2.5 Bn by 2024’ by Global Market Insights

This post has the following link to the market analysis of the DTC market (https://www.gminsights.com/pressrelease/direct-to-consumer-dtc-genetic-testing-market). Below is the highlights of the report.

As you can see,this market segment appears to want to expand into the nutritional consulting business as well as targeted biomarkers for specific diseases.

Rising incidence of genetic disorders across the globe will augment the market growth

Increasing prevalence of genetic disorders will propel the demand for direct-to-consumer genetic testing and will augment industry growth over the projected timeline. Increasing cases of genetic diseases such as breast cancer, achondroplasia, colorectal cancer and other diseases have elevated the need for cost-effective and efficient genetic testing avenues in the healthcare market.
 

For instance, according to the World Cancer Research Fund (WCRF), in 2018, over 2 million new cases of cancer were diagnosed across the globe. Also, breast cancer is stated as the second most commonly occurring cancer. Availability of superior quality and advanced direct-to-consumer genetic testing has drastically reduced the mortality rates in people suffering from cancer by providing vigilant surveillance data even before the onset of the disease. Hence, the aforementioned factors will propel the direct-to-consumer genetic testing market overt the forecast timeline.
 

DTC Genetic Testing Market By Technology

Get more details on this report – Request Free Sample PDF
 

Nutrigenomic Testing will provide robust market growth

The nutrigenomic testing segment was valued over USD 220 million market value in 2019 and its market will witness a tremendous growth over 2020-2028. The growth of the market segment is attributed to increasing research activities related to nutritional aspects. Moreover, obesity is another major factor that will boost the demand for direct-to-consumer genetic testing market.
 

Nutrigenomics testing enables professionals to recommend nutritional guidance and personalized diet to obese people and help them to keep their weight under control while maintaining a healthy lifestyle. Hence, above mentioned factors are anticipated to augment the demand and adoption rate of direct-to-consumer genetic testing through 2028.
 

Browse key industry insights spread across 161 pages with 126 market data tables & 10 figures & charts from the report, “Direct-To-Consumer Genetic Testing Market Size By Test Type (Carrier Testing, Predictive Testing, Ancestry & Relationship Testing, Nutrigenomics Testing), By Distribution Channel (Online Platforms, Over-the-Counter), By Technology (Targeted Analysis, Single Nucleotide Polymorphism (SNP) Chips, Whole Genome Sequencing (WGS)), Industry Analysis Report, Regional Outlook, Application Potential, Price Trends, Competitive Market Share & Forecast, 2020 – 2028” in detail along with the table of contents:
https://www.gminsights.com/industry-analysis/direct-to-consumer-dtc-genetic-testing-market
 

Targeted analysis techniques will drive the market growth over the foreseeable future

Based on technology, the DTC genetic testing market is segmented into whole genome sequencing (WGS), targeted analysis, and single nucleotide polymorphism (SNP) chips. The targeted analysis market segment is projected to witness around 12% CAGR over the forecast period. The segmental growth is attributed to the recent advancements in genetic testing methods that has revolutionized the detection and characterization of genetic codes.
 

Targeted analysis is mainly utilized to determine any defects in genes that are responsible for a disorder or a disease. Also, growing demand for personalized medicine amongst the population suffering from genetic diseases will boost the demand for targeted analysis technology. As the technology is relatively cheaper, it is highly preferred method used in direct-to-consumer genetic testing procedures. These advantages of targeted analysis are expected to enhance the market growth over the foreseeable future.
 

Over-the-counter segment will experience a notable growth over the forecast period

The over-the-counter distribution channel is projected to witness around 11% CAGR through 2028. The segmental growth is attributed to the ease in purchasing a test kit for the consumers living in rural areas of developing countries. Consumers prefer over-the-counter distribution channel as they are directly examined by regulatory agencies making it safer to use, thereby driving the market growth over the forecast timeline.
 

Favorable regulations provide lucrative growth opportunities for direct-to-consumer genetic testing

Europe direct-to-consumer genetic testing market held around 26% share in 2019 and was valued at around USD 290 million. The regional growth is due to elevated government spending on healthcare to provide easy access to genetic testing avenues. Furthermore, European regulatory bodies are working on improving the regulations set on the direct-to-consumer genetic testing methods. Hence, the above-mentioned factors will play significant role in the market growth.
 

Focus of market players on introducing innovative direct-to-consumer genetic testing devices will offer several growth opportunities

Few of the eminent players operating in direct-to-consumer genetic testing market share include Ancestry, Color Genomics, Living DNA, Mapmygenome, Easy DNA, FamilytreeDNA (Gene By Gene), Full Genome Corporation, Helix OpCo LLC, Identigene, Karmagenes, MyHeritage, Pathway genomics, Genesis Healthcare, and 23andMe. These market players have undertaken various business strategies to enhance their financial stability and help them evolve as leading companies in the direct-to-consumer genetic testing industry.
 

For example, in November 2018, Helix launched a new genetic testing product, DNA discovery kit, that allows customer to delve into their ancestry. This development expanded the firm’s product portfolio, thereby propelling industry growth in the market.

The following posts discuss bioethical issues related to genetic testing and personalized medicine from a clinicians and scientisit’s perspective

Question: Each of these articles discusses certain bioethical issues although focuses on personalized medicine and treatment. Given your understanding of the robust process involved in validating clinical biomarkers and the current state of the DTC market, how could DTC testing results misinform patients and create mistrust in the physician-patient relationship?

Personalized Medicine, Omics, and Health Disparities in Cancer:  Can Personalized Medicine Help Reduce the Disparity Problem?

Diversity and Health Disparity Issues Need to be Addressed for GWAS and Precision Medicine Studies

Genomics & Ethics: DNA Fragments are Products of Nature or Patentable Genes?

The following posts discuss the bioethical concerns of genetic testing from a patient’s perspective:

Ethics Behind Genetic Testing in Breast Cancer: A Webinar by Laura Carfang of survivingbreastcancer.org

Ethical Concerns in Personalized Medicine: BRCA1/2 Testing in Minors and Communication of Breast Cancer Risk

23andMe Product can be obtained for Free from a new app called Genes for Good: UMich’s Facebook-based Genomics Project

Question: If you are developing a targeted treatment with a companion diagnostic, what bioethical concerns would you address during the drug development process to ensure fair, equitable and ethical treatment of all patients, in trials as well as post market?

Articles on Genetic Testing, Companion Diagnostics and Regulatory Mechanisms

Centers for Medicare & Medicaid Services announced that the federal healthcare program will cover the costs of cancer gene tests that have been approved by the Food and Drug Administration

Real Time Coverage @BIOConvention #BIO2019: Genome Editing and Regulatory Harmonization: Progress and Challenges

New York Times vs. Personalized Medicine? PMC President: Times’ Critique of Streamlined Regulatory Approval for Personalized Treatments ‘Ignores Promising Implications’ of Field

Live Conference Coverage @Medcitynews Converge 2018 Philadelphia: Early Diagnosis Through Predictive Biomarkers, NonInvasive Testing

Protecting Your Biotech IP and Market Strategy: Notes from Life Sciences Collaborative 2015 Meeting

Question: What type of regulatory concerns should one have during the drug development process in regards to use of biomarker testing? From the last article on Protecting Your IP how important is it, as a drug developer, to involve all payers during the drug development process?

Read Full Post »

Can the Public Benefit Company Structure Save US Healthcare?

Curator: Stephen J. Williams, Ph.D.

UPDATED 3/15/2023

According to Centers for Medicare and Medicare Services (CMS.gov) healthcare spending per capita has reached 17.7 percent of GDP with, according to CMS data:

From 1960 through 2013, health spending rose from $147 per person to $9,255 per person, an average annual increase of 8.1 percent.

the National Health Expenditure Accounts (NHEA) are the official estimates of total health care spending in the United States. Dating back to 1960, the NHEA measures annual U.S. expenditures for health care goods and services, public health activities, government administration, the net cost of health insurance, and investment related to health care. The data are presented by type of service, sources of funding, and type of sponsor.

Graph: US National Healthcare Expenditures as a percent of Gross Domestic Product from 1960 to current. Recession periods are shown in bars. Note that the general trend has been increasing healthcare expenditures with only small times of decrease for example 2020 in year of COVID19 pandemic. In addition most of the years have been inflationary with almost no deflationary periods, either according to CPI or healthcare costs, specifically.

U.S. health care spending grew 4.6 percent in 2019, reaching $3.8 trillion or $11,582 per person.  As a share of the nation’s Gross Domestic Product, health spending accounted for 17.7 percent.

And as this spending grew (demand for health care services) associated costs also rose but as the statistical analyses shows there was little improvement in many health outcome metrics during the same time. 

Graph of the Growth of National Health Expenditures (NHE) versus the growth of GDP. Note most years from 1960 growth rate of NHE has always been higher than GDP, resulting in a seemingly hyperinflationary effect of healthcare. Also note how there are years when this disconnect is even greater, as there were years when NHE grew while there were recessionary periods in the general economy.

It appears that US healthcare may be on the precipice of a transformational shift, but what will this shift look like? The following post examines if the corporate structure of US healthcare needs to be changed and what role does a Public Benefit Company have in this much needed transformation.

Hippocratic Oath

I swear by Apollo the physician, and Asclepius, and Hygieia and Panacea and all the gods and goddesses as my witnesses, that, according to my ability and judgement, I will keep this Oath and this contract:

To hold him who taught me this art equally dear to me as my parents, to be a partner in life with him, and to fulfill his needs when required; to look upon his offspring as equals to my own siblings, and to teach them this art, if they shall wish to learn it, without fee or contract; and that by the set rules, lectures, and every other mode of instruction, I will impart a knowledge of the art to my own sons, and those of my teachers, and to students bound by this contract and having sworn this Oath to the law of medicine, but to no others.

I will use those dietary regimens which will benefit my patients according to my greatest ability and judgement, and I will do no harm or injustice to them.

I will not give a lethal drug to anyone if I am asked, nor will I advise such a plan; and similarly I will not give a woman a pessary to cause an abortion.

In purity and according to divine law will I carry out my life and my art.

I will not use the knife, even upon those suffering from stones, but I will leave this to those who are trained in this craft.

Into whatever homes I go, I will enter them for the benefit of the sick, avoiding any voluntary act of impropriety or corruption, including the seduction of women or men, whether they are free men or slaves.

Whatever I see or hear in the lives of my patients, whether in connection with my professional practice or not, which ought not to be spoken of outside, I will keep secret, as considering all such things to be private.

So long as I maintain this Oath faithfully and without corruption, may it be granted to me to partake of life fully and the practice of my art, gaining the respect of all men for all time. However, should I transgress this Oath and violate it, may the opposite be my fate.

Translated by Michael North, National Library of Medicine, 2002.

Much of the following information can be found on the Health Affairs Blog in a post entitled

Public Benefit Corporations: A Third Option For Health Care Delivery?

By Soleil Shah, Jimmy J. Qian, Amol S. Navathe, Nirav R. Shah

Limitations of For Profit and Non-Profit Hospitals

For profit represent ~ 25% of US hospitals and are owned and governed by shareholders, and can raise equity through stock and bond markets.

According to most annual reports, the CEOs incorrectly assume they are legally bound as fiduciaries to maximize shareholder value.  This was a paradigm shift in priorities of companies which started around the mid 1980s, a phenomenon discussed below.  

A by-product of this business goal, to maximize shareholder value, is that CEO pay and compensation is naturally tied to equity markets.  A means for this is promoting cost efficiencies, even in the midst of financial hardships.

A clear example of the failure of this system can be seen during the 2020- current COVID19 pandemic in the US. According to the Medicare Payment Advisory Commission, four large US hospitals were able to decrease their operating expenses by $2.3 billion just in Q2 2020.  This amounted to 65% of their revenue; in comparison three large NONPROFIT hospitals reduced their operating expense by an aggregate $13 million (only 1% of their revenue), evident that in lean times for-profit will resort to drastic cost cutting at expense of service, even in times of critical demands for healthcare.

Because of their tax structure and perceived fiduciary responsibilities, for-profit organizations (unlike non-profit and public benefit corporations) are not legally required to conduct community health need assessments, establish financial assistance policies, nor limit hospital charges for those eligible for financial assistance.  In addition to the difference in tax liability, for-profit, unlike their non-profit counterparts, at least with hospitals, are not funded in part by state or local government.  As we will see, a large part of operating revenue for non-profit university based hospitals is state and city funding.

Therefore risk for financial responsibility is usually assumed by the patient, and in worst case, by the marginalized patient populations on to the public sector.

Tax Structure Considerations of for-profit healthcare

Financials of major for-profit healthcare entities (2020 annual)

Non-profit Healthcare systems

Nonprofits represent about half of all hospitals in the US.  Most of these exist as a university structure, so retain the benefits of being private health systems and retaining the funding and tax benefits attributed to most systems of higher education. And these nonprofits can be very profitable.  After taking in consideration the state, local, and federal tax exemptions these nonprofits enjoy, as well as tax-free donations from contributors (including large personal trust funds), a nonprofit can accumulate a large amount of revenue after expenses.  In fact 82 nonprofit hospitals had $33 billion of net asset increase year-over-year (20% increase) from 2016 to 2017.  The caveat is that this revenue over expenses is usually spent on research or increased patient services (this may mean expanding the physical infrastructure of the hospital or disseminating internal grant money to clinical investigators, expanding the hospital/university research assets which could result in securing even larger amount of external funding from government sources.

And although this model may work well for intercity university/healthcare systems, it is usually a struggle for the rural nonprofit hospitals.  In 2020, ten out of 17 rural hospitals that went under were nonprofits.  And this is not just true in the tough pandemic year.  Over the past two decades multitude of nonprofit rural hospitals had to sell and be taken over by larger for-profit entities. 

Hospital consolidation has led to a worse patient experience and no real significant changes in readmission or mortality data.  (The article below is how over 130 rural hospitals have closed since 2010, creating a medical emergency in rural US healthcare)

https://www.nationalgeographic.com/history/article/appalachian-hospitals-are-disappearing

 

And according to the article below it is only to get worse

The authors of the Health Affairs blog feel a major disadvantage of both the for-profit and non-profit healthcare systems is “that both face limited accountability with respect to anticompettive mergers and acquisitions.”

More hospital consolidation is expected post-pandemic

Aug 10, 2020

By Rich Daly, HFMA Senior Writer and Editor

News | Coronavirus

More hospital consolidation is expected post-pandemic

  • Hospital deal volume is likely to accelerate due to the financial damage inflicted by the coronavirus pandemic.
  • The anticipated increase in volume did not show up in the latest quarter, when deals were sharply down.
  • The pandemic may have given hospitals leverage in coming policy fights over billing and the creation of “public option” health plans.

Hospital consolidation is likely to increase after the COVID-19 pandemic, say both critics and supporters of the merger-and-acquisition (M&A) trend.

The financial effects of the coronavirus pandemic are expected to drive more consolidation between and among hospitals and physician practices, a group of policy professionals told a recent Washington, D.C.-based web briefing sponsored by the Alliance for Health Policy.

“There is a real danger that this could lead to more consolidation, which if we’re not careful could lead to higher prices,” said Karyn Schwartz, a senior fellow at the Kaiser Family Foundation (KFF).

Schwartz cited a recent KFF analysis of available research that concluded “provider consolidation leads to higher health care prices for private insurance; this is true for both horizontal and vertical consolidation.”

Kenneth Kaufman, managing director and chair of Kaufman Hall, noted that crises tend to push financially struggling organizations “further behind.”

“I wouldn’t be surprised at all if that happens,” Kaufman said. “That will lead to further consolidation in the provider market.”

The initial rounds of federal assistance from the CARES Act, which were based first on Medicare revenue and then on net patient revenue, may fuel consolidation, said Mark Miller, PhD, executive vice president of healthcare for Arnold Ventures. That’s because the funding formulas favored organizations that already had higher revenues, he said, and provided less assistance to low-revenue organizations.

HHS has distributed $116.2 billion from the $175 billion in provider funding available through the CARES Act and the Paycheck Protection Program and Health Care Enhancement Act. The largest distributions used the two revenue formulas cited by Miller.

No surge in M&A yet

The expected burst in hospital M&A activity has yet to occur. Kaufman Hall identified 14 transactions in the second quarter of 2020, far fewer than in the same quarter in any of the four preceding years, when second-quarter transactions totaled between 19 and 31. The latest deals were not focused on small hospitals, with average seller revenue of more than $800 million — far larger than the previous second-quarter high of $409 million in 2018.

Six of the 14 announced transactions were divestitures by major for-profit health systems, including Community Health Systems, Quorum and HCA.

Kaufman Hall’s analysis of the recent deals identified another pandemic-related factor that may fuel hospital M&A: closer ties between hospitals. The analysis cited the example of  Lifespan and Care New England, which had suspended merger talks in 2019. More recently, in a joint announcement, the CEOs of the two systems noted that because of the COVID-19 crisis, the two systems “have been working together in unprecedented ways” and “have agreed to enter into an exploration process to understand the pros and cons of what a formal continuation of this collaboration could look like in the future.”

The M&A outlook for rural hospitals

The pandemic has had less of a negative effect on the finances of rural hospitals that previously joined larger health systems, said Suzie Desai, senior director of not-for-profit healthcare for S&P Global.

A CEO of a health system with a large rural network told Kaufman the federal grants that the system received for its rural hospitals were much larger than the grants paid through the general provider fund.

“If that was true across the board, then the federal government recognized that many rural hospitals could be at risk of not being able to make payroll; actually running out of money,” Kaufman said. “And they seem to have bent over backwards to make sure that didn’t happen.”  

Other CARES Act funding distributed to providers included:

  • $12.8 billion for 959 safety net hospitals
  • $11 billion to almost 4,000 rural healthcare providers and hospitals in urban areas that have certain special rural designations in Medicare

Telehealth has helped rural hospitals but has not been sufficient to address the financial losses inflicted by the pandemic, Desai said.

Other coming trends include a sharper cost focus

Desai expects an increasing focus “over the next couple years” on hospital costs because of the rising share of revenue received from Medicare and Medicaid. She expects increased efforts to use technology and data to lower costs.

Billy Wynne, JD, chairman of Wynne Health Group, expects telehealth restrictions to remain relaxed after the pandemic.

Also, the perceptions of the public and politicians about the financial health of hospitals are likely to give those organizations leverage in coming policy fights over changes such as banning surprise billing and creating so-called public-option health plans, Wynne said. As an example, he cited the Colorado legislature’s suspension of the launch of a public option “in part because of sensitivities around hospital finances in the COVID pandemic.”

“Once the dust settles, it’ll be interesting to see if their leverage has increased or decreased due to what we’ve been through,” Wynne said.

About the Author

Rich Daly, HFMA Senior Writer and Editor,

is based in the Washington, D.C., office. Follow Rich on Twitter: @rdalyhealthcare

Source: https://www.hfma.org/topics/news/2020/08/more-hospital-consolidation-is-expected-post-pandemic.html

From Harvard Medical School

Hospital Mergers and Quality of Care

A new study looks at the quality of care at hospitals acquired in a recent wave of consolidations

By JAKE MILLER January 16, 2020 Research

Two train tracks merge in a blurry sunset.

Image: NirutiStock / iStock / Getty Images Plus       

The quality of care at hospitals acquired during a recent wave of consolidations has gotten worse or stayed the same, according to a study led by Harvard Medical School scientists published Jan. 2 in NEJM.

The findings deal a blow to the often-cited arguments that hospital consolidation would improve care. A flurry of earlier studies showed that mergers increase prices. Now after analyzing patient outcomes after hundreds of hospital mergers, the new research also dashes the hopes that this more expensive care might be of higher quality.

Get more HMS news here

“Our findings call into question claims that hospital mergers are good for patients—and beg the question of what we are getting from higher hospital prices,” said study senior author J. Michael McWilliams, the Warren Alpert Foundation Professor of Health Care Policy in the Blavatnik Institute at HMS and an HMS professor of medicine and a practicing general internist at Brigham and Women’s Hospital.

McWilliams noted that rising hospital prices have been one of the leading drivers of unsustainable growth in U.S. health spending.   

To examine the impact of hospital mergers on quality of care, researchers from HMS and Harvard Business School examined patient outcomes from nearly 250 hospital mergers that took place between 2009 and 2013. Using data collected by the Centers for Medicare and Medicaid Services, they analyzed variables such as 30-day readmission and mortality rates among patients discharged from a hospital, as well as clinical measures such as timely antibiotic treatment of patients with bacterial pneumonia. The researchers also factored in patient experiences, such as whether those who received care at a given hospital would recommend it to others. For their analysis, the team compared trends in these indicators between 246 hospitals acquired in merger transactions and unaffected hospitals.

The verdict? Consolidation did not improve hospital performance, and patient-experience scores deteriorated somewhat after the mergers.

The study was not designed to examine the reasons behind the worsening in patient experience. Weakening of competition due to hospital mergers could have contributed, the researchers said, but deeper exploration suggested other potential mechanisms. Notably, the analysis found the decline in patient-experience scores occurred mainly in hospitals acquired by hospitals that already had a poor patient-experience score—a finding that suggests acquisitions facilitate the spread of low quality care but not of high quality care.

The researchers caution that isolated, individual mergers may have still yielded positive results—something that an aggregate analysis is not powered to capture. And the researchers could only examine measurable aspects of quality. The trend in hospital performance on these standard measures, however, appears to point to a net effect of overall decline, the team said.

“Since our study estimated the average effects of mergers, we can’t rule out the possibility that some mergers are good for patient care,” said first author Nancy Beaulieu, research associate in health care policy at HMS. “But this evidence should give us pause when considering arguments for hospitals mergers.”

The work was supported by the Agency for Healthcare Research and Quality (grant no. U19HS024072).

Co-investigators included Bruce Landon and Jesse Dalton from HMS, Ifedayo Kuye, from the University of California, San Francisco, and Leemore Dafny from Harvard Business School and the National Bureau of Economic Research.

Source: https://hms.harvard.edu/news/hospital-mergers-quality-care

Public Benefit Corporations (PBC)

     Public benefit corporations (versus Benefit Corporate status, which is more of a pledge) are separate legal entities which exist as a hybrid, for-profit/nonprofit company but is mandated to 

  1. Pursue a general or specific public benefit
  2. Consider the non-financial interests of its shareholders and other STAKEHOLDERS when making decision
  3. report how well it is achieving its overall public benefit objectives
  4. Have limited fiduciary responsibility to investors that remains IN SCOPE of public benefit goal

In essence, the public benefit corporations executives are mandated to run the company for the benefit of STAKEHOLDERS first, if those STAKEHOLDERS are the public beneficiary of the company’s goals.  This in essence moves the needle away from the traditional C-Corp overvaluing the needs of shareholders and brings back the mission of the company and in the case of healthcare, the needs of its stakeholders, the consumers of healthcare.

     PBCs are legal entities recognized by states rather than by the federal government.  So far, in 2020 about 37 states allow companies to incorporate as a PBC.  Stipulations of the charter include semiannual reporting of the public benefits bestowed by the company and how well it is achieving its public benefit mandate.  There are about 3,000 US PBCs. Some companies have felt it was in their company mission and financial interest to change incorporation as a PBC.

Some well known PBCs include

  1. Ben and Jerry’s Ice Cream
  2. American Red Cross
  3. Susan B. Komen Foundation
  4. Allbirds (a shoe startup valued at $1.7 billion when made switch)
  5. Bombas (the sock company that donates extra socks when you buy a pair)
  6. Lemonade (a publicly traded insurance PBC that has beneficiaries select a nonprofit that the company will donate to)

Although the number of PBCs in the healthcare arena is increasing

  1. Not many PBCs are in the area of healthcare delivery 
  2. Noone is quite sure what the economic model would look like for a healthcare delivery PBC

Some example of hospital PBC include NYC Health + Hospitals and Community First Medical Center in Chicago.

Benefits of moving a hospital to PBC Status

  1. PBCs are held legally accountable to a predefined public benefit.  For hospitals this could be delivering cost-effective quality of care and affordable to a local citizenry or an economically disadvantaged population.  PBCs must produce at least an annual report on the public benefits it has achieved contrasted against a third party standard.  For example a hospital could include data of Medicaid related mortality risks, data neither the C-corp nor the nonprofit 501c would have to report on.  Most nonprofits and charities report their taxes on a schedule H or Form 990, which only has to report the officer’s compensation as well as monies given to charitable organizations, or other 501 organizations.  The nonprofit would show a balance of zero as the donated money for that year would be allocated out for various purposes. Hospitals, even as nonprofits, are not required to submit all this data.  Right now in US the ACA just requires any hospital that receives government or ACA insurance payments to report certain outcome statistics.  Although varying state by state, a PBC should have a “benefit officer” to make sure the mandate is being met.  In some cases a PBC benefit officer could sue the board for putting shareholder interest over the public benefit mandate.
  2. A PBC can include community stakeholders in the articles of incorporation thus giving a voice to local community members.  This would be especially beneficial for a hospital serving, say, a rural community.
  3. PBCs do have advantages of the for-profit companies as they are not limited to non-equity forms of investment.  A PBC can raise money in the equity markets or take on debt and finance it.  These financial instruments are unavailable to the non-profit.  Yet one interesting aspect is that PBCs require a HIGHER voting threshold by shareholders than a traditional for profit company in the ability to change their public benefit or convert their PBC back to a for-profit.

Limitations of the PBC

  1. Little incentive financially for current and future hospitals to incorporate as a PBC.  Herein lies a huge roadblock given the state of our reimbursement structure in this country.  Although there may be an incentive with regard to hiring and retention of staff drawn to the organization’s social purpose.  There have been, in the past, suggestions to allow hospitals that incorporate at PBC to receive some tax benefit, but this legislation has not gone through either at state or federal level. (put link to tax article).  
  2. In order for there to be value to constituents (patients) there must be strong accountability measures.  This will require the utmost in ethical behavior by a board and executives.  We have witnessed, through M&A by large health groups, anticompetitive and near monopoly behavior.
  3. There are no federal guidelines but varying guidelines from state to state.  There must be some federal recognition of the PBC status when it comes to healthcare, such as that the government is one of the biggest payers of US healthcare.

This is a great interview with ArcHealth, a PBC healthcare system.

Source: https://www.archealthjustice.com/arc-health-as-public-benefit-company-and-social-enterprise-what-is-the-difference/

Arc Health as a Public Benefit Company and Social Enterprise – What is the difference?

Mar 3, 2021 | Healthcare

Arc Health PBC is a public benefit corporation, a mission-driven for-profit company that utilizes a market-driven approach to achieving our short and long-term social goals. As a public benefit corporation, Arc Health is also a social enterprise working to further our mission of providing healthcare to rural, underserved, and indigenous communities through business practices that improve the recruitment and retention of quality healthcare providers.

What is a Social Enterprise?

While there is no one exact definition, according to the Social Enterprise Alliance, a social enterprise is an “organization that addresses a basic unmet need or solves a social or environmental problem through a market-driven approach.” A social enterprise is not a distinct legal entity, but instead, an “ideological spectrum marrying commercial approaches with social good.” Social enterprises foster a dual-bottom-line – simultaneously seeking profits and social impact. Arc Health, like many social enterprises, seeks to be self–sustainable. 

Two primary structures fall under the social enterprise umbrella: nonprofits and for-profit organizations. There are also related entities within both structures that could be considered social enterprises. Any of these listed structures can be regarded as a social enterprise depending on if and how involved they are with socially beneficial programs.

What is a Public Benefit Corporation?

Public Benefit Corporations (PBCs), also known as benefit corporations, are “for-profit companies that balance maximizing value to stakeholders with a legally binding commitment to a social or environmental mission.” PBCs operate as for-profit entities with no tax advantages or exemptions. Still, they must have a “purpose of creating general public benefit,” such as promoting the arts or science, preserving the environment, or providing benefits to underserved communities. PBCs must attain a higher degree of corporate purpose, expanded accountability, and expected transparency. 

There are now  over 3,000 registered PBCs, comprising approximately 0.1% of American businesses.

 As a PBC, Arc Health expects to access capital through individual investors who seek financial returns, rather than through donations. Arc Health’s investors make investments with a clear understanding of the balance the company must strike between financial returns (I.e., profitability) and social purpose. Therefore, investors expect the company to be operationally profitable to ensure a financial return on their investments, while also making clear to all stakeholders and the public that generating social impact is the priority. 

What is the difference between a Social Enterprise and PBC?

Social enterprises and PBCs emulate similar ideals that value the importance and need to invoke social change vis-a-vis working in a market-driven industry. Public benefit corporations fall under the social enterprise umbrella. An organization may choose to use a social enterprise model and incorporate itself as either a not-for-profit, C-Corp, PBC, or other corporate structure.  

How did Arc Health Become a Public Benefit Corporation?

Arc Health was initially formed as a C-Corp. In 2019, Arc Health’s CEO and Co-Founder, Dave Shaffer, guided the conversion from a C-Corp to a PBC, incorporated in Delaware. Today, Arc Health follows guidelines and expectations for PBCs, including adhering to the State of Delaware’s requirements for PBCs. 

Why is Arc Health a Social Enterprise and Public Benefit Corporation?

Arc Health believes it is essential to commit ourselves to our mission and demonstrate our dedication through our actions. We work to adhere to the core values of accountability, transparency, and purpose. As a registered public benefit company and a social enterprise, we execute our drive to achieve health equity in tangible and effective ways that the communities we work with, our stakeholders, and our providers expect of us.  

90% of Americans say that companies must not only say a product or service is beneficial, but they also need to prove its benefit.

When we partner with health clinics and hospitals, we aim to provide services that enact lasting change. For example, we work with healthcare providers who desire to contribute both clinical and non-clinical skills. In 2020, Arc Health clinicians developed COVID-19 response protocols and educational materials about the vaccines. They participated in pain management working groups. They identified and followed up with kids in the community who were overdue for a well-child check. Arc Health providers should be driven by a desire to develop a long-term relationship with a healthcare service provider and participate in its successes and challenges.   

Paradigm Shift in the 1980’s: Companies Start to Emphasize Shareholders Over Stakeholders

So earlier in this post we had mentioned about a shift in philosophy at the corporate boardroom that affected how comparate thought, value, and responsibility: Companies in the 1980s started to shift their focus and value only the needs of corporate ShAREHOLDERS at the expense of their  traditional STAKEHOLDERS (customers, clients).  Many movies and books have been written on this and debatable if deliberate or a by-product of M&A, hostile takeovers, and the stock market in general but the effect was that the consumer was relegated as having less value, even though marketing budgets are very high.  The fiduciary responsibility of the executive was now defined in terms of satisfying shareholders, who were now  big huge and powerful brokerage houses, private equity, and hedge funds.  A good explanation by Medium.com Tyler Lasicki is given below.

From the Medium.com

Source: https://medium.com/swlh/the-shareholder-v-stakeholder-contrast-a-brief-history-c5a6cfcaa111

The Shareholder V. Stakeholder Contrast, a Brief History

Tyler Lasicki

Follow

May 26, 2020 · 14 min read

Introduction

In a famous 1970 New York Times Article, Milton Friedman postulated that the CEO, as an employee of the shareholder, must strive to provide the highest possible return for all shareholders. Since that article, the United States has embraced this idea as the fundamental philosophy supporting the ultimate purpose of businesses — The Shareholders Come First.

In August of 2019, the Business Roundtable, a group made up of the most influential U.S CEOs, published a letter shifting their stance on the purpose of a corporation. Regardless of whether this piece of paper will actually result in any systematic changes has yet to be seen, however this newly stated purpose of business is a dramatic shift from the position Milton Friedman took in 1970. According to the statement, these corporations will no longer prioritize maximizing profits for shareholders, but instead turn their focus to benefiting all stakeholders — including citizens, customers, suppliers, employees, on par with shareholders. 

Now the social responsibility of a company and the CEO was to maxiimize the profits even at the expense of any previous social responsibility they once had.

Small sample of the 181 Signatures attached to the Business Roundtable’s letter

What has happened over the past 50 years that has led to such a fundamental change in ideology? What has happened to make the CEO’s of America’s largest corporations suddenly change their stance on such a foundational principle of what it means to be an American business?

Since diving into this subject, I have come to find that the “American fundamental principle” of putting shareholders first is one that is actually not all that fundamental. In fact, for a large portion of our nation’s history this ideology was actually seen as the unpopular position.

Key ideological shifts in U.S. history

This post dives into a brief history of these two contrasting ideological viewpoints in an attempt to contextualize the forces behind both sides — specifically, the most recent shift (1970–2019). This basic idea of what is most important; the stakeholder or the shareholder, is the underlying reason as to why many things are the way they are today. A corporation’s priority of shareholder or stakeholder ultimately impacts employee salaries, benefits, quality of life within communities, environmental conditions, even the access to education children can receive. It affects our lives in a breadth and depth of ways and now that corporations may be changing positions (yet again) to focus on a model that prioritizes the stakeholder, it is important to understand why.

Looking forward, if stakeholder priority ends up being the popular position among American businesses, how long will it last for? What could lead to its downfall? And what will managers do to ensure a long term stakeholder-friendly business model?

It is clear to me the reasons that have led to these shifts in ideology are rather nuanced, however I want to highlight a few trends that have had a major impact on businesses changing their priorities while also providing context as to why things have shifted.

The Ascendancy of Shareholder Value

Following the 1929 stock market crash and the Great Depression, stakeholder primacy became the popular perspective within corporate America. Stakeholder primacy is the idea that corporations are to consider a wider group of interested parties (not just shareholders) whose positions need to be taken into consideration by corporate governance. According to this point of view, rather than solely being an agent for shareholders, management’s responsibilities were to be dispersed among all of its constituencies, even if it meant a reduction in shareholder value. This ideology lasted as the dominant position for roughly 40 years, in part due to public opinion and strong views on corporate responsibility, but also through state adoption of stakeholder laws.

By the mid-1970s, falling corporate profitability and stagnant share prices had been the norm for a decade. This poor economic performance influenced a growing concern in the U.S. regarding the perceived divergence between manager and shareholder interest. Many held the position that profits and share prices were suffering as a result of corporation’s increased attention on stakeholder groups.

This noticeable divergence in interests sparked many academics to focus their research on corporate management’s motivations in decision making regarding their allocation of resources. This branch of research would later be known as agency theory, which focused on the relationship between principals (shareholders) and their agents (management). Research at the time outlined how over the previous decades corporate management had pursued strategies that were not likely to optimize resources from a shareholder’s perspective. These findings were part of a seismic shift of corporate philosophy, changing priority from the stakeholders of a business to the shareholders.

By 1982, the U.S. economy started to recover from a prolonged period of high inflation and low economic growth. This recovery acted as a catalyst for change in many industries, leaving many corporate management teams to struggle in response to these changes. Their business performance suffered as a result. These distressed businesses became targets for a group of new investors…private equity firms.

Now the paradigm shift had its biggest backer…. private equity!  And private equity care about ONE thing….. THEIR OWN SHARE VALUE and subsequently meaning corporate profit, which became the most important directive for the CEO.

So it is all hopeless now? Can there be a shift back to the good ‘ol days?  

Well some changes are taking place at top corporate levels which may help the stakeholders to have a voice at the table, as the following IRMagazine article states.

And once again this is being led by the Business Roundtable, the same Business Roundtable that proposed the shift back in the 1970s.

Andrew Holt

Andrew Holt

REPORTER

  •  
  •  
  •  

SHAREHOLDER VALUE

CORPORATE GOVERNANCE

Shift from shareholder value to stakeholder-focused model for top US firms

AUG 23, 2019

Business Roundtable reveals corporations to drop idea they function to serve shareholders only

Source: https://www.irmagazine.com/esg/shift-shareholder-value-stakeholder-focused-model-top-us-firms

Andrew Holt

Andrew Holt

REPORTER

n a major corporate shift, shareholder value is no longer the main objective of the US’ top company CEOs, according to the Business Roundtable, which instead emphasizes the ‘purpose of a corporation’ and a stakeholder-focused model.

The influential body – a group of chief executive officers from major US corporations – has stressed the idea of a corporation dropping the age-old notion that corporations function first and foremost to serve their shareholders and maximize profits.

Rather, the focus should be on investing in employees, delivering value to customers, dealing ethically with suppliers and supporting outside communities as the vanguard of American business, according to a Business Roundtable statement.

‘While each of our individual companies serves its own corporate purpose, we share a fundamental commitment to all of our stakeholders,’ reads the statement, signed by 181 CEOs. ‘We commit to deliver value to all of them, for the future success of our companies, our communities and our country.’

Gary LaBranche, president and CEO of NIRI, tells IR Magazine that this is part of a wider trend: ‘The redefinition of purpose from shareholder-focused to stakeholder-focused is not new to NIRI members. For example, a 2014 IR Update article by the late Professor Lynn Stout urges a more inclusive way of thinking about corporate purpose.’ 

NIRI has also addressed this concept at many venues, including the senior roundtable annual meeting and the NIRI Annual Conference, adds LaBranche. This trend was further seen in the NIRI policy statement on ESG disclosure, released in January this year. 

Analyzing the meaning of this change in more detail, LaBranche adds: ‘The statement is a revolutionary break with the Business Roundtable’s previous position that the purpose of the corporation is to create value for shareholders, which was a long-held position championed by Milton Friedman.

‘The challenge is that Friedman’s thought leadership helped to inspire the legal and regulatory regime that places wealth creation for shareholders as the ‘prime directive’ for corporate executives.

‘Thus, commentators like Mike Allen of Axios are quick to point out that some shareholders may actually use the new statement to accuse CEOs of worrying about things beyond increasing the value of their shares, which, Allen reminds us, is the CEOs’ fiduciary responsibility.

‘So while the new Business Roundtable statement reflects a much-needed rebalancing and modernization that speaks to the comprehensive responsibilities of corporate citizens, we can expect that some shareholders will push back on this more inclusive view of who should benefit from corporate efforts and the capital that makes it happen. The new statement may not mark the dawn of a new day, but it perhaps signals the twilight of the Friedman era.’

In a similarly reflective way, Jamie Dimon, chairman and CEO of JPMorgan Chase & Co and chairman of the Business Roundtable, says: ‘The American dream is alive, but fraying. Major employers are investing in their workers and communities because they know it is the only way to be successful over the long term. These modernized principles reflect the business community’s unwavering commitment to continue to push for an economy that serves all Americans.’

Note:  Mr Dimon has been very vocal for many years on corporate social responsibility, especially since the financial troubles of 2009.

Impact of New Regulatory Trends in M&A Deals

The following podcast from Pricewaterhouse Cooper Health Research Institute (called Next in Health) discusses some of the trends in healthcare M&A and is a great listen. However from 6:30 on the podcast discusses a new trend which is occuring in the healthcare company boardroom, which is this new focus on integrating companies that have proven ESG (or environmental, social, governance) functions within their organzations. As stated, doing an M&A deal with a company with strong ESG is looked favorably among regulators now.

Please click on the following link to hear a Google Podcast Next in Health episode

https://podcasts.google.com/feed/aHR0cHM6Ly9mZWVkcy5idXp6c3Byb3V0LmNvbS8xMjgyNjQ2LnJzcw?sa=X&ved=2ahUKEwil9sua2cf5AhUErXIEHaoTBQoQ9sEGegQIARAC

 

UPDATED 3/15/2023

Should There Be More Public Benefit Corporations in Health Care?

In a post by Heather Landi  in Fierce Healthcare entitled

 

Health tech unicorn Aledade recently announced that it made the strategic decision to become a public benefit corporation (PBC).

 
 

The company joins just a handful of others in healthcare that are structured this way.

So what exactly is a PBC, and why does it matter?

PBCs are a type of for-profit corporate entity that has also adopted a public benefit purpose and is currently authorized by 35 states and the District of Columbia. A PBC must consider the nonfinancial interests of its shareholders and other stakeholders when making decisions. As a public benefit corporation, companies have to weigh their social/environmental objectives alongside maximizing value for shareholders.

 

While PBC and B Corp. are often used interchangeably, they are not the same. A B Corp. is a certification provided to eligible companies by the nonprofit, B Lab. A PBC is an actual legal entity that bakes into its certificate of incorporation a “public benefit,” according to Rubicon Law Group.

“I don’t think that there is a trade-off between either you do things that are good for society or you make profits in your business.” —Farzad Mostashari, M.D.

PBCs also are required to provide a report to shareholders every two years that detail how well the company is achieving its overall public benefit objectives. In some states, the report must be assessed against a third-party standard and be made publicly available. Delaware PBCs are not required to report publicly or against a third-party standard.

Aledade launched in 2014 and uses data analytics to help independent doctors’ offices transition to value-based care models. The company currently partners with more than 1,000 independent primary care practices comprising over 11,000 physicians and has nearly 150 contracts covering more than 1.7 million patients and $17 billion in total healthcare spending. Last June, the company raised $123 million in a series E round, boosting its valuation to $3.1 billion.

 

In a blog post, Aledade CEO and co-founder Farzad Mostashari, M.D., explained the company’s reasoning behind the move and said the corporate structure of a PBC is “well suited to mission-oriented companies where alignment with stakeholders is a key driver of the business model.”

“Aledade’s public benefit purpose means that we must weigh the interests of our primary care practice partners, their patients, our employees, and those who bear the burden of rising health care costs, alongside those of our shareholders, when we make decisions,” Mostashari said in an interview. This duty extends to all significant board decisions, including decisions on whether to go public, to make acquisitions or to sell the company, he noted.

The PBC structure helps create alignment among stakeholders and build trust, he said. “I don’t think that there is a trade-off between either you do things that are good for society or you make profits in your business. That might be true for fee-for-service businesses. It’s not true for Aledade,” he said.

He added, “For businesses that are built on trust and alignment, not considering stakeholder benefits gets you neither social good nor profits. If you’re in a business like our business where it’s actually really important that everybody have faith and belief that you are doing what’s best for patients, that you are actually in it for the long-term for practices, that’s what makes us successful as a business.”

Mark Cuban Cost Plus Drugs, which launched in January 2022 to offer low-cost rivals to overpriced generic drugs, also is structured as a public benefit corporation. The company’s founder and CEO Alexander Oshmyansky started the company in 2015 as a nonprofit, according to a feature story in D Magazine. Through Y Combinator, investors told Oshmyansky that the nonprofit model wouldn’t be able to raise the needed funds. He then reworked the business model to a PBC and launched Osh’s Affordable Pharmaceuticals in 2018.

Some other companies that are biotech drug development companies that operate under the PBC model include

rural healthcare startup Homeward Health,

Perlara, the first biotech PBC,

Rarebase, also a biotech company,

Sage Health At-Home,

Savvy Cooperative, which is described as “the first and only patient-owned public benefit co-op,”

OWP Pharmaceuticals,

Medicaid-focused company Waymark and

Trial Library, a cancer precision medicine company.

The pros and cons
 

Even a traditional for-profit C corporation can work toward a public mission without becoming a PBC. But, in an industry like healthcare, too often the duty to maximize financial returns for shareholders or investors can be in conflict with what is best for patients, executives say.

“With a startup, it might limit the ability to sell their business to a larger company in the future because there might be some limitations on what the larger company could do with the organization.”—Jodi Daniel, a partner in Crowell & Moring’s Health Care Group

According to some healthcare experts, PBCs offer a promising alternative as a business model for healthcare companies by providing a “North Star” by which a company can navigate critical business decisions.

“I think it really helps to drive accountability,” Huang, Osmind’s chief executive, said. “I think that’s important, especially in healthcare where it’s easy sometimes to get misaligned with all the different stakeholders that are involved in the industry. We wanted to make sure we had something to be accountable to. Second, it’s ingrained in the culture. The third element of why it was so helpful for us from the beginning is just on focus and alignment. I think we can be much more clear and transparent about what we’re focused on, our values, how we try to use that transparently to influence our decisions and how we can build a business that really ties all of that together.”

In a Health Affairs article, medical researchers at Stanford, including Jimmy Qian, a co-founder of Osmind, laid out the case for why PBCs may simultaneously improve individual patient outcomes and collective benefit without sacrificing institutions’ financial stability.

PBCs are held legally accountable to a predefined public benefit, which, for hospitals, could involve delivering high-quality, affordable care to local populations. PBCs are required to produce annual benefits reports that are assessed against a third-party standard. “These reports could be used by regulatory agencies such as the Centers for Medicare and Medicaid Services (CMS) or local health authorities to evaluate whether the PBC is making progress toward its stated mission and respond accordingly,” the researchers wrote.

But are there any trade-offs?

Having a public benefit obligation could potentially “tie the hands” of board members who can’t just focus on profits and must focus on those dual responsibilities, noted Jodi Daniel, a partner in Crowell & Moring’s Health Care Group.

“Companies that transition to being a public benefit corporation are intentionally trying to ensure that that the company’s mission doesn’t get diminished over time because it’s in their charter. So it helps [the mission] to endure. But there are pros and cons to that. It is somewhat binding the future board members and executives to follow that mission,” she said.

Daniel said she has spoken with several healthcare companies recently that are weighing the possibility of transitioning to a PBC. “Companies often don’t want to necessarily limit their options in their decision-making in the future. With a startup, it might limit the ability to sell their business to a larger company in the future because there might be some limitations on what the larger company could do with the organization,” she said in an interview. 

By making decisions based on interests outside of financial ones, organizations may put themselves at a margin disadvantage as compared to pure for-profit players in the space, wrote Hospitalogy founder Blake Madden.

Faddis with Veeva said the company hasn’t seen any financial or performance trade-off as a result of operating as a PBC. He noted that the move has been good for recruiting, spurred more long-term conversations with customers and has been a source of new ideas.

“Prior to the conversion, you had employees who were thinking of new products or new functionality with the mindset of getting to be commercially successful,” Faddis said. “Now, you also have people thinking about it from the angle of, ‘Does it further one of our PBC purposes and then maybe it’s also going to be commercially successful?'”

Converting to a PBC also can be a tactic to build trust, Daniel noted, especially in healthcare, and that holds the potential to drive business. 

One factor that isn’t clear is whether there is sufficient oversight to hold these companies accountable to their stated public mission. Who checks to make sure companies are making progress toward their objectives to improve healthcare?

Osmind publishes its benefit corporation report on its website to make it available to the public even though it is not required to do so. “I think that really highlights the accountability piece of you need to tell the world or at least tell your shareholders how you’re really trying to uphold your public benefit,” Huang said.

Other related articles published on this Open Access Online Scientific Journal on Healthcare Issues include the following:

Opportunity Mapping of the E-Health Sector prior to COVID19 Outbreak
mHealth market growth in America, Europe, & APAC
Ethics Behind Genetic Testing in Breast Cancer: A Webinar by Laura Carfang of survivingbreastcancer.org
The Inequality and Health Disparity seen with the COVID-19 Pandemic Is Similar to Past Pandemics
Live Notes from @HarvardMed Bioethics: Authors Jerome Groopman, MD & Pamela Hartzband, MD, discuss Your Medical Mind
COVID-related financial losses at Mass General Brigham
Personalized Medicine, Omics, and Health Disparities in Cancer:  Can Personalized Medicine Help Reduce the Disparity Problem?

Read Full Post »

Science Policy Forum: Should we trust healthcare explanations from AI predictive systems?

Some in industry voice their concerns

Curator: Stephen J. Williams, PhD

Post on AI healthcare and explainable AI

   In a Policy Forum article in ScienceBeware explanations from AI in health care”, Boris Babic, Sara Gerke, Theodoros Evgeniou, and Glenn Cohen discuss the caveats on relying on explainable versus interpretable artificial intelligence (AI) and Machine Learning (ML) algorithms to make complex health decisions.  The FDA has already approved some AI/ML algorithms for analysis of medical images for diagnostic purposes.  These have been discussed in prior posts on this site, as well as issues arising from multi-center trials.  The authors of this perspective article argue that choice of type of algorithm (explainable versus interpretable) algorithms may have far reaching consequences in health care.

Summary

Artificial intelligence and machine learning (AI/ML) algorithms are increasingly developed in health care for diagnosis and treatment of a variety of medical conditions (1). However, despite the technical prowess of such systems, their adoption has been challenging, and whether and how much they will actually improve health care remains to be seen. A central reason for this is that the effectiveness of AI/ML-based medical devices depends largely on the behavioral characteristics of its users, who, for example, are often vulnerable to well-documented biases or algorithmic aversion (2). Many stakeholders increasingly identify the so-called black-box nature of predictive algorithms as the core source of users’ skepticism, lack of trust, and slow uptake (3, 4). As a result, lawmakers have been moving in the direction of requiring the availability of explanations for black-box algorithmic decisions (5). Indeed, a near-consensus is emerging in favor of explainable AI/ML among academics, governments, and civil society groups. Many are drawn to this approach to harness the accuracy benefits of noninterpretable AI/ML such as deep learning or neural nets while also supporting transparency, trust, and adoption. We argue that this consensus, at least as applied to health care, both overstates the benefits and undercounts the drawbacks of requiring black-box algorithms to be explainable.

Source: https://science.sciencemag.org/content/373/6552/284?_ga=2.166262518.995809660.1627762475-1953442883.1627762475

Types of AI/ML Algorithms: Explainable and Interpretable algorithms

  1.  Interpretable AI: A typical AI/ML task requires constructing algorithms from vector inputs and generating an output related to an outcome (like diagnosing a cardiac event from an image).  Generally the algorithm has to be trained on past data with known parameters.  When an algorithm is called interpretable, this means that the algorithm uses a transparent or “white box” function which is easily understandable. Such example might be a linear function to determine relationships where parameters are simple and not complex.  Although they may not be as accurate as the more complex explainable AI/ML algorithms, they are open, transparent, and easily understood by the operators.
  2. Explainable AI/ML:  This type of algorithm depends upon multiple complex parameters and takes a first round of predictions from a “black box” model then uses a second algorithm from an interpretable function to better approximate outputs of the first model.  The first algorithm is trained not with original data but based on predictions resembling multiple iterations of computing.  Therefore this method is more accurate or deemed more reliable in prediction however is very complex and is not easily understandable.  Many medical devices that use an AI/ML algorithm use this type.  An example is deep learning and neural networks.

The purpose of both these methodologies is to deal with problems of opacity, or that AI predictions based from a black box undermines trust in the AI.

For a deeper understanding of these two types of algorithms see here:

https://www.kdnuggets.com/2018/12/machine-learning-explainability-interpretability-ai.html

or https://www.bmc.com/blogs/machine-learning-interpretability-vs-explainability/

(a longer read but great explanation)

From the above blog post of Jonathan Johnson

  • How interpretability is different from explainability
  • Why a model might need to be interpretable and/or explainable
  • Who is working to solve the black box problem—and how

What is interpretability?

Does Chipotle make your stomach hurt? Does loud noise accelerate hearing loss? Are women less aggressive than men? If a machine learning model can create a definition around these relationships, it is interpretable.

All models must start with a hypothesis. Human curiosity propels a being to intuit that one thing relates to another. “Hmm…multiple black people shot by policemen…seemingly out of proportion to other races…something might be systemic?” Explore.

People create internal models to interpret their surroundings. In the field of machine learning, these models can be tested and verified as either accurate or inaccurate representations of the world.

Interpretability means that the cause and effect can be determined.

What is explainability?

ML models are often called black-box models because they allow a pre-set number of empty parameters, or nodes, to be assigned values by the machine learning algorithm. Specifically, the back-propagation step is responsible for updating the weights based on its error function.

To predict when a person might die—the fun gamble one might play when calculating a life insurance premium, and the strange bet a person makes against their own life when purchasing a life insurance package—a model will take in its inputs, and output a percent chance the given person has at living to age 80.

Below is an image of a neural network. The inputs are the yellow; the outputs are the orange. Like a rubric to an overall grade, explainability shows how significant each of the parameters, all the blue nodes, contribute to the final decision.

In this neural network, the hidden layers (the two columns of blue dots) would be the black box.

For example, we have these data inputs:

  • Age
  • BMI score
  • Number of years spent smoking
  • Career category

If this model had high explainability, we’d be able to say, for instance:

  • The career category is about 40% important
  • The number of years spent smoking weighs in at 35% important
  • The age is 15% important
  • The BMI score is 10% important

Explainability: important, not always necessary

Explainability becomes significant in the field of machine learning because, often, it is not apparent. Explainability is often unnecessary. A machine learning engineer can build a model without ever having considered the model’s explainability. It is an extra step in the building process—like wearing a seat belt while driving a car. It is unnecessary for the car to perform, but offers insurance when things crash.

The benefit a deep neural net offers to engineers is it creates a black box of parameters, like fake additional data points, that allow a model to base its decisions against. These fake data points go unknown to the engineer. The black box, or hidden layers, allow a model to make associations among the given data points to predict better results. For example, if we are deciding how long someone might have to live, and we use career data as an input, it is possible the model sorts the careers into high- and low-risk career options all on its own.

Perhaps we inspect a node and see it relates oil rig workers, underwater welders, and boat cooks to each other. It is possible the neural net makes connections between the lifespan of these individuals and puts a placeholder in the deep net to associate these. If we were to examine the individual nodes in the black box, we could note this clustering interprets water careers to be a high-risk job.

In the previous chart, each one of the lines connecting from the yellow dot to the blue dot can represent a signal, weighing the importance of that node in determining the overall score of the output.

  • If that signal is high, that node is significant to the model’s overall performance.
  • If that signal is low, the node is insignificant.

With this understanding, we can define explainability as:

Knowledge of what one node represents and how important it is to the model’s performance.

So how does choice of these two different algorithms make a difference with respect to health care and medical decision making?

The authors argue: 

“Regulators like the FDA should focus on those aspects of the AI/ML system that directly bear on its safety and effectiveness – in particular, how does it perform in the hands of its intended users?”

A suggestion for

  • Enhanced more involved clinical trials
  • Provide individuals added flexibility when interacting with a model, for example inputting their own test data
  • More interaction between user and model generators
  • Determining in which situations call for interpretable AI versus explainable (for instance predicting which patients will require dialysis after kidney damage)

Other articles on AI/ML in medicine and healthcare on this Open Access Journal include

Applying AI to Improve Interpretation of Medical Imaging

Real Time Coverage @BIOConvention #BIO2019: Machine Learning and Artificial Intelligence #AI: Realizing Precision Medicine One Patient at a Time

LIVE Day Three – World Medical Innovation Forum ARTIFICIAL INTELLIGENCE, Boston, MA USA, Monday, April 10, 2019

Cardiac MRI Imaging Breakthrough: The First AI-assisted Cardiac MRI Scan Solution, HeartVista Receives FDA 510(k) Clearance for One Click™ Cardiac MRI Package

 

Read Full Post »

Reporter: Stephen J. Williams, PhD

In an announcement televised on C-Span, President Elect Joseph Biden announced his new Science Team to advise on science policy matters, as part of the White House Advisory Committee on Science and Technology. Below is a video clip and the transcript, also available at

https://www.c-span.org/video/?508044-1/president-elect-biden-introduces-white-house-science-team

 

 

COMING UP TONIGHT ON C-SPAN, NEXT, PRESIDENT-ELECT JOE BIDEN AND VICE PRESIDENT-ELECT KAMALA HARRIS ANNOUNCE SEVERAL MEMBERS OF THEIR WHITE HOUSE SCIENCE TEAM. AND THEN SENATE MINORITY LEADER CHUCK SCHUMER TALKS ABOUT THE IMPEACHMENT OF PRESIDENT TRUMP IN THE WEEKLY DEMOCRATIC ADDRESS. AND AFTER THAT, TODAY’S SPEECH BY VICE PRESIDENT MIKE PENCE TO SAILORS AT NAVAL AIR STATION LAMORE IN CALIFORNIA. NEXT, PRESIDENT-ELECT JOE BIDEN AND VICE PRESIDENT-ELECT KAMALA HARRIS ANNOUNCE SEVERAL MEMBERS OF THEIR WHITE HOUSE SCIENCE TEAM. FROM WILMINGTON, DELAWARE, THIS IS ABOUT 40 MINUTES. PRESIDENT-ELECT BIDEN: GOOD AFTERNOON, FOLKS. I WAS TELLING THESE FOUR BRILLIANT SCIENTISTS AS I STOOD IN THE BACK, IN A WAY, THEY — THIS IS THE MOST EXCITING ANNOUNCEMENT THAT I’VE GOTTEN TO MAKE IN THE ENTIRE CABINET RAISED TO A CABINET LEVEL POSITION IN ONE CASE. THESE ARE AMONG THE BRIGHTEST MOST DEDICATED PEOPLE NOT ONLY IN THE COUNTRY BUT THE WORLD. THEY’RE COMPOSED OF SOME OF THE MOST SCIENTIFIC BRILLIANT MINDS IN THE WORLD. WHEN I WAS VICE PRESIDENT AS — I I HAD INTENSE INTEREST IN EVERYTHING THEY WERE DOING AND I PAID ENORMOUS ATTENTION. AND I WOULD — LIKE A KID GOING BACK TO SCHOOL. SIT DOWN AND CAN YOU EXPLAIN TO ME AND THEY WERE — VERY PATIENT WITH ME. AND — BUT AS PRESIDENT, I WANTED YOU TO KNOW I’M GOING TO PAY A GREAT DEAL OF ATTENTION. WHEN I TRAVEL THE WORLD AS VICE PRESIDENT, I WAS OFTEN ASKED TO EXPLAIN TO WORLD LEADERS, THEY ASKED ME THINGS LIKE DEFINE AMERICA. TELL ME HOW CAN YOU DEFINE AMERICA? WHAT’S AMERICA? AND I WAS ON A TIBETAN PLATEAU WITH AT THE TIME WITH XI ZIN PING AND WE HAD AN INTERPRETER CAN I DEFINE AMERICA FOR HIM? I SAID YES, I CAN. IN ONE WORD. POSSIBILITIES. POSSIBILITIES. I THINK IT’S ONE OF THE REASONS WHY WE’VE OCCASIONALLY BEEN REFERRED TO AS UGLY AMERICANS. WE THINK ANYTHING’S POSSIBLE GIVEN THE CHANCE, WE CAN DO ANYTHING. AND THAT’S PART OF I THINK THE AMERICAN SPIRIT. AND WHAT THE PEOPLE ON THIS STAGE AND THE DEPARTMENTS THEY WILL LEAD REPRESENT ENORMOUS POSSIBILITIES. THEY’RE THE ONES ASKING THE MOST AMERICAN OF QUESTIONS, WHAT NEXT? WHAT NEXT? NEVER SATISFIED, WHAT’S NEXT? AND WHAT’S NEXT IS BIG AND BREATHTAKING. HOW CAN — HOW CAN WE MAKE THE IMPOSSIBLE POSSIBLE? AND THEY WERE JUST ASKING QUESTIONS FOR THE SAKE OF QUESTIONS, THEY’RE ASKING THESE QUESTIONS AS CALL TO ACTION. , TO INSPIRE, TO HELP US IMAGINE THE FUTURE AND FIGURE OUT HOW TO MAKE IT REAL AND IMPROVE THE LIVES OF THE AMERICAN PEOPLE AND PEOPLE AROUND THE WORLD. THIS IS A TEAM THAT ASKED US TO IMAGINE EVERY HOME IN AMERICA BEING POWERED BY RENEWABLE ENERGY WITHIN THE NEXT 10 YEARS. OR 3-D IMAGE PRINTERS RESTORING TISSUE AFTER TRAUMATIC INJURIES AND HOSPITALS PRINTING ORGANS FOR ORGAN TRANSPLANTS. IMAGINE, IMAGINE. AND THEY REALLY — AND, YOU KNOW, THEN RALLY, THE SCIENTIFIC COMMUNITY TO GO ABOUT DOING WHAT WE’RE IMAGINING. YOU NEED SCIENCE, DATA AND DISCOVERY WAS A GOVERNING PHILOSOPHY IN THE OBAMA-BIDEN ADMINISTRATION. AND EVERYTHING FROM THE ECONOMY TO THE ENVIRONMENT TO CRIMINAL JUSTICE REFORM AND TO NATIONAL SECURITY. AND ON HEALTH CARE. FOR EXAMPLE, A BELIEF IN SCIENCE LED OUR EFFORTS TO MAP THE HUMAN BRAIN AND TO DEVELOP MORE PRECISE INDIVIDUALIZED MEDICINES. IT LED TO OUR ONGOING MISSION TO END CANCER AS WE KNOW IT, SOMETHING THAT IS DEEPLY PERSONAL TO BOTH MY FAMILY AND KAMALA’S FAMILY AND COUNTLESS FAMILIES IN AMERICA. WHEN PRESIDENT OBAMA ASKED ME TO LEAD THE CANCER MOON SHOT, I KNEW WE HAD TO INJECT A SENSE OF URGENCY INTO THE FIGHT. WE BELIEVED WE COULD DOUBLE THE RATE OF PROGRESS AND DO IN FIVE YEARS WHAT OTHERWISE WOULD TAKE 10. MY WIFE, JILL, AND I TRAVELED AROUND THE COUNTRY AND THE WORLD MEETING WITH THOUSANDS OF CANCER PATIENTS AND THEIR FAMILIES, PHYSICIANS, RESEARCHERS, PHILANTHROPISTS, TECHNOLOGY LEADERS AND HEADS OF STATE. WE SOUGHT TO BETTER UNDERSTAND AND BREAK DOWN THE SILOS AND STOVE PIPES THAT PREVENT THE SHARING OF INFORMATION AND IMPEDE ADVANCES IN CANCER RESEARCH AND TREATMENT WHILE BUILDING A FOCUSED AND COORDINATED EFFORT HERE AT HOME AND ABROAD. WE MADE PROGRESS. BUT THERE’S SO MUCH MORE THAT WE CAN DO. WHEN I ANNOUNCED THAT I WOULD NOT RUN IN 2015 AT THE TIME, I SAID I ONLY HAD ONE REGRET IN THE ROSE GARDEN AND IF I HAD ANY REGRETS THAT I HAD WON, THAT I WOULDN’T GET TO BE THE PRESIDENT TO PRESIDE OVER CANCER AS WE KNOW IT. WELL, AS GOD WILLING, AND ON THE 20TH OF THIS MONTH IN A COUPLE OF DAYS AS PRESIDENT I’M GOING TO DO EVERYTHING I CAN TO GET THAT DONE. I’M GOING TO — GOING TO BE A PRIORITY FOR ME AND FOR KAMALA AND IT’S A SIGNATURE ISSUE FOR JILL AS FIRST LADY. WE KNOW THE SCIENCE IS DISCOVERY AND NOT FICTION. AND IT’S ALSO ABOUT HOPE. AND THAT’S AMERICA. IT’S IN THE D.N.A. OF THIS COUNTRY, HOPE. WE’RE ON THE CUSP OF SOME OF THE MOST REMARKABLE BREAKTHROUGHS THAT WILL FUNDAMENTALLY CHANGE THE WAY OF LIFE FOR ALL LIFE ON THIS PLANET. WE CAN MAKE MORE PROGRESS IN THE NEXT 10 YEARS, I PREDICT, THAN WE’VE MADE IN THE LAST 50 YEARS. AND EXPONENTIAL MOVEMENT. WE CAN ALSO FACE SOME OF THE MOST DIRE CRISES IN A GENERATION WHERE SCIENCE IS CRITICAL TO WHETHER OR NOT WE MEET THE MOMENT OF PERIL AND PROMISE THAT WE KNOW IS WITHIN OUR REACH. IN 1944, FRANKLIN ROOSEVELT ASKED HIS SCIENCE ADVISOR HOW COULD THE UNITED STATES FURTHER ADVANCE SCIENTIFIC RESEARCH IN THE CRITICAL YEARS FOLLOWING THE SECOND WORLD WAR? THE RESPONSE LED TO SOME OF THE MOST GROUND BREAKING DISCOVERIES IN THE LAST 75 YEARS. AND WE CAN DO THAT AGAIN. AND WE CAN DO MORE. SO TODAY, I’M PROUD TO ANNOUNCE A TEAM OF SOME OF THE COUNTRY’S MOST BRILLIANT AND ACCOMPLISHED SCIENTISTS TO LEAD THE WAY. AND I’M ASKING THEM TO FOCUS ON FIVE KEY AREAS. FIRST THE PANDEMIC AND WHAT WE CAN LEARN ABOUT WHAT IS POSSIBLE OR WHAT SHOULD BE POSSIBLE TO ADDRESS THE WIDEST RANGE OF PUBLIC HEALTH NEEDS. SECONDLY, THE ECONOMY, HOW CAN WE BUILD BACK BETTER TO ENSURE PROSPERITY IS FULLY SHARED ALL ACROSS AMERICA? AMONG ALL AMERICANS? AND THIRDLY, HOW SCIENCE HELPS US CONFRONT THIS CLIMATE CRISIS WE FACE IN AMERICA AND THE WORLD BUT IN AMERICA HOW IT HELPS US CONFRONT THE CLIMATE CRISIS WITH AMERICAN JOBS AND INGENUITY. AND FOURTH, HOW CAN WE ENSURE THE UNITED STATES LEADS THE WORLD IN TECHNOLOGIES AND THE INDUSTRIES THAT THE FUTURE THAT WILL BE CRITICAL FOR OUR ECONOMIC PROSPERITY AND NATIONAL SECURITY? ESPECIALLY WITH THE INTENSE INCREASED COMPETITION AROUND THE WORLD FROM CHINA ON? AND FIFTH, HOW CAN WE ASSURE THE LONG-TERM HEALTH AND TRUST IN SCIENCE AND TECHNOLOGY IN OUR NATION? YOU KNOW, THESE ARE EACH QUESTIONS THAT CALL FOR ACTION. AND I’M HONORED TO ANNOUNCE A TEAM THAT IS ANSWERING THE CALL TO SERVE. AS THE PRESIDENTIAL SCIENCE ADVISOR AND DIRECTOR OF THE OFFICE OF SCIENCE AND TECHNOLOGY POLICY, I NOMINATE ONE OF THE MOST BRILLIANT GUYS I KNOW, PERSONS I KNOW, DR. ERIC LANDER. AND THANK YOU, DOC, FOR COMING BACK. THE PIONEER — HE’S A PIONEER IN THE STIFFING COMMUNITY. PRINCIPAL LEADER IN THE HUMAN GENOME PROJECT. AND NOT HYPERBOLE TO SUGGEST THAT DR. LANDER’S WORK HAS CHANGED THE COURSE OF HUMAN HISTORY. HIS ROLE IN HELPING US MAP THE GENOME PULLED BACK THE CURTAIN ON HUMAN DISEASE, ALLOWING SCIENTISTS, EVER SINCE, AND FOR GENERATIONS TO COME TO EXPLORE THE MOLECULAR BASIS FOR SOME OF THE MOST DEVASTATING ILLNESSES AFFECTING OUR WORLD. AND THE APPLICATION OF HIS PIONEERING WORK AS — ARE POISED TO LEAD TO INCREDIBLE CURES AND BREAKTHROUGHS IN THE YEARS TO COME. DR. LANDER NOW SERVES AS THE PRESIDENT AND FOUNDING DIRECTOR OF THE BRODE INSTITUTE AT M.I.T. AND HARVARD, THE WORLD’S FOREMOST NONPROFIT GENETIC RESEARCH ORGANIZATION. AND I CAME TO APPRECIATE DR. LANDER’S EXTRAORDINARY MIND WHEN HE SERVED AS THE CO-CHAIR OF THE PRESIDENT’S COUNCIL ON ADVISORS AND SCIENCE AND TECHNOLOGY DURING THE OBAMA-BIDEN ADMINISTRATION. AND I’M GRATEFUL, I’M GRATEFUL THAT WE CAN WORK TOGETHER AGAIN. I’VE ALWAYS SAID THAT BIDEN-HARRIS ADMINISTRATION WILL ALSO LEAD AND WE’RE GOING TO LEAD WITH SCIENCE AND TRUTH. WE BELIEVE IN BOTH. [LAUGHTER] GOD WILLING OVERCOME THE PANDEMIC AND BUILD OUR COUNTRY BETTER THAN IT WAS BEFORE. AND THAT’S WHY FOR THE FIRST TIME IN HISTORY, I’M GOING TO BE ELEVATING THE PRESIDENTIAL SCIENCE ADVISOR TO A CABINET RANK BECAUSE WE THINK IT’S THAT IMPORTANT. AS DEPUTY DIRECTOR OF THE OFFICE OF SCIENCE AND TECHNOLOGY POLICY AND SCIENCE AND — SCIENCE AND SOCIETY, I APPOINT DR. NELSON. SHE’S A PROFESSOR AT THE INSTITUTE OF ADVANCED STUDIES AT PRINCETON UNIVERSITY. THE PRESIDENT OF THE SOCIAL SCIENCE RESEARCH COUNCIL. AND ONE OF AMERICA’S LEADING SCHOLARS IN THE — AN AWARD-WINNING AUTHOR AND RESEARCHER AND EXPLORING THE CONNECTIONS BETWEEN SCIENCE AND OUR SOCIETY. THE DAUGHTER OF A MILITARY FAMILY, HER DAD SERVED IN THE UNITED STATES NAVY AND HER MOM WAS AN ARMY CRIPPING TO RAFFER. DR. NELSON DEVELOPED A LOVE OF TECHNOLOGY AT A VERY YOUNG AGE PARTICULARLY WITH THE EARLY COMPUTER PRODUCTS. COMPUTING PRODUCTS AND CODE-BREAKING EQUIPMENT THAT EVERY KID HAS AROUND THEIR HOUSE. AND SHE GREW UP WITHIN HER HOME. WHEN I WROTE THAT DOWN, I THOUGHT TO MYSELF, I MEAN, HOW MANY KIDS — ANY WAY, THAT PASSION WAS A PASSION FORGED A LIFELONG CURIOSITY ABOUT THE INEQUITIES AND THE POWER DIAMONDICS THAT SIT BENEATH THE SURFACE OF SCIENTIFIC RESEARCH AND THE TECHNOLOGY WE BUILD. DR. NELSON IS FOCUSED ON THOSE INSIGHTS. AND THE SCIENCE, TECHNOLOGY AND SOCIETY, LIKE FEW BEFORE HER EVER HAVE IN AMERICAN HISTORY. BREAKING NEW GROUND ON OUR UNDERSTANDING OF THE ROLE SCIENCE PLAYS IN AMERICAN LIFE AND OPENING THE DOOR TO — TO A FUTURE WHICH SCIENCE BETTER SERVES ALL PEOPLE. AS CO-CHAIR OF THE PRESIDENT’S COUNCIL ON ADVISORS OF SCIENCE AND TECHNOLOGY,APPOINT DR. FRANCIS ARNOLD, DIRECTOR OF THE ROSE BIOENGINEERING CENTER AT CALTECH AND ONE OF THE WORLD’S LEADING EXPERTS IN PROTEIN ENGINEERING, A LIFE-LONG CHAMPION OF RENEWABLE ENERGY SOLUTIONS WHO HAS BEEN INDUCTED INTO THE NATIONAL INVENTORS’ HALL OF FAME. THAT AIN’T A BAD PLACE TO BE. NOT ONLY IS SHE THE FIRST WOMAN TO BE ELECTED TO ALL THREE NATIONAL ACADEMIES OF SCIENCE, MEDICINE AND ENGINEERING AND ALSO THE FIRST WOMAN, AMERICAN WOMAN, TO WIN A NOBEL PRIZE IN CHEMISTRY. A VERY SLOW LEARNER, SLOW STARTER, THE DAUGHTER OF PITTSBURGH, SHE WORKED AS A CAB DRIVER, A JAZZ CLUB SERVER, BEFORE MAKING HER WAY TO BERKELEY AND A CAREER ON THE LEADING EDGE OF HUMAN DISCOVERY. AND I WANT TO MAKE THAT POINT AGAIN. I WANT — IF ANY OF YOUR CHILDREN ARE WATCHING, LET THEM KNOW YOU CAN DO ANYTHING. THIS COUNTRY CAN DO ANYTHING. ANYTHING AT ALL. AND SO SHE SURVIVED BREAST CANCER, OVERCAME A TRAGIC LOSS IN HER FAMILY WHILE RISING TO THE TOP OF HER FIELD, STILL OVERWHELMINGLY DOMINATED BY MEN. HER PASSION HAS BEEN A STEADFAST COMMITMENT TO RENEWABLE ENERGY FOR THE BETTERMENT OF OUR PLANET AND HUMANKIND. SHE IS AN INSPIRING FIGURE TO SCIENTISTS ACROSS THE FIELD AND ACROSS NATIONS. AND I WANT TO THANK DR. ARNOLD FOR AGREEING TO CO-CHAIR A FIRST ALL WOMAN TEAM TO LEAD THE PRESIDENT’S COUNCIL OF ADVISORS ON SCIENCE AND TECHNOLOGY WHICH LEADS ME TO THE NEXT MEMBER OF THE TEAM. AS CO-CHAIR, THE PRESIDENT’S COUNCIL OF ADVISORS ON SCIENCE AND TECHNOLOGY, I APPOINT DR. MARIE ZUBER. A TRAIL BLAZER BRAISING GEO PHYSICIST AND PLANETARY SCIENTIST A. FORMER CHAIR OF THE NATIONAL SCIENCE BOARD. FIRST WOMAN TO LEAD THE SCIENCE DEPARTMENT AT M.I.T. AND THE FIRST WOMAN TO LEAD NASA’S ROBOTIC PLANETARY MISSION. GROWING UP IN COLE COUNTRY NOT FAR FROM HEAVEN, SCRANTON, PENNSYLVANIA, IN CARBON COUNTY, PENNSYLVANIA, ABOUT 50 MILES SOUTH OF WHERE I WAS A KID, SHE DREAMED OF EXPLORING OUTER SPACE. COULD HAVE TOLD HER SHE WOULD JUST GO TO GREEN REACH IN SCRANTON AND FIND WHERE IT WAS. AND I SHOULDN’T BE SO FLIPPANT. BUT I’M SO EXCITED ABOUT THESE FOLKS. YOU KNOW, READING EVERY BOOK SHE COULD FIND AND LISTENING TO HER MOM’S STORIES ABOUT WATCHING THE EARLIEST ROCKET LAUNCH ON TELEVISION, MARIE BECAME THE FIRST PERSON IN HER FAMILY TO GO TO COLLEGE AND NEVER LET GO OF HER DREAM. TODAY SHE OVERSEES THE LINCOLN LABORATORY AT M.I.T. AND LEADS THE INSTITUTION’S CLIMATE ACTION PLAN. GROWING UP IN COLD COUNTRY, NOT AND FINALLY, COULD NOT BE HERE TODAY, BUT I’M PLEASED TO ANNOUNCE THAT I’VE HAD A LONG CONVERSATION WITH DR. FRANCIS COLLINS AND COULD NOT BE HERE TODAY. AND I’VE ASKED THEM TO STAY ON AS DIRECTOR OF THE INSTITUTE OF HEALTH AND — AT THIS CRITICAL MOMENT. I’VE KNOWN DR. COLLINS FOR MANY YEARS. I WORKED WITH HIM CLOSELY. HE’S BRILLIANT. A PIONEER. A TRUE LEADER. AND ABOVE ALL, HE’S A MODEL OF PUBLIC SERVICE AND I’M HONORED TO BE WORKING WITH HIM AGAIN. AND IT IS — IN HIS ABSENCE I WANT TO THANK HIM AGAIN FOR BEING WILLING TO STAY ON. I KNOW THAT WASN’T HIS ORIGINAL PLAN. BUT WE WORKED AN AWFUL LOT ON THE MOON SHOT AND DEALING WITH CANCER AND I JUST WANT TO THANK HIM AGAIN. AND TO EACH OF YOU AND YOUR FAMILIES, AND I SAY YOUR FAMILIES, THANK YOU FOR THE WILLINGNESS TO SERVE. AND NOT THAT YOU HAVEN’T BEEN SERVING ALREADY BUT TO SERVE IN THE ADMINISTRATION. AND THE AMERICAN PEOPLE, TO ALL THE AMERICAN PEOPLE, THIS IS A TEAM THAT’S GOING TO HELP RESTORE YOUR FAITH IN AMERICA’S PLACE IN THE FRONTIER OF SCIENCE AND DISCOVER AND HOPE. I’M NOW GOING TO TURN THIS OVER STARTING WITH DR. LANDER, TO EACH OF OUR NOMINEES AND THEN WITH — HEAR FROM THE VICE PRESIDENT. BUT AGAIN, JUST CAN’T THANK YOU ENOUGH AND I REALLY MEAN IT. THANK YOU, THANK YOU, THANK YOU FOR WILLING TO DO THIS. DOCTOR, IT’S ALL YOURS. I BETTER PUT MY MASK ON OR I’M GOING TO GET IN TROUBLE.

 

Director’s Page

Read Full Post »

US Responses to Coronavirus Outbreak Expose Many Flaws in Our Medical System

US Responses to Coronavirus Outbreak Expose Many Flaws in Our Medical System

Curator: Stephen J. Williams, Ph.D.

The  coronavirus pandemic has affected almost every country in every continent however, after months of the novel advent of novel COVID-19 cases, it has become apparent that the varied clinical responses in this epidemic (and outcomes) have laid bare some of the strong and weak aspects in, both our worldwide capabilities to respond to infectious outbreaks in a global coordinated response and in individual countries’ response to their localized epidemics.

 

Some nations, like Israel, have initiated a coordinated government-private-health system wide action plan and have shown success in limiting both new cases and COVID-19 related deaths.  After the initial Wuhan China outbreak, China closed borders and the government initiated health related procedures including the building of new hospitals. As of writing today, Wuhan has experienced no new cases of COVID-19 for two straight days.

 

However, the response in the US has been perplexing and has highlighted some glaring problems that have been augmented in this crisis, in the view of this writer.    In my view, which has been formulated after social discussion with members in the field ,these issues can be centered on three major areas of deficiencies in the United States that have hindered a rapid and successful response to this current crisis and potential future crises of this nature.

 

 

  1. The mistrust or misunderstanding of science in the United States
  2. Lack of communication and connection between patients and those involved in the healthcare industry
  3. Socio-geographical inequalities within the US healthcare system

 

1. The mistrust or misunderstanding of science in the United States

 

For the past decade, anyone involved in science, whether directly as active bench scientists, regulatory scientists, scientists involved in science and health policy, or environmental scientists can attest to the constant pressure to not only defend their profession but also to defend the entire scientific process and community from an onslaught of misinformation, mistrust and anxiety toward the field of science.  This can be seen in many of the editorials in scientific publications including the journal Science and Scientific American (as shown below)

 

Stepping Away from Microscopes, Thousands Protest War on Science

Boston rally coincides with annual American Association for the Advancement of Science (AAAS) conference and is a precursor to the March for Science in Washington, D.C.

byLauren McCauley, staff writer

Responding to the troubling suppression of science under the Trump administration, thousands of scientists, allies, and frontline communities are holding a rally in Boston’s Copley Square on Sunday.

#standupforscience Tweets

 

“Science serves the common good,” reads the call to action. “It protects the health of our communities, the safety of our families, the education of our children, the foundation of our economy and jobs, and the future we all want to live in and preserve for coming generations.”

It continues: 

But it’s under attack—both science itself, and the unalienable rights that scientists help uphold and protect. 

From the muzzling of scientists and government agencies, to the immigration ban, the deletion of scientific data, and the de-funding of public science, the erosion of our institutions of science is a dangerous direction for our country. Real people and communities bear the brunt of these actions.

The rally was planned to coincide with the annual American Association for the Advancement of Science (AAAS) conference, which draws thousands of science professionals, and is a precursor to the March for Science in Washington, D.C. and in cities around the world on April 22.

 

Source: https://www.commondreams.org/news/2017/02/19/stepping-away-microscopes-thousands-protest-war-science

https://images.app.goo.gl/UXizCsX4g5wZjVtz9

 

https://www.washingtonpost.com/video/c/embed/85438fbe-278d-11e7-928e-3624539060e8

 

 

The American Association for Cancer Research (AACR) also had marches for public awareness of science and meaningful science policy at their annual conference in Washington, D.C. in 2017 (see here for free recordings of some talks including Joe Biden’s announcement of the Cancer Moonshot program) and also sponsored events such as the Rally for Medical Research.  This patient advocacy effort is led by the cancer clinicians and scientific researchers to rally public support for cancer research for the benefit of those affected by the disease.

Source: https://leadingdiscoveries.aacr.org/cancer-patients-front-and-center/

 

 

     However, some feel that scientists are being too sensitive and that science policy and science-based decision making may not be under that much of a threat in this country. Yet even as some people think that there is no actual war on science and on scientists they realize that the public is not engaged in science and may not be sympathetic to the scientific process or trust scientists’ opinions. 

 

   

From Scientific American: Is There Really a War on Science? People who oppose vaccines, GMOs and climate change evidence may be more anxious than antagonistic

 

Certainly, opponents of genetically modified crops, vaccinations that are required for children and climate science have become louder and more organized in recent times. But opponents typically live in separate camps and protest single issues, not science as a whole, said science historian and philosopher Roberta Millstein of the University of California, Davis. She spoke at a standing-room only panel session at the American Association for the Advancement of Science’s annual meeting, held in Washington, D.C. All the speakers advocated for a scientifically informed citizenry and public policy, and most discouraged broadly applied battle-themed rhetoric.

 

Source: https://www.scientificamerican.com/article/is-there-really-a-war-on-science/

 

      In general, it appears to be a major misunderstanding by the public of the scientific process, and principles of scientific discovery, which may be the fault of miscommunication by scientists or agendas which have the goals of subverting or misdirecting public policy decisions from scientific discourse and investigation.

 

This can lead to an information vacuum, which, in this age of rapid social media communication,

can quickly perpetuate misinformation.

 

This perpetuation of misinformation was very evident in a Twitter feed discussion with Dr. Eric Topol, M.D. (cardiologist and Founder and Director of the Scripps Research Translational  Institute) on the US President’s tweet on the use of the antimalarial drug hydroxychloroquine based on President Trump referencing a single study in the International Journal of Antimicrobial Agents.  The Twitter thread became a sort of “scientific journal club” with input from international scientists discussing and critiquing the results in the paper.  

 

Please note that when we scientists CRITIQUE a paper it does not mean CRITICIZE it.  A critique is merely an in depth analysis of the results and conclusions with an open discussion on the paper.  This is part of the normal peer review process.

 

Below is the original Tweet by Dr. Eric Topol as well as the ensuing tweet thread

 

https://twitter.com/EricTopol/status/1241442247133900801?s=20

 

Within the tweet thread it was discussed some of the limitations or study design flaws of the referenced paper leading the scientists in this impromptu discussion that the study could not reasonably conclude that hydroxychloroquine was not a reliable therapeutic for this coronavirus strain.

 

The lesson: The public has to realize CRITIQUE does not mean CRITICISM.

 

Scientific discourse has to occur to allow for the proper critique of results.  When this is allowed science becomes better, more robust, and we protect ourselves from maybe heading down an incorrect path, which may have major impacts on a clinical outcome, in this case.

 

 

2.  Lack of communication and connection between patients and those involved in the healthcare industry

 

In normal times, it is imperative for the patient-physician relationship to be intact in order for the physician to be able to communicate proper information to their patient during and after therapy/care.  In these critical times, this relationship and good communication skills becomes even more important.

 

Recently, I have had multiple communications, either through Twitter, Facebook, and other social media outlets with cancer patients, cancer advocacy groups, and cancer survivorship forums concerning their risks of getting infected with the coronavirus and how they should handle various aspects of their therapy, whether they were currently undergoing therapy or just about to start chemotherapy.  This made me realize that there were a huge subset of patients who were not receiving all the information and support they needed; namely patients who are immunocompromised.

 

These are patients represent

  1. cancer patient undergoing/or about to start chemotherapy
  2. Patients taking immunosuppressive drugs: organ transplant recipients, patients with autoimmune diseases, multiple sclerosis patients
  3. Patients with immunodeficiency disorders

 

These concerns prompted me to write a posting curating the guidance from National Cancer Institute (NCI) designated cancer centers to cancer patients concerning their risk to COVID19 (which can be found here).

 

Surprisingly, there were only 14 of the 51 US NCI Cancer Centers which had posted guidance (either there own or from organizations like NCI or the National Cancer Coalition Network (NCCN).  Most of the guidance to patients had stemmed from a paper written by Dr. Markham of the Fred Hutchinson Cancer Center in Seattle Washington, the first major US city which was impacted by COVID19.

 

Also I was surprised at the reactions to this posting, with patients and oncologists enthusiastic to discuss concerns around the coronavirus problem.  This led to having additional contact with patients and oncologists who, as I was surprised, are not having these conversations with each other or are totally confused on courses of action during this pandemic.  There was a true need for each party, both patients/caregivers and physicians/oncologists to be able to communicate with each other and disseminate good information.

 

Last night there was a Tweet conversation on Twitter #OTChat sponsored by @OncologyTimes.  A few tweets are included below

https://twitter.com/OncologyTimes/status/1242611841613864960?s=20

https://twitter.com/OncologyTimes/status/1242616756658753538?s=20

https://twitter.com/OncologyTimes/status/1242615906846547978?s=20

 

The Lesson:  Rapid Communication of Vital Information in times of stress is crucial in maintaining a good patient/physician relationship and preventing Misinformation.

 

3.  Socio-geographical Inequalities in the US Healthcare System

It has become very clear that the US healthcare system is fractioned and multiple inequalities (based on race, sex, geography, socio-economic status, age) exist across the whole healthcare system.  These inequalities are exacerbated in times of stress, especially when access to care is limited.

 

An example:

 

On May 12, 2015, an Amtrak Northeast Regional train from Washington, D.C. bound for New York City derailed and wrecked on the Northeast Corridor in the Port Richmond neighborhood of Philadelphia, Pennsylvania. Of 238 passengers and 5 crew on board, 8 were killed and over 200 injured, 11 critically. The train was traveling at 102 mph (164 km/h) in a 50 mph (80 km/h) zone of curved tracks when it derailed.[3]

Some of the passengers had to be extricated from the wrecked cars. Many of the passengers and local residents helped first responders during the rescue operation. Five local hospitals treated the injured. The derailment disrupted train service for several days. 

(Source Wikipedia https://en.wikipedia.org/wiki/2015_Philadelphia_train_derailment)

What was not reported was the difficulties that first responders, namely paramedics had in finding an emergency room capable of taking on the massive load of patients.  In the years prior to this accident, several hospitals, due to monetary reasons, had to close their emergency rooms or reduce them in size. In addition only two in Philadelphia were capable of accepting gun shot victims (Temple University Hospital was the closest to the derailment but one of the emergency rooms which would accept gun shot victims. This was important as Temple University ER, being in North Philadelphia, is usually very busy on any given night.  The stress to the local health system revealed how one disaster could easily overburden many hospitals.

 

Over the past decade many hospitals, especially rural hospitals, have been shuttered or consolidated into bigger health systems.  The graphic below shows this

From Bloomberg: US Hospital Closings Leave Patients with Nowhere to go

 

 

https://images.app.goo.gl/JdZ6UtaG3Ra3EA3J8

 

Note the huge swath of hospital closures in the midwest, especially in rural areas.  This has become an ongoing problem as the health care system deals with rising costs.

 

Lesson:  Epidemic Stresses an already stressed out US healthcare system

 

Please see our Coronavirus Portal at

https://pharmaceuticalintelligence.com/coronavirus-portal/

 

for more up-to-date scientific, clinical information as well as persona stories, videos, interviews and economic impact analyses

and @pharma_BI

Read Full Post »

Real Time Coverage @BIOConvention #BIO2019: Genome Editing and Regulatory Harmonization: Progress and Challenges

Reporter: Stephen J Williams, PhD @StephenJWillia2

 

Genome editing offers the potential of new and effective treatments for genetic diseases. As companies work to develop these treatments, regulators are focused on ensuring that any such products meet applicable safety and efficacy requirements. This panel will discuss how European Union and United States regulators are approaching therapeutic use of genome editing, issues in harmonization between these two – and other – jurisdictions, challenges faced by industry as regulatory positions evolve, and steps that organizations and companies can take to facilitate approval and continued efforts at harmonization.

 

CBER:  because of the nature of these gene therapies, which are mainly orphan, there is expedited review.  Since they started this division in 2015, they have received over 1500 applications.

Spark: Most of the issues were issues with the primary disease not the gene therapy so they had to make new endpoint tests so had talks with FDA before they entered phase III.   There has been great collaboration with FDA,  now they partnered with Novartis to get approval outside US.  You should be willing to partner with EU pharmas to expedite the regulatory process outside US.  In China the process is new and Brazil is behind on their gene therapy guidance.  However there is the new issue of repeat testing of your manufacturing process, as manufacturing of gene therapies had been small scale before. However he notes that problems with expedited review is tough because you don’t have alot of time to get data together.  They were lucky that they had already done a randomized trial.

Sidley Austin:  EU regulatory you make application with advance therapy you don’t have a national option, the regulation body assesses a committee to see if has applicability. Then it goes to a safety committee.  EU has been quicker to approve these advance therapies. Twenty five percent of their applications are gene therapies.  Companies having issues with manufacturing.  There can be issues when the final application is formalized after discussions as problems may arise between discussions, preliminary applications, and final applications.

Sarepta: They have a robust gene therapy program.  Their lead is a therapy for DMD (Duchenne’s Muscular Dystrophy) where affected males die by 25. Japan and EU have different regulatory applications and although they are similar and data can be transferred there is more paperwork required by EU.  The US uses an IND for application. Global feedback is very challenging, they have had multiple meetings around the world and takes a long time preparing a briefing package….. putting a strain on the small biotechs.  No company wants to be either just EU centric or US centric they just want to get out to market as fast as possible.

 

Please follow LIVE on TWITTER using the following @ handles and # hashtags:

@Handles

@pharma_BI

@AVIVA1950

@BIOConvention

# Hashtags

#BIO2019 (official meeting hashtag)

 

 

 

Read Full Post »

Real Time Coverage @BIOConvention #BIO2019: After Trump’s Drug Pricing Blueprint: What Happens Next? A View from Washington; June 3 2019 1:00 PM Philadelphia PA

Reporter: Stephen J. Williams, PhD @StephenJWillia2

 

Speaker: Dan Todd, JD

Dan Todd is the Principal of Todd Strategy, LLC, a consulting firm founded in 2014 and based in Washington, DC. He provides legislative and regulatory strategic guidance and advocacy for healthcare stakeholders impacted by federal healthcare programs.

Prior to Todd Strategy, Mr. Todd was a Senior Healthcare Counsel for the Republican staff of the Senate Finance Committee, the Committee of jurisdiction for the Medicare and Medicaid programs. His areas of responsibility for the committee included the Medicare Part B and Part D programs, which includes physician, medical device, diagnostic and biopharmaceutical issues.

Before joining the Finance Committee, Mr. Todd spent several years in the biotechnology industry, where he led policy development and government affairs strategy. He also represented his companies’ interests with major trade associations such as PhRMA and BIO before federal and state representatives, as well as with key stakeholders such as physician and patient advocacy organizations.

Dan also served as a Special Assistant in the Office of the Administrator at the Centers for Medicare & Medicaid Services (CMS), the federal agency charged with the operation of the Medicare and Medicaid programs. While at CMS, Dan worked on Medicare Part B and Part D issues during the implementation of the Medicare Modernization Act from 2003 to 2005.

Cost efficiencies were never measured.

Removing drug rebates would cost 180 billion over 10 years. CBO came up with similar estimate.  Not sure what Congress will do. It appears they will keep the rebates in.

  • House  Dems are really going after PBMs; anytime the Administration makes a proposal goes right into CBO baseline estimates;  negotiations appear to be in very early stages and estimates are up in the air
  • WH close to meet a budget cap but then broke down in next day; total confusion in DC on budget; healthcare is now held up, especially the REBATE rule; : is a shame as panel agrees cost savings would be huge
  • they had initiated a study to tie the costs of PartB to international drug prices; meant to get at disparity on international drug prices; they currently are only mulling the international price index; other option is to reform Part B;  the proposed models were brought out near 2016 elections so not much done; unified agenda;
  • most of the response of Congress relatively publicly muted; a flat fee program on biologics will have big effect on how physicians and health systems paid; very cat and mouse game in DC around drug pricing
  • administration is thinking of a PartB “inflation cap”;  committees are looking at it seriously; not a rebate;  discussion of tiering of physician payments
  • Ways and Means Cmmtte:  proposing in budget to alleve some stresses on PartB deductable amounts;
  • PartD: looking at ways to shore it up; insurers 80% taxpayers 20% responsible; insurers think it will increase premiums but others think will reduce catastrophic costs; big part of shift in spending in Part D has been this increase in catastrophic costs
  • this week they may actually move through committees on this issue; Administration trying to use the budgetary process to drive this bargain;  however there will have to be offsets so there may be delays in process

Follow or Tweet on Twitter using the following @ and # (hashtags)

@pharma_BI

@AVIVA1950

@BIOConvention

@PCPCC

#BIO2019

#patientcost

#PrimaryCare

 

Other articles on this Open Access Journal on Healthcare Costs, Payers, and Patient Care Include:

The Arnold Relman Challenge: US HealthCare Costs vs US HealthCare Outcomes

Centers for Medicare & Medicaid Services announced that the federal healthcare program will cover the costs of cancer gene tests that have been approved by the Food and Drug Administration

Trends in HealthCare Economics: Average Out-of-Pocket Costs, non-Generics and Value-Based Pricing, Amgen’s Repatha and AstraZeneca’s Access to Healthcare Policies

Can Blockchain Technology and Artificial Intelligence Cure What Ails Biomedical Research and Healthcare

Live Conference Coverage @Medcity Converge 2018 Philadelphia: Oncology Value Based Care and Patient Management

Read Full Post »

LIVE 2019 Petrie-Flom Center Annual Conference: Consuming Genetics: Ethical and Legal Considerations of New Technologies, Friday, May 17, 2019 from 8:00 AM to 5:00 PM EDT

 

Wasserstein Hall, Milstein West (2019)

Petrie-Flom Center

23 Everett St., Rm. 327

Cambridge, MA 02138

https://petrieflom.law.harvard.edu/events/details/2019-petrie-flom-center-annual-conference

This year’s conference is organized in collaboration with Nita A. Farahany, Duke Law School, and Henry T. Greely, Stanford Law School.

REAL TIME Press Coverage for http://pharmaceuticalintelligence.com 

by Aviva Lev-Ari, PhD, RN

Director & Founder, Leaders in Pharmaceutical Business Intelligence (LPBI) Group, Boston

Editor-in-Chief, Open Access Online Scientific Journal, http://pharmaceuticalintelligence.com

Editor-in-Chief, BioMed e-Series, 16 Volumes in Medicine, https://pharmaceuticalintelligence.com/biomed-e-books/

 

@pharma_Bi

@AVIVA1950

 

Logo, Leaders in Pharmaceutical Business Intelligence (LPBI) Group, Boston

Our BioMed e-series

WE ARE ON AMAZON.COM

https://lnkd.in/ekWGNqA

  • Cardiovascular Diseases, Volume Three: Etiologies of Cardiovascular Diseases: Epigenetics, Genetics and Genomics. On Amazon.com since 11/29/2015

http://www.amazon.com/dp/B018PNHJ84

  • VOLUME 1: Genomics Orientations for Personalized Medicine. On Amazon.com since 11/23/2015

http://www.amazon.com/dp/B018DHBUO6

  • VOLUME 2: Latest in Genomics Methodologies for Therapeutics: Gene Editing, NGS & BioInformatics, Simulations and the Genome Ontology – Work-in-Progress

https://pharmaceuticalintelligence.com/biomed-e-books/genomics-orientations-for-personalized-medicine/volume-two-genomics-methodologies-ngs-bioinformatics-simulations-and-the-genome-ontology/

 

 

2019 Petrie-Flom Center Annual Conference: Consuming Genetics:

Ethical and Legal Considerations of New Technologies

AGENDA NOW AVAILABLE! 2019 Petrie-Flom Center Annual Conference image

 May 17, 2019 8:30 AM – 5:15 PM
 Conferences
 2018-2019
Harvard Law School, Wasserstein Hall, Milstein West (2019)
1585 Massachusetts Ave., Cambridge, MA

Register for this event

The Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School is pleased to announce plans for our 2019 annual conference: “Consuming Genetics: The Ethical and Legal Considerations of Consumer Genetic Technologies.” This year’s conference is organized in collaboration with Nita A. Farahany, Duke Law School, and Henry T. Greely, Stanford Law School.

 

Description

Breakthroughs in genetics have often raised complex ethical and legal questions, which loom ever larger as genetic testing is becoming more commonplace, affordable, and comprehensive and genetic editing becomes poised to be a consumer technology. As genetic technologies become more accessible to individuals, the ethical and legal questions around the consumer use of these technologies become more pressing.

Already the global genetic testing and consumer/wellness genomics market was valued at $2.24 billion in 2015 and is expected to double by 2025 to nearly $5 billion. The rise of direct-to-consumer genetic testing and DIY kits raise questions about the appropriate setting for these activities, including a concern that delivering health-related results directly to consumers might cause individuals to draw the wrong medical conclusions. At the same time, advances in CRISPR and other related technologies raise anxieties about the implications of editing our own DNA, especially as access to these technologies explode in the coming years.

In an age where serial killers are caught because their relatives chose to submit DNA to a consumer genealogy database, is genetic privacy for individuals possible? Does the aggregation of data from genetic testing turn people into products by commercializing their data? How might this data reduce or exacerbate already significant health care disparities? How can we prepare for widespread access to genetic editing tools?

As these questions become more pressing, now is the time to re-consider what ethical and regulatory safeguards should be implemented and discuss the many questions raised by advancements in consumer genetics.

This event is free and open to the public, but space is limited and registration is required. Register now!

#DTCgenome

@PetrieFlom

@pharma_BI

@AVIVA1950

Agenda

8:30 – 9:00am, Registration

A continental breakfast will be available.

9:00 – 9:10am, Welcome Remarks

9:10 – 10:10am, Consumer Genetic Technologies: Rights, Liabilities, and Other Obligations

  • Gary Marchant, Regent’s Professor, Sandra Day O’Connor College of Law and Director, Center for Law, Science, and Innovation, Arizona State University (with Mark Barnes, Ellen W. Clayton, and Susan M. Wolf) – Liability Implications of Direct-to-Consumer Genetic Testing
  1. Insurance may not cover BRCA genetic testing even for Patients with diagnosis of Breast cancer
  • Anya Prince, Associate Professor of Law, University of Iowa College of Law and Member of the University of Iowa Genetics Cluster – Consuming Genetics as an Insurance Consumer
  1. Life insurance company initiated genetic testing: (a) Gatekeeping policy underwriting new comer applicants (b) Wellness Employer wellness programs incentivize healthy behavior Incorporate genetic testing into wellness Programs Test for preventing genetic conditions Like BRCA, Lynch syndrome, preventable – win/win proposition –>>> Healthier employees. Studies show shift of cost from employer to employee and employer have access to genetic information of employees.
  • Life Insurance – JH Vitality program, get Apple watch if meet goals, premium is lower – incentive
  • DTC companies beginning to market to Insurance
  • Employment Legal Landscape:
  1. legal regulations
  • Jessica RobertsProfessor, Alumnae College Professor in Law, and Director of the Health Law & Policy Institute, University of Houston Law Center – In Favor of Genetic Conversion: An Argument for Genetic Property Rights
  1. Ownership right to Genetic Property rights of the Information, consented to transfer or abandonment
  2. Conversion – Informed consent
  3. Family not in treatment relationship with the Researcher – Court rejected the claim family donated to research unfair benefir of the Hospital from the data and tissue donated
  4. Claim of conversion – Common Law
  5. Gene by Gene Family Tree DNA
  6. Courts shows a newfound openness to claims for genetic conversion
  7. claims for genetic conversion will not stifle reaserch or create moral harms
  8. consumers genetics, claims for genetic conversion are actually necessary to adequately protect people’s interests in their DNA
  • Moderator: I. Glenn CohenFaculty Director and James A. Attwood and Leslie Williams Professor of Law

10:10 – 10:20am, Break

10:20 – 11:40am, Privacy in the Age of Consumer Genetics

  • Jorge Contreras, Professor, College Of Law and Adjunct Professor, Human Genetics, University of Utah – Direct to Consumer Genetics and Data Ownership
  • Seema MohapatraAssociate Professor of Law, Indiana University Robert H. McKinney School of Law – Abolishing the Myth of “Anonymous” Gamete Donation in the Age of Direct-to-Consumer Genetic Testing
  • Kayte Spector-Bagdady, Assistant Professor, Department of Obstetrics and Gynecology and Chief, Research Ethics Service, Center for Bioethics and Social Sciences in Medicine (CBSSM), University of Michigan Medical School – Improving Commercial Health Data Sharing Policy: Transparency, Accountability, and Ethics for Academic Use of Private Health Data Resources
  • Liza VertinskyAssociate Professor of Law, Emory University School of Law and Emory Global Health Institute Faculty Fellow (with Yaniv Heled) – Genetic Privacy and Public Figures
  • Moderator: Nita FarahanyProfessor of Law and Professor of Philosophy, Duke Law School

11:40am – 12:40pm, Tinkering with Ourselves: The Law and Ethics of DIY Genomics

  • Barbara J. EvansMary Ann & Lawrence E. Faust Professor of Law and Director, Center on Biotechnology & Law, University of Houston Law Center; Professor, Electrical and Computer Engineering, Cullen College of Engineering, University of Houston – Programming Our Genomes, Programming Ourselves: The Moral and Regulatory Limits of Self-Harm When Consumers Wield Genomic Technologies
  • Maxwell J. MehlmanDistinguished University Professor, Arthur E. Petersilge Professor of Law, and Director of the Law-Medicine Center, Case Western Reserve University School of Law, and Professor of Biomedical Ethics, Case Western Reserve University School of Medicine (with Ronald A. Conlon) – Governing Non-Traditional Biology
  • Patricia J. ZettlerAssociate Professor, Center for Law Health and Society, Georgia State University College of Law (with Christi Guerrini and Jacob S. Sherkow) – Finding a Regulatory Balance for Genetic Biohacking
  • Moderator: Henry T. Greely, Director, Center for Law and the Biosciences; Professor (by courtesy) of Genetics, Stanford School of Medicine; Chair, Steering Committee of the Center for Biomedical Ethics; and Director, Stanford Program in Neuroscience and Society, Stanford University

12:40 – 1:20pm, Lunch

Lunch will be provided.

1:20 – 2:20pm, Regulating Consumer Genetic Technologies

  • James Hazelpostdoctoral fellow, Center for Genetic Privacy and Identity in Community Settings (GetPreCiSe), Vanderbilt University Medical Center – Privacy Best Practices for Consumer Genetic Testing Services: Are Industry Efforts at Self-Regulation Sufficient?
  • Scott SchweikartSenior Research Associate, Council on Ethical and Judicial Affairs, American Medical Association and Legal Editor, AMA Journal of Ethics – Human Gene Editing: An Ethical Analysis and Arguments for Regulatory Guidance at Both the National and Global Levels
  • Catherine M. SharkeyCrystal Eastman Professor of Law, NYU School of Law (with Kenneth Offit) – Regulatory Aspects of Direct-to-Consumer Genetic Testing: The Emerging Role of the FDA
  1. Genetic predisposition – BRCA I & II – approved Testing
  2. Pharmaco-genetic Test authorization – incorrect interpretation, incorrect action based on results
  3. Regulatory model, pathway
  4. False positive and False negative BRCA I & II
  5. 23&Me – huge DB, big data who controls the data
  6. Across regulatory – liability issues on who own big data
  • Moderator: Rina Spence, President of SpenceCare International LLC

2:20 – 2:30pm, Break

2:30 – 3:50pm, Consumer Genetics and Identity

  • Kif Augustine-AdamsIvan Meitus Chair and Professor of Law, BYU Law School – Generational Failures of Law and Ethics: Rape, Mormon Orthodoxy, and the Revelatory Power of Ancestry DNA
  1. Complex Sorrows: Anscestry DNA – 20 Millions records. Complete anonymity and privacy collapsed
  • Jonathan KahnJames E. Kelley Chair in Tort Law and Professor of Law, Mitchell-Hamline School of Law – Precision Medicine and the Resurgence of Race in Genomic Medicine
  1. precision medicine – classification of individuals into subpopulations that differ in their susceptability to a particular disease
  2. Blurring DIversity and Genetic Variation, Empirical and Normative Inclusion
  3. NHGRI – underrepresented of diversity in the community of genomics research professional is a socioeconomics issue not a genetics one – underrepresentation in DBs
  4. What does Diversity mean?
  5. Underrepresentation not race: Scientific workforce, recruitment sites recruitment cohort, Ancestry, Genetic variation, responsibilities for disparities
  6. Genetic Diversity rare alleles ->> actionable alleles
  • Emily LargentAssistant Professor, Department of Medical Ethics and Health Policy and Senior Fellow, Leonard Davis Institute of Health Economics, University of Pennsylvania – Losing Our Minds? Direct-to-Consumer Genetic Testing and Alzheimer’s Disease
  1. Protect people and knowledge about one’s disease
  2. AD & APo-E Gene, e-2, e-3, e-4 – Carriers increase risk to AD too 40%
  • Natalie RamAssistant Professor of Law, University of Baltimore School of Law – Genetic Genealogy and the Problem of Familial Forensic Identification
  1. Opt in to share genetic data on the platforms opt in national DB
  2. Genetic relatedness is stickier than social relations
  3. Voluntary sharing of genetic information – no other party can protect genetic information of any person, thu, if shared voluntarily
  4. Geneology is involuntarily disclosure of genetic information
  5. Familial Forensic Identification – Privacy for information held by Telephone companies
  6. Involuntarily Identification by genomic and genetic data genetic markers
  • Moderator: Carmel Shachar, Executive Director, the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics and Lecturer at Law, Harvard Law School
  1. Genetic relatedness

3:50 – 4:00pm, Break

4:00 – 5:00pm, The Impact of Genetic Information

  • Leila Jamal, Genetic Counselor, Division of Intramural Research and Co-Investigator, Centralized Sequencing Initiative, National Institute of Allergy and Infectious Diseases and Affiliated Scholar, Department of Bioethics, National Institutes of Health (with Benjamin Berkman and Will Schupmann) – An Ethical Framework for Genetic Counseling Practice in the Genomic Era
  1. Genetic Counseling – to benefit the patient, positive autonomy, benefiecence – how potentially impactful is the Test Information
  2. Nondirectiveness – Why?
  3. distance from eugenics + abortion politics
  4. persons ans patient autonomy – non-interference
  5. Genetic and Genomics Testing: Prenata, Pediatric, Vancer, other: Cardiology, Neurology, Hematology, Infectious diseases, pharmaco genomics, DTC, Ancestry
  6. Pre- Test Genetic Counseling – information and testing need, indication for testing
  7. Post-Test
  8. Informational Burden low vs high: Likely pathogenic, Pathogenic vs benign – natural history data
  9. potentially high impact – Testing that can reveal an action to be taken
  10. Relation with Patient close vs distant – recommendation based on best evidence +guidelines available
  11. Institutional role of Counselor
  • Emily Qian, Genetic Counselor, Veritas Genetics (with Magalie Leduc, Rebecca Hodges, Bryan Cosca, Ryan Durigan, Laurie McCright, Doug Flood, and Birgit Funke) – Physician-Mediated Elective Whole Genome Sequencing Tests: Impacts on Informed Consent
  1. DTC
  2. Physician-initiated Genetic Testing
  3. Physician-initiated DTC
  4. Informed consent is a process: Topics covered – possible results & consequences
  5. Health Care Provider (HCP) Demographics: Neurology
  6. Analysis: Family Name
  7. Informed consent – who is responsible
  8. Consumers
  • Vardit Ravitsky,@VarditRavitsky  Associate Professor, Bioethics Programs, Department of Social and Preventive Medicine, School of Public Health, University of Montreal; Director, Ethics and Health Branch, Center for Research on Ethics – Non-Invasive Prenatal Whole Genome Sequencing: Ethical and Regulatory Implications for Post-Birth Access to Information
  • Moderator: Melissa UvegesPostdoctoral Fellow, Harvard Medical School Center for Bioethics
  1. Clear conceptual approach
  2. Prioritize privacy/open future banning NIPW vs right to know unrestricted NIPW, prioritizing parental autonomy ->> allowing restrictions to be built in

5:00 – 5:15pm, Closing Remarks

 

Sponsored by the Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics at Harvard Law School with support from the Center for Bioethics at Harvard Medical School and the Oswald DeN. Cammann Fund at Harvard University.

SOURCE

http://petrieflom.law.harvard.edu/events/details/2019-petrie-flom-center-annual-conference

Read Full Post »

%d bloggers like this: