Advertisements
Feeds:
Posts
Comments

Archive for the ‘Electronic Health Record’ Category


Artificial Intelligence and Cardiovascular Disease

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

 

Cardiology is a vast field that focuses on a large number of diseases specifically dealing with the heart, the circulatory system, and its functions. As such, similar symptomatologies and diagnostic features may be present in an individual, making it difficult for a doctor to easily isolate the actual heart-related problem. Consequently, the use of artificial intelligence aims to relieve doctors from this hurdle and extend better quality to patients. Results of screening tests such as echocardiograms, MRIs, or CT scans have long been proposed to be analyzed using more advanced techniques in the field of technology. As such, while artificial intelligence is not yet widely-used in clinical practice, it is seen as the future of healthcare.

 

The continuous development of the technological sector has enabled the industry to merge with medicine in order to create new integrated, reliable, and efficient methods of providing quality health care. One of the ongoing trends in cardiology at present is the proposed utilization of artificial intelligence (AI) in augmenting and extending the effectiveness of the cardiologist. This is because AI or machine-learning would allow for an accurate measure of patient functioning and diagnosis from the beginning up to the end of the therapeutic process. In particular, the use of artificial intelligence in cardiology aims to focus on research and development, clinical practice, and population health. Created to be an all-in-one mechanism in cardiac healthcare, AI technologies incorporate complex algorithms in determining relevant steps needed for a successful diagnosis and treatment. The role of artificial intelligence specifically extends to the identification of novel drug therapies, disease stratification or statistics, continuous remote monitoring and diagnostics, integration of multi-omic data, and extension of physician effectivity and efficiency.

 

Artificial intelligence – specifically a branch of it called machine learning – is being used in medicine to help with diagnosis. Computers might, for example, be better at interpreting heart scans. Computers can be ‘trained’ to make these predictions. This is done by feeding the computer information from hundreds or thousands of patients, plus instructions (an algorithm) on how to use that information. This information is heart scans, genetic and other test results, and how long each patient survived. These scans are in exquisite detail and the computer may be able to spot differences that are beyond human perception. It can also combine information from many different tests to give as accurate a picture as possible. The computer starts to work out which factors affected the patients’ outlook, so it can make predictions about other patients.

 

In current medical practice, doctors will use risk scores to make treatment decisions for their cardiac patients. These are based on a series of variables like weight, age and lifestyle. However, they do not always have the desired levels of accuracy. A particular example of the use of artificial examination in cardiology is the experimental study on heart disease patients, published in 2017. The researchers utilized cardiac MRI-based algorithms coupled with a 3D systolic cardiac motion pattern to accurately predict the health outcomes of patients with pulmonary hypertension. The experiment proved to be successful, with the technology being able to pick-up 30,000 points within the heart activity of 250 patients. With the success of the aforementioned study, as well as the promise of other researches on artificial intelligence, cardiology is seemingly moving towards a more technological practice.

 

One study was conducted in Finland where researchers enrolled 950 patients complaining of chest pain, who underwent the centre’s usual scanning protocol to check for coronary artery disease. Their outcomes were tracked for six years following their initial scans, over the course of which 24 of the patients had heart attacks and 49 died from all causes. The patients first underwent a coronary computed tomography angiography (CCTA) scan, which yielded 58 pieces of data on the presence of coronary plaque, vessel narrowing and calcification. Patients whose scans were suggestive of disease underwent a positron emission tomography (PET) scan which produced 17 variables on blood flow. Ten clinical variables were also obtained from medical records including sex, age, smoking status and diabetes. These 85 variables were then entered into an artificial intelligence (AI) programme called LogitBoost. The AI repeatedly analysed the imaging variables, and was able to learn how the imaging data interacted and identify the patterns which preceded death and heart attack with over 90% accuracy. The predictive performance using the ten clinical variables alone was modest, with an accuracy of 90%. When PET scan data was added, accuracy increased to 92.5%. The predictive performance increased significantly when CCTA scan data was added to clinical and PET data, with accuracy of 95.4%.

 

Another study findings showed that applying artificial intelligence (AI) to the electrocardiogram (ECG) enables early detection of left ventricular dysfunction and can identify individuals at increased risk for its development in the future. Asymptomatic left ventricular dysfunction (ALVD) is characterised by the presence of a weak heart pump with a risk of overt heart failure. It is present in three to six percent of the general population and is associated with reduced quality of life and longevity. However, it is treatable when found. Currently, there is no inexpensive, noninvasive, painless screening tool for ALVD available for diagnostic use. When tested on an independent set of 52,870 patients, the network model yielded values for the area under the curve, sensitivity, specificity, and accuracy of 0.93, 86.3 percent, 85.7 percent, and 85.7 percent, respectively. Furthermore, in patients without ventricular dysfunction, those with a positive AI screen were at four times the risk of developing future ventricular dysfunction compared with those with a negative screen.

 

In recent years, the analysis of big data database combined with computer deep learning has gradually played an important role in biomedical technology. For a large number of medical record data analysis, image analysis, single nucleotide polymorphism difference analysis, etc., all relevant research on the development and application of artificial intelligence can be observed extensively. For clinical indication, patients may receive a variety of cardiovascular routine examination and treatments, such as: cardiac ultrasound, multi-path ECG, cardiovascular and peripheral angiography, intravascular ultrasound and optical coherence tomography, electrical physiology, etc. By using artificial intelligence deep learning system, the investigators hope to not only improve the diagnostic rate and also gain more accurately predict the patient’s recovery, improve medical quality in the near future.

 

The primary issue about using artificial intelligence in cardiology, or in any field of medicine for that matter, is the ethical issues that it brings about. Physicians and healthcare professionals prior to their practice swear to the Hippocratic Oath—a promise to do their best for the welfare and betterment of their patients. Many physicians have argued that the use of artificial intelligence in medicine breaks the Hippocratic Oath since patients are technically left under the care of machines than of doctors. Furthermore, as machines may also malfunction, the safety of patients is also on the line at all times. As such, while medical practitioners see the promise of artificial technology, they are also heavily constricted about its use, safety, and appropriateness in medical practice.

 

Issues and challenges faced by technological innovations in cardiology are overpowered by current researches aiming to make artificial intelligence easily accessible and available for all. With that in mind, various projects are currently under study. For example, the use of wearable AI technology aims to develop a mechanism by which patients and doctors could easily access and monitor cardiac activity remotely. An ideal instrument for monitoring, wearable AI technology ensures real-time updates, monitoring, and evaluation. Another direction of cardiology in AI technology is the use of technology to record and validate empirical data to further analyze symptomatology, biomarkers, and treatment effectiveness. With AI technology, researchers in cardiology are aiming to simplify and expand the scope of knowledge on the field for better patient care and treatment outcomes.

 

References:

 

https://www.news-medical.net/health/Artificial-Intelligence-in-Cardiology.aspx

 

https://www.bhf.org.uk/informationsupport/heart-matters-magazine/research/artificial-intelligence

 

https://www.medicaldevice-network.com/news/heart-attack-artificial-intelligence/

 

https://www.nature.com/articles/s41569-019-0158-5

 

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5711980/

 

www.j-pcs.org/article.asp

http://www.onlinejacc.org/content/71/23/2668

http://www.scielo.br/pdf/ijcs/v30n3/2359-4802-ijcs-30-03-0187.pdf

 

https://www.escardio.org/The-ESC/Press-Office/Press-releases/How-artificial-intelligence-is-tackling-heart-disease-Find-out-at-ICNC-2019

 

https://clinicaltrials.gov/ct2/show/NCT03877614

 

https://www.europeanpharmaceuticalreview.com/news/82870/artificial-intelligence-ai-heart-disease/

 

https://www.frontiersin.org/research-topics/10067/current-and-future-role-of-artificial-intelligence-in-cardiac-imaging

 

https://www.news-medical.net/health/Artificial-Intelligence-in-Cardiology.aspx

 

https://www.sciencedaily.com/releases/2019/05/190513104505.htm

 

Advertisements

Read Full Post »


The Journey of Antibiotic Discovery

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

 

The term ‘antibiotic’ was introduced by Selman Waksman as any small molecule, produced by a microbe, with antagonistic properties on the growth of other microbes. An antibiotic interferes with bacterial survival via a specific mode of action but more importantly, at therapeutic concentrations, it is sufficiently potent to be effective against infection and simultaneously presents minimal toxicity. Infectious diseases have been a challenge throughout the ages. From 1347 to 1350, approximately one-third of Europe’s population perished to Bubonic plague. Advances in sanitary and hygienic conditions sufficed to control further plague outbreaks. However, these persisted as a recurrent public health issue. Likewise, infectious diseases in general remained the leading cause of death up to the early 1900s. The mortality rate shrunk after the commercialization of antibiotics, which given their impact on the fate of mankind, were regarded as a ‘medical miracle’. Moreover, the non-therapeutic application of antibiotics has also greatly affected humanity, for instance those used as livestock growth promoters to increase food production after World War II.

 

Currently, more than 2 million North Americans acquire infections associated with antibiotic resistance every year, resulting in 23,000 deaths. In Europe, nearly 700 thousand cases of antibiotic-resistant infections directly develop into over 33,000 deaths yearly, with an estimated cost over €1.5 billion. Despite a 36% increase in human use of antibiotics from 2000 to 2010, approximately 20% of deaths worldwide are related to infectious diseases today. Future perspectives are no brighter, for instance, a government commissioned study in the United Kingdom estimated 10 million deaths per year from antibiotic resistant infections by 2050.

 

The increase in antibiotic-resistant bacteria, alongside the alarmingly low rate of newly approved antibiotics for clinical usage, we are on the verge of not having effective treatments for many common infectious diseases. Historically, antibiotic discovery has been crucial in outpacing resistance and success is closely related to systematic procedures – platforms – that have catalyzed the antibiotic golden age, namely the Waksman platform, followed by the platforms of semi-synthesis and fully synthetic antibiotics. Said platforms resulted in the major antibiotic classes: aminoglycosides, amphenicols, ansamycins, beta-lactams, lipopeptides, diaminopyrimidines, fosfomycins, imidazoles, macrolides, oxazolidinones, streptogramins, polymyxins, sulphonamides, glycopeptides, quinolones and tetracyclines.

 

The increase in drug-resistant pathogens is a consequence of multiple factors, including but not limited to high rates of antimicrobial prescriptions, antibiotic mismanagement in the form of self-medication or interruption of therapy, and large-scale antibiotic use as growth promotors in livestock farming. For example, 60% of the antibiotics sold to the USA food industry are also used as therapeutics in humans. To further complicate matters, it is estimated that $200 million is required for a molecule to reach commercialization, with the risk of antimicrobial resistance rapidly developing, crippling its clinical application, or on the opposing end, a new antibiotic might be so effective it is only used as a last resort therapeutic, thus not widely commercialized.

 

Besides a more efficient management of antibiotic use, there is a pressing need for new platforms capable of consistently and efficiently delivering new lead substances, which should attend their precursors impressively low rates of success, in today’s increasing drug resistance scenario. Antibiotic Discovery Platforms are aiming to screen large libraries, for instance the reservoir of untapped natural products, which is likely the next antibiotic ‘gold mine’. There is a void between phenotanypic screening (high-throughput) and omics-centered assays (high-information), where some mechanistic and molecular information complements antimicrobial activity, without the laborious and extensive application of various omics assays. The increasing need for antibiotics drives the relentless and continuous research on the foreground of antibiotic discovery. This is likely to expand our knowledge on the biological events underlying infectious diseases and, hopefully, result in better therapeutics that can swing the war on infectious diseases back in our favor.

 

During the genomics era came the target-based platform, mostly considered a failure due to limitations in translating drugs to the clinic. Therefore, cell-based platforms were re-instituted, and are still of the utmost importance in the fight against infectious diseases. Although the antibiotic pipeline is still lackluster, especially of new classes and novel mechanisms of action, in the post-genomic era, there is an increasingly large set of information available on microbial metabolism. The translation of such knowledge into novel platforms will hopefully result in the discovery of new and better therapeutics, which can sway the war on infectious diseases back in our favor.

 

References:

 

https://www.mdpi.com/2079-6382/8/2/45/htm

 

https://www.ncbi.nlm.nih.gov/pubmed/19515346

 

https://www.ajicjournal.org/article/S0196-6553(11)00184-2/fulltext

 

https://www.ncbi.nlm.nih.gov/pubmed/21700626

 

http://www.med.or.jp/english/journal/pdf/2009_02/103_108.pdf

 

Read Full Post »

Digital Therapeutics: A Threat or Opportunity to Pharmaceuticals


Digital Therapeutics: A Threat or Opportunity to Pharmaceuticals

Reporter and Curator: Dr. Sudipta Saha, Ph.D.

 

Digital Therapeutics (DTx) have been defined by the Digital Therapeutics Alliance (DTA) as “delivering evidence based therapeutic interventions to patients, that are driven by software to prevent, manage or treat a medical disorder or disease”. They might come in the form of a smart phone or computer tablet app, or some form of a cloud-based service connected to a wearable device. DTx tend to fall into three groups. Firstly, developers and mental health researchers have built digital solutions which typically provide a form of software delivered Cognitive-Behaviour Therapies (CBT) that help patients change behaviours and develop coping strategies around their condition. Secondly there are the group of Digital Therapeutics which target lifestyle issues, such as diet, exercise and stress, that are associated with chronic conditions, and work by offering personalized support for goal setting and target achievement. Lastly, DTx can be designed to work in combination with existing medication or treatments, helping patients manage their therapies and focus on ensuring the therapy delivers the best outcomes possible.

 

Pharmaceutical companies are clearly trying to understand what DTx will mean for them. They want to analyze whether it will be a threat or opportunity to their business. For a long time, they have been providing additional support services to patients who take relatively expensive drugs for chronic conditions. A nurse-led service might provide visits and telephone support to diabetics for example who self-inject insulin therapies. But DTx will help broaden the scope of support services because they can be delivered cost-effectively, and importantly have the ability to capture real-world evidence on patient outcomes. They will no-longer be reserved for the most expensive drugs or therapies but could apply to a whole range of common treatments to boost their efficacy. Faced with the arrival of Digital Therapeutics either replacing drugs, or playing an important role alongside therapies, pharmaceutical firms have three options. They can either ignore DTx and focus on developing drug therapies as they have done; they can partner with a growing number of DTx companies to develop software and services complimenting their drugs; or they can start to build their own Digital Therapeutics to work with their products.

 

Digital Therapeutics will have knock-on effects in health industries, which may be as great as the introduction of therapeutic apps and services themselves. Together with connected health monitoring devices, DTx will offer a near constant stream of data about an individuals’ behavior, real world context around factors affecting their treatment in their everyday lives and emotional and physiological data such as blood pressure and blood sugar levels. Analysis of the resulting data will help create support services tailored to each patient. But who stores and analyses this data is an important question. Strong data governance will be paramount to maintaining trust, and the highly regulated pharmaceutical industry may not be best-placed to handle individual patient data. Meanwhile, the health sector (payers and healthcare providers) is becoming more focused on patient outcomes, and payment for value not volume. The future will say whether pharmaceutical firms enhance the effectiveness of drugs with DTx, or in some cases replace drugs with DTx.

 

Digital Therapeutics have the potential to change what the pharmaceutical industry sells: rather than a drug it will sell a package of drugs and digital services. But they will also alter who the industry sells to. Pharmaceutical firms have traditionally marketed drugs to doctors, pharmacists and other health professionals, based on the efficacy of a specific product. Soon it could be paid on the outcome of a bundle of digital therapies, medicines and services with a closer connection to both providers and patients. Apart from a notable few, most pharmaceutical firms have taken a cautious approach towards Digital Therapeutics. Now, it is to be observed that how the pharmaceutical companies use DTx to their benefit as well as for the benefit of the general population.

 

References:

 

https://eloqua.eyeforpharma.com/LP=23674?utm_campaign=EFP%2007MAR19%20EFP%20Database&utm_medium=email&utm_source=Eloqua&elqTrackId=73e21ae550de49ccabbf65fce72faea0&elq=818d76a54d894491b031fa8d1cc8d05c&elqaid=43259&elqat=1&elqCampaignId=24564

 

https://www.s3connectedhealth.com/resources/white-papers/digital-therapeutics-pharmas-threat-or-opportunity/

 

http://www.pharmatimes.com/web_exclusives/digital_therapeutics_will_transform_pharma_and_healthcare_industries_in_2019._heres_how._1273671

 

https://www.mckinsey.com/industries/pharmaceuticals-and-medical-products/our-insights/exploring-the-potential-of-digital-therapeutics

 

https://player.fm/series/digital-health-today-2404448/s9-081-scaling-digital-therapeutics-the-opportunities-and-challenges

 

Read Full Post »


Healthcare conglomeration to access Big Data and lower costs

Curator: Larry H. Bernstein, MD, FCAP 

 

 

UPDATED on 3/17/2019

https://www.medpagetoday.com/cardiology/prevention/78202?xid=nl_mpt_SRCardiology_2019-02-25&eun=g99985d0r&utm_source=Sailthru&utm_medium=email&utm_campaign=CardioUpdate_022519&utm_term=NL_Spec_Cardiology_Update_Active

Medicare Advantage plans may be driving up quality of care in terms of preventive treatment for coronary artery disease patients, but that has had little impact on outcomes compared with fee-for-service Medicare, researchers reported in JAMA Cardiology.

 

The expected benefits are not as easily realized as anticipated.   The problem of access to data sources is not as difficult as the content needed for evaluation.

 

Healthcare Big Data Drives a New Round of Collaborations between Hospitals, Health Systems, and Care Management Companies

DARK DAILY   DARK DAILY info@darkreport.com

 

January 13, 2016

Recently-announced partnerships want to use big data to improve patient outcomes and lower costs; clinical laboratory test data will have a major role in these efforts

In the race to use healthcare big data to improve patient outcomes, several companies are using acquisitions and joint ventures to beef up and gain access to bigger pools of data. Pathologists and clinical laboratory managers have an interest in this trend, because medical laboratory test data will be a large proportion of the information that resides in these huge healthcare databases.

For health systems that want to be players in the healthcare big data market, one strategy is to do arisk-sharing venture with third-party care-management companies. This allows the health systems to leverage their extensive amounts of patient data while benefiting from the expertise of their venture partners.

Cardinal Health Acquires 71% Interest in naviHealth

One company that wants to work with hospitals and health systems in these types of arrangements is Cardinal Health. It recently acquired a 71% interest in Nashville-based naviHealth. This company partners with health plans, health systems, physicians, and post-acute providers to manage the entire continuum of post-acute care (PAC), according to a news release on the naviHealth website. NaviHealth’s business model involves sharing the financial risk with its clients and leveraging big data to predict best outcomes and lower costs.

“We created an economic model to take on the entire post-acute-care episode,” declared naviHealth CEO and President Clay Richards in a company news release. “It’s leveraging the technology and analytics to create individual care protocols.”

Click here to see image

“The most basic, and the most important, thing is … they [Cardinal Health] share the same core values as we do, which is to be on the right side of healthcare,” naviHealth CEO Clay Richards told The Tennessean. “It’s about how you deliver better outcomes for patients with lower costs: How do you solve the problems [with growing costs]? That’s what we and Cardinal define as being on the right side of healthcare.” (Caption and image copyright: The Tennessean.)

Provider Investments Signal Continuation of Trend

Cardinal Health intends to combine its ability to reduce costs while providing effective care with naviHealth’s evidence-based, personalized post-acute-care plans. This is one approach to harness the power of big data to improve patient care. One goal is focus this expertise on post-acute care, which is one of Medicare’s quality measures.

Patients and their families often are unsure of what to expect after being discharged. And, according to an article published in Kaiser Health News, a 2013 Institutes of Medicine (IOM) report noted a link between the quality of post-acute care and healthcare spending following the discharge of Medicare patients.

However, maximizing the use of healthcare big data requires the participation of multiple stakeholders. Information scientists, hospital administrators, software developers, insurers, clinicians, and patients themselves must all perform a role in order for big data to reach its full potential. No single sector will be able to bring the benefits of big data to fruition; rather collaboration and partnerships will be necessary.

Other Collaborations and Alliances Target Healthcare Big Data

Two other organizations engaged in a similar collaboration are the Mayo Clinic andOptum360, a revenue management services company that focuses on simplifying and streamlining the revenue cycle process. In a press release, the companies announced that they were partnering to “develop new revenue management services capabilities aimed at improving patient experiences and satisfaction while reducing administrative costs for healthcare providers.” (See Dark Daily, “When It Comes to Mining Healthcare Big Data, Including Medical Laboratory Test Results, Optum Labs Is the Company to Watch,” December 14, 2015.)

In order to accomplish this, Mayo will have to share its revenue cycle management (RCM) data with Optum360, which will use the data to devise improved revenue cycle processes and systems.

“What we’re trying to find out, if we can, is what does healthcare cost, and what of that spend really adds value to a patient’s outcome over time, especially with these high-impact diseases,” stated Mayo Clinic President and CEO John Noseworthy, MD, in a story published by the Star Tribune. He was referencing another big data project Mayo is engaged in with UnitedHealth Group. “Ultimately, we as a country have to figure this out, so people can have access to high-quality care and it doesn’t bankrupt them or the country.”

Click here to see image

Mayo Clinic President and CEO John Noseworthy, MD, believes big data may be the key to transforming healthcare costs by informing clinical decision-making and altering patient outcomes. (Photo copyright: Mayo Clinic.)

Another interesting healthcare big data partnership is the Pittsburgh Health Data Alliance (The Alliance). It involves a collaboration between Carnegie Mellon University (CMU), the University of Pittsburgh (PITT), and the University of Pittsburgh Medical Center (UPMC). The aim of The Alliance is to take raw data from wearable devices, insurance records, medical appointments, as well as other common sources, and develop ways to improve the health of individuals and the wider community.

The common thread among all these collaborative efforts is a desire to improve outcomes while reducing costs. This is the promise of healthcare big data. And no matter which direction the effort takes, clinical laboratories, which generate a vast amount of critical health data, are in a good position to play important roles involving the contribution of lab test data and identifying ways to use healthcare big data projects to improve patient care.
—Dava Stewart

Read Full Post »


N3xt generation carbon nanotubes

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Skyscraper-style carbon-nanotube chip design ‘boosts electronic performance by factor of a thousand’

http://www.kurzweilai.net/skyscraper-style-carbon-nanotube-chip-design-boosts-electronic-performance-by-factor-of-a-thousand

 

 

http://www.kurzweilai.net/images/computing-high-rise-architecture.jpg

A new revolutionary high-rise architecture for computing (credit: Stanford University)

 

Researchers at Stanford and three other universities are creating a revolutionary new skyscraper-like high-rise architecture for computing based on carbon nanotube materials instead of silicon.

In Rebooting Computing, a special issue (in press) of the IEEE Computer journal, the team describes its new approach as “Nano-Engineered Computing Systems Technology,” or N3XT.

Suburban-style chip layouts create long commutes and regular traffic jams in electronic circuits, wasting time and energy, they note.

N3XT will break data bottlenecks by integrating processors and memory-like floors in a skyscraper and by connecting these components with millions of “vias,” which play the role of tiny electronic elevators.

The N3XT high-rise approach will move more data, much faster, using far less energy, than would be possible using low-rise circuits, according to the researchers.

Stanford researchers including Associate Professor Subhasish Mitra and Professor H.-S. Philip Wong have “assembled a group of top thinkers and advanced technologies to create a platform that can meet the computing demands of the future,” Mitra says.

“When you combine higher speed with lower energy use, N3XT systems outperform conventional approaches by a factor of a thousand,” Wong claims.

Carbon nanotube transistors

Engineers have previously tried to stack silicon chips but with limited success, the researchers suggest. Fabricating a silicon chip requires temperatures close to 1,800 degrees Fahrenheit, making it extremely challenging to build a silicon chip atop another without damaging the first layer. The current approach to what are called 3-D, or stacked, chips is to construct two silicon chips separately, then stack them and connect them with a few thousand wires.

But conventional 3-D silicon chips are still prone to traffic jams and it takes a lot of energy to push data through what are a relatively few connecting wires.

The N3XT team is taking a radically different approach: building layers of processors and memory directly atop one another, connected by millions of vias that can move more data over shorter distances that traditional wire, using less energy, and immersing computation and memory storage into an electronic super-device.

The key is the use of non-silicon materials that can be fabricated at much lower temperatures than silicon, so that processors can be built on top of memory without the new layer damaging the layer below. As in IBM’s recent chip breakthrough (see “Method to replace silicon with carbon nanotubes developed by IBM Research“), N3XT chips are based on carbon nanotube transistors.

Transistors are fundamental units of a computer processor, the tiny on-off switches that create digital zeroes and ones. CNTs are faster and more energy-efficient than silicon processors, and much thinner. Moreover, in the N3XT architecture, they can be fabricated and placed over and below other layers of memory.

Among the N3XT scholars working at this nexus of computation and memory are Christos Kozyrakis and Eric Pop of Stanford, Jeffrey Bokor and Jan Rabaey of the University of California, Berkeley, Igor Markov of the University of Michigan, and Franz Franchetti and Larry Pileggi of Carnegie Mellon University.

New storage technologies 

Team members also envision using data storage technologies that rely on materials other than silicon. This would allow for the new materials to be manufactured on top of CNTs, using low-temperature fabrication processes.

One such data storage technology is called resistive random-access memory, or RRAM (see “‘Memristors’ based on transparent electronics offer technology of the future“). Resistance slows down electrons, creating a zero, while conductivity allows electrons to flow, creating a one. Tiny jolts of electricity switch RRAM memory cells between these two digital states. N3XT team members are also experimenting with a variety of nanoscale magnetic storage materials.

Just as skyscrapers have ventilation systems, N3XT high-rise chip designs incorporate thermal cooling layers. This work, led by Stanford mechanical engineers Kenneth Goodson and Mehdi Asheghi, ensures that the heat rising from the stacked layers of electronics does not degrade overall system performance.

Mitra and Wong have already demonstrated a working prototype of a high-rise chip. At the International Electron Devices Meeting in December 2014 they unveiled a four-layered chip made up of two layers of RRAM memory sandwiched between two layers of CNTs (see “Stanford engineers invent radical ‘high-rise’ 3D chips“).

In their N3XT paper they ran simulations showing how their high-rise approach was a thousand times more efficient in carrying out many important and highly demanding industrial software applications.

 

 

references:

  • M. Aly, M. Gao, G. Hills, C-S Lee, G. Pitner, M. Shulaker, T. Wu, M. Asheghi, J. Bokor, F. Franchetti, K. Goodson, C. Kozyrakis, I. Markov, K. Olukotun, L. Pileggi, E. Pop, J. Rabaey, C. Ré, H.-S.P. Wong and S. Mitra, Energy-Efficient Abundant-Data Computing: The N3XT 1,000X. IEEE Computer, Special Issue on Rebooting Computing, Dec. 2015 (in press)

This is a little off-thread, but it does involve carbon, so this is the only place I can squeeze this prediction in…
The big news tonight is that the lithium batteries in electric skate boards called “hoverboards” are catching on fire.
This is going to take a lot of them off of the market. But…they will rise again when carbon batteries with graphene and CNTs and boron nitride and I don’t know what else make batteries run as cool as these high-rise CNT chips.
In fact, these high-rise CNT chips will most likely be part of the power control and distribution in the N3XT generation of hoverboards.

 

I am really curious what will be the next computing paradigm. Personally, I see more than one type of computers being used at the same time : neuromorphic, optical, electronic von Neumann nanotube transistors 3D chips, quantum for some purposes and also magnetic storage (memristors).
What is even more interesting, Mr Kurzweil wrote and said that nanotube high-rise computers like described in the article here will be dominant type of computers in the 20s. It may or may not be true.

 

Oh for sure, Kynareth, the first von Neumann machines will do their work with these processors. But memristors should also be a part of them.
I suspect that high-rise cnt chips will indeed be dominant as soon as they come to market. This will give you the N3XT “Field of Dreams.”
If you build it they will come. As soon as it is produced in mass quantities, everybody will come and beat down the doors to get them.

 

This is fantastic. I want the next, or the third next gen of this to be a molecularly built up block of circuitry. This article shows advances using the nano-tube connector that has been experimented before, and also the replacement overall in the circuitry. I only suggest that further generation because it resembles the “molybloc circuits”-bricks and tiles of densely packed circuitry, molybloc standing for molecular block circuitry, found in the Honor Harringoton Sci Fi series (David Weber).

 

Stanford engineers invent radical ‘high-rise’ 3D chips

http://www.kurzweilai.net/stanford-engineers-invent-radical-high-rise-3d-chips

http://www.kurzweilai.net/images/high-rise-chip.jpg

A four-layer prototype high-rise chip built by Stanford engineers. The bottom and top layers are logic transistors. Sandwiched between them are two layers of memory. The vertical tubes are nanoscale electronic “elevators” that connect logic and memory, allowing them to work together efficiently. (Credit: Max Shulaker, Stanford)

 

Stanford engineers have build 3D “high-rise” chips that could leapfrog the performance of the single-story logic and memory chips on today’s circuit cards, which are subject to frequent traffic jams between logic and memory.

The Stanford approach would attempt to end these jams by building layers of logic atop layers of memory to create a tightly interconnected high-rise chip. Many thousands of nanoscale electronic “elevators” would move data between the layers much faster, using less electricity, than the bottleneck-prone wires connecting single-story logic and memory chips today.

The work is led by Subhasish Mitra, a Stanford associate professor of electrical engineering and of computer science, and H.-S. Philip Wong, the Williard R. and Inez Kerr Bell Professor in Stanford’s School of Engineering. They describe their new high-rise chip architecture in a paper being presented at the IEEE International Electron Devices Meeting Dec. 15–17.

The researchers’ innovation leverages three breakthroughs: a new technology for creating transistors using nanotubes, a new type of computer memory that lends itself to multi-story fabrication, and a technique to build these new logic and memory technologies into high-rise structures in a radically different way than previous efforts to stack chips.

“This research is at an early stage, but our design and fabrication techniques are scalable,” Mitra said. “With further development this architecture could lead to computing performance that is much, much greater than anything available today.” Wong said the prototype chip unveiled at IEDM shows how to put logic and memory together into three-dimensional structures that can be mass-produced.

“Paradigm shift is an overused concept, but here it is appropriate,” Wong said. “With this new architecture, electronics manufacturers could put the power of a supercomputer in your hand.”

Overcoming silicon heat

Researchers have been trying to solve a major problem with chip-generated heat by creating carbon nanotubes (CNT)  transistors. Mitra and Wong are presenting a second paper at the conference showing how their team made some of the highest performance CNT transistors ever built.

 

Image of a CNT-based field-effect transistor (FET) using a new high-density process (credit: Mitra/Wong Lab, Stanford)

http://www.kurzweilai.net/images/CNTFET.jpg

Until now the standard process used to grow CNTs did not create sufficient density. The Stanford engineers solved this problem  an ingenious technique. They started by growing CNTs the standard way, on round quartz wafers. Then they created a metal film that acts like a tape. Using this adhesive process, they lifted an entire crop of CNTs off the quartz growth medium and placed it onto a silicon wafer that would become the foundation of their high-rise chip.

They repeated this process 13 times, achieving some of the highest density, highest performance CNTs ever made. Moreover, the Stanford team showed that they could perform this technique on more than one layer of logic as they created their high-rise chip.

 

RRAM memory

Left: Today’s single-story electronic circuit cards, where logic and memory chips exist as separate structures connected by wires, can get jammed with digital traffic between logic and memory. Right: layers of logic and memory create skyscraper chips where data would move up and down on nanoscale “elevators” to avoid traffic jams. (Credit: Mitra/Wong Lab, Stanford)

Wong is a world leader in a new memory technology called “resistive random access memory” (RRAM) which he unveiled at last year’s IEDM conference.

Unlike today’s memory chips, this new storage technology is not based on silicon, but titanium nitride, hafnium oxide and platinum. This formed a metal/oxide/metal sandwich. Applying electricity to this three-metal sandwich one way causes it to resist the flow of electricity. Reversing the polarity causes the structure to conduct electricity again.

The change from resistive to conductive states is how this new memory technology creates digital zeroes and ones.

RRAM uses less energy than current memory, leading to prolonged battery life in mobile devices. Inventing this new memory technology was also the key to creating the high-rise chip because RRAM can be made at much lower temperatures than silicon memory.

Interconnected layers

Max Shulaker and Tony Wu, Stanford graduate students in electrical engineering, created the techniques behind the four-story high-rise chip unveiled at the conference.

The low-heat process for making RRAM and CNTs enabled them to fabricate each layer of memory directly atop each layer of CNT logic. While making each memory layer, they were able to drill thousands of interconnections into the logic layer below. This multiplicity of connections is what enables the high-rise chip to avoid the traffic jams on conventional circuit cards.

There is no way to tightly interconnect layers using today’s conventional silicon-based logic and memory. That’s because it takes so much heat to build a layer of silicon memory — about 1,000 degrees Celsius — that any attempt to do so would melt the logic below.

Previous efforts to stack silicon chips could save space but not avoid the digital traffic jams. That’s because each layer would have to be built separately and connected by wires — which would still be prone to traffic jams, unlike the nanoscale elevators in the Stanford design.

 

 

‘Memristors’ based on transparent electronics offer technology of the future

Memristors are faster, smaller, and use less power than non-volatile flash memory

Transparent electronics (pioneered at Oregon State University) may find one of their newest applications as a next-generation replacement for some uses of non-volatile flash memory, a multi-billion dollar technology nearing its limit of small size and information storage capacity.

Researchers at OSU have confirmed that zinc tin oxide, an inexpensive and environmentally benign compound,  could provide a new, transparent technology where computer memory is based on resistance, instead of an electron charge.

This resistive random access memory, or RRAM, is referred to by some researchers as a “memristor.” Products using this approach could become even smaller, faster and cheaper than the silicon transistors that have revolutionized modern electronics — and transparent as well.

Transparent electronics offer potential for innovative products that don’t yet exist, like information displayed on an automobile windshield, or surfing the web on the glass top of a coffee table.

“Flash memory has taken us a long way with its very small size and low price,” said John Conley, a professor in the OSU School of Electrical Engineering and Computer Science. “But it’s nearing the end of its potential, and memristors are a leading candidate to continue performance improvements.”

Memristors: faster than flash  

Memristors have a simple structure, are able to program and erase information rapidly, and consume little power. They accomplish a function similar to transistor-based flash memory, but with a different approach. Whereas traditional flash memory stores information with an electrical charge, RRAM accomplishes this with electrical resistance. Like flash, it can store information as long as it’s needed.

Flash memory computer chips are ubiquitous in almost all modern electronic products, ranging from cell phones and computers to video games and flat panel televisions.

Thin-film transistors that control liquid crystal displays

Some of the best opportunities for these new amorphous oxide semiconductors are not so much for memory chips, but with thin-film, flat panel displays, researchers say. Private industry has already shown considerable interest in using them for the thin-film transistors that control liquid crystal displays, and one compound approaching commercialization is indium gallium zinc oxide.

But indium and gallium are getting increasingly expensive, and zinc tin oxide — also a transparent compound — appears to offer good performance with lower cost materials. The new research also shows that zinc tin oxide can be used not only for thin-film transistors, but also for memristive memory, Conley said, an important factor in its commercial application.

More work is needed to understand the basic physics and electrical properties of the new compounds, researchers said.

This research was supported by the U.S. Office of Naval Research, the National Science Foundation and the Oregon Nanoscience and Microtechnologies Institute.

 

Resistive switching in zinc–tin-oxide

Santosh MuraliaJaana S. RajachidambarambSeung-Yeol HanbChih-Hung ChangbGregory S. HermanbJohn F. Conley Jr.a,

Bipolar resistive switching is demonstrated in the amorphous oxide semiconductor zinc–tin-oxide (ZTO). A gradual forming process produces improved switching uniformity. Al/ZTO/Pt crossbar devices show switching ratios greater than 103, long retention times, and good endurance. The resistive switching in these devices is consistent with a combined filamentary/interfacial mechanism. Overall, ZTO shows great potential as a low cost material for embedding memristive memory with thin film transistor logic for large area electronics.


Highlights

► We present the first report of resistive switching in zinc–tin-oxide (ZTO). ► ZTO is the leading alternative material to IGZO for TFTs for LCDs. ► ZTO has an advantage over IGZO of lower cost due to the absence of In and Ga. ► Al/ZTO/Pt crossbar RRAM devices show switching ratios greater than 103. ► ZTO shows promise for embedding RRAM with TFT logic for large area electronics.

MemristorRRAMResistive switchingAmorphous oxide semiconductorsTransparent electronics; Zinc–tin-oxide

Plot of log|current| vs. top electrode voltage for a 50μm×50μm device with an ...

http://ars.els-cdn.com/content/thumbimage/1-s2.0-S0038110112002304-gr1.sml

 

Method to replace silicon with carbon nanotubes developed by IBM Research

Could work down to the 1.8 nanometer node in the future
Schematic of a set of molybdenum (M0) end-contacted nanotube transistors (credit: Qing Cao et al./Science)

IBM Research has announced a “major engineering breakthrough” that could lead to carbon nanotubes replacing silicon transistors in future computing technologies.

As transistors shrink in size, electrical resistance increases within the contacts, which impedes performance. So IBM researchers invented a metallurgical process similar to microscopic welding that chemically binds the contact’s metal (molybdenum) atoms to the carbon atoms at the ends of nanotubes.

The new method promises to shrink transistor contacts without reducing performance of carbon-nanotube devices, opening a pathway to dramatically faster, smaller, and more powerful computer chips beyond the capabilities of traditional silicon semiconductors.

“This is the kind of breakthrough that we’re committed to making at IBM Research via our $3 billion investment over 5 years in research and development programs aimed a pushing the limits of chip technology,” said Dario Gil, VP, Science & Technology, IBM Research. “Our aim is to help IBM produce high-performance systems capable of handling the extreme demands of new data analytics and cognitive computing applications.”

The development was reported today in the October 2 issue of the journal Science.

 

Overcoming contact resistance

http://www.kurzweilai.net/images/SWNT-transistor-contacts.jpg

Schematic of carbon nanotube transistor contacts. Left: High-resistance side-bonded contact, where the single-wall nanotube (SWNT) (black tube) is partially covered by the metal molybdenum (Mo) (purple dots). Right: low-resistance end-bonded contact, where the SWNT is attached to the molybdenum electrode through carbide bonds, while the carbon atoms (black dots) from the originally covered portion of the SWNT uniformly diffuse out into the Mo electrode (credit: Qing Cao et al./Science)

 

The new “end-bonded contact scheme” allows carbon-nanotube contacts to be shrunken down to below 10 nanometers without deteriorating performance. IBM says the scheme could overcome contact resistance challenges all the way to the 1.8 nanometer node and replace silicon with carbon nanotubes.

Silicon transistors have been made smaller year after year, but they are approaching a point of physical limitation. With Moore’s Law running out of steam, shrinking the size of the transistor — including the channels and contacts — without compromising performance has been a challenge for researchers for decades.

 

http://www.kurzweilai.net/images/SWCNT.jpg

Single wall carbon nanotube (credit: IBM)

 

IBM has previously shown that carbon nanotube transistors can operate as excellent switches at channel dimensions of less than ten nanometers, which is less than half the size of today’s leading silicon technology. Electrons in carbon transistors can move more easily than in silicon-based devices and use less power.

Carbon nanotubes are also flexible and transparent, making them useful for flexible and stretchable electronics or sensors embedded in wearables.

IBM acknowledges that several major manufacturing challenges still stand in the way of commercial devices based on nanotube transistors.

Earlier this summer, IBM unveiled the first 7 nanometer node silicon test chip, pushing the limits of silicon technologies.

END-BONDED CONTACTS FOR CARBON NANOTUBE TRANSISTORS WITH LOW, SIZE-INDEPENDENT RESISTANCE

Science 2 Oct 2015; 350(6256):6872      http://dx.doi.org:/10.1126/science.aac8006

Moving beyond the limits of silicon transistors requires both a high-performance channel and high-quality electrical contacts. Carbon nanotubes provide high-performance channels below 10 nanometers, but as with silicon, the increase in contact resistance with decreasing size becomes a major performance roadblock. We report a single-walled carbon nanotube (SWNT) transistor technology with an end-bonded contact scheme that leads to size-independent contact resistance to overcome the scaling limits of conventional side-bonded or planar contact schemes. A high-performance SWNT transistor was fabricated with a sub–10-nanometer contact length, showing a device resistance below 36 kilohms and on-current above 15 microampere per tube. The p-type end-bonded contact, formed through the reaction of molybdenum with the SWNT to form carbide, also exhibited no Schottky barrier. This strategy promises high-performance SWNT transistors, enabling future ultimately scaled device technologies.

 

 

 

 

Read Full Post »

Biomarker Development


Biomarker Development

Curator: Larry H. Bernstein, MD, FCAP

 

 

NBDA’s Biomarker R&D Modules

http://nbdabiomarkers.org/

“collaboratively creating the NBDA Standards* required for end-to-end, evidence – based biomarker development to advance precision (personalized) medicine”

http://nbdabiomarkers.org/sites/all/themes/nbda/images/nbda_logo.jpg

http://nbdabiomarkers.org/about/what-we-do/pipeline-overview/assay-development

 

Successful biomarkers should move systematically and seamlessly through specific R&D “modules” – from early discovery to clinical validation. NBDA’s end-to-end systems approach is based on working with experts from all affected multi-sector stakeholder communities to build an in-depth understanding of the existing barriers in each of these “modules” to support decision making at each juncture.  Following extensive “due diligence” the NBDA works with all stakeholders to assemble and/or create the enabling standards (guidelines, best practices, SOPs) needed to support clinically relevant and robust biomarker development.

Mission: Collaboratively creating the NBDA Standards* required for end-to-end, evidence – based biomarker development to advance precision (personalized) medicine.
NBDA Standards include but are not limited to: “official existing standards”, guidelines, principles, standard operating procedures (SOP), and best practices.

https://vimeo.com/83266065

 

“The NBDA’s vision is not to just relegate the current biomarker development processes to history, but also to serve as a working example of what convergence of purpose, scientific knowledge and collaboration can accomplish.”

NBDA Workshop VII – “COLLABORATIVELY BUILDING A FOUNDATION FOR FDA BIOMARKER QUALIFICATION”
NBDA Workshop VII   December 14-15, 2015   Washington Court Hotel, Washington, DC

The upcoming meeting was preceded by an NBDA workshop held on December 1-2, 2014, “The Promising but Elusive Surrogate Endpoint:  What Will It Take?” where we explored in-depth with FDA leadership and experts in the field the current status and future vison for achieving success in surrogate endpoint development.  Through panels and workgroups, the attendees extended their efforts to pursue the FDA’s biomarker qualification pathway through the creation of sequential contexts of use models to support qualification of drug development tools – and ultimately surrogate endpoints.

Although the biomarker (drug development tools) qualification pathway (http://www.fda.gov/Drugs/DevelopmentApprovalProcess/DrugDevelopmentTools…) represents an opportunity to increase the value of predictive biomarkers, animal models, and clinical outcomes across the drug (and biologics) development continuum, there are myriad challenges.  In that regard, the lack of evidentiary standards to support contexts of use-specific biomarkers emerged from the prior NBDA workshop as the major barrier to achieving the promise of biomarker qualification.  It also became clear that overall, the communities do not understand the biomarker qualification process; nor do they fully appreciate that it is up to the stakeholders in the field (academia, non-profit foundations, pharmaceutical and biotechnology companies, and patient advocate organizations) to develop these evidentiary standards.

This NBDA workshop will feature a unique approach to address these problems.  Over the past two years, the NBDA has worked with experts in selected disease areas to develop specific case studies that feature a systematic approach to identifying the evidentiary standards needed for sequential contexts of use for specific biomarkers to drive biomarker qualification.   These constructs, and accompanying whitepapers are now the focus of collaborative discussions with FDA experts.

The upcoming meeting will feature in-depth panel discussions of 3-4 of these cases, including the case leader, additional technical contributors, and a number of FDA experts.  Each of the panels will analyze their respective case for strengths and weaknesses – including suggestions for making the biomarker qualification path for the specific biomarker more transparent and efficient. In addition, the discussions will highlight the problem of poor reproducibility of biomarker discovery results, and its impact on the qualification process.

 

Health Care in the Digital Age

Mobile, big data, the Internet of Things and social media are leading a revolution that is transforming opportunities in health care and research. Extraordinary advancements in mobile technology and connectivity have provided the foundation needed to dramatically change the way health care is practiced today and research is done tomorrow. While we are still in the early innings of using mobile technology in the delivery of health care, evidence supporting its potential to impact the delivery of better health care, lower costs and improve patient outcomes is apparent. Mobile technology for health care, or mHealth, can empower doctors to more effectively engage their patients and provide secure information on demand, anytime and anywhere. Patients demand safety, speed and security from their providers. What are the technologies that are allowing this transformation to take place?

 

https://youtu.be/WeXEa2cL3oA    Monday, April 27, 2015  Milken Institute

Moderator


Michael Milken, Chairman, Milken Institute

 

Speakers


Anna Barker, Fellow, FasterCures, a Center of the Milken Institute; Professor and Director, Transformative Healthcare Networks, and Co-Director, Complex Adaptive Systems Network, Arizona State University
Atul Butte, Director, Institute of Computational Health Sciences, University of California, San Francisco
John Chen, Executive Chairman and CEO, BlackBerry
Victor Dzau, President, Institute of Medicine, National Academy of Sciences; Chancellor Emeritus, Duke University
Patrick Soon-Shiong, Chairman and CEO, NantWorks, LLC

 

Mobile, big data, the Internet of Things and social media are leading a revolution that is transforming opportunities in health care and research. Extraordinary advancements in mobile technology and connectivity have provided the foundation needed to dramatically change the way health care is practiced today and research is done tomorrow. While we are still in the early innings of using mobile technology in the delivery of health care, evidence supporting its potential to impact the delivery of better health care, lower costs and improve patient outcomes is apparent. Mobile technology for health care, or mHealth, can empower doctors to more effectively engage their patients and provide secure information on demand, anytime and anywhere. Patients demand safety, speed and security from their providers. What are the technologies that are allowing this transformation to take place?

 

Read Full Post »


Better bioinformatics

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Big data in biomedicine: 4 big questions

Eric Bender  Nature Nov 2015; S19, 527.     http://dx.doi.org:/10.1038/527S19a

http://www.nature.com/nature/journal/v527/n7576_supp/full/527S19a.html

Gathering and understanding the deluge of biomedical research and health data poses huge challenges. But this work is rapidly changing the face of medicine.

 

http://www.nature.com/nature/journal/v527/n7576_supp/images/527S19a-g1.jpg

 

1. How can long-term access to biomedical data that are vital for research be improved?

Why it matters
Data storage may be getting cheaper, particularly in cloud computing, but the total costs of maintaining biomedical data are too high and climbing rapidly. Current models for handling these tasks are only stopgaps.

Next steps
Researchers, funders and others need to analyse data usage and look at alternative models, such as ‘data commons’, for providing access to curated data in the long term. Funders also need to incorporate resources for doing this.

Quote
“Our mission is to use data science to foster an open digital ecosystem that will accelerate efficient, cost-effective biomedical research to enhance health, lengthen life and reduce illness and disability.” Philip Bourne, US National Institutes of Health.

 

2. How can the barriers to using clinical trial results and patients’ health records for research be lowered?

Why it matters
‘De-identified’ data from clinical trials and patients’ medical records offer opportunities for research, but the legal and technical obstacles are immense. Clinical study data are rarely shared, and medical records are walled off by privacy and security regulations and by legal concerns.

Next steps
Patient advocates are lobbying for access to their own health data, including genomic information. The European Medicines Agency is publishing clinical reports submitted as part of drug applications. And initiatives such as CancerLinQ are gathering de-identified patient data.

Quote
“There’s a lot of genetic information that no one understands yet, so is it okay or safe or right to put that in the hands of a patient? The flip side is: it’s my information — if I want it, I should get it.”Megan O’Boyle, Phelan-McDermid Syndrome Foundation.

 

3. How can knowledge from big data be brought into point-of-care health-care delivery?

Why it matters
Delivering precision medicine will immensely broaden the scope of electronic health records. This massive shift in health care will be complicated by the introduction of new therapies, requiring ongoing education for clinicians who need detailed information to make clinical decisions.

Next steps
Health systems are trying to bring up-to-date treatments to clinics and build ‘health-care learning systems’ that integrate with electronic health records. For instance, the CancerLinQ project provides recommendations for patients with cancer whose treatment is hard to optimize.

Quote
“Developing a standard interface for innovators to access the information in electronic health records will connect the point of care to big data and the full power of the web, spawning an ‘app store’ for health.” Kenneth Mandl, Harvard Medical School.

 

4. Can academia create better career tracks for bioinformaticians?

Why it matters
The lack of attractive career paths in bioinformatics has led to a shortage of scientists that have both strong statistical skills and biological understanding. The loss of data scientists to other fields is slowing the pace of medical advances.

Next steps
Research institutions will take steps, including setting up formal career tracks, to reward bioinformaticians who take on multidisciplinary collaborations. Funders will find ways to better evaluate contributions from bioinformaticians.

Quote
“Perhaps the most promising product of big data, that labs will be able to explore countless and unimagined hypotheses, will be stymied if we lack the bioinformaticians that can make this happen.” Jeffrey Chang, University of Texas.

 

Eric Bender is a freelance science writer based in Newton, Massachusetts.

Read Full Post »

Older Posts »