Feeds:
Posts
Comments

Archive for the ‘Statistical Methods for Research Evaluation’ Category

Scientific Curation Fostering Expert Networks and Open Innovation: Lessons from Clive Thompson

Life-cycle of Science 2

 

 

 

 

 

 

 

 

 

 

 

Curators and Writer: Stephen J. Williams, Ph.D. with input from Curators Larry H. Bernstein, MD, FCAP, Dr. Justin D. Pearlman, MD, PhD, FACC and Dr. Aviva Lev-Ari, PhD, RN

(this discussion is in a three part series including:

Using Scientific Content Curation as a Method for Validation and Biocuration

Using Scientific Content Curation as a Method for Open Innovation)

 

Every month I get my Wired Magazine (yes in hard print, I still like to turn pages manually plus I don’t mind if I get grease or wing sauce on my magazine rather than on my e-reader) but I always love reading articles written by Clive Thompson. He has a certain flair for understanding the techno world we live in and the human/technology interaction, writing about interesting ways in which we almost inadvertently integrate new technologies into our day-to-day living, generating new entrepreneurship, new value.   He also writes extensively about tech and entrepreneurship.

October 2013 Wired article by Clive Thompson, entitled “How Successful Networks Nurture Good Ideas: Thinking Out Loud”, describes how the voluminous writings, postings, tweets, and sharing on social media is fostering connections between people and ideas which, previously, had not existed. The article was generated from Clive Thompson’s book Smarter Than you Think: How Technology is Changing Our Minds for the Better.Tom Peters also commented about the article in his blog (see here).

Clive gives a wonderful example of Ory Okolloh, a young Kenyan-born law student who, after becoming frustrated with the lack of coverage of problems back home, started a blog about Kenyan politics. Her blog not only got interest from movie producers who were documenting female bloggers but also gained the interest of fellow Kenyans who, during the upheaval after the 2007 Kenyan elections, helped Ory to develop a Google map for reporting of violence (http://www.ushahidi.com/, which eventually became a global organization using open-source technology to affect crises-management. There are a multitude of examples how networks and the conversations within these circles are fostering new ideas. As Clive states in the article:

 

Our ideas are PRODUCTS OF OUR ENVIRONMENT.

They are influenced by the conversations around us.

However the article got me thinking of how Science 2.0 and the internet is changing how scientists contribute, share, and make connections to produce new and transformative ideas.

But HOW MUCH Knowledge is OUT THERE?

 

Clive’s article listed some amazing facts about the mountains of posts, tweets, words etc. out on the internet EVERY DAY, all of which exemplifies the problem:

  • 154.6 billion EMAILS per DAY
  • 400 million TWEETS per DAY
  • 1 million BLOG POSTS (including this one) per DAY
  • 2 million COMMENTS on WordPress per DAY
  • 16 million WORDS on Facebook per DAY
  • TOTAL 52 TRILLION WORDS per DAY

As he estimates this would be 520 million books per DAY (book with average 100,000 words).

A LOT of INFO. But as he suggests it is not the volume but how we create and share this information which is critical as the science fiction writer Theodore Sturgeon noted “Ninety percent of everything is crap” AKA Sturgeon’s Law.

 

Internet live stats show how congested the internet is each day (http://www.internetlivestats.com/). Needless to say Clive’s numbers are a bit off. As of the writing of this article:

 

  • 2.9 billion internet users
  • 981 million websites (only 25,000 hacked today)
  • 128 billion emails
  • 385 million Tweets
  • > 2.7 million BLOG posts today (including this one)

 

The Good, The Bad, and the Ugly of the Scientific Internet (The Wild West?)

 

So how many science blogs are out there? Well back in 2008 “grrlscientistasked this question and turned up a total of 19,881 blogs however most were “pseudoscience” blogs, not written by Ph.D or MD level scientists. A deeper search on Technorati using the search term “scientist PhD” turned up about 2,000 written by trained scientists.

So granted, there is a lot of

goodbadugly

 

              ….. when it comes to scientific information on the internet!

 

 

 

 

 

I had recently re-posted, on this site, a great example of how bad science and medicine can get propagated throughout the internet:

http://pharmaceuticalintelligence.com/2014/06/17/the-gonzalez-protocol-worse-than-useless-for-pancreatic-cancer/

 

and in a Nature Report:Stem cells: Taking a stand against pseudoscience

http://www.nature.com/news/stem-cells-taking-a-stand-against-pseudoscience-1.15408

Drs.Elena Cattaneo and Gilberto Corbellini document their long, hard fight against false and invalidated medical claims made by some “clinicians” about the utility and medical benefits of certain stem-cell therapies, sacrificing their time to debunk medical pseudoscience.

 

Using Curation and Science 2.0 to build Trusted, Expert Networks of Scientists and Clinicians

 

Establishing networks of trusted colleagues has been a cornerstone of the scientific discourse for centuries. For example, in the mid-1640s, the Royal Society began as:

 

“a meeting of natural philosophers to discuss promoting knowledge of the

natural world through observation and experiment”, i.e. science.

The Society met weekly to witness experiments and discuss what we

would now call scientific topics. The first Curator of Experiments

was Robert Hooke.”

 

from The History of the Royal Society

 

Royal Society CoatofArms

 

 

 

 

 

 

The Royal Society of London for Improving Natural Knowledge.

(photo credit: Royal Society)

(Although one wonders why they met “in-cognito”)

Indeed as discussed in “Science 2.0/Brainstorming” by the originators of OpenWetWare, an open-source science-notebook software designed to foster open-innovation, the new search and aggregation tools are making it easier to find, contribute, and share information to interested individuals. This paradigm is the basis for the shift from Science 1.0 to Science 2.0. Science 2.0 is attempting to remedy current drawbacks which are hindering rapid and open scientific collaboration and discourse including:

  • Slow time frame of current publishing methods: reviews can take years to fashion leading to outdated material
  • Level of information dissemination is currently one dimensional: peer-review, highly polished work, conferences
  • Current publishing does not encourage open feedback and review
  • Published articles edited for print do not take advantage of new web-based features including tagging, search-engine features, interactive multimedia, no hyperlinks
  • Published data and methodology incomplete
  • Published data not available in formats which can be readably accessible across platforms: gene lists are now mandated to be supplied as files however other data does not have to be supplied in file format

(put in here a brief blurb of summary of problems and why curation could help)

 

Curation in the Sciences: View from Scientific Content Curators Larry H. Bernstein, MD, FCAP, Dr. Justin D. Pearlman, MD, PhD, FACC and Dr. Aviva Lev-Ari, PhD, RN

Curation is an active filtering of the web’s  and peer reviewed literature found by such means – immense amount of relevant and irrelevant content. As a result content may be disruptive. However, in doing good curation, one does more than simply assign value by presentation of creative work in any category. Great curators comment and share experience across content, authors and themes. Great curators may see patterns others don’t, or may challenge or debate complex and apparently conflicting points of view.  Answers to specifically focused questions comes from the hard work of many in laboratory settings creatively establishing answers to definitive questions, each a part of the larger knowledge-base of reference. There are those rare “Einstein’s” who imagine a whole universe, unlike the three blind men of the Sufi tale.  One held the tail, the other the trunk, the other the ear, and they all said this is an elephant!
In my reading, I learn that the optimal ratio of curation to creation may be as high as 90% curation to 10% creation. Creating content is expensive. Curation, by comparison, is much less expensive.

– Larry H. Bernstein, MD, FCAP

Curation is Uniquely Distinguished by the Historical Exploratory Ties that Bind –Larry H. Bernstein, MD, FCAP

The explosion of information by numerous media, hardcopy and electronic, written and video, has created difficulties tracking topics and tying together relevant but separated discoveries, ideas, and potential applications. Some methods to help assimilate diverse sources of knowledge include a content expert preparing a textbook summary, a panel of experts leading a discussion or think tank, and conventions moderating presentations by researchers. Each of those methods has value and an audience, but they also have limitations, particularly with respect to timeliness and pushing the edge. In the electronic data age, there is a need for further innovation, to make synthesis, stimulating associations, synergy and contrasts available to audiences in a more timely and less formal manner. Hence the birth of curation. Key components of curation include expert identification of data, ideas and innovations of interest, expert interpretation of the original research results, integration with context, digesting, highlighting, correlating and presenting in novel light.

Justin D Pearlman, MD, PhD, FACC from The Voice of Content Consultant on The  Methodology of Curation in Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation The Art of Scientific & Medical Curation

 

In Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison, Drs. Larry Bernstein and Aviva Lev-Ari likens the medical and scientific curation process to curation of musical works into a thematic program:

 

Work of Original Music Curation and Performance:

 

Music Review and Critique as a Curation

Work of Original Expression what is the methodology of Curation in the context of Medical Research Findings Exposition of Synthesis and Interpretation of the significance of the results to Clinical Care

… leading to new, curated, and collaborative works by networks of experts to generate (in this case) ebooks on most significant trends and interpretations of scientific knowledge as relates to medical practice.

 

In Summary: How Scientific Content Curation Can Help

 

Given the aforementioned problems of:

        I.            the complex and rapid deluge of scientific information

      II.            the need for a collaborative, open environment to produce transformative innovation

    III.            need for alternative ways to disseminate scientific findings

CURATION MAY OFFER SOLUTIONS

        I.            Curation exists beyond the review: curation decreases time for assessment of current trends adding multiple insights, analyses WITH an underlying METHODOLOGY (discussed below) while NOT acting as mere reiteration, regurgitation

 

      II.            Curation providing insights from WHOLE scientific community on multiple WEB 2.0 platforms

 

    III.            Curation makes use of new computational and Web-based tools to provide interoperability of data, reporting of findings (shown in Examples below)

 

Therefore a discussion is given on methodologies, definitions of best practices, and tools developed to assist the content curation community in this endeavor.

Methodology in Scientific Content Curation as Envisioned by Aviva lev-Ari, PhD, RN

 

At Leaders in Pharmaceutical Business Intelligence, site owner and chief editor Aviva lev-Ari, PhD, RN has been developing a strategy “for the facilitation of Global access to Biomedical knowledge rather than the access to sheer search results on Scientific subject matters in the Life Sciences and Medicine”. According to Aviva, “for the methodology to attain this complex goal it is to be dealing with popularization of ORIGINAL Scientific Research via Content Curation of Scientific Research Results by Experts, Authors, Writers using the critical thinking process of expert interpretation of the original research results.” The following post:

Cardiovascular Original Research: Cases in Methodology Design for Content Curation and Co-Curation

 

http://pharmaceuticalintelligence.com/2013/07/29/cardiovascular-original-research-cases-in-methodology-design-for-content-curation-and-co-curation/

demonstrate two examples how content co-curation attempts to achieve this aim and develop networks of scientist and clinician curators to aid in the active discussion of scientific and medical findings, and use scientific content curation as a means for critique offering a “new architecture for knowledge”. Indeed, popular search engines such as Google, Yahoo, or even scientific search engines such as NCBI’s PubMed and the OVID search engine rely on keywords and Boolean algorithms …

which has created a need for more context-driven scientific search and discourse.

In Science and Curation: the New Practice of Web 2.0, Célya Gruson-Daniel (@HackYourPhd) states:

To address this need, human intermediaries, empowered by the participatory wave of web 2.0, naturally started narrowing down the information and providing an angle of analysis and some context. They are bloggers, regular Internet users or community managers – a new type of profession dedicated to the web 2.0. A new use of the web has emerged, through which the information, once produced, is collectively spread and filtered by Internet users who create hierarchies of information.

.. where Célya considers curation an essential practice to manage open science and this new style of research.

As mentioned above in her article, Dr. Lev-Ari represents two examples of how content curation expanded thought, discussion, and eventually new ideas.

  1. Curator edifies content through analytic process = NEW form of writing and organizations leading to new interconnections of ideas = NEW INSIGHTS

i)        Evidence: curation methodology leading to new insights for biomarkers

 

  1. Same as #1 but multiple players (experts) each bringing unique insights, perspectives, skills yielding new research = NEW LINE of CRITICAL THINKING

ii)      Evidence: co-curation methodology among cardiovascular experts leading to cardiovascular series ebooks

Life-cycle of Science 2

The Life Cycle of Science 2.0. Due to Web 2.0, new paradigms of scientific collaboration are rapidly emerging.  Originally, scientific discovery were performed by individual laboratories or “scientific silos” where the main method of communication was peer-reviewed publication, meeting presentation, and ultimately news outlets and multimedia. In this digital era, data was organized for literature search and biocurated databases. In an era of social media, Web 2.0, a group of scientifically and medically trained “curators” organize the piles of data of digitally generated data and fit data into an organizational structure which can be shared, communicated, and analyzed in a holistic approach, launching new ideas due to changes in organization structure of data and data analytics.

 

The result, in this case, is a collaborative written work above the scope of the review. Currently review articles are written by experts in the field and summarize the state of a research are. However, using collaborative, trusted networks of experts, the result is a real-time synopsis and analysis of the field with the goal in mind to

INCREASE THE SCIENTIFIC CURRENCY.

For detailed description of methodology please see Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation The Art of Scientific & Medical Curation

 

In her paper, Curating e-Science Data, Maureen Pennock, from The British Library, emphasized the importance of using a diligent, validated, and reproducible, and cost-effective methodology for curation by e-science communities over the ‘Grid:

“The digital data deluge will have profound repercussions for the infrastructure of research and beyond. Data from a wide variety of new and existing sources will need to be annotated with metadata, then archived and curated so that both the data and the programmes used to transform the data can be reproduced for use in the future. The data represent a new foundation for new research, science, knowledge and discovery”

— JISC Senior Management Briefing Paper, The Data Deluge (2004)

 

As she states proper data and content curation is important for:

  • Post-analysis
  • Data and research result reuse for new research
  • Validation
  • Preservation of data in newer formats to prolong life-cycle of research results

However she laments the lack of

  • Funding for such efforts
  • Training
  • Organizational support
  • Monitoring
  • Established procedures

 

Tatiana Aders wrote a nice article based on an interview with Microsoft’s Robert Scoble, where he emphasized the need for curation in a world where “Twitter is the replacement of the Associated Press Wire Machine” and new technologic platforms are knocking out old platforms at a rapid pace. In addition he notes that curation is also a social art form where primary concerns are to understand an audience and a niche.

Indeed, part of the reason the need for curation is unmet, as writes Mark Carrigan, is the lack of appreciation by academics of the utility of tools such as Pinterest, Storify, and Pearl Trees to effectively communicate and build collaborative networks.

And teacher Nancy White, in her article Understanding Content Curation on her blog Innovations in Education, shows examples of how curation in an educational tool for students and teachers by demonstrating students need to CONTEXTUALIZE what the collect to add enhanced value, using higher mental processes such as:

  • Knowledge
  • Comprehension
  • Application
  • Analysis
  • Synthesis
  • Evaluation

curating-tableA GREAT table about the differences between Collecting and Curating by Nancy White at http://d20innovation.d20blogs.org/2012/07/07/understanding-content-curation/

 

 

 

 

 

 

 

 

 

 

 

University of Massachusetts Medical School has aggregated some useful curation tools at http://esciencelibrary.umassmed.edu/data_curation

Although many tools are related to biocuration and building databases but the common idea is curating data with indexing, analyses, and contextual value to provide for an audience to generate NETWORKS OF NEW IDEAS.

See here for a curation of how networks fosters knowledge, by Erika Harrison on ScoopIt

(http://www.scoop.it/t/mobilizing-knowledge-through-complex-networks)

 

“Nowadays, any organization should employ network scientists/analysts who are able to map and analyze complex systems that are of importance to the organization (e.g. the organization itself, its activities, a country’s economic activities, transportation networks, research networks).”

Andrea Carafa insight from World Economic Forum New Champions 2012 “Power of Networks

 

Creating Content Curation Communities: Breaking Down the Silos!

 

An article by Dr. Dana Rotman “Facilitating Scientific Collaborations Through Content Curation Communities” highlights how scientific information resources, traditionally created and maintained by paid professionals, are being crowdsourced to professionals and nonprofessionals in which she termed “content curation communities”, consisting of professionals and nonprofessional volunteers who create, curate, and maintain the various scientific database tools we use such as Encyclopedia of Life, ChemSpider (for Slideshare see here), biowikipedia etc. Although very useful and openly available, these projects create their own challenges such as

  • information integration (various types of data and formats)
  • social integration (marginalized by scientific communities, no funding, no recognition)

The authors set forth some ways to overcome these challenges of the content curation community including:

  1. standardization in practices
  2. visualization to document contributions
  3. emphasizing role of information professionals in content curation communities
  4. maintaining quality control to increase respectability
  5. recognizing participation to professional communities
  6. proposing funding/national meeting – Data Intensive Collaboration in Science and Engineering Workshop

A few great presentations and papers from the 2012 DICOSE meeting are found below

Judith M. Brown, Robert Biddle, Stevenson Gossage, Jeff Wilson & Steven Greenspan. Collaboratively Analyzing Large Data Sets using Multitouch Surfaces. (PDF) NotesForBrown

 

Bill Howe, Cecilia Aragon, David Beck, Jeffrey P. Gardner, Ed Lazowska, Tanya McEwen. Supporting Data-Intensive Collaboration via Campus eScience Centers. (PDF) NotesForHowe

 

Kerk F. Kee & Larry D. Browning. Challenges of Scientist-Developers and Adopters of Existing Cyberinfrastructure Tools for Data-Intensive Collaboration, Computational Simulation, and Interdisciplinary Projects in Early e-Science in the U.S.. (PDF) NotesForKee

 

Ben Li. The mirages of big data. (PDF) NotesForLiReflectionsByBen

 

Betsy Rolland & Charlotte P. Lee. Post-Doctoral Researchers’ Use of Preexisting Data in Cancer Epidemiology Research. (PDF) NoteForRolland

 

Dana Rotman, Jennifer Preece, Derek Hansen & Kezia Procita. Facilitating scientific collaboration through content curation communities. (PDF) NotesForRotman

 

Nicholas M. Weber & Karen S. Baker. System Slack in Cyberinfrastructure Development: Mind the Gaps. (PDF) NotesForWeber

Indeed, the movement of Science 2.0 from Science 1.0 had originated because these “silos” had frustrated many scientists, resulting in changes in the area of publishing (Open Access) but also communication of protocols (online protocol sites and notebooks like OpenWetWare and BioProtocols Online) and data and material registries (CGAP and tumor banks). Some examples are given below.

Open Science Case Studies in Curation

1. Open Science Project from Digital Curation Center

This project looked at what motivates researchers to work in an open manner with regard to their data, results and protocols, and whether advantages are delivered by working in this way.

The case studies consider the benefits and barriers to using ‘open science’ methods, and were carried out between November 2009 and April 2010 and published in the report Open to All? Case studies of openness in research. The Appendices to the main report (pdf) include a literature review, a framework for characterizing openness, a list of examples, and the interview schedule and topics. Some of the case study participants kindly agreed to us publishing the transcripts. This zip archive contains transcripts of interviews with researchers in astronomy, bioinformatics, chemistry, and language technology.

 

see: Pennock, M. (2006). “Curating e-Science Data”. DCC Briefing Papers: Introduction to Curation. Edinburgh: Digital Curation Centre. Handle: 1842/3330. Available online: http://www.dcc.ac.uk/resources/briefing-papers/introduction-curation– See more at: http://www.dcc.ac.uk/resources/briefing-papers/introduction-curation/curating-e-science-data#sthash.RdkPNi9F.dpuf

 

2.      cBIO -cBio’s biological data curation group developed and operates using a methodology called CIMS, the Curation Information Management System. CIMS is a comprehensive curation and quality control process that efficiently extracts information from publications.

 

3. NIH Topic Maps – This website provides a database and web-based interface for searching and discovering the types of research awarded by the NIH. The database uses automated, computer generated categories from a statistical analysis known as topic modeling.

 

4. SciKnowMine (USC)- We propose to create a framework to support biocuration called SciKnowMine (after ‘Scientific Knowledge Mine’), cyberinfrastructure that supports biocuration through the automated mining of text, images, and other amenable media at the scale of the entire literature.

 

  1. OpenWetWareOpenWetWare is an effort to promote the sharing of information, know-how, and wisdom among researchers and groups who are working in biology & biological engineering. Learn more about us.   If you would like edit access, would be interested in helping out, or want your lab website hosted on OpenWetWare, pleasejoin us. OpenWetWare is managed by the BioBricks Foundation. They also have a wiki about Science 2.0.

6. LabTrove: a lightweight, web based, laboratory “blog” as a route towards a marked up record of work in a bioscience research laboratory. Authors in PLOS One article, from University of Southampton, report the development of an open, scientific lab notebook using a blogging strategy to share information.

7. OpenScience ProjectThe OpenScience project is dedicated to writing and releasing free and Open Source scientific software. We are a group of scientists, mathematicians and engineers who want to encourage a collaborative environment in which science can be pursued by anyone who is inspired to discover something new about the natural world.

8. Open Science Grid is a multi-disciplinary partnership to federate local, regional, community and national cyberinfrastructures to meet the needs of research and academic communities at all scales.

 

9. Some ongoing biomedical knowledge (curation) projects at ISI

IICurate
This project is concerned with developing a curation and documentation system for information integration in collaboration with the II Group at ISI as part of the BIRN.

BioScholar
It’s primary purpose is to provide software for experimental biomedical scientists that would permit a single scientific worker (at the level of a graduate student or postdoctoral worker) to design, construct and manage a shared knowledge repository for a research group derived on a local store of PDF files. This project is funded by NIGMS from 2008-2012 ( RO1-GM083871).

10. Tools useful for scientific content curation

 

Research Analytic and Curation Tools from University of Queensland

 

Thomson Reuters information curation services for pharma industry

 

Microblogs as a way to communicate information about HPV infection among clinicians and patients; use of Chinese microblog SinaWeibo as a communication tool

 

VIVO for scientific communities– In order to connect this information about research activities across institutions and make it available to others, taking into account smaller players in the research landscape and addressing their need for specific information (for example, by proving non-conventional research objects), the open source software VIVO that provides research information as linked open data (LOD) is used in many countries.  So-called VIVO harvesters collect research information that is freely available on the web, and convert the data collected in conformity with LOD standards. The VIVO ontology builds on prevalent LOD namespaces and, depending on the needs of the specialist community concerned, can be expanded.

 

 

11. Examples of scientific curation in different areas of Science/Pharma/Biotech/Education

 

From Science 2.0 to Pharma 3.0 Q&A with Hervé Basset

http://digimind.com/blog/experts/pharma-3-0/

Hervé Basset, specialist librarian in the pharmaceutical industry and owner of the blog “Science Intelligence“, to talk about the inspiration behind his recent book  entitled “From Science 2.0 to Pharma 3.0″, published by Chandos Publishing and available on Amazon and how health care companies need a social media strategy to communicate and convince the health-care consumer, not just the practicioner.

 

Thomson Reuters and NuMedii Launch Ground-Breaking Initiative to Identify Drugs for Repurposing. Companies leverage content, Big Data analytics and expertise to improve success of drug discovery

 

Content Curation as a Context for Teaching and Learning in Science

 

#OZeLIVE Feb2014

http://www.youtube.com/watch?v=Ty-ugUA4az0

Creative Commons license

 

DigCCur: A graduate level program initiated by University of North Carolina to instruct the future digital curators in science and other subjects

 

Syracuse University offering a program in eScience and digital curation

 

Curation Tips from TED talks and tech experts

Steven Rosenbaum from Curation Nation

http://www.youtube.com/watch?v=HpncJd1v1k4

 

Pawan Deshpande form Curata on how content curation communities evolve and what makes a good content curation:

http://www.youtube.com/watch?v=QENhIU9YZyA

 

How the Internet of Things is Promoting the Curation Effort

Update by Stephen J. Williams, PhD 3/01/19

Up till now, curation efforts like wikis (Wikipedia, Wikimedicine, Wormbase, GenBank, etc.) have been supported by a largely voluntary army of citizens, scientists, and data enthusiasts.  I am sure all have seen the requests for donations to help keep Wikipedia and its other related projects up and running.  One of the obscure sister projects of Wikipedia, Wikidata, wants to curate and represent all information in such a way in which both machines, computers, and humans can converse in.  About an army of 4 million have Wiki entries and maintain these databases.

Enter the Age of the Personal Digital Assistants (Hellooo Alexa!)

In a March 2019 WIRED article “Encyclopedia Automata: Where Alexa Gets Its Information”  senior WIRED writer Tom Simonite reports on the need for new types of data structure as well as how curated databases are so important for the new fields of AI as well as enabling personal digital assistants like Alexa or Google Assistant decipher meaning of the user.

As Mr. Simonite noted, many of our libraries of knowledge are encoded in an “ancient technology largely opaque to machines-prose.”   Search engines like Google do not have a problem with a question asked in prose as they just have to find relevant links to pages. Yet this is a problem for Google Assistant, for instance, as machines can’t quickly extract meaning from the internet’s mess of “predicates, complements, sentences, and paragraphs. It requires a guide.”

Enter Wikidata.  According to founder Denny Vrandecic,

Language depends on knowing a lot of common sense, which computers don’t have access to

A wikidata entry (of which there are about 60 million) codes every concept and item with a numeric code, the QID code number. These codes are integrated with tags (like tags you use on Twitter as handles or tags in WordPress used for Search Engine Optimization) so computers can identify patterns of recognition between these codes.

Now human entry into these databases are critical as we add new facts and in particular meaning to each of these items.  Else, machines have problems deciphering our meaning like Apple’s Siri, where they had complained of dumb algorithms to interpret requests.

The knowledge of future machines could be shaped by you and me, not just tech companies and PhDs.

But this effort needs money

Wikimedia’s executive director, Katherine Maher, had prodded and cajoled these megacorporations for tapping the free resources of Wiki’s.  In response, Amazon and Facebook had donated millions for the Wikimedia projects.  Google recently gave 3.1 million USD$ in donations.

 

Future postings on the relevance and application of scientific curation will include:

Using Scientific Content Curation as a Method for Validation and Biocuration

 

Using Scientific Content Curation as a Method for Open Innovation

 

Other posts on this site related to Content Curation and Methodology include:

The growing importance of content curation

Data Curation is for Big Data what Data Integration is for Small Data

6 Steps to More Effective Content Curation

Stem Cells and Cardiac Repair: Content Curation & Scientific Reporting

Cancer Research: Curations and Reporting

Cardiovascular Diseases and Pharmacological Therapy: Curations

Cardiovascular Original Research: Cases in Methodology Design for Content Co-Curation The Art of Scientific & Medical Curation

Exploring the Impact of Content Curation on Business Goals in 2013

Power of Analogy: Curation in Music, Music Critique as a Curation and Curation of Medical Research Findings – A Comparison

conceived: NEW Definition for Co-Curation in Medical Research

The Young Surgeon and The Retired Pathologist: On Science, Medicine and HealthCare Policy – The Best Writers Among the WRITERS

Reconstructed Science Communication for Open Access Online Scientific Curation

 

 

Read Full Post »

USPTO Guidance On Patentable Subject Matter

USPTO Guidance On Patentable Subject Matter

Curator and Reporter: Larry H Bernstein, MD, FCAP

LH Bernstein

LH Bernstein

 

 

 

 

 

 

Revised 4 July, 2014

http://pharmaceuticalintelligence.com/2014/07/03/uspto-guidance-on-patentable-subject-matter

 

I came across a few recent articles on the subject of US Patent Office guidance on patentability as well as on Supreme Court ruling on claims. I filed several patents on clinical laboratory methods early in my career upon the recommendation of my brother-in-law, now deceased.  Years later, after both brother-in-law and patent attorney are no longer alive, I look back and ask what I have learned over $100,000 later, with many trips to the USPTO, opportunities not taken, and a one year provisional patent behind me.

My conclusion is

(1) that patents are for the protection of the innovator, who might realize legal protection, but the cost and the time investment can well exceed the cost of startup and building a small startup enterprize, that would be the next step.

(2) The other thing to consider is the capability of the lawyer or firm that represents you.  A patent that is well done can be expected to take 5-7 years to go through with due diligence.   I would not expect it to be done well by a university with many other competing demands. I might be wrong in this respect, as the climate has changed, and research universities have sprouted engines for change.  Experienced and productive faculty are encouraged or allowed to form their own such entities.

(3) The emergence of Big Data, computational biology, and very large data warehouses for data use and integration has changed the landscape. The resources required for an individual to pursue research along these lines is quite beyond an individuals sole capacity to successfully pursue without outside funding.  In addition, the changed designated requirement of first to publish has muddied the water.

Of course, one can propose without anything published in the public domain. That makes it possible for corporate entities to file thousands of patents, whether there is actual validation or not at the time of filing.  It would be a quite trying experience for anyone to pursue in the USPTO without some litigation over ownership of patent rights. At this stage of of technology development, I have come to realize that the organization of research, peer review, and archiving of data is still at a stage where some of the best systems avalailable for storing and accessing data still comes considerably short of what is needed for the most complex tasks, even though improvements have come at an exponential pace.

I shall not comment on the contested views held by physicists, chemists, biologists, and economists over the completeness of guiding theories strongly held.  Only history will tell.  Beliefs can hold a strong sway, and have many times held us back.

I am not an expert on legal matters, but it is incomprehensible to me that issues concerning technology innovation can be adjudicated in the Supreme Court, as has occurred in recent years. I have postgraduate degrees in  Medicine, Developmental Anatomy, and post-medical training in pathology and laboratory medicine, as well as experience in analytical and research biochemistry.  It is beyond the competencies expected for these type of cases to come before the Supreme Court, or even to the Federal District Courts, as we see with increasing frequency,  as this has occurred with respect to the development and application of the human genome.

I’m not sure that the developments can be resolved for the public good without a more full development of an open-access system of publishing. Now I present some recent publication about, or published by the USPTO.

DR ANTHONY MELVIN CRASTO

Dr. Melvin Castro - Organic Chemistry and New Drug Development

Dr. Melvin Castro – Organic Chemistry and New Drug Development

 

 

 

 

 

 

 

 

YOU ARE FOLLOWING THIS BLOG You are following this blog, along with 1,014 other amazing people (manage).

patentimages.storage.goog…

USPTO Guidance On Patentable Subject Matter: Impediment to Biotech Innovation

Joanna T. Brougher, David A. Fazzolare J Commercial Biotechnology 2014 20(3):Brougher

jcbiotech-patents

jcbiotech-patents

 

 

 

 

 

 

 

 

 

 

 

Abstract In June 2013, the U.S. Supreme Court issued a unanimous decision upending more than three decades worth of established patent practice when it ruled that isolated gene sequences are no longer patentable subject matter under 35 U.S.C. Section 101.While many practitioners in the field believed that the USPTO would interpret the decision narrowly, the USPTO actually expanded the scope of the decision when it issued its guidelines for determining whether an invention satisfies Section 101.

The guidelines were met with intense backlash with many arguing that they unnecessarily expanded the scope of the Supreme Court cases in a way that could unduly restrict the scope of patentable subject matter, weaken the U.S. patent system, and create a disincentive to innovation. By undermining patentable subject matter in this way, the guidelines may end up harming not only the companies that patent medical innovations, but also the patients who need medical care.  This article examines the guidelines and their impact on various technologies.

Keywords:   patent, patentable subject matter, Myriad, Mayo, USPTO guidelines

Full Text: PDF

References

35 U.S.C. Section 101 states “Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.

” Prometheus Laboratories, Inc. v. Mayo Collaborative Services, 566 U.S. ___ (2012)

Association for Molecular Pathology et al., v. Myriad Genetics, Inc., 569 U.S. ___ (2013).

Parke-Davis & Co. v. H.K. Mulford Co., 189 F. 95, 103 (C.C.S.D.N.Y. 1911)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Funk Brothers Seed Co. v. Kalo Inoculant Co., 333 U.S. 127, 131 (1948)

USPTO. Guidance For Determining Subject Matter Eligibility Of Claims Reciting Or Involving Laws of Nature, Natural Phenomena, & Natural Products.

http://www.uspto.gov/patents/law/exam/myriad-mayo_guidance.pdf

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

Courtney C. Brinckerhoff, “The New USPTO Patent Eligibility Rejections Under Section 101.” PharmaPatentsBlog, published May 6, 2014, accessed http://www.pharmapatentsblog.com/2014/05/06/the-new-patent-eligibility-rejections-section-101/

DOI: http://dx.doi.org/10.5912/jcb664

 

Science 4 July 2014; 345 (6192): pp. 14-15  DOI: http://dx.doi.org/10.1126/science.345.6192.14
  • IN DEPTH

INTELLECTUAL PROPERTY

Biotech feels a chill from changing U.S. patent rules

A 2013 Supreme Court decision that barred human gene patents is scrambling patenting policies.

PHOTO: MLADEN ANTONOV/AFP/GETTY IMAGES

A year after the U.S. Supreme Court issued a landmark ruling that human genes cannot be patented, the biotech industry is struggling to adapt to a landscape in which inventions derived from nature are increasingly hard to patent. It is also pushing back against follow-on policies proposed by the U.S. Patent and Trademark Office (USPTO) to guide examiners deciding whether an invention is too close to a natural product to deserve patent protection. Those policies reach far beyond what the high court intended, biotech representatives say.

“Everything we took for granted a few years ago is now changing, and it’s generating a bit of a scramble,” says patent attorney Damian Kotsis of Harness Dickey in Troy, Michigan, one of more than 15,000 people who gathered here last week for the Biotechnology Industry Organization’s (BIO’s) International Convention.

At the meeting, attorneys and executives fretted over the fate of patent applications for inventions involving naturally occurring products—including chemical compounds, antibodies, seeds, and vaccines—and traded stories of recent, unexpected rejections by USPTO. Industry leaders warned that the uncertainty could chill efforts to commercialize scientific discoveries made at universities and companies. Some plan to appeal the rejections in federal court.

USPTO officials, meanwhile, implored attendees to send them suggestions on how to clarify and improve its new policies on patenting natural products, and even announced that they were extending the deadline for public comment by a month. “Each and every one of you in this room has a moral duty … to provide written comments to the PTO,” patent lawyer and former USPTO Deputy Director Teresa Stanek Rea told one audience.

At the heart of the shake-up are two Supreme Court decisions: the ruling last year in Association for Molecular Pathology v. Myriad Genetics Inc. that human genes cannot be patented because they occur naturally (Science, 21 June 2013, p. 1387); and the 2012 Mayo v. Prometheus decision, which invalidated a patent on a method of measuring blood metabolites to determine drug doses because it relied on a “law of nature” (Science, 12 July 2013, p. 137).

Myriad and Mayo are already having a noticeable impact on patent decisions, according to a study released here. It examined about 1000 patent applications that included claims linked to natural products or laws of nature that USPTO reviewed between April 2011 and March 2014. Overall, examiners rejected about 40%; Myriad was the basis for rejecting about 23% of the applications, and Mayo about 35%, with some overlap, the authors concluded. That rejection rate would have been in the single digits just 5 years ago, asserted Hans Sauer, BIO’s intellectual property counsel, at a press conference. (There are no historical numbers for comparison.) The study was conducted by the news service Bloomberg BNA and the law firm Robins, Kaplan, Miller & Ciseri in Minneapolis, Minnesota.

USPTO is extending the decisions far beyond diagnostics and DNA?

The numbers suggest USPTO is extending the decisions far beyond diagnostics and DNA, attorneys say. Harness Dickey’s Kotsis, for example, says a client recently tried to patent a plant extract with therapeutic properties; it was different from anything in nature, Kotsis argued, because the inventor had altered the relative concentrations of key compounds to enhance its effect. Nope, decided USPTO, too close to nature.

In March, USPTO released draft guidance designed to help its examiners decide such questions, setting out 12 factors for them to weigh. For example, if an examiner deems a product “markedly different in structure” from anything in nature, that counts in its favor. But if it has a “high level of generality,” it gets dinged.

The draft has drawn extensive criticism. “I don’t think I’ve ever seen anything as complicated as this,” says Kevin Bastian, a patent attorney at Kilpatrick Townsend & Stockton in San Francisco, California. “I just can’t believe that this will be the standard.”

USPTO officials appear eager to fine-tune the draft guidance, but patent experts fear the Supreme Court decisions have made it hard to draw clear lines. “The Myriad decision is hopelessly contradictory and completely incoherent,” says Dan Burk, a law professor at the University of California, Irvine. “We know you can’t patent genetic sequences,” he adds, but “we don’t really know why.”

Get creative in using Draft Guidelines!

For now, Kostis says, applicants will have to get creative to reduce the chance of rejection. Rather than claim protection for a plant extract itself, for instance, an inventor could instead patent the steps for using it to treat patients. Other biotech attorneys may try to narrow their patent claims. But there’s a downside to that strategy, they note: Narrower patents can be harder to protect from infringement, making them less attractive to investors. Others plan to wait out the storm, predicting USPTO will ultimately rethink its guidance and ease the way for new patents.

 

Public comment period extended

USPTO has extended the deadline for public comment to 31 July, with no schedule for issuing final language. Regardless of the outcome, however, Stanek Rea warned a crowd of riled-up attorneys that, in the world of biopatents, “the easy days are gone.”

 

United States Patent and Trademark Office

Today we published and made electronically available a new edition of the Manual of Patent Examining Procedure (MPEP). Manual of Patent Examining Procedure uspto.gov http://www.uspto.gov/web/offices/pac/mpep/index.html Summary of Changes

PDF Title Page
PDF Foreword
PDF Introduction
PDF Table of Contents
PDF Chapter 600 –
PDF   Parts, Form, and Content of Application Chapter 700 –
PDF    Examination of Applications Chapter 800 –
PDF   Restriction in Applications Filed Under 35 U.S.C. 111; Double Patenting Chapter 900 –
PDF   Prior Art, Classification, and Search Chapter 1000 –
PDF  Matters Decided by Various U.S. Patent and Trademark Office Officials Chapter 1100 –
PDF   Statutory Invention Registration (SIR); Pre-Grant Publication (PGPub) and Preissuance Submissions Chapter 1200 –
PDF    Appeal Chapter 1300 –
PDF   Allowance and Issue Appendix L –
PDF   Patent Laws Appendix R –
PDF   Patent Rules Appendix P –
PDF   Paris Convention Subject Matter Index 
PDF Zipped version of the MPEP current revision in the PDF format.

Manual of Patent Examining Procedure (MPEP)Ninth Edition, March 2014

The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to:
http://uspto-mpep.ideascale.com.

Manual of Patent Examining Procedure (MPEP) Ninth Edition, March 2014
The USPTO continues to offer an online discussion tool for commenting on selected chapters of the Manual. To participate in the discussion and to contribute your ideas go to: http://uspto-mpep.ideascale.com.

Note: For current fees, refer to the Current USPTO Fee Schedule.
Consolidated Laws – The patent laws in effect as of May 15, 2014. Consolidated Rules – The patent rules in effect as of May 15, 2014.  MPEP Archives (1948 – 2012)
Current MPEP: Searchable MPEP

The documents updated in the Ninth Edition of the MPEP, dated March 2014, include changes that became effective in November 2013 or earlier.
All of the documents have been updated for the Ninth Edition except Chapters 800, 900, 1000, 1300, 1700, 1800, 1900, 2000, 2300, 2400, 2500, and Appendix P.
More information about the changes and updates is available from the “Blue Page – Introduction” of the Searchable MPEP or from the “Summary of Changes” link to the HTML and PDF versions provided below. Discuss the Manual of Patent Examining Procedure (MPEP) Welcome to the MPEP discussion tool!

We have received many thoughtful ideas on Chapters 100-600 and 1800 of the MPEP as well as on how to improve the discussion site. Each and every idea submitted by you, the participants in this conversation, has been carefully reviewed by the Office, and many of these ideas have been implemented in the August 2012 revision of the MPEP and many will be implemented in future revisions of the MPEP. The August 2012 revision is the first version provided to the public in a web based searchable format. The new search tool is available at http://mpep.uspto.gov. We would like to thank everyone for participating in the discussion of the MPEP.

We have some great news! Chapters 1300, 1500, 1600 and 2400 of the MPEP are now available for discussion. Please submit any ideas and comments you may have on these chapters. Also, don’t forget to vote on ideas and comments submitted by other users. As before, our editorial staff will periodically be posting proposed new material for you to respond to, and in some cases will post responses to some of the submitted ideas and comments.Recently, we have received several comments concerning the Leahy-Smith America Invents Act (AIA). Please note that comments regarding the implementation of the AIA should be submitted to the USPTO via email t aia_implementation@uspto.gov or via postal mail, as indicated at the America Invents Act Web site. Additional information regarding the AIA is available at www.uspto.gov/americainventsact  We have also received several comments suggesting policy changes which have been routed to the appropriate offices for consideration. We really appreciate your thinking and recommendations!

FDA Guidance for Industry:Electronic Source Data in Clinical Investigations

Electronic Source Data

Electronic Source Data

 

 

 

 

 

 

 

The FDA published its new Guidance for Industry (GfI) – “Electronic Source Data in Clinical Investigations” in September 2013.
The Guidance defines the expectations of the FDA concerning electronic source data generated in the context of clinical trials. Find out more about this Guidance.
http://www.gmp-compliance.org/enews_4288_FDA%20Guidance%20for%20Industry%3A%20Electronic%20Source%20Data%20in%20Clinical%20Investigations
_8534,8457,8366,8308,Z-COVM_n.html

After more than 5 years and two draft versions, the final version of the Guidance for
Industry (GfI) – “Electronic Source Data in Clinical Investigations” was published in
September 2013. This new FDA Guidance defines the FDA’s expectations for sponsors,
CROs, investigators and other persons involved in the capture, review and retention of
electronic source data generated in the context of FDA-regulated clinical trials.In an
effort to encourage the modernization and increased efficiency of processes in clinical
trials, the FDA clearly supports the capture of electronic source data and emphasizes
the agency’s intention to support activities aimed at ensuring the reliability, quality,
integrity and traceability of this source data, from its electronic source to the electronic
submission of the data in the context of an authorization procedure. The Guidance
addresses aspects as data capture, data review and record retention. When the
computerized systems used in clinical trials are described, the FDA recommends
that the description not only focus on the intended use of the system, but also on
data protection measures and the flow of data across system components and
interfaces. In practice, the pharmaceutical industry needs to meet significant
requirements regarding organisation, planning, specification and verification of
computerized systems in the field of clinical trials. The FDA also mentions in the
Guidance that it does not intend to apply 21 CFR Part 11 to electronic health records
(EHR). Author: Oliver Herrmann Q-Infiity Source: http://www.fda.gov/downloads/Drugs/GuidanceComplianceRegulatoryInformation/
Guidances/UCM328691.pdf
Webinar: https://collaboration.fda.gov/p89r92dh8wc

 

Read Full Post »

@Stanford University – Rapid Language Evolution in 19th-century Brazil: Data Mining, Literary Analysis and Evolutionary Biology – A Study of Six Centuries of Portuguese-language Texts

Reporter: Aviva Lev-Ari, PhD, RN

Article ID #142: @Stanford University – Rapid Language Evolution in 19th-century Brazil: Data Mining, Literary Analysis and Evolutionary Biology – A Study of Six Centuries of Portuguese-language Texts. Published on 6/5/2014

WordCloud Image Produced by Adam Tubman

Stanford collaboration offers new perspectives on evolution of Brazilian language

Using a novel combination of data mining, literary analysis and evolutionary biology to study six centuries of Portuguese-language texts, Stanford scholars discover the literary roots of rapid language evolution in 19th-century Brazil.

L.A. Cicero Stanford biology Professor Marcus Feldman, left, and Cuahtemoc Garcia-Garcia, a graduate student in Iberian and Latin American Cultures, combined forces to investigate the evolution of Portuguese as spoken in Brazil.

 

Literature and biology may not seem to overlap in their endeavors, but a Stanford project exploring the evolution of written language in Brazil is bringing the two disciplines together.

Over the last 18 months, Iberian and Latin American Cultures graduate student Cuauhtémoc García-García and biology Professor Marcus Feldman have been working together to trace the evolution of the  Brazilian Portuguese language through literature.

By combining Feldman’s expertise in mathematical analysis of cultural evolution with García-García’s knowledge of Latin American culture and computer programming, they have produced quantifiable evidence of rapid historical changes in written Brazilian Portuguese in the 19th and 20th centuries.

Specifically, Feldman and García-García are studying the changing use of words in tens of thousands of texts, with a focus on the personal pronouns that Brazilians used to address one another.

Their digital analysis of linguistics development in literary texts reflects Brazil’s complex colonial history.

The change in the use of personal pronouns, a daily part of social and cultural interaction, formed part of an evolving linguistic identity that was specific to Brazil, and not its Portuguese colonizers.

“We believe that this fast transition in the written language was due primarily to the approximately 300-year prohibition of both the introduction of the printing press and the foundation of universities in Brazil under Portuguese rule,” García-García said.

What Feldman and García-García found was that spoken language did in fact evolve during those 300 years, but little written evidence of that process exists because colonial restrictions on printing and literacy prevented language development in the written form.

A national sentiment of “write as we speak” arose in Brazil after Portuguese rule ended. García-García said their data shows an abrupt introduction in written texts of the spoken pronouns that were developed during the 300-year colonization period.

Drawing on Feldman’s experience with theoretical and statistical evolutionary models, García-García developed computer programs that count certain words to see how often they appear and how their use has changed over hundreds of years.

In Brazilian literary works produced in the post-colonial period, Feldman said, they have “found examples of written linguistic evolution over short time periods, contrary to the longer periods that are typical for changes in language.”

The findings will figure prominently in García-García’s dissertation, which addresses the transmission of written language across time and space.

The project’s source materials include about 70,000 digitized works in Portuguese from the 13th to the 21st century, ranging from literature and newspapers to technical manuals and pamphlets.

García-García, a member of The Digital Humanities Focal Group at Stanford, said their research “shows how written language changed, and through these changes in pronoun use, we now have a better understanding of how Brazilian writing evolved following the introduction of the printing press.”

Feldman, a population geneticist and one of the founders of the quantitative theory of cultural evolution, said he sees their project as a natural approach to linguistic evolution.

“I believe that evolutionary science and the humanities have a lot to offer each other in both theoretical and empirical explorations,” Feldman said.

Language by the numbers

García-García became interested in language evolution while studying Brazilian Portuguese under the instruction of Stanford lecturer Lyris Wiedemann. He approached Feldman, proposing an evolutionary study of Brazilian Portuguese, and Feldman agreed to help him analyze the data. García-García then enlisted Stanford lecturer Agripino Silveira, who provided linguistic expertise.

García-García worked with Stanford Library curators Glen Worthey, Adan Griego and Everardo Rodriguez for more than a year to develop the technical infrastructure and copyright clearance he needed to access Stanford’s entire digitized corpus of Portuguese language texts. After incorporating even more source material from the HathiTrust digital archive, García-García began the time-consuming task of “cleaning” the corpus, so data could be effectively mined from it.

“Sometimes there were duplicates, issues with the digitization, and works with multiple editions that created ‘noise’ in the corpus,” he said.

Following months of preparation, Feldman and García-García were able to begin data mining. Specifically, they counted the incidences of two pronouns, tu and você, which both mean the singular “you,” and how their incidence in literature changed over time.

“After running various searches, I could correlate results and see how and when certain words were used to build up a comprehensive image of this evolution,” he said.

Tu was – and still is – used in Portugal as the typical way to say ‘you.’ But, in Brazil, você is the more normal way to say it, particularly in major cities like Rio de Janeiro and São Paulo where the majority of the population lives,” García-García explained.

However, that was not always the case. When Brazil was a Portuguese colony, and up until the arrival of the printing press in1808, tu was the canonical form in written language.

As part of the run-up to independence in 1822, universities and printing presses were established in Brazil for the first time in 1808, having been prohibited by the Portuguese colonizers in what García-García calls “cultural repression.”

By the late 19th century, você emerged as the way to address people, shedding part of the colonial legacy, and tu quickly became less prominent in written Brazilian Portuguese.

“Our findings quantifiably show how pronoun use developed. We have found that around 1840, vocêwas used about 10-15 percent of the time by authors to say ‘you.’ By the turn of the century, this had increased to about 70 percent,” García-García said.

“Our data suggest that você was rarely used in the late 17th and 18th centuries, but really appears and takes hold in the middle of the 19th century, a few decades after 1808. Thus, the late arrival of the printing press marks a critical point for understanding the evolution of written Portuguese in Brazil, ” he said.

From Romanticism to realism

Their research revealed an intriguing literary coincidence – the period of transition from tu to vocêcorrelated with the broad change in the dominant literary genre in Brazilian literature from European Romanticism to Latin American realism.

Interestingly, the researchers noticed that the rapid change was most evident several decades after Brazil’s independence in the 1820s because it took that long for Brazilian writers to develop their own voice and style.

For centuries Brazilian writers were forced to write in the style of the Portuguese, but as García-García said, “with their new freedom they wanted to write stories that reflected their national identity.”

“Machado de Assis, arguably Brazil’s greatest author, is a fine example. His early novels are archetypally Romanticist, and then his later novels are deeply Realist, and the use of the pronouns shift from one to the other,” García-García said.

Nonetheless, in Machado’s work there is sometimes a purposeful switch back to the tu form if, for example, the author wanted to evoke a certain sentiment or change the narrative voice.

“The data-mining project cannot ascertain subtle uses of words and how, in some works, the pronouns are ‘interchangeable,'” he added.

Computational expertise was no substitute for literary expertise, and García-García used the two disciplines in tandem to get a clearer picture in his data.

“I had to stop using the computer and go back to a close reading of a large sample of books, and the literary genre change reflects this period of post-colonial social and historical change,” he said.

Feldman and García-García hope to use their methodology to explore different languages.

“Next we hope to study the digitized Spanish language corpus, which currently comprises close to a quarter of a million works from the last 900 years,” García-García said.

Tom Winterbottom is a doctoral candidate in Iberian and Latin American Cultures at Stanford. For more news about the humanities at Stanford, visit the Human Experience.

Media Contact

Corrie Goldman, director of humanities communication: (650) 724-8156, corrieg@stanford.edu

 

SOURCE

http://news.stanford.edu/news/2014/june/evolution-language-brazil-060414.html

Read Full Post »

Pfizer Abandons Bid for AstraZeneca Regardless of Valuation of its Future Financials: The Destiny of Drug R&D Pipelines – The Case of AstraZeneca

Reporter: Aviva Lev-Ari, PhD, RN

UPDATED on 9/3/2014

‘Back to normal’ for AstraZeneca CEO, despite Pfizer rumors

BARCELONA Tue Sep 2, 2014 8:59am EDT

 

(Reuters) – Though speculation is rife of a new Pfizer bid, AstraZeneca’s chief executive is not holed up with advisers in London or New York. Instead, he has spent the last three days immersed in heart science in Barcelona.

“The only thing I can tell you is I am here – and imagine where I would be if something was happening!” Pascal Soriot told Reuters on the sidelines of the European Society of Cardiology congress, the world’s largest heart meeting.

Strict British takeover rules limit what Soriot and other players can say about Pfizer’s abortive attempt to buy AstraZeneca and the possibility of a resumption of talks.

But chatter among investors that Pfizer will come back has boosted shares in Britain’s second biggest drugmaker more than 10 percent since the middle of last month, and the ending of the first of a two-stage cooling-off period on Aug. 26.

While Pfizer cannot launch another public bid until late November, AstraZeneca can now invite it back for talks and Pfizer also has one shot at making a private approach.

Although the saga may not be over, Soriot remains adamant AstraZeneca has a strong independent future.

“We are making good progress with the pipeline and everything so far – touch wood – is going in the right direction,” he said. “We’re back to normal.”

Soriot has spent his time in Barcelona focused on the heart drug Brilinta, which AstraZeneca flagged as worth a potential $3.5 billion-a-year in a strategy update that formed a central plank of its defense against Pfizer.

Sales of the drug have disappointed investors so far, totaling only $216 million in the first half of 2014, but they should pick up following the closure of a U.S. probe into a big clinical trial that had worried some doctors, Soriot said.

“It’s very clear that Brilinta will, actually, in the end make it,” Soriot said, adding he had been encouraged by feedback from key opinion leaders in Barcelona, as well a new study showing the drug was safe to use in ambulances.

SOURCE

http://www.reuters.com/article/2014/09/02/us-astrazeneca-pfizer-exclusive-idUSKBN0GX1C720140902

UPDATED on 5/27/2014

Pfizer Abandons Bid for AstraZeneca

 

Pfizer’s final offer valued AstraZeneca at nearly 70 billion pounds. AstraZeneca demanded an offer of more than £74 billion.Phil Noble/ReutersPfizer’s final offer valued AstraZeneca at nearly 70 billion pounds. AstraZeneca demanded an offer of more than £74 billion.

Updated, 9:36 p.m. | On the final day for Pfizer to decide whether to abandon the plan, it said it did not intend to make an offer for AstraZeneca. Last week, the British company rejected what Pfizer had called its final offer. The cash-and-stock bid, which valued AstraZeneca at about $119 billion, would have created the world’s largest drug company.

Pfizer had indicated that it would not pursue a hostile bid, which would have allowed AstraZeneca’s shareholders to vote on the deal without the approval of AstraZeneca’s board. Under British takeover rules, Pfizer is not permitted to make another offer for AstraZeneca for six months. If AstraZeneca’s board were to agree to talks, the earliest Pfizer could offer a higher price would be in three months.

“We continue to believe that our final proposal was compelling and represented full value for AstraZeneca based on the information that was available to us,” Ian C. Read, Pfizer’s chairman and chief, said in a statement. “As we said from the start, the pursuit of this transaction was a potential enhancement to our existing strategy.”

SOURCE

 

<a href="http://dealbook.nytimes.com/2014/05/26/pfizers-says-its-bid-for-astrazeneca-is-dead/?_php=true&_type=blogs&emc=edit_dlbkam_20140527&nl=business&nlid=40094405&_r=0[/embed]%20">http://dealbook.nytimes.com/2014/05/26/pfizers-says-its-bid-for-astrazeneca-is-dead/?_php=true&_type=blogs&emc=edit_dlbkam_20140527&nl=business&nlid=40094405&_r=0 

The Destiny of Drug R&D Pipelines: The Case of AstraZeneca – Valuation of its Future Financials 

Reporter: Aviva Lev-Ari, PhD, RN

May 21 (Reuters) – AstraZeneca ‘s shareholders remain split over the UK pharmaceuticals giant’s decision to reject a $118 billion offer from U.S. rival Pfizer, with AXA coming out against it while Threadneedle supported it.

“It is the view of AXA IM UK that the board of AstraZeneca should not prevent an offer from Pfizer of 55 pounds ($92.67) per share from being put to the shareholders of the company,” Jim Stride, head of UK equities at AXA Investment Managers <AXAF,PA>, said on Wednesday.

AXA is the third biggest shareholder in AstraZeneca, with a 4.51 percent stake.

“Many shareholders – but not necessarily all – will find this an attractive offer. Accordingly we believe that the board was arguably wrong and acted too hastily to dismiss the latest proposal from Pfizer,” Stride added.

Meanwhile Threadneedle, which is the fifteenth biggest shareholder with a 1.39 percent stake, said it supported AstraZeneca’s decision to reject Pfizer’s proposal.

“As long-term investors in AstraZeneca, we continue to support the board’s stance on the Pfizer offer. We feel the full implications of the proposed acquisition have not been sufficiently understood and addressed by Pfizer,” a spokeswoman for Threadneedle said.

“The company has made notable progress under (Chief Excecutive) Pascal Soriot and is a strong, stand-alone UK business with a good product pipeline.” ($1 = 0.5935 British Pounds) (Reporting By Jemima Kelly; Editing by Chris Vellacott)

 

SOURCE 

http://www.reuters.com/article/2014/05/21/idUSL6N0O73S520140521

 

5/19 3:23AM – Chairman of AstraZeneca REJECTS the FINAL Offer of Pfizer explaining the Progress in Pipeline of defense Immunotherapy in Cancer, the NEW SCIENCE will bring good value to shareholders.

VIEW VIDEO

http://www.reuters.com/video/2014/05/19/so-why-did-astrazeneca-reject-pfizer?videoId=312996516

On Sunday, Pfizer made what it called its “final” $119 billion offer for AstraZeneca, which is based in London. Pfizer also stated that while it wanted a deal, it was only making a soft “nonbinding” offer at this time.

Pfizer said it would not start a full-fledged hostile offer for AstraZeneca. A hostile bid would have involved offering terms that AstraZeneca’s shareholders could accept without the approval of AstraZeneca’s board. Given the price Pfizer offered, such a maneuver had a real chance of success.

SOURCE

http://dealbook.nytimes.com/2014/05/19/the-curious-incident-of-pfizers-final-offer-for-astrazeneca/

Schroders, Fidelity reveal investor split as Astra rejects Pfizer

LONDON Tue May 20, 2014 3:50pm EDT

(Reuters) – Some leading AstraZeneca Plc shareholders were at odds over whether the British drugmaker made the right decision in rejecting Pfizer Inc’s final $118 billion bid to buy the company.

Schroder Investment Management Ltd (SDR.L), AstraZeneca’s (AZN.L) 12th-biggest shareholder, urged the drugmaker on Tuesday to restart takeover talks with Pfizer (PFE.N) while Fidelity Worldwide Investment (UK) Ltd, holder of the 18th largest stake in Astra, backed the British company’s stance.

The division highlighted a split among investors following the collapse of a potential transaction, leaving many shareholders frustrated at missing out on a big windfall.

Schroders said it was disappointed with “the quick rejection by the AstraZeneca board” of an improved 55 pounds-a-share offer and the decision by Pfizer to “draw a premature end to these negotiations by calling their latest proposal final.”

“Given the increase in the offer we would encourage the AstraZeneca management to recommence their engagement with Pfizer, and subsequently their shareholders,” the fund manager, which owns 2 percent of AstraZeneca, said.

SOURCE

http://www.reuters.com/article/2014/05/20/us-astrazeneca-pfizer-shareholders-idUSBREA4J09520140520

Pfizer failed in its takeover bid for AstraZeneca because of overconfidence

AstraZeneca’s directors played their hand well, while Pfizer’s bid was largely driven by cost savings and tax minimisation

, financial editor

SOURCE

http://www.theguardian.com/business/2014/may/19/pfizer-failed-takeover-bid-astrazeneca

AstraZeneca Snubs Pfizer Once More

Updated, 10:36 p.m. | LONDON — AstraZeneca has to hope it can deliver on its vaunted drug pipeline.

On Monday, AstraZeneca, the Anglo-Swedish drug maker, rejectedPfizer’s latest — and what it described as its “final” — bid to buy AstraZeneca, which would create the world’s largest pharmaceutical company.

Barring a last-minute change of heart by AstraZeneca’s board or another sweetened bid by Pfizer later this week, the likelihood of a potential deal looks bleak. Under British takeover rules, Pfizer has until May 26 — a holiday this year in both Britain and the United States — to decide whether to walk away.

The latest offer, made Sunday evening, was worth about $119 billion. On Monday, AstraZeneca responded that the increased bid “undervalues the company and its attractive prospects.”

In making its final offer on Sunday, Pfizer said that it did not believe that AstraZeneca’s board was prepared to recommend a deal “at a reasonable price” and it encouraged AstraZeneca’s shareholders to urge the company to engage in “meaningful dialogue” about a potential combination.

Pfizer, which began its pursuit of a merger last year, has said it will not make a hostile bid.

The news sent AstraZeneca’s shares down 11.1 percent, to 42.875 pounds, in trading on Monday in London. The company’s stock had been trading higher in recent weeks as shareholders anticipated an increased offer from Pfizer.

A Pfizer spokesman said on Monday that the company was evaluating its options after the latest rejection.

Leif Johansson, the AstraZeneca chairman, said in a statement that Pfizer’s pursuit all along “appears to have been fundamentally driven by the corporate financial benefits to its shareholders of cost savings and tax minimization.” He was referring to Pfizer’s plan to reincorporate in Britain through the transaction to substantially reduce its United States tax bill.

“From our first meeting in January to our latest discussion yesterday, and in the numerous phone calls in between, Pfizer has failed to make a compelling strategic, business or value case,” Mr. Johansson said. “The board is firm in its conviction as to the appropriate terms to recommend to shareholders.”

AstraZeneca has raised a number of concerns about the potential tie-up, but the main sticking point appears to be what constitutes an appropriate price for AstraZeneca and its stable of drugs in development.

The latest proposal would have given AstraZeneca shareholders 1.747 shares of the combined company, and £24.76 in cash for each of their shares. The offer valued each share of AstraZeneca at about £55, or about $92.50.

Pfizer’s bid would have combined the makers of Crestor and Viagra.Christopher Furlong/Getty ImagesPfizer’s bid would have combined the makers of Crestor and Viagra.

Analysts had predicted that AstraZeneca’s board would be willing to engage in meaningful discussions about a merger at that price. Yet AstraZeneca said on Monday that it was looking, at a minimum, for a price closer to £58 a share.

Sunday’s offer represented a 45 percent premium over AstraZeneca’s share price before news of Pfizer’s interest became public in April and came after a previous offer by Pfizer late on Friday.

Is AstraZeneca “realistic in what it believes ‘fair value’ is?” Timothy Anderson, a pharmaceutical analyst at Sanford C. Bernstein, said in a research note Monday. “Projecting the worth of new drug pipelines is notoriously difficult, and drug companies and financial analysts alike are often wrong to the tune of billions of dollars, especially when going out five to 10 years. Drug development is just not that predictable.”

AstraZeneca said this month that it expected to achieve annual revenue of $45 billion by 2023 as an independent company.

The company, which has an attractive portfolio of cancer drugs, has repeatedly trumpeted the strength of its drugs in development, saying it is projecting peak annual sales potential of about $23 billion for those drugs by the end of 2023.

The company’s pipeline includes potential treatments for cancer, cardiovascular disease and asthma. One promising area is its immuno-oncology drugs, which use the body’s immune system to attack tumors. MEDI4736, a cancer treatment in development, is one drug that the company is pointing to as a reason to be optimistic about its pipeline. The company expects the drug could earn $6.5 billion in peak annual sales.

Pfizer, the maker of best-selling drugs like Lipitor and Viagra, has raised questions about AstraZeneca’s prospects as a stand-alone concern and vowed to keep jobs in Britain in a bid to persuade British politicians to support the transaction.

If completed, the deal would be one of the largest acquisitions in the pharmaceutical industry, surpassing Pfizer’s takeover of Warner-Lambert 14 years ago, which was valued at $90 billion at the time. Adjusted for inflation, however, that takeover would now be valued at about $124 billion.

Leif Johansson, the chairman of AstraZeneca, said the board was firm in declining Pfizer’s bid.Bob Strong/ReutersLeif Johansson, the chairman of AstraZeneca, said the board was firm in declining Pfizer’s bid.

Savvas Neophytou, an analyst at Panmure Gordon & Company in London, said a majority of AstraZeneca’s large shareholders would have preferred that the company engage in a “more robust discussion” with Pfizer and that £55 a share would have been an attractive price for many stockholders.

Now, AstraZeneca must deliver on its lofty outlook, Mr. Neophytou said.

“From AstraZeneca’s point of view, the challenge is not the next six months,” Mr. Neophytou said. “The challenge is what happens in the medium to long term, the next three to five years.”

Pfizer’s pursuit of AstraZeneca has been a contentious one, with both companies seemingly talking past each other and AstraZeneca’s board showing little desire to engage Pfizer.

On Friday, Pfizer approached AstraZeneca with another increased offer of cash and stock worth about £53.50 a share. AstraZeneca’s board again felt the offer was too low.

On a conference call between the companies on Sunday, Mr. Johansson said, even if other crucial aspects of the deal were satisfactory, the board would be prepared to recommend only an offer price that was more than 10 percent above the Friday bid, or about £58.85, according to AstraZeneca.

Pfizer reiterated that its Friday proposal was final and would not be amended, according to AstraZeneca, but then announced an increased — and again final — proposal on Sunday, without previous notice.

The potential combination has also raised concerns among British and United States lawmakers.

In Britain, the concern has centered on whether Pfizer would eliminate jobs after a merger and hurt Britain’s standing in life sciences research. AstraZeneca employs 51,500 people worldwide, including about 6,700 in Britain.

Pfizer had committed to keep at least 20 percent of its research and development work force in Britain after a deal and to complete a research-and-development center being built by AstraZeneca in Cambridge, England.

But British politicians have pointed to Pfizer’s decision in 2011 to close a facility in Sandwich, England, as a reason to be concerned about its commitment to keeping jobs in Britain. The closing led to the loss of more than 1,500 jobs.

United States politicians also have raised questions about the company’s plans to reincorporate in Britain. The strategy, known as an inversion, is increasingly popular among United States companies, especially in the pharmaceutical industry.

The opposition Labour Party, which had accused Prime Minister David Cameron of cheerleading for the Pfizer bid, welcomed the decision by AstraZeneca.

“The AstraZeneca board has remained resolute and sought to act in the best long-term interests of the company and its vital work in developing new lifesaving drugs,” said Chuka Umunna, who speaks for the Labour Party on business issues.

“Pfizer has said today that it will not seek to launch a hostile bid and must not renege on this promise,” Mr. Umunna said.

Stephen Castle contributed reporting.

A version of this article appears in print on 05/20/2014, on page B1 of the NewYork edition with the headline: AstraZeneca Snubs Pfizer Once More 

SOURCE

http://dealbook.nytimes.com/2014/05/19/astrazeneca-rejects-final-offer-by-pfizer/?_php=true&_type=blogs&_r=0

  1. AstraZeneca, Innovative Medicines and Early Development, Alderley Park, Macclesfield SK10 4TG, UK.

    • David Cook,
    • Dearg Brown,
    • Ruth March,
    • Paul Morgan,
    • Gemma Satterthwaite &
    • Menelas N. Pangalos
  2. AstraZeneca, Innovative Medicines & Early Development, 141 Portland Street, 10th Floor Cambridge, Massachusetts 02139, USA.

    • Robert Alexander

Competing interests statement

All authors are employees and shareholders of AstraZeneca.

Corresponding author

Correspondence to:

  • David Cook is a senior scientist and safety expert in AstraZeneca’s clinical Patient Safety group.

  • Dearg Brown was a Director of Chemistry in AstraZeneca’s oncology group and is now a project manager at Manchester University, UK.

  • Robert Alexander is a Clinical Vice President in AstraZeneca’s neuroscience group.

  • Ruth March is the Vice President of the Personalized Healthcare and Biomarkers group at AstraZeneca.

  • Paul Morgan is Head of Translation Safety in Drug Safety and Metabolism at AstraZeneca.

  • Gemma Satterthwaite is a Director within AstraZeneca’s global product and portfolio strategy group.

  • Menelas N. Pangalos is Executive Vice President and Global Head of Innovative Medicines and Early Development at AstraZeneca.

     

Nature Reviews Drug Discovery (2014) doi:10.1038/nrd4309
Published online 16 May 2014
The portfolio review

In 2011, a comprehensive review was undertaken of 142 drug discovery and development projects at AstraZeneca. The review covered projects from all therapeutic areas that had been active during the 2005–2010 period, from the phases following the completion of preclinical research through to the end of clinical testing in Phase II. The key aims of the review were to understand the major reasons for project closure and to identify the features of projects that correlated with successful outcomes.

We did not expand the review to look at Phase III for two reasons. First, successful transition through proof of concept (Phase II) remains the area where the industry overall has the highest rate of attrition and which must be improved. Second, the number of projects in Phase III for a single company is too small to be able to draw valid conclusions, and this number becomes even smaller if looking at successful transitions to launch of a medicine.

It should be noted that this is a single assessment from a single company, based on a limited number of projects, and over a limited timeframe. Nevertheless, we think that insights from this work could help to guide future teams and improve R&D productivity.

Methods. The aim of the review was to identify key ‘lessons learned’ in projects that could be used to improve the R&D productivity of the company. As such, the review was performed by a cross-functional group of scientists and clinicians drawn from the project team community, and conducted in a peer-to-peer manner. To be as objective as possible, structured questionnaires were used when interviewing teams and, where possible, contemporaneous documents were analysed to provide supporting evidence for assessments. In addition, to further avoid any potential bias, senior leaders who had been associated with the governance decisions over the assessment period were not involved in the review.

For this analysis, the drug development process was divided into four distinct phases: preclinical, Phase I, Phase IIa and Phase IIb. The preclinical phase was defined as the phase from the first good laboratory practice (GLP) toxicology dose of a candidate drug through to an investigational new drug (IND) application or first clinical trial application (CTA) before first-in-human testing. Phase I was defined as the phase that included the first-in-human trials within a small trial population (typically <50 patients) and included safety, tolerability and dose-ranging studies. These studies were often conducted in healthy volunteers, but in some indications (for example, oncology) they could include patients. Phase II trials were defined as trials that were aimed at evaluating the candidate drug’s efficacy in a patient population, leading up to clinical proof of concept. Within our analysis, we subdivided Phase II into Phase IIa and Phase IIb. Phase IIa studies were generally smaller (typically <200 patients) and designed to mainly address early evidence of drug activity, whereas Phase IIb studies included larger numbers of patients (typically <400 patients) and were designed to demonstrate clinical proof of concept and an understanding of dose response.

For each of these phases, projects were classified as being ‘active’ (still in that phase), ‘closed’ (shut during that phase) or ‘successful’ (transitioned from this phase to the next one). Every project was analysed separately in each phase of its development path; so, for example, a project that had reached Phase III trials was analysed four times across the entire development process. Data were collected for each project, for each of the development phases that it had completed, using comprehensive surveys and questionnaires with over 200 questions covering all aspects of the project (for example, the scientific rationale, target validation and physicochemical properties of the candidate drug). Questionnaires were adapted so that they were specific to each phase of the review to allow for the retrospective understanding of the data that were available for a project at that stage, and to analyse how the project knowledge and data developed as the project passed through different phases. Written surveys were supplemented with in-depth peer-to-peer interviews with project teams. Responses to the questionnaires and interviews were subjected to rigorous peer review by a team of experienced scientists and clinicians to ensure consistent evaluation across all projects. In-depth ‘root-cause’ analysis was used to reach conclusions as to why projects had failed. Analyses that were based on answers to specific questions in the questionnaires were only performed on projects that had provided a complete set of answers to the relevant questions.

Results. Overall, we gathered data from more than 80% of the 142 AstraZeneca projects within the scope of the review, and for 95% of projects in clinical phases. Of the projects analysed, 94 closed during the period assessed; 33 closed before clinical testing and a further 61 closed during clinical testing. The remaining projects were still active at the time of this review.

We compared the success rates for our projects to pharmaceutical industry benchmarks, obtained from the Pharmaceutical Benchmarking Forum (Fig. 1a). Our success rate in the preclinical phase (defined as the percentage of projects completing this phase and moving to the next phase of development) was comparable with industry benchmarks (66% versus 63%; see the KMR Group website for further information on the Pharmaceutical Benchmarking Forum).

Our data suggested that we had a higher success rate in completing Phase I (59% versus 48%) but a markedly lower success rate in completing Phase II (15% versus 29%), compared to industry benchmarks. In addition,

Phase III success rates were lower than the industry overall (60% versus 67%), although the number of projects in this phase was very low. Therefore, AstraZeneca was allowing more projects to enter later-stage development, only to have them subsequently fail.

Overall, AstraZeneca’s success in bringing candidate drugs to market during the 2005–2010 period was significantly lower than the industry median (2% versus 6%).

 SOURCE

 

At AstraZeneca, Occasion for Introspection

By Aaron Krol

May 19, 2014 | On Friday, a frank assessment of AstraZeneca’s R&D practices appeared in Nature Reviews Drug DiscoveryThe review, written by members of the pharmaceutical company’s Innovative Medicines and Early Development division, examined projects in preclinical, Phase I and Phase II trials from 2005-2010, and reached some stark conclusions about the corporate culture’s influence on which compounds advanced through successive stages of testing. At the same time, the review offers new reasons for optimism, and may be seen to bolster AstraZeneca in its ongoing battle to resist acquisition by Pfizer.

While the top-line figures are glaring – in particular, the observation that AstraZeneca was letting an exceptional number of compounds through Phase I trials, compared to the industry as a whole, only to see disproportionate failures in Phase II – the authors’ substantive look at the causes of failure is enlightening. A lack of confidence in the underlying biology of therapies seems to have been a frequent reason for failure, even in later stages of a project. In 40% of cases where lack of efficacy led to a project’s shutdown, teams reported that they “lacked data demonstrating a clear linkage of the target to the disease or access to a well-validated animal model of the disease,” and as late as Phase IIb trials, over 40% of teams reported “low confidence” in their molecular targets. In these cases, weak evidence of efficacy in preclinical models may have been enough to move projects forward, despite no strong hypothesis linking a lead drug to the disease pathology.

The authors also noted that a more well-rounded view of the biology was a powerful predictor of success. At the time of the review, only 27% of Phase II trials had failed where teams reported that their drug target had a known genetic link to the disease in humans, and only 18% of Phase IIb trials had failed if teams reported that they began testing with a biomarker for efficacy in mind. “Overall,” the authors write, “these data highlight that, throughout every phase of early R&D, it is crucial for scientists and clinicians to gain an understanding of, and confidence in, the disease biology, the relationship of the target to the disease indication, and the proposed mechanism of action of a potential drug.”

While the problems identified in this review will look familiar to many working in the pharmaceutical industry, they seem to have grown particularly entrenched at AstraZeneca. The authors note that projects were frequently advanced because teams were evaluated by volume of compounds in active testing, creating a perverse incentive to promote any candidate drug that could meet minimal standards of evidence. One indicator of this culture is the lack of diversity in back-up molecules, meant as potential replacements for a lead candidate in certain key projects. The authors found that back-up molecules were often only minor tweaks of the lead candidate, advanced with no reason to believe they would differ in activity or outperform the lead drug on any metric. “In one extreme case,” they write, “we identified a project with seven back-up molecules in the family… [which] all failed owing to the same preclinical toxicology finding.”

Although the review was completed in 2011, it has been published at a fortuitous time for AstraZeneca, which has fought off repeated takeover attempts by Pfizer, including an offer of $117 billion over the weekend. There has been frequent speculation that AstraZeneca’s shareholders may rebel against the wishes of the board and accept a high bid by Pfizer, which is offering a high premium over the market value of AstraZeneca shares. While AstraZeneca’s board members insist that Pfizer is still undervaluing their drug portfolio, skeptics note that AstraZeneca has struggled to bring new drugs to market in recent years.

This report offers reasons to believe that the company’s R&D practices may have changed since 2011, with a new emphasis on biological understanding over volume. The authors note that the company’s early development portfolio is significantly smaller than it was three years ago, and that teams are now expected to articulate their hypotheses relating drugs to disease targets, and more encouraged to seek out relevant biomarkers, companion diagnostics, and patient subpopulations. While acknowledging that “it is too early to see tangible benefits to the AstraZeneca pipeline,” they also observe that both preclinical and Phase II success rates have risen since the new R&D philosophy was implemented, while inflated Phase I success rates have fallen.

While the timing may be coincidence, for AstraZeneca boosters looking for evidence to point to when arguing that Pfizer’s takeover bids undervalue the flagging company, this review will be a welcome addition to the public record. For everyone else, it is an instructive glimpse into misguided incentives at the root level of the drug industry.

SOURCE

http://www.bio-itworld.com/2014/5/19/astrazeneca-occasion-introspection.html

 

Read Full Post »

Summary of Translational Medicine – e-Series A: Cardiovascular Diseases, Volume Four – Part 1

Summary of Translational Medicine – e-Series A: Cardiovascular Diseases, Volume Four – Part 1

Author and Curator: Larry H Bernstein, MD, FCAP

and

Curator: Aviva Lev-Ari, PhD, RN

Article ID #135: Summary of Translational Medicine – e-Series A: Cardiovascular Diseases, Volume Four – Part 1. Published on 4/28/2014

WordCloud Image Produced by Adam Tubman

 

Part 1 of Volume 4 in the e-series A: Cardiovascular Diseases and Translational Medicine, provides a foundation for grasping a rapidly developing surging scientific endeavor that is transcending laboratory hypothesis testing and providing guidelines to:

  • Target genomes and multiple nucleotide sequences involved in either coding or in regulation that might have an impact on complex diseases, not necessarily genetic in nature.
  • Target signaling pathways that are demonstrably maladjusted, activated or suppressed in many common and complex diseases, or in their progression.
  • Enable a reduction in failure due to toxicities in the later stages of clinical drug trials as a result of this science-based understanding.
  • Enable a reduction in complications from the improvement of machanical devices that have already had an impact on the practice of interventional procedures in cardiology, cardiac surgery, and radiological imaging, as well as improving laboratory diagnostics at the molecular level.
  • Enable the discovery of new drugs in the continuing emergence of drug resistance.
  • Enable the construction of critical pathways and better guidelines for patient management based on population outcomes data, that will be critically dependent on computational methods and large data-bases.

What has been presented can be essentially viewed in the following Table:

 

Summary Table for TM - Part 1

Summary Table for TM – Part 1

 

 

 

There are some developments that deserve additional development:

1. The importance of mitochondrial function in the activity state of the mitochondria in cellular work (combustion) is understood, and impairments of function are identified in diseases of muscle, cardiac contraction, nerve conduction, ion transport, water balance, and the cytoskeleton – beyond the disordered metabolism in cancer.  A more detailed explanation of the energetics that was elucidated based on the electron transport chain might also be in order.

2. The processes that are enabling a more full application of technology to a host of problems in the environment we live in and in disease modification is growing rapidly, and will change the face of medicine and its allied health sciences.

 

Electron Transport and Bioenergetics

Deferred for metabolomics topic

Synthetic Biology

Introduction to Synthetic Biology and Metabolic Engineering

Kristala L. J. Prather: Part-1    <iBiology > iBioSeminars > Biophysics & Chemical Biology >

http://www.ibiology.org Lecturers generously donate their time to prepare these lectures. The project is funded by NSF and NIGMS, and is supported by the ASCB and HHMI.
Dr. Prather explains that synthetic biology involves applying engineering principles to biological systems to build “biological machines”.

Dr. Prather has received numerous awards both for her innovative research and for excellence in teaching.  Learn more about how Kris became a scientist at
Prather 1: Synthetic Biology and Metabolic Engineering  2/6/14IntroductionLecture Overview In the first part of her lecture, Dr. Prather explains that synthetic biology involves applying engineering principles to biological systems to build “biological machines”. The key material in building these machines is synthetic DNA. Synthetic DNA can be added in different combinations to biological hosts, such as bacteria, turning them into chemical factories that can produce small molecules of choice. In Part 2, Prather describes how her lab used design principles to engineer E. coli that produce glucaric acid from glucose. Glucaric acid is not naturally produced in bacteria, so Prather and her colleagues “bioprospected” enzymes from other organisms and expressed them in E. coli to build the needed enzymatic pathway. Prather walks us through the many steps of optimizing the timing, localization and levels of enzyme expression to produce the greatest yield. Speaker Bio: Kristala Jones Prather received her S.B. degree from the Massachusetts Institute of Technology and her PhD at the University of California, Berkeley both in chemical engineering. Upon graduation, Prather joined the Merck Research Labs for 4 years before returning to academia. Prather is now an Associate Professor of Chemical Engineering at MIT and an investigator with the multi-university Synthetic Biology Engineering Reseach Center (SynBERC). Her lab designs and constructs novel synthetic pathways in microorganisms converting them into tiny factories for the production of small molecules. Dr. Prather has received numerous awards both for her innovative research and for excellence in teaching.

VIEW VIDEOS

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk#t=0

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk#t=12

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk#t=74

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk#t=129

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk#t=168

https://www.youtube.com/watch?feature=player_embedded&v=ndThuqVumAk

 

II. Regulatory Effects of Mammalian microRNAs

Calcium Cycling in Synthetic and Contractile Phasic or Tonic Vascular Smooth Muscle Cells

in INTECH
Current Basic and Pathological Approaches to
the Function of Muscle Cells and Tissues – From Molecules to HumansLarissa Lipskaia, Isabelle Limon, Regis Bobe and Roger Hajjar
Additional information is available at the end of the chapter
http://dx.doi.org/10.5772/48240
1. Introduction
Calcium ions (Ca ) are present in low concentrations in the cytosol (~100 nM) and in high concentrations (in mM range) in both the extracellular medium and intracellular stores (mainly sarco/endo/plasmic reticulum, SR). This differential allows the calcium ion messenger that carries information
as diverse as contraction, metabolism, apoptosis, proliferation and/or hypertrophic growth. The mechanisms responsible for generating a Ca signal greatly differ from one cell type to another.
In the different types of vascular smooth muscle cells (VSMC), enormous variations do exist with regard to the mechanisms responsible for generating Ca signal. In each VSMC phenotype (synthetic/proliferating and contractile [1], tonic or phasic), the Ca signaling system is adapted to its particular function and is due to the specific patterns of expression and regulation of Ca.
For instance, in contractile VSMCs, the initiation of contractile events is driven by mem- brane depolarization; and the principal entry-point for extracellular Ca is the voltage-operated L-type calcium channel (LTCC). In contrast, in synthetic/proliferating VSMCs, the principal way-in for extracellular Ca is the store-operated calcium (SOC) channel.
Whatever the cell type, the calcium signal consists of  limited elevations of cytosolic free calcium ions in time and space. The calcium pump, sarco/endoplasmic reticulum Ca ATPase (SERCA), has a critical role in determining the frequency of SR Ca release by upload into the sarcoplasmic
sensitivity of  SR calcium channels, Ryanodin Receptor, RyR and Inositol tri-Phosphate Receptor, IP3R.
Synthetic VSMCs have a fibroblast appearance, proliferate readily, and synthesize increased levels of various extracellular matrix components, particularly fibronectin, collagen types I and III, and tropoelastin [1].
Contractile VSMCs have a muscle-like or spindle-shaped appearance and well-developed contractile apparatus resulting from the expression and intracellular accumulation of thick and thin muscle filaments [1].
Schematic representation of Calcium Cycling in Contractile and Proliferating VSMCs

Schematic representation of Calcium Cycling in Contractile and Proliferating VSMCs

 

Figure 1. Schematic representation of Calcium Cycling in Contractile and Proliferating VSMCs.

Left panel: schematic representation of calcium cycling in quiescent /contractile VSMCs. Contractile re-sponse is initiated by extracellular Ca influx due to activation of Receptor Operated Ca (through phosphoinositol-coupled receptor) or to activation of L-Type Calcium channels (through an increase in luminal pressure). Small increase of cytosolic due IP3 binding to IP3R (puff) or RyR activation by LTCC or ROC-dependent Ca influx leads to large SR Ca IP3R or RyR clusters (“Ca -induced Ca SR calcium pumps (both SERCA2a and SERCA2b are expressed in quiescent VSMCs), maintaining high concentration of cytosolic Ca and setting the sensitivity of RyR or IP3R for the next spike.
Contraction of VSMCs occurs during oscillatory Ca transient.
Middle panel: schematic representa tion of atherosclerotic vessel wall. Contractile VSMC are located in the media layer, synthetic VSMC are located in sub-endothelial intima.
Right panel: schematic representation of calcium cycling in quiescent /contractile VSMCs. Agonist binding to phosphoinositol-coupled receptor leads to the activation of IP3R resulting in large increase in cytosolic Ca calcium pumps (only SERCA2b, having low turnover and low affinity to Ca depletion leads to translocation of SR Ca sensor STIM1 towards PM, resulting in extracellular Ca influx though opening of Store Operated Channel (CRAC). Resulted steady state Ca transient is critical for activation of proliferation-related transcription factors ‘NFAT).
Abbreviations: PLC – phospholipase C; PM – plasma membrane; PP2B – Ca /calmodulin-activated protein phosphatase 2B (calcineurin); ROC- receptor activated channel; IP3 – inositol-1,4,5-trisphosphate, IP3R – inositol-1,4,5- trisphosphate receptor; RyR – ryanodine receptor; NFAT – nuclear factor of activated T-lymphocytes; VSMC – vascular smooth muscle cells; SERCA – sarco(endo)plasmic reticulum Ca sarcoplasmic reticulum.

 

Time for New DNA Synthesis and Sequencing Cost Curves

By Rob Carlson

I’ll start with the productivity plot, as this one isn’t new. For a discussion of the substantial performance increase in sequencing compared to Moore’s Law, as well as the difficulty of finding this data, please see this post. If nothing else, keep two features of the plot in mind: 1) the consistency of the pace of Moore’s Law and 2) the inconsistency and pace of sequencing productivity. Illumina appears to be the primary driver, and beneficiary, of improvements in productivity at the moment, especially if you are looking at share prices. It looks like the recently announced NextSeq and Hiseq instruments will provide substantially higher productivities (hand waving, I would say the next datum will come in another order of magnitude higher), but I think I need a bit more data before officially putting another point on the plot.

 

cost-of-oligo-and-gene-synthesis

cost-of-oligo-and-gene-synthesis

Illumina’s instruments are now responsible for such a high percentage of sequencing output that the company is effectively setting prices for the entire industry. Illumina is being pushed by competition to increase performance, but this does not necessarily translate into lower prices. It doesn’t behoove Illumina to drop prices at this point, and we won’t see any substantial decrease until a serious competitor shows up and starts threatening Illumina’s market share. The absence of real competition is the primary reason sequencing prices have flattened out over the last couple of data points.

Note that the oligo prices above are for column-based synthesis, and that oligos synthesized on arrays are much less expensive. However, array synthesis comes with the usual caveat that the quality is generally lower, unless you are getting your DNA from Agilent, which probably means you are getting your dsDNA from Gen9.

Note also that the distinction between the price of oligos and the price of double-stranded sDNA is becoming less useful. Whether you are ordering from Life/Thermo or from your local academic facility, the cost of producing oligos is now, in most cases, independent of their length. That’s because the cost of capital (including rent, insurance, labor, etc) is now more significant than the cost of goods. Consequently, the price reflects the cost of capital rather than the cost of goods. Moreover, the cost of the columns, reagents, and shipping tubes is certainly more than the cost of the atoms in the sDNA you are ostensibly paying for. Once you get into longer oligos (substantially larger than 50-mers) this relationship breaks down and the sDNA is more expensive. But, at this point in time, most people aren’t going to use longer oligos to assemble genes unless they have a tricky job that doesn’t work using short oligos.

Looking forward, I suspect oligos aren’t going to get much cheaper unless someone sorts out how to either 1) replace the requisite human labor and thereby reduce the cost of capital, or 2) finally replace the phosphoramidite chemistry that the industry relies upon.

IDT’s gBlocks come at prices that are constant across quite substantial ranges in length. Moreover, part of the decrease in price for these products is embedded in the fact that you are buying smaller chunks of DNA that you then must assemble and integrate into your organism of choice.

Someone who has purchased and assembled an absolutely enormous amount of sDNA over the last decade, suggested that if prices fell by another order of magnitude, he could switch completely to outsourced assembly. This is a potentially interesting “tipping point”. However, what this person really needs is sDNA integrated in a particular way into a particular genome operating in a particular host. The integration and testing of the new genome in the host organism is where most of the cost is. Given the wide variety of emerging applications, and the growing array of hosts/chassis, it isn’t clear that any given technology or firm will be able to provide arbitrary synthetic sequences incorporated into arbitrary hosts.

 TrackBack URL: http://www.synthesis.cc/cgi-bin/mt/mt-t.cgi/397

 

Startup to Strengthen Synthetic Biology and Regenerative Medicine Industries with Cutting Edge Cell Products

28 Nov 2013 | PR Web

Dr. Jon Rowley and Dr. Uplaksh Kumar, Co-Founders of RoosterBio, Inc., a newly formed biotech startup located in Frederick, are paving the way for even more innovation in the rapidly growing fields of Synthetic Biology and Regenerative Medicine. Synthetic Biology combines engineering principles with basic science to build biological products, including regenerative medicines and cellular therapies. Regenerative medicine is a broad definition for innovative medical therapies that will enable the body to repair, replace, restore and regenerate damaged or diseased cells, tissues and organs. Regenerative therapies that are in clinical trials today may enable repair of damaged heart muscle following heart attack, replacement of skin for burn victims, restoration of movement after spinal cord injury, regeneration of pancreatic tissue for insulin production in diabetics and provide new treatments for Parkinson’s and Alzheimer’s diseases, to name just a few applications.

While the potential of the field is promising, the pace of development has been slow. One main reason for this is that the living cells required for these therapies are cost-prohibitive and not supplied at volumes that support many research and product development efforts. RoosterBio will manufacture large quantities of standardized primary cells at high quality and low cost, which will quicken the pace of scientific discovery and translation to the clinic. “Our goal is to accelerate the development of products that incorporate living cells by providing abundant, affordable and high quality materials to researchers that are developing and commercializing these regenerative technologies” says Dr. Rowley

 

Life at the Speed of Light

http://kcpw.org/?powerpress_pinw=92027-podcast

NHMU Lecture featuring – J. Craig Venter, Ph.D.
Founder, Chairman, and CEO – J. Craig Venter Institute; Co-Founder and CEO, Synthetic Genomics Inc.

J. Craig Venter, Ph.D., is Founder, Chairman, and CEO of the J. Craig Venter Institute (JVCI), a not-for-profit, research organization dedicated to human, microbial, plant, synthetic and environmental research. He is also Co-Founder and CEO of Synthetic Genomics Inc. (SGI), a privately-held company dedicated to commercializing genomic-driven solutions to address global needs.

In 1998, Dr. Venter founded Celera Genomics to sequence the human genome using new tools and techniques he and his team developed.  This research culminated with the February 2001 publication of the human genome in the journal, Science. Dr. Venter and his team at JVCI continue to blaze new trails in genomics.  They have sequenced and a created a bacterial cell constructed with synthetic DNA,  putting humankind at the threshold of a new phase of biological research.  Whereas, we could  previously read the genetic code (sequencing genomes), we can now write the genetic code for designing new species.

The science of synthetic genomics will have a profound impact on society, including new methods for chemical and energy production, human health and medical advances, clean water, and new food and nutritional products. One of the most prolific scientists of the 21st century for his numerous pioneering advances in genomics,  he  guides us through this emerging field, detailing its origins, current challenges, and the potential positive advances.

His work on synthetic biology truly embodies the theme of “pushing the boundaries of life.”  Essentially, Venter is seeking to “write the software of life” to create microbes designed by humans rather than only through evolution. The potential benefits and risks of this new technology are enormous. It also requires us to examine, both scientifically and philosophically, the question of “What is life?”

J Craig Venter wants to digitize DNA and transmit the signal to teleport organisms

http://pharmaceuticalintelligence.com/2013/11/01/j-craig-venter-wants-to-digitize-dna-and-transmit-the-signal-to-teleport-organisms/

2013 Genomics: The Era Beyond the Sequencing of the Human Genome: Francis Collins, Craig Venter, Eric Lander, et al.

http://pharmaceuticalintelligence.com/2013/02/11/2013-genomics-the-era-beyond-the-sequencing-human-genome-francis-collins-craig-venter-eric-lander-et-al/

Human Longevity Inc (HLI) – $70M in Financing of Venter’s New Integrative Omics and Clinical Bioinformatics

http://pharmaceuticalintelligence.com/2014/03/05/human-longevity-inc-hli-70m-in-financing-of-venters-new-integrative-omics-and-clinical-bioinformatics/

 

 

Where Will the Century of Biology Lead Us?

By Randall Mayes

A technology trend analyst offers an overview of synthetic biology, its potential applications, obstacles to its development, and prospects for public approval.

  • In addition to boosting the economy, synthetic biology projects currently in development could have profound implications for the future of manufacturing, sustainability, and medicine.
  • Before society can fully reap the benefits of synthetic biology, however, the field requires development and faces a series of hurdles in the process. Do researchers have the scientific know-how and technical capabilities to develop the field?

Biology + Engineering = Synthetic Biology

Bioengineers aim to build synthetic biological systems using compatible standardized parts that behave predictably. Bioengineers synthesize DNA parts—oligonucleotides composed of 50–100 base pairs—which make specialized components that ultimately make a biological system. As biology becomes a true engineering discipline, bioengineers will create genomes using mass-produced modular units similar to the microelectronics and computer industries.

Currently, bioengineering projects cost millions of dollars and take years to develop products. For synthetic biology to become a Schumpeterian revolution, smaller companies will need to be able to afford to use bioengineering concepts for industrial applications. This will require standardized and automated processes.

A major challenge to developing synthetic biology is the complexity of biological systems. When bioengineers assemble synthetic parts, they must prevent cross talk between signals in other biological pathways. Until researchers better understand these undesired interactions that nature has already worked out, applications such as gene therapy will have unwanted side effects. Scientists do not fully understand the effects of environmental and developmental interaction on gene expression. Currently, bioengineers must repeatedly use trial and error to create predictable systems.

Similar to physics, synthetic biology requires the ability to model systems and quantify relationships between variables in biological systems at the molecular level.

The second major challenge to ensuring the success of synthetic biology is the development of enabling technologies. With genomes having billions of nucleotides, this requires fast, powerful, and cost-efficient computers. Moore’s law, named for Intel co-founder Gordon Moore, posits that computing power progresses at a predictable rate and that the number of components in integrated circuits doubles each year until its limits are reached. Since Moore’s prediction, computer power has increased at an exponential rate while pricing has declined.

DNA sequencers and synthesizers are necessary to identify genes and make synthetic DNA sequences. Bioengineer Robert Carlson calculated that the capabilities of DNA sequencers and synthesizers have followed a pattern similar to computing. This pattern, referred to as the Carlson Curve, projects that scientists are approaching the ability to sequence a human genome for $1,000, perhaps in 2020. Carlson calculated that the costs of reading and writing new genes and genomes are falling by a factor of two every 18–24 months. (see recent Carlson comment on requirement to read and write for a variety of limiting  conditions).

Startup to Strengthen Synthetic Biology and Regenerative Medicine Industries with Cutting Edge Cell Products

http://pharmaceuticalintelligence.com/2013/11/28/startup-to-strengthen-synthetic-biology-and-regenerative-medicine-industries-with-cutting-edge-cell-products/

Synthetic Biology: On Advanced Genome Interpretation for Gene Variants and Pathways: What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging

http://pharmaceuticalintelligence.com/2013/05/17/synthetic-biology-on-advanced-genome-interpretation-for-gene-variants-and-pathways-what-is-the-genetic-base-of-atherosclerosis-and-loss-of-arterial-elasticity-with-aging/

Synthesizing Synthetic Biology: PLOS Collections

http://pharmaceuticalintelligence.com/2012/08/17/synthesizing-synthetic-biology-plos-collections/

Capturing ten-color ultrasharp images of synthetic DNA structures resembling numerals 0 to 9

http://pharmaceuticalintelligence.com/2014/02/05/capturing-ten-color-ultrasharp-images-of-synthetic-dna-structures-resembling-numerals-0-to-9/

Silencing Cancers with Synthetic siRNAs

http://pharmaceuticalintelligence.com/2013/12/09/silencing-cancers-with-synthetic-sirnas/

Genomics Now—and Beyond the Bubble

Futurists have touted the twenty-first century as the century of biology based primarily on the promise of genomics. Medical researchers aim to use variations within genes as biomarkers for diseases, personalized treatments, and drug responses. Currently, we are experiencing a genomics bubble, but with advances in understanding biological complexity and the development of enabling technologies, synthetic biology is reviving optimism in many fields, particularly medicine.

BY MICHAEL BROOKS    17 APR, 2014     http://www.newstatesman.com/

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is The Secret Anarchy of Science.

The basic idea is that we take an organism – a bacterium, say – and re-engineer its genome so that it does something different. You might, for instance, make it ingest carbon dioxide from the atmosphere, process it and excrete crude oil.

That project is still under construction, but others, such as using synthesised DNA for data storage, have already been achieved. As evolution has proved, DNA is an extraordinarily stable medium that can preserve information for millions of years. In 2012, the Harvard geneticist George Church proved its potential by taking a book he had written, encoding it in a synthesised strand of DNA, and then making DNA sequencing machines read it back to him.

When we first started achieving such things it was costly and time-consuming and demanded extraordinary resources, such as those available to the millionaire biologist Craig Venter. Venter’s team spent most of the past two decades and tens of millions of dollars creating the first artificial organism, nicknamed “Synthia”. Using computer programs and robots that process the necessary chemicals, the team rebuilt the genome of the bacterium Mycoplasma mycoides from scratch. They also inserted a few watermarks and puzzles into the DNA sequence, partly as an identifying measure for safety’s sake, but mostly as a publicity stunt.

What they didn’t do was redesign the genome to do anything interesting. When the synthetic genome was inserted into an eviscerated bacterial cell, the new organism behaved exactly the same as its natural counterpart. Nevertheless, that Synthia, as Venter put it at the press conference to announce the research in 2010, was “the first self-replicating species we’ve had on the planet whose parent is a computer” made it a standout achievement.

Today, however, we have entered another era in synthetic biology and Venter faces stiff competition. The Steve Jobs to Venter’s Bill Gates is Jef Boeke, who researches yeast genetics at New York University.

Boeke wanted to redesign the yeast genome so that he could strip out various parts to see what they did. Because it took a private company a year to complete just a small part of the task, at a cost of $50,000, he realised he should go open-source. By teaching an undergraduate course on how to build a genome and teaming up with institutions all over the world, he has assembled a skilled workforce that, tinkering together, has made a synthetic chromosome for baker’s yeast.

 

Stepping into DIYbio and Synthetic Biology at ScienceHack

Posted April 22, 2014 by Heather McGaw and Kyrie Vala-Webb

We got a crash course on genetics and protein pathways, and then set out to design and build our own pathways using both the “Genomikon: Violacein Factory” kit and Synbiota platform. With Synbiota’s software, we dragged and dropped the enzymes to create the sequence that we were then going to build out. After a process of sketching ideas, mocking up pathways, and writing hypotheses, we were ready to start building!

The night stretched long, and at midnight we were forced to vacate the school. Not quite finished, we loaded our delicate bacteria, incubator, and boxes of gloves onto the bus and headed back to complete our bacterial transformation in one of our hotel rooms. Jammed in between the beds and the mini-fridge, we heat-shocked our bacteria in the hotel ice bucket. It was a surreal moment.

While waiting for our bacteria, we held an “unconference” where we explored bioethics, security and risk related to synthetic biology, 3D printing on Mars, patterns in juggling (with live demonstration!), and even did a Google Hangout with Rob Carlson. Every few hours, we would excitedly check in on our bacteria, looking for bacterial colonies and the purple hue characteristic of violacein.

Most impressive was the wildly successful and seamless integration of a diverse set of people: in a matter of hours, we were transformed from individual experts and practitioners in assorted fields into cohesive and passionate teams of DIY biologists and science hackers. The ability of everyone to connect and learn was a powerful experience, and over the course of just one weekend we were able to challenge each other and grow.

Returning to work on Monday, we were hungry for more. We wanted to find a way to bring the excitement and energy from the weekend into the studio and into the projects we’re working on. It struck us that there are strong parallels between design and DIYbio, and we knew there was an opportunity to bring some of the scientific approaches and curiosity into our studio.

 

 

Read Full Post »

Prologue to Cancer – e-book Volume One – Where are we in this journey?

Prologue to Cancer – e-book Volume One – Where are we in this journey?

Author and Curator: Larry H. Bernstein, MD, FCAP

Article ID #128: Prologue to Cancer – e-book Volume One – Where are we in this journey? Published on 4/13/2014

WordCloud Image Produced by Adam Tubman

Consulting Reviewer and Contributor:  Jose Eduardo de Salles Roselino, MD

 

LH Bernstein

LH Bernstein

Jose Eduardo de Salles Roselino

LES Roselino

 

 

This is a preface to the fourth in the ebook series of Leaders in Pharmaceutical Intelligence, a collaboration of experienced doctorate medical and pharmaceutical professionals.  The topic is of great current interest, and it entails a significant part of current medical expenditure by a group of neoplastic diseases that may develop at different periods in life, and have come to supercede infections or even eventuate in infectious disease as an end of life event.  The articles presented are a collection of the most up-to-date accounts of the state of a now rapidly emerging field of medical research that has benefitted enormously by progress in immunodiagnostics,  radiodiagnostics, imaging, predictive analytics, genomic and proteomic discovery subsequent to the completion of the Human Genome Project, advances in analytic methods in qPCR, gene sequencing, genome mapping, signaling pathways, exome identification, identification of therapeutic targets in inhibitors, activators, initiators in the progression of cell metabolism, carcinogenesis, cell movement, and metastatic potential.  This story is very complicated because we are engaged in trying to evoke from what we would like to be similar clinical events, dissimilar events in their expression and classification, whether they are within the same or different anatomic class.  Thus, we are faced with constructing an objective evidence-based understanding requiring integration of several disciplinary approaches to see a clear picture.  The failure to do so creates a high risk of failure in biopharmaceutical development.

The chapters that follow cover novel and important research and development in cancer related research, development, diagnostics and treatment, and in balance, present a substantial part of the tumor landscape, with some exceptions.  Will there ever be a unifying concept, as might be hoped for? I certainly can’t see any such prediction on the horizon.  Part of the problem is that disease classification is a human construct to guide us, and so are treatments that have existed and are reexamined for over 2,000 years.  In that time, we have changed, our afflictions have been modified, and our environment has changed with respect to the microorganisms within and around us, viruses, the soil, and radiation exposure, and the impacts of war and starvation, and access to food.  The outline has been given.  Organic and inorganic chemistry combined with physics has given us a new enterprise in biosynthetics that is and will change our world.  But let us keep in mind that this is a human construct, just as drug target development is such a construct, workable with limitations.

What Molecular Biology Gained from Physics

We need greater clarity and completeness in defining the carcinogenetic process.  It is the beginning, but not the end.  But we must first examine the evolution of the scientific structure that leads to our present understanding. This was preceded by the studies of anatomy, physiology, and embryology that had to occur as a first step, which was followed by the researches into bacteriology, fungi, sea urchins and the evolutionary creatures that could be studied having more primary development in scale.  They are still major objects of study, with the expectation that we can derive lessons about comparative mechanisms that have been passed on through the ages and have common features with man.  This became the serious intent of molecular biology, the discipline that turned to find an explanation for genetics, and to carry out controlled experiments modelled on the discipline that already had enormous success in physics, mathematics, and chemistry. In 1900, when Max Planck hypothesized that the frequency of light emitted by the black body depended on the frequency of the oscillator that emitted it, it had important ramifications for chemistry and biology (See Appendix II and Footnote 1, Planck equation, energy and oscillation).  The leading idea is to search below the large-scale observations of classical biology.

The central dogma of molecular biology where genetic material is transcribed into RNA and then translated into protein, provides a starting point, but the construct is undergoing revision in light of emerging novel roles for RNA and signaling pathways.   The term, coined by Warren Weaver (director of Natural Sciences for the Rockefeller Foundation), who observed an emergence of significant change given recent advances in fields such as X-ray crystallography. Molecular biology also plays important role in understanding formations, actions, regulations of various parts of cellswhich can be used efficiently for targeting new drugs, diagnosis of disease, physiology of the Cell. The Nobel Prize in Physiology or Medicine in 1969 was shared by Max Delbrück, Alfred D. Hershey, Salvador E. Luria, whose work with viral replication gave impetus to the field.  Delbruck was a physicist who trained in Copenhagen under Bohr, and specifically committed himself to a rigor in biology, as was in physics.

Dorothy Hodgkin protein crystallography

Dorothy Hodgkin protein crystallography

Rosalind Franlin crystallographer double helix

Rosalind Franlin
crystallographer
double helix

 Max Delbruck         molecular biology

Max Delbruck        
molecular biology

Max Planck

Max Planck Quantum Physics

 

 

 

We then stepped back from classical (descriptive) physiology, with the endless complexity, to molecular biology.  This led us to the genetic code, with a double helix model.  It has recently been found insufficiently explanatory, with the recent construction of triplex and quadruplex models. They have a potential to account for unaccounted for building blocks, such as inosine, and we don’t know whether more than one model holds validity under different conditions .  The other major field of development has been simply unaccounted for in the study of proteomics, especially in protein-protein interactions, and in the energetics of protein conformation, first called to our attention by the work of Jacob, Monod, and Changeux (See Footnote 2).  Proteins are not just rigid structures stamped out by the monotonously simple DNA to RNA to protein concept.  Nothing is ever quite so simple. Just as there are epigenetic events, there are posttranslational events, and yet more.

JPChangeux-150x170

JP Changeux

 

 

 

 

 

 

 

 

The Emergence of Molecular Biology

I now return the discussion to the topic of medicine, the emergence of molecular biology and the need for convergence with biochemistry in the mid-20th century. Jose Eduardo de Salles Roselino recalls “I was previously allowed to make of the conformational energy as made by R Marcus in his Nobel lecture revised (J. of Electroanalytical  Chemistry 438:(1997) p251-259. (See Footnote 1) His description of the energetic coordinates of a landscape of a chemical reaction is only a two-dimensional cut of what in fact is a volcano crater (in three dimensions) (each one varies but the sum of the two is constant. Solvational+vibrational=100% in ordinate) nuclear coordinates in abcissa. In case we could represent it by research methods that allow us to discriminate in one by one degree of different pairs of energy, we would most likely have 360 other similar representations of the same phenomenon. The real representation would take into account all those 360 representations together. In case our methodology was not that fine, for instance it discriminates only differences of minimal 10 degrees in 360 possible, will have 36 partial representations of something that to be perfectly represented will require all 36 being taken together. Can you reconcile it with ATGC?  Yet, when complete genome sequences were presented they were described as though we will know everything about this living being. The most important problems in biology will be viewed by limited vision always and the awareness of this limited is something we should acknowledge and teach it. Therefore, our knowledge is made up of partial representations. If we had the entire genome data for the most intricate biological problems, they are still not amenable to this level of reductionism. But going from general views of signals andsymptoms we could get to the most detailed molecular view and in this case genome provides an anchor.”

“Warburg Effect” describes the preference of glycolysis and lactic acid fermentation rather than oxidative phosphorylation for energy production in cancer cells. Mitochondrial metabolism is an important and necessary component in the functioning and maintenance of the cell, and accumulating evidence suggests that dysfunction of mitochondrial metabolism plays a role in cancer. Progress has demonstrated the mechanisms of the mitochondrial metabolism-to-glycolysis switch in cancer development and how to target this metabolic switch.

 

 

Glycolysis

glycolysis

 

Otto Heinrich Warburg (1883- )

Otto Warburg

435px-Louis_Pasteur,_foto_av_Félix_Nadar_Crisco_edit

Louis Pasteur

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The expression “Pasteur effect” was coined by Warburg when inspired by Pasteur’s findings in yeast cells, when he investigated this metabolic observation (Pasteur effect) in cancer cells. In yeast cells, Pasteur had found that the velocity of sugar used was greatly reduced in presence of oxygen. Not to be confused, in the “Crabtree effect”, the velocity of sugar metabolism was greatly increased, a reversal, when yeast cells were transferred from the aerobic to an anaerobic condition. Thus, the velocity of sugar metabolism of yeast cells was shown to be under metabolic regulatory control in response to change in environmental oxygen conditions in growth. Warburg had to verify whether cancer cells and tissue related normal mammalian cells also have a similar control mechanism. He found that this control was also found in normal cells studied, but was absent in cancer cells. Strikingly, cancer cells continue to have higher anaerobic gycolysis despite the presence of oxygen in their culture media (See Footnote 3).

Taking this a step further, food is digested and supplied to cells In vertebrates mainly in the form of glucose, which is metabolized producing Adenosine Triphosphate (ATP) by two pathways. Glycolysis, occurs via anaerobic metabolism in the cytoplasm, and is of major significance for making ATP quickly, but in a minuscule amount (2 molecules).  In the presence of oxygen, the breakdown process continues in the mitochondria via the Krebs’s cycle coupled with oxidative phosphorylation, which is more efficient for ATP production (36 molecules). Cancer cells seem to depend on glycolysis. In the 1920s, Otto Warburg first proposed that cancer cells show increased levels of glucose consumption and lactate fermentation even in the presence of ample oxygen (known as “Warburg Effect”). Based on this theory, oxidative phosphorylation switches to glycolysis which promotes the proliferation of cancer cells. Many studies have demonstrated glycolysis as the main metabolic pathway in cancer cells.

Albert Szent Gyogy (Warburg’s student) and Otto Meyerhof both studied striated skeletal muscle metabolism invertebrates, and they found those changes observed in yeast by Pasteur. The description of the anaerobic pathway was largely credited to Emden and Meyerhof. Whenever there is increase in muscle work, energy need is above what can be provided by blood supply, the cell metabolism changes from aerobic (where  Acetyl CoA  provides the chemical energy for aerobic production of ATP) to anaerobic metabolism of glucose. In this condition, glucose is obtained directly from its muscle glycogen stores (not from hepatic glycogenolysis).  This is the sole source of chemical energy that is independent of oxygen supplied to the cell. It is a physiological change on muscle metabolism that favors autonomy. It does not depend upon the blood oxygen for aerobic metabolim or blood sources of carbon metabolites borne out from adipose tissue (free fatty acids) or muscle proteins (branched chain amino acids), or vascular delivery of glucose. On that condition, the muscle can perform contraction by its internal source of ATP and uses conversion of pyruvate to lactate in order to regenerate much-needed NAD (by hydride transfer from pyruvate) as a replacement for this mitochondrial function. This regulatory change, keeps glycolysis going at fast rate in order to meet ATP needs of the cell under low yield condition (only two or three ATP for each glucose converted into two lactate molecules). Therefore, it cannot last for long periods of time. This regulatory metabolic change is made in seconds, minutes and therefore happens with the proteins that are already presented in the cell. It does not requires the effect of transcription factors and/or changes in gene expression (See Footnote 1, 2).

In other types mammalian cells, like those from the lens of the eye (86% gycolysis + pentose shunt),  and red blood cells (RBC)[both lacking mitochondria], and also in the deep medullary layer of the kidneys, for lack of mitochondria in the first two cases and normally reduced blood perfusion in the third – A condition required for the counter current mechanism and our ability to concentrate urine also have, permanent higher anaerobic metabolism. In the case of RBC, it includes the ability to produce in a shunt of glycolytic pathway 2,3 diphospho- glycerate that is required to place the hemogloblin macromolecule in an unstable equilibrium between its two forms (R and T – Here presented as simplified accordingly to the model of Monod, Wyman and Changeux. The final model would be even much complex (see for instance, H-W and K review Nature 2007 vol 450: p 964-972 )

Any tissue under a condition of ischemia that is required for some medical procedures (open heart surgery, organ transplants, etc) displays this fast regulatory mechanism (See Footnote 1, 2). A display of these regulatory metabolic changes can be seen in: Cardioplegia: the protection of the myocardium during open heart surgery: a review. D. J. Hearse J. Physiol., Paris, 1980, 76, 751-756 (Fig 1).  The following points are made:

1-       It is a fast regulatory response. Therefore, no genetic mechanism can be taken into account.

2-       It moves from a reversible to an irreversible condition, while the cells are still alive. Death can be seen at the bottom end of the arrow. Therefore, it cannot be reconciled with some of the molecular biology assumptions:

A-       The gene and genes reside inside the heart muscle cells but, in order to preserve intact, the source of coded genetic information that the cell reads and transcribes, DNA must be kept to a minimal of chemical reactivity.

B-       In case sequence determines conformation, activity and function , elevated potassium blood levels could not cause cardiac arrest.

In comparison with those conditions here presented, cancer cells keep the two metabolic options for glucose metabolism at the same time. These cells can use glucose that our body provides to them or adopt temporarily, an independent metabolic form without the usual normal requirement of oxygen (one or another form for ATP generation).  ATP generation is here, an over-simplification of the metabolic status since the carbon flow for building blocks must also be considered and in this case oxidative metabolism of glucose in cancer cells may be viewed as a rich source of organic molecules or building blocks that dividing cells always need.

JES Roselino has conjectured that “most of the Krebs cycle reaction works as ideal reversible thermodynamic systems that can supply any organic molecule that by its absence could prevent cell duplication.” In the vision of Warburg, cancer cells have a defect in Pasteur-effect metabolic control. In case it was functioning normally, it will indicate which metabolic form of glucose metabolism is adequate for each condition. What more? Cancer cells lack differentiated cell function. Any role for transcription factors must be considered as the role of factors that led to the stable phenotypic change of cancer cells. The failure of Pasteur effect must be searched for among the fast regulatory mechanisms that aren’t dependent on gene expression (See Footnote 3).

Extending the thoughts of JES Roselino (Hepatology 1992;16: 1055-1060), reduced blood flow caused by increased hydrostatic pressure in extrahepatic cholestasis decreases mitochondrial function (quoted in Hepatology) and as part of Pasteur effect normal response, increased glycolysis in partial and/or functional anaerobiosis and therefore blocks the gluconeogenic activity of hepatocytes that requires inhibited glycolysis. In this case, a clear energetic link can be perceived between the reduced energetic supply and the ability to perform differentiated hepatic function (gluconeogenesis). In cancer cells, the action of transcription factors that can be viewed as different ensembles of kaleidoscopic pieces (with changing activities as cell conditions change) are clearly linked to the new stable phenotype. In relation to extrahepatic cholestasis mentioned above it must be reckoned that in case a persistent chronic condition is studied a secondary cirrhosis is installed as an example of persistent stable condition, difficult to be reversed and without the requirement for a genetic mutation. (See Footnote 4).

 The Rejection of Complexity

Most of our reasoning about genes was derived from scientific work in microorganisms. These works have provided great advances in biochemistry.

250px-DNA_labeled DNA diagram showing base pairing

double helix

 

hgp_hubris_220x288_72 genome cartoon

Dna triplex pic

Triple helix

 

formation of a triplex DNA structure

formation of triple helix

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

1-      The “Gelehrter idea”: No matter what you are doing you will always be better off, in case you have a gene (In chapter 7 Principles of Medical Genetics Gelehrter and Collins Williams & Wilkins 1990).

2-      The idea that everything could be found following one gene one enzyme relationship that works fine for our understanding of the metabolism, in all biological problems.

3-      The idea that everything that explains biochemistry in microorganisms explains also for every living being (J Nirenberg).

4-      The idea that biochemistry may not require that time should be also taken into account. Time must be considered only for genetic and biological evolution studies (S Luria. In Life- The unfinished experiment 1977 C Scribner´s sons NY).

5-      Finally, the idea that everything in biology, could be found in the genome. Since all information in biology goes from DNA through RNA to proteins. Alternatively, are in the DNA, in case the strict line that includes RNA is not included.

This last point can be accepted in case it is considered that ALL GENETIC information is in our DNA. Genetics as part of life and not as its total expression.

For example, when our body is informed that the ambient temperature is too low or alternatively is too high, our body is receiving an information that arrives from our environment. This external information will affect our proteins and eventually, in case of longer periods in a new condition will cause adaptive response that may include conformational changes in transcription factors (proteins) that will also, produce new readings on the DNA. However, it is an information that moves from outside, to proteins and not from DNA to proteins. The last pathway, when transcription factors change its conformation and change DNA reading will follow the dogmatic view as an adaptive response (See Footnotes 1-3).

However, in case, time is taken into account, the first reactions against cold or warmer temperatures will be the ones that happen through change in protein conformation, activities and function before any change in gene expression can be noticed at protein level. These fast changes, in seconds, minutes cannot be explained by changes in gene expression and are strongly linked to what is needed for the maintenance of life.

“It is possible”, says Roselino, “desirable, to explain all these fast biochemical responses to changes in a living being condition as the sound foundation of medical practices without a single mention to DNA. In case a failure in any mechanism necessary to life is found to be genetic in its origin, the genome in context with with this huge set of transcription factors must be taken into account. This is the biochemical line of reasoning that I have learned with Houssay and Leloir. It would be an honor to see it restored in modern terms.”

More on the Mechanism of Metabolic Control

It was important that genomics would play such a large role in medical research for the last 70 years. There is also good reason to rethink the objections of the Nobelists James Watson and Randy Schekman in the past year, whatever discomfort it brings.  Molecular biology has become a tautology, and as a result deranged scientific rigor inside biology.

Crick & Watson with their DNA model, 1953

Eatson and Crick

Randy-Schekman Berkeley

Randy-Schekman Berkeley

 

 

According to JES Roselino, “consider that glycolysis is oscillatory thanks to the kinetic behavior of Phosphofructokinase. Further, by its effect upon Pyruvate kinase through Fructose 1,6 diphosphate oscillatory levels, the inhibition of gluconeogenesis is also oscillatory. When the carbon flow through glycolysis is led to a maximal level gluconeogenesis will be almost completely blocked. The reversal of the Pyruvate kinase step in liver requires two enzymes (Pyruvate carboxylase (maintenance of oxaloacetic levels) + phosphoenolpyruvate carboxykinase (E.C. 4.1.1.32)) and energy requiring reactions that most likely could not as an ensemble, have a fast enough response against pyruvate kinase short period of inhibition during high frequency oscillatory periods of glycolytic flow. Only when glycolysis oscillates at low frequency the opposite reaction could enable gluconeogenic carbon flow.”

In case it can be shown in a rather convincing way, the same reasoning could be applied to understand how simple replicative signals inducing Go to G1 transition in cells, could easily overcome more complex signals required for cell differentiation and differentiated function.

Perhaps the problem of overextension of the equivalence of the DNA and what happens to the organism is also related to the initial reliance on a single cell model to relieve the complexity (which isn’t fully the case).

For instance, consider this fragment:
“Until only recently it was assumed that all proteins take on a clearly defined three-dimensional structure – i.e. they fold in order to be able to assume these functions.”
Cold Spring Harbour Symp. Quant. Biol. 1973  p 187-193 J.C Seidel and J Gergely – Investigation of conformational changes in Spin-Labeled Myosin Model for muscle contraction:
Huxley, A. F. 1971 Proc. Roy. Soc (London) (B) 178:1
Huxley, A.F and R. M. Simmons,1971. Nature 233:633
J.C Haselgrove X ray Evidence for a conformational Change in the Actin-containing filaments…Cold Spring Harbour Symp Quant Biol.1972 v 37: p 341-352

Only a very small sample indicating otherwise. Proteins were held as interacting macromolecules, changing their conformation in regulatory response to changes in the microenvironment (See Footnote 2). DNA was the opposite, non-interacting macromolecules to be as stable as a library must be.

The dogma held that the property of proteins could be read in DNA alone. Consequenly, the few examples quoted above, must be ignored and all people must believe that DNA alone, without environmental factors roles, controls protein amino acid sequence (OK), conformation (not true), activity (not true) and function (not true).

It appeared naively to be correct from the dogma to conclude from interpreting your genome: You have a 50% increased risk of developing the following disease (deterministic statement).  The correct form must be: You belong to a population that has a 50% increase in the risk of….followed by –  what you must do to avoid increase in your personal risk and the care you should take in case you want to have longer healthy life.  Thus, genetics and non-genetic diseases were treated as the same and medical foundations were reinforced by magical considerations (dogmas) in a very profitable way for those involved besides the patient.

 Footnotes:

  1. There is a link of electricity with ions in biology and the oscillatory behavior of some electrical discharges.  In addition, the oscillatory form of electrical discharged may have allowed Planck to relate high energy content with higher frequencies and conversely, low energy content in low frequency oscillatory events.  One may think of high density as an indication of great amount of matter inside a volume in space.  This helps the understanding of Planck’s idea as a high-density-energy in time for a high frequency phenomenon.
  1. Take into account a protein that may have its conformation restricted by an S-S bridge. This protein also, may move to another more flexible conformation in case it is in HS HS condition when the S-S bridge is broken. Consider also that, it takes some time for a protein to move from one conformation for instance, the restricted conformation (S-S) to other conformations. Also, it takes a few seconds or minutes to return to the S-S conformation (This is the Daniel Koshland´s concept of induced fit and relaxation time used by him in order to explain allosteric behavior of monomeric proteins- Monod, Wyman and Changeux requires tetramer or at least, dimer proteins).
  1. In case you have glycolysis oscillating in a frequency much higher than the relaxation time you could lead to the prevalence of high NADH effect leading to high HS /HS condition and at low glycolytic frequency, you could have predominance of S-S condition affecting protein conformation. In case you have predominance of NAD effect upon protein S-S you would get the opposite results.  The enormous effort to display the effect of citrate and over Phosphofructokinase conformation was made by others. Take into account that ATP action as an inhibitor in this case, is a rather unusual one. It is a substrate of the reaction, and together with its action as activator  F1,6 P (or its equivalent F2,6 P) is also unusual. However, it explains oscillatory behaviour of glycolysis. (Goldhammer , A.R, and Paradies: PFK structure and function, Curr. Top Cell Reg 1979; 15:109-141).
  1. The results presented in our Hepatology work must be viewed in the following way: In case the hepatic (oxygenated) blood flow is preserved, the bile secretory cells of liver receive well-oxygenated blood flow (the arterial branches bath secretory cells while the branches originated from portal vein irrigate the hepatocytes.  During extra hepatic cholestasis the low pressure, portal blood flow is reduced and the hepatocytes do not receive enough oxygen required to produce ATP that gluconeogenesis demands. Hepatic artery do not replace this flow since, its branches only join portal blood fluxes after the previous artery pressure  is reduced to a low pressure venous blood – at the point where the formation of hepatic vein is. Otherwise, the flow in the portal vein would be reversed or, from liver to the intestine. It is of no help to take into account possible valves for this reasoning since minimal arterial pressure is well above maximal venous pressure and this difference would keep this valve in permanent close condition. In low portal blood flow condition, the hepatocyte increases pyruvate kinase activity and with increased pyruvate kinase activity Gluconeogenesis is forbidden (See Walsh & Cooper revision quoted in the Hepatology as ref 23). For the hemodynamic considerations, role of artery and veins in hepatic portal system see references 44 and 45 Rappaport and Schneiderman and Rappapaport.

 

 Appendix I.

metabolic pathways

metabolic pathways

Signals Upstream and Targets Downstream of Lin28 in the Lin28 Pathway

Signals Upstream and Targets Downstream of Lin28 in the Lin28 Pathway

 

 

 

 

 

 

 

 

1.  Functional Proteomics Adds to Our Understanding

Ben Schuler’s research group from the Institute of Biochemistry of the University of Zurich has now established that an increase in temperature leads to folded proteins collapsing and becoming smaller. Other environmental factors can trigger the same effect. The crowded environments inside cells lead to the proteins shrinking. As these proteins interact with other molecules in the body and bring other proteins together, understanding of these processes is essential “as they play a major role in many processes in our body, for instance in the onset of cancer”, comments study coordinator Ben Schuler.

Measurements using the “molecular ruler”

“The fact that unfolded proteins shrink at higher temperatures is an indication that cell water does indeed play an important role as to the spatial organisation eventually adopted by the molecules”, comments Schuler with regard to the impact of temperature on protein structure. For their studies the biophysicists use what is known as single-molecule spectroscopy. Small colour probes in the protein enable the observation of changes with an accuracy of more than one millionth of a millimetre. With this “molecular yardstick” it is possible to measure how molecular forces impact protein structure.

With computer simulations the researchers have mimicked the behaviour of disordered proteins. They want to use them in future for more accurate predictions of their properties and functions.

Correcting test tube results

That’s why it’s important, according to Schuler, to monitor the proteins not only in the test tube but also in the organism. “This takes into account the fact that it is very crowded on the molecular level in our body as enormous numbers of biomolecules are crammed into a very small space in our cells”, says Schuler. The biochemists have mimicked this “molecular crowding” and observed that in this environment disordered proteins shrink, too.

Given these results many experiments may have to be revisited as the spatial organisation of the molecules in the organism could differ considerably from that in the test tube according to the biochemist from the University of Zurich. “We have, therefore, developed a theoretical analytical method to predict the effects of molecular crowding.” In a next step the researchers plan to apply these findings to measurements taken directly in living cells.

Explore further: Designer proteins provide new information about the body’s signal processesMore information: Andrea Soranno, Iwo Koenig, Madeleine B. Borgia, Hagen Hofmann, Franziska Zosel, Daniel Nettels, and Benjamin Schuler. Single-molecule spectroscopy reveals polymer effects of disordered proteins in crowded environments. PNAS, March 2014. DOI: 10.1073/pnas.1322611111

 

Effects of Hypoxia on Metabolic Flux

  1. Glucose-6-phosphate dehydrogenase regulation in the hepatopancreas of the anoxia-tolerantmarinemollusc, Littorina littorea

JL Lama , RAV Bell and KB Storey

Glucose-6-phosphate dehydrogenase (G6PDH) gates flux through the pentose phosphate pathway and is key to cellular antioxidant defense due to its role in producing NADPH. Good antioxidant defenses are crucial for anoxia-tolerant organisms that experience wide variations in oxygen availability. The marine mollusc, Littorina littorea, is an intertidal snail that experiences daily bouts of anoxia/hypoxia with the tide cycle and shows multiple metabolic and enzymatic adaptations that support anaerobiosis. This study investigated the kinetic, physical and regulatory properties of G6PDH from hepatopancreas of L. littorea to determine if the enzyme is differentially regulated in response to anoxia, thereby providing altered pentose phosphate pathway functionality under oxygen stress conditions.

Several kinetic properties of G6PDH differed significantly between aerobic and 24 h anoxic conditions; compared with the aerobic state, anoxic G6PDH (assayed at pH 8) showed a 38% decrease in K G6P and enhanced inhibition by urea, whereas in pH 6 assays Km NADP and maximal activity changed significantly.

All these data indicated that the aerobic and anoxic forms of G6PDH were the high and low phosphate forms, respectively, and that phosphorylation state was modulated in response to selected endogenous protein kinases (PKA or PKG) and protein phosphatases (PP1 or PP2C). Anoxia-induced changes in the phosphorylation state of G6PDH may facilitate sustained or increased production of NADPH to enhance antioxidant defense during long term anaerobiosis and/or during the transition back to aerobic conditions when the reintroduction of oxygen causes a rapid increase in oxidative stress.

Lama et al.  Peer J 2013.   http://dx.doi.org/10.7717/peerj.21

 

  1. Structural Basis for Isoform-Selective Inhibition in Nitric Oxide Synthase

    TL. Poulos and H Li

In the cardiovascular system, the important signaling molecule nitric oxide synthase (NOS) converts L-arginine into L-citrulline and releases nitric oxide (NO). NO produced by endothelial NOS (eNOS) relaxes smooth muscle which controls vascular tone and blood pressure. Neuronal NOS (nNOS) produces NO in the brain, where it influences a variety of neural functions such as neural transmitter release. NO can also support the immune system, serving as a cytotoxic agent during infections. Even with all of these important functions, NO is a free radical and, when overproduced, it can cause tissue damage. This mechanism can operate in many neurodegenerative diseases, and as a result the development of drugs targeting nNOS is a desirable therapeutic goal.

However, the active sites of all three human isoforms are very similar, and designing inhibitors specific for nNOS is a challenging problem. It is critically important, for example, not to inhibit eNOS owing to its central role in controlling blood pressure. In this Account, we summarize our efforts in collaboration with Rick Silverman at Northwestern University to develop drug candidates that specifically target NOS using crystallography, computational chemistry, and organic synthesis. As a result, we have developed aminopyridine compounds that are 3800-fold more selective for nNOS than eNOS, some of which show excellent neuroprotective effects in animal models. Our group has solved approximately 130 NOS-inhibitor crystal structures which have provided the structural basis for our design efforts. Initial crystal structures of nNOS and eNOS bound to selective dipeptide inhibitors showed that a single amino acid difference (Asp in nNOS and Asn in eNOS) results in much tighter binding to nNOS. The NOS active site is open and rigid, which produces few large structural changes when inhibitors bind. However, we have found that relatively small changes in the active site and inhibitor chirality can account for large differences in isoform-selectivity. For example, we expected that the aminopyridine group on our inhibitors would form a hydrogen bond with a conserved Glu inside the NOS active site. Instead, in one group of inhibitors, the aminopyridine group extends outside of the active site where it interacts with a heme propionate. For this orientation to occur, a conserved Tyr side chain must swing out of the way. This unanticipated observation taught us about the importance of inhibitor chirality and active site dynamics. We also successfully used computational methods to gain insights into the contribution of the state of protonation of the inhibitors to their selectivity. Employing the lessons learned from the aminopyridine inhibitors, the Silverman lab designed and synthesized symmetric double-headed inhibitors with an aminopyridine at each end, taking advantage of their ability to make contacts both inside and outside of the active site. Crystal structures provided yet another unexpected surprise. Two of the double-headed inhibitor molecules bound to each enzyme subunit, and one molecule participated in the generation of a novel Zn site that required some side chains to adopt alternate conformations. Therefore, in addition to achieving our specific goal, the development of nNOS selective compounds, we have learned how subtle differences in and structure can control proteinligand interactions and often in unexpected ways.

 

300px-Nitric_Oxide_Synthase

Nitric oxide synthase

arginine-NO-citulline cycle

arginine-NO-citulline cycle

active site of eNOS (PDB_1P6L) and nNOS (PDB_1P6H).

active site of eNOS (PDB_1P6L) and nNOS (PDB_1P6H).

 

 

NO - muscle, vasculature, mitochondria

NO – muscle, vasculature, mitochondria

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Figure:  (A) Structure of one of the early dipeptide lead compounds, 1, that exhibits excellentisoform selectivity. (B, C) show the crystal structures of the dipeptide inhibitor 1 in the active site of eNOS (PDB: 1P6L) and nNOS (PDB: 1P6H). In nNOS, the inhibitor “curls” which enables the inhibitor R-amino group to interact with both Glu592 and Asp597. In eNOS, Asn368 is the homologue to nNOS Asp597.

Accounts in Chem Res 2013; 46(2): 390-98.

  1. Jamming a Protein Signal

Interfering with a single cancer-promoting protein and its receptor can open this resistance mechanism by initiating autophagy of the affected cells,  according to researchers at The University of Texas MD Anderson Cancer Center  in the journal Cell Reports.  According to Dr. Anil Sood and Yunfei Wen, lead and first authors, blocking  prolactin, a potent growth factor for ovarian cancer, sets off downstream events that result in cell by autophagy, the process  recycles damaged organelles and proteins for new use by the cell through the phagolysozome. This in turn, provides a clinical rationale for blocking prolactin and its receptor to initiate sustained autophagy as an alternative strategy for treating cancers.

Steep reductions in tumor weight

Prolactin (PRL) is a hormone previously implicated in ovarian, endometrial and other cancer development andprogression. When PRL binds to its cell membrane receptor, PRLR, activation of cancer-promoting cell signaling pathways follows.  A variant of normal prolactin called G129R blocks the reaction between prolactin and its receptor. Sood and colleagues treated mice that had two different lines of human ovarian cancer, both expressing the prolactin receptor, with G129R. Tumor weights fell by 50 percent for mice with either type of ovarian cancer after 28 days of treatment with G129R, and adding the taxane-based chemotherapy agent paclitaxel cut tumor weight by 90 percent. They surmise that higher doses of G129R may result in even greater therapeutic benefit.

 

3D experiments show death by autophagy

 

[video width=”1280″ height=”720″ mp4=”http://pharmaceuticalintelligence.com/wp-content/uploads/2014/04/1741-7007-11-65-s1-macromolecular-juggling-by-ubiquitylation-enzymes1.mp4″][/video]

 

Next the team used the prolactin-mimicking peptide to treat cultures of cancer spheroids which sharply reduced their numbers, and blocked the activation of JAK2 and STAT signaling pathways.

Protein analysis of the treated spheroids showed increased presence of autophagy factors and genomic analysis revealed increased expression of a number of genes involved in autophagy progression and cell death.  Then a series of experiments using fluorescence and electron microscopy showed that the cytosol of treated cells had large numbers of cavities caused by autophagy.

The team also connected the G129R-induced autophagy to the activity of PEA-15, a known cancer inhibitor. Analysis of tumor samples from 32 ovarian cancer patients showed that tumors express higher levels of the prolactin receptor and lower levels of phosphorylated PEA-15 than normal ovarian tissue. However, patients with low levels of the prolactin receptor and higher PEA-15 had longer overall survival than those with high PRLR and low PEA-15.

Source: MD Anderson Cancer Center

 

  1. Chemists’ Work with Small Peptide Chains of Enzymes

Korendovych and his team designed seven simple peptides, each containing seven amino acids. They then allowed the molecules of each peptide to self-assemble, or spontaneously clump together, to form amyloids. (Zinc, a metal with catalytic properties, was introduced to speed up the reaction.) What they found was that four of the seven peptides catalyzed the hydrolysis of molecules known as esters, compounds that react with water to produce water and acids—a feat not uncommon among certain enzymes.

“It was the first time that a peptide this small self-assembled to produce an enzyme-like catalyst,” says Korendovych. “Each enzyme has to be an exact fit for its respective substrate,” he says, referring to the molecule with which an enzyme reacts. “Even after millions of years, nature is still testing all the possible combinations of enzymes to determine which ones can catalyze metabolic reactions. Our results make an argument for the design of self-assembling nanostructured catalysts.”

Source: Syracuse University

Here are three articles emphasizing the value of combinatorial analysis, which can be formed from genomic, clinical, and proteomic data sets.

 

  1. Comparative analysis of differential network modularity in tissue specific normal and cancer protein interaction networks

    F Islam , M Hoque , RS Banik , S Roy , SS Sumi, et al.

As most biological networks show modular properties, the analysis of differential modularity between normal and cancer protein interaction networks can be a good way to understand cancer more significantly. Two aspects of biological network modularity e.g. detection of molecular complexes (potential modules or clusters) and identification of crucial nodes forming the overlapping modules have been considered in this regard.

The computational analysis of previously published protein interaction networks (PINs) has been conducted to identify the molecular complexes and crucial nodes of the networks. Protein molecules involved in ten major cancer signal transduction pathways were used to construct the networks based on expression data of five tissues e.g. bone, breast, colon, kidney and liver in both normal and cancer conditions.

Cancer PINs show higher level of clustering (formation of molecular complexes) than the normal ones. In contrast, lower level modular overlapping is found in cancer PINs than the normal ones. Thus a proposition can be made regarding the formation of some giant nodes in the cancer networks with very high degree and resulting in reduced overlapping among the network modules though the predicted molecular complex numbers are higher in cancer conditions.

Islam et al. Journal of Clinical Bioinformatics 2013, 3:19-32

  1. A new 12-gene diagnostic biomarker signature of melanoma revealed by integrated microarray analysis

    Wanting Liu , Yonghong Peng and Desmond J. Tobin
    PeerJ 1:e49;        http://dx.doi.org/10.7717/peerj.49

Here we present an integrated microarray analysis framework, based on a genome-wide relative significance (GWRS) and genome-wide global significance (GWGS) model. When applied to five microarray datasets on melanoma published between 2000 and 2011, this method revealed a new signature of 200 genes. When these were linked to so-called ‘melanoma driver’ genes involved in MAPK, Ca2+, and WNT signaling pathways we were able to produce a new 12-gene diagnostic biomarker signature for melanoma (i.e., EGFR, FGFR2, FGFR3, IL8, PTPRF, TNC, CXCL13, COL11A1, CHP2, SHC4, PPP2R2C, andWNT4).We have begun to experimentally validate a subset of these genes involved inMAPK signaling at the protein level, including CXCL13, COL11A1, PTPRF and SHC4 and found these to be overexpressed inmetastatic and primarymelanoma cells in vitro and in situ compared to melanocytes cultured from healthy skin epidermis and normal healthy human skin.

 

catalytic amyloid forming particle

catalytic amyloid forming particle

 

 

 

 

 

 

 

        8.    PanelomiX: A threshold-based algorithm to create panels of biomarkers

X Robin , N Turck , A Hainard , N Tiberti, et al.
               Translational Proteomics 2013.    http://dx.doi.org/10.1016/j.trprot.2013.04.003

The PanelomiX toolbox combines biomarkers and evaluates the performance of panels to classify patients better than singlemarkers or other classifiers. The ICBTalgorithm proved to be an efficient classifier, the results of which can easily be interpreted.

Here are two current examples of the immense role played by signaling pathways in carcinogenic mechanisms and in treatment targeting, which is also confounded by acquired resistance.

 

  1. Triple-Negative Breast Cancer

  1. epidermal growth factor receptor (EGFR or ErbB1) and
  2. high activity of the phosphatidylinositol 3-kinase (PI3K)–Akt pathway

are both targeted in triple-negative breast cancer (TNBC).

  • activation of another EGFR family member [human epidermal growth factor receptor 3 (HER3) (or ErbB3)] may limit the antitumor effects of these drugs.

This study found that TNBC cell lines cultured with the EGFR or HER3 ligand EGF or heregulin, respectively, and treated with either an Akt inhibitor (GDC-0068) or a PI3K inhibitor (GDC-0941) had increased abundance and phosphorylation of HER3.

The phosphorylation of HER3 and EGFR in response to these treatments

  1. was reduced by the addition of a dual EGFR and HER3 inhibitor (MEHD7945A).
  2. MEHD7945A also decreased the phosphorylation (and activation) of EGFR and HER3 and
  3. the phosphorylation of downstream targets that occurred in response to the combination of EGFR ligands and PI3K-Akt pathway inhibitors.

In culture, inhibition of the PI3K-Akt pathway combined with either MEHD7945A or knockdown of HER3

  1. decreased cell proliferation compared with inhibition of the PI3K-Akt pathway alone.
  2. Combining either GDC-0068 or GDC-0941 with MEHD7945A inhibited the growth of xenografts derived from TNBC cell lines or from TNBC patient tumors, and
  3. this combination treatment was also more effective than combining either GDC-0068 or GDC-0941 with cetuximab, an EGFR-targeted antibody.
  4. After therapy with EGFR-targeted antibodies, some patients had residual tumors with increased HER3 abundance and EGFR/HER3 dimerization (an activating interaction).

Thus, we propose that concomitant blockade of EGFR, HER3, and the PI3K-Akt pathway in TNBC should be investigated in the clinical setting.

Reference: Antagonism of EGFR and HER3 Enhances the Response to Inhibitors of the PI3K-Akt Pathway in Triple-Negative Breast Cancer. JJ Tao, P Castel, N Radosevic-Robin, M Elkabets, et al.  Sci. Signal., 25 March 2014;
7(318), p. ra29   http://dx.doi.org/10.1126/scisignal.2005125

 

                  10.   Metastasis in RAS Mutant or Inhibitor-Resistant Melanoma Cells

The protein kinase BRAF is mutated in about 40% of melanomas, and BRAF inhibitors improve progression-free and overall survival in these patients. However, after a relatively short period of disease control, most patients develop resistance because of reactivation of the RAF–ERK (extracellular signal–regulated kinase) pathway, mediated in many cases by mutations in RAS. We found that BRAF inhibition induces invasion and metastasis in RAS mutant melanoma cells through a mechanism mediated by the reactivation of the MEK (mitogen-activated protein kinase kinase)–ERK pathway.

Reference: BRAF Inhibitors Induce Metastasis in RAS Mutant or Inhibitor-Resistant Melanoma Cells by Reactivating MEK and ERK Signaling. B Sanchez-Laorden, A Viros, MR Girotti, M Pedersen, G Saturno, et al., Sci. Signal., 25 March 2014;  7(318), p. ra30  http://dx.doi.org/10.1126/scisignal.2004815

Appendix II.

The world of physics in the twentieth century saw the end of determinism established by Newton. This is characterized by discrete laws that describe natural observations. These are in gravity and in eletricity. In an early phase of investigation, an era of galvanic or voltaic electricity represented a revolutionary break from the historical focus on frictional electricity. Alessandro Voltadiscovered that chemical reactions could be used to create positively charged anodes and negatively charged cathodes.  In 1790, Prof. Luigi Alyisio Galvani of Bologna, while conducting experiments on “animal electricity“, noticed the twitching of a frog’s legs in the presence of an electric machine. He observed that a frog’s muscle, suspended on an iron balustrade by a copper hook passing through its dorsal column, underwent lively convulsions without any extraneous cause, the electric machine being at this time absent.  Volta communicated a description of his pile to the Royal Society of London and shortly thereafter Nicholson and Cavendish (1780) produced the decomposition of water by means of the electric current, using Volta’s pile as the source of electromotive force.

Siméon Denis Poisson attacked the difficult problem of induced magnetization, and his results provided  a first approximation. His innovation required the application of mathematics to physics.  His memoirs on the theory of electricity and magnetism created a new branch of mathematical physics.  The discovery of electromagnetic induction was made almost simultaneously and independently by Michael Faraday and Joseph Henry. Michael Faraday, the successor of Humphry Davy, began his epoch-making research relating to electric and electromagnetic induction in 1831. In his investigations of the peculiar manner in which iron filings arrange themselves on a cardboard or glass in proximity to the poles of a magnet, Faraday conceived the idea of magnetic “lines of force” extending from pole to pole of the magnet and along which the filings tend to place themselves. On the discovery being made that magnetic effects accompany the passage of an electric current in a wire, it was also assumed that similar magnetic lines of force whirled around the wire. He also posited that iron, nickel, cobalt, manganese, chromium, etc., are paramagnetic (attracted by magnetism), whilst other substances, such as bismuth, phosphorus, antimony, zinc, etc., are repelled by magnetism or are diamagnetic.

Around the mid-19th century, Fleeming Jenkin‘s work on ‘ Electricity and Magnetism ‘ and Clerk Maxwell’s ‘ Treatise on Electricity and Magnetism ‘ were published. About 1850 Kirchhoff published his laws relating to branched or divided circuits. He also showed mathematically that according to the then prevailing electrodynamic theory, electricity would be propagated along a perfectly conducting wire with the velocity of light. Herman Helmholtz investigated the effects of induction on the strength of a current and deduced mathematical equations, which experiment confirmed. In 1853 Sir William Thomson (later Lord Kelvin) predicted as a result of mathematical calculations the oscillatory nature of the electric discharge of a condenser circuit.  Joseph Henry, in 1842 discerned  the oscillatory nature of the Leyden jardischarge.

In 1864 James Clerk Maxwell announced his electromagnetic theory of light, which was perhaps the greatest single step in the world’s knowledge of electricity. Maxwell had studied and commented on the field of electricity and magnetism as early as 1855/6 when On Faraday’s lines of force was read to the Cambridge Philosophical Society. The paper presented a simplified model of Faraday’s work, and how the two phenomena were related. He reduced all of the current knowledge into a linked set of differential equations with 20 equations in 20 variables. This work was later published as On Physical Lines of Force in1861. In order to determine the force which is acting on any part of the machine we must find its momentum, and then calculate the rate at which this momentum is being changed. This rate of change will give us the force. The method of calculation which it is necessary to employ was first given by Lagrange, and afterwards developed, with some modifications, by Hamilton’s equations. Now Maxwell logically showed how these methods of calculation could be applied to the electro-magnetic field. The energy of a dynamical systemis partly kinetic, partly potential. Maxwell supposes that the magnetic energy of the field is kinetic energy, the electric energy potential.  Around 1862, while lecturing at King’s College, Maxwell calculated that the speed of propagation of an electromagnetic field is approximately that of the speed of light.   Maxwell’s electromagnetic theory of light obviously involved the existence of electric waves in free space, and his followers set themselves the task of experimentally demonstrating the truth of the theory. By 1871, he presented the Remarks on the mathematical classification of physical quantities.

A Wave-Particle Dilemma at the Century End

In 1896 J.J. Thomson performed experiments indicating that cathode rays really were particles, found an accurate value for their charge-to-mass ratio e/m, and found that e/m was independent of cathode material. He made good estimates of both the charge e and the mass m, finding that cathode ray particles, which he called “corpuscles”, had perhaps one thousandth of the mass of the least massive ion known (hydrogen). He further showed that the negatively charged particles produced by radioactive materials, by heated materials, and by illuminated materials, were universal.  In the late 19th century, the Michelson–Morley experiment was performed by Albert Michelson and Edward Morley at what is now Case Western Reserve University. It is generally considered to be the evidence against the theory of a luminiferous aether. The experiment has also been referred to as “the kicking-off point for the theoretical aspects of the Second Scientific Revolution.” Primarily for this work, Albert Michelson was awarded theNobel Prize in 1907.

Wave–particle duality is a theory that proposes that all matter exhibits the properties of not only particles, which have mass, but also waves, which transfer energy. A central concept of quantum mechanics, this duality addresses the inability of classical concepts like “particle” and “wave” to fully describe the behavior of quantum-scale objects. Standard interpretations of quantum mechanics explain this paradox as a fundamental property of the universe, while alternative interpretations explain the duality as an emergent, second-order consequence of various limitations of the observer. This treatment focuses on explaining the behavior from the perspective of the widely used Copenhagen interpretation, in which wave–particle duality serves as one aspect of the concept of complementarity, that one can view phenomena in one way or in another, but not both simultaneously.  Through the work of Max PlanckAlbert EinsteinLouis de BroglieArthur Compton, Niels Bohr, and many others, current scientific theory holds that all particles also have a wave nature (and vice versa).

Beginning in 1670 and progressing over three decades, Isaac Newton argued that the perfectly straight lines of reflection demonstrated light’s particle nature, but Newton’s contemporaries Robert Hooke and Christiaan Huygens—and later Augustin-Jean Fresnel—mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different, refraction could be easily explained. The resulting Huygens–Fresnel principle was supported by Thomas Young‘s discovery of double-slit interference, the beginning of the end for the particle light camp.  The final blow against corpuscular theory came when James Clerk Maxwell discovered that he could combine four simple equations, along with a slight modification to describe self-propagating waves of oscillating electric and magnetic fields. When the propagation speed of these electromagnetic waves was calculated, the speed of light fell out. While the 19th century had seen the success of the wave theory at describing light, it had also witnessed the rise of the atomic theory at describing matter.

Matter and Light

In 1789, Antoine Lavoisier secured chemistry by introducing rigor and precision into his laboratory techniques. By discovering diatomic gases, Avogadro completed the basic atomic theory, allowing the correct molecular formulae of most known compounds—as well as the correct weights of atoms—to be deduced and categorized in a consistent manner. The final stroke in classical atomic theory came when Dimitri Mendeleev saw an order in recurring chemical properties, and created a table presenting the elements in unprecedented order and symmetry.   Chemistry was now an atomic science.

Black-body radiation, the emission of electromagnetic energy due to an object’s heat, could not be explained from classical arguments alone. The equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object’s energy is partitioned equally among the object’s vibrational modes. This worked well when describing thermal objects, whose vibrational modes were defined as the speeds of their constituent atoms, and the speed distribution derived from egalitarian partitioning of these vibrational modes closely matched experimental results. Speeds much higher than the average speed were suppressed by the fact that kinetic energy is quadratic—doubling the speed requires four times the energy—thus the number of atoms occupying high energy modes (high speeds) quickly drops off. Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws. This became known as the black body problem. The Rayleigh–Jeans law which, while correctly predicting the intensity of long wavelength emissions, predicted infinite total energy as the intensity diverges to infinity for short wavelengths.

The solution arrived in 1900 when Max Planck hypothesized that the frequency of light emitted by the black body depended on the frequency of the oscillator that emitted it, and the energy of these oscillators increased linearly with frequency (according to his constant h, where E = hν). By demanding that high-frequency light must be emitted by an oscillator of equal frequency, and further requiring that this oscillator occupy higher energy than one of a lesser frequency, Planck avoided any catastrophe; giving an equal partition to high-frequency oscillators produced successively fewer oscillators and less emitted light. And as in the Maxwell–Boltzmann distribution, the low-frequency, low-energy oscillators were suppressed by the onslaught of thermal jiggling from higher energy oscillators, which necessarily increased their energy and frequency. Planck had intentionally created an atomic theory of the black body, but had unintentionally generated an atomic theory of light, where the black body never generates quanta of light at a given frequency with energy less than .

In 1905 Albert Einstein took Planck’s black body model in itself and saw a wonderful solution to another outstanding problem of the day: the photoelectric effect, the phenomenon where electrons are emitted from atoms when they absorb energy from light.   Only by increasing the frequency of the light, and thus increasing the energy of the photons, can one eject electrons with higher energy. Thus, using Planck’s constant h to determine the energy of the photons based upon their frequency, the energy of ejected electrons should also increase linearly with frequency; the gradient of the line being Planck’s constant. These results were not confirmed until 1915, when Robert Andrews Millikan, produced experimental results in perfect accord with Einstein’s predictions. While  the energy of ejected electrons reflected Planck’s constant, the existence of photons was not explicitly proven until the discovery of the photon antibunching effect  When Einstein received his Nobel Prizein 1921, it was  for the photoelectric effect, the suggestion of quantized light. Einstein’s “light quanta” represented the quintessential example of wave–particle duality. Electromagnetic radiation propagates following  linear wave equations, but can only be emitted or absorbed as discrete elements, thus acting as a wave and a particle simultaneously.

Radioactivity Changes the Scientific Landscape

The turn of the century also features radioactivity, which later came to the forefront of the activities of World War II, the Manhattan Project, the discovery of the chain reaction, and later – Hiroshima and Nagasaki.

Marie Curie

Marie Curie

 

 

 

Marie Skłodowska-Curie was a Polish and naturalized-French physicist and chemist who conducted pioneering research on radioactivity. She was the first woman to win a Nobel Prize, the only woman to win in two fields, and the only person to win in multiple sciences. She was also the first woman to become a professor at the University of Paris, and in 1995 became the first woman to be entombed on her own merits in the Panthéon in Paris. She shared the 1903 Nobel Prize in Physics with her husband Pierre Curie and with physicist Henri Becquerel. She won the 1911 Nobel Prize in Chemistry.  Her achievements included a theory of radioactivity (a term that she coined, techniques for isolating radioactive isotopes, and the discovery of polonium and radium. She named the first chemical element that she discovered – polonium, which she first isolated in 1898 – after her native country. Under her direction, the world’s first studies were conducted into the treatment of neoplasms using radioactive isotopes. She founded the Curie Institutes in Paris and in Warsaw, which remain major centres of medical research today. During World War I, she established the first military field radiological centres.  Curie died in 1934 due to aplastic anemia brought on by exposure to radiation – mainly, it seems, during her World War I service in mobile X-ray units created by her.

 

Read Full Post »

Forensic Science Research Opportunity

Reporter: Aviva Lev-Ari, PhD, RN

From: Pierson, Steve [mailto:spierson@amstat.org]
Sent: Wednesday, March 12, 2014 9:16 AM
To: stat-acad-reps
Subject: Forensic Science opportunity

Dear Department Chairs/Representatives,

I write to request names of faculty who might be willing to serve on a standing review panel for the National Institute of Justice. The ASA forensic science committee cannot meet current demand for statisticians to be involved in forensic science and so suggested this email as an opportunity for younger faculty to learn about this funding agency and to be exposed to forensic science research challenges.

One of the standing review committee is an established one in Impression & Pattern Evidence (fingerprint, firearms, blood pattern, etc.) and the other is a new one in Trace Evidence (hair, fibers, paint, particles, etc.).  They ask reviewers for a 3-year commitment and hold an in-person review panel meeting annually in June in the DC area. Some of the ASA forensic science members have served on the established panel and describe as very interesting and less work than an NSF panel.

Please send any suggestions to me by Monday morning (and preferably sooner.)

Here’s additional information on the standing review panels that we’ve learned:
-Establishing scientifically valid foundations for assigning values to forensic comparisons is one of the top concerns in forensic science today.  We certainly feel that our funding decisions benefit when projects get adequate scrutiny of their statistical methods on the front end, and we hope to insure that by including the right expertise on our review panels.

– As to the question that your member had about subject matter expertise:  we’re not necessarily just looking for people with significant experience with a forensic science field.  If you’re able to refer qualified statisticians with that background, that would be great—but as you’ve said that list is rather short.  Primarily, we’re looking for statisticians who would be able to act as second reviewers of projects that rely to a large extent on statistical methods or that would only be successful if grounded in a statistically valid methodology.  For some particularly stats-heavy proposals, they may serve as lead reviewer, and would be appropriately paired with a technical expert. They would not be expected to weigh in on matters beyond the scope of their expertise (e.g., whether a specific FS technique is appropriate or practical).  Our panels will have 12-18 members, so there is plenty of room for a diversity of experience and background.

-Recusals from the panel are handled year by year. If they plan to apply this year to the solicitation “Research and Development in Forensic Science for Criminal Justice Purposes” (link below), it would be best if they didn’t offer to serve, as they would immediately be recused for COI. But once on the panel, there is nothing to prevent them from applying to the solicitation in a subsequent cycle, as long as they let us know that they will need to be recused. (FYI, here is the current solicitation: https://www.ncjrs.gov/pdffiles1/nij/sl001082.pdf.)

I should also give you a heads-up that we’ll be encouraging statisticians to apply for positions on the various committees of the newly established NIST forensic science oversight body called Organization of Scientific Area Committees (OSAC), to replace what has been known as the Scientific Working Groups. The OSAC and the ways to serve are described at http://www.nist.gov/forensics/upload/osac-021814.pdf. We expect the application process to open up within a month. If you open up those slides, you’ll see that “statistician” is mentioned 7 times.

Best Wishes,
Steve

Steve Pierson, Ph.D.
Director of Science Policy

American Statistical Association
Promoting the Practice and Profession of Statistics™
732 North Washington Street
Alexandria, VA 22314-1943
(703) 302-1841
http://www.amstat.org/policy <http://www.amstat.org/policy>
For ASA science policy updates, follow us on Twitter: @ASA_SciPol

http://www.amstat.org/asa175/index.cfm 

SOURCE

From: Tom Lane <Tom.Lane@mathworks.com>
Date: Thu, 13 Mar 2014 21:49:05 +0000
To: “tlane@alum.mit.edu” <tlane@alum.mit.edu>
Conversation: [BCASA] Forensic Science opportunity
Subject: [BCASA] Forensic Science opportunity

Read Full Post »

Human Longevity Inc (HLI) – $70M in Financing of Venter’s New Integrative Omics and Clinical Bioinformatics

Reporter: Aviva Lev-Ari, PhD, RN

Article ID #121: Human Longevity Inc (HLI) – $70M in Financing of Venter’s New Integrative Omics and Clinical Bioinformatics. Published on 3/5/14

WordCloud Image Produced by Adam Tubman

Venter’s New Integrative Omics and Clinical Data Analysis Firm Lands $70M in Financing

March 04, 2014

NEW YORK (GenomeWeb News) – J. Craig Venter today unveiled a new company called Human Longevity Inc. that will combine human genome, microbiome, and metabolome data coupled with clinical information to fuel development of new diagnostics, therapeutics, and stem cell treatments for diseases related to aging.

In a media briefing today, Venter said the company will “change the way medicine is practiced,” and will spearhead “a shift to a more preventive, genomic-based medicine model” that can lead to longer, healthier lives and lower healthcare costs.

Using $70 million in Series A financing, HLI initially plans to conduct genome, microbiome, and tumor sequencing on patients from the University of California, San Diego Moores Cancer Center and use their clinical phenotype and metabolomics data to create a massive database, Venter explained in a media briefing. HLI said the financing came from a small group of private investors. Though it didn’t disclose the names of those investors, The New York Times reported today that Illumina was among the backers.

Venter said the initial financing should keep the company going for about 18 months. HLI is building a long-term facility in San Diego that will be completed in about a year, Venter said, and it is currently in temporary facilities.

The firm plans to license data and knowledge to pharmaceutical and biotechnology firms and universities for their own research programs, while developing new therapeutics and diagnostics and providing sequencing services.

The company has already bought two Illumina HiSeq X Ten Sequencing Systems, and has inked an option to buy three more. It plans to sequence up to 40,000 human genomes per year initially and ramp up to 100,000 per year. HLI said it will conduct the first clinical project to include germ line, human genome, and tumor genome sequencing, along with a range of other types of information from each patient.

As part of its efforts, HLI has struck an agreement with Metabolon, under which the NC-based firm will provide biochemical profiling of the genomic samples that HLI collects.

Venter is co-founder, executive chairman and CEO of HLI, which also has agreed to a research services collaboration with the J. Craig Venter Institute, of which he is founder and CEO. That alliance will cover proteomics, infectious disease diagnostics, and the human microbiome.

The company said that it will tackle cancer first. Every patient at the UCSD Moores Cancer Center will have the opportunity to have their genome, microbiome, and tumors sequenced and analyzed as part of their treatment, said Venter. Other diseases of interest include diabetes, obesity, heart and liver diseases, and dementia.

Venter noted that 13 years ago it cost around $100 million and took nine months to sequence his genome, but now that cost has dropped to around $1,000 per genome.

“We are scaling up to do tens of thousands of genomes in the same time frame that it took to do one,” he said.

Through its agreement with HLI, Metabolon will characterize 2,400 chemicals in the bloodstream of 10,000 of the initial patients.

Venter said HLI plans to try to layer “the chemical data with the microbiome data, the human genome data, and most importantly the human phenotype data. We will be importing clinical records of every individual we are sequencing, so this will be one of the largest data studies in the history of science and medicine.”

“Hopefully,” Venter said, within 10 years HLI will “have data from half a million to a million human genomes, and the phenotype data, clinical data, and outcome data associated with that.”

“I view this as just the beginning, a starting point of this new field that some of us have been waiting for for a very long time, following on the first human genome 13 years ago,” he said.

Among Venter’s ventures is Synthetic Genomics, a genomics and synthetic biology firm of which he is a co-founder, chairman, CEO, and co-CSO. Though HLI didn’t say specifically that it would collaborate with Synthetic Genomics, according to a FAQ sheet on its website, it plans to use “synthetic biology advances to repair and repopulate a patient’s depleted and degraded stem cell population, returning those cells to a more healthy and youthful state.”

In addition to Venter, HLI’s two other co-founders are Peter Diamandis, chairman and CEO of the X Prize Foundation and co-founder and executive chairman of Singularity University, and stem cell biology researcher and entrepreneur Robert Hariri, who also will serve as company vice chairman.

J Craig Venter wants to digitize DNA and transmit the signal to teleport organisms

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/11/01/j-craig-venter-wants-to-digitize-dna-and-transmit-the-signal-to-teleport-organisms/

Life Sciences Circle Event: Next omics – Personalized Medicine beyond Genomics, December 11, 2013 5:30-8:30PM, The Broad Institute, Cambridge

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/11/18/life-sciences-circle-event-next-omics-personalized-medicine-beyond-genomics-december-11-2013-530-830pm-the-broad-institute-cambridge/

2013 Genomics: The Era Beyond the Sequencing of the Human Genome: Francis Collins, Craig Venter, Eric Lander, et al.

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/02/11/2013-genomics-the-era-beyond-the-sequencing-human-genome-francis-collins-craig-venter-eric-lander-et-al/

Synthetic Biology: On Advanced Genome Interpretation for Gene Variants and Pathways: What is the Genetic Base of Atherosclerosis and Loss of Arterial Elasticity with Aging

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/05/17/synthetic-biology-on-advanced-genome-interpretation-for-gene-variants-and-pathways-what-is-the-genetic-base-of-atherosclerosis-and-loss-of-arterial-elasticity-with-aging/

Scientific Innovation: as Influenced by Academia, Publishing Requirements and the Academic Publishing Industry

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2014/03/05/scientific-innovation-as-influenced-by-academia-publishing-requirements-and-the-academic-publishing-industry/

Fourth Annual QPrize Competition to Fund the World’s Next Groundbreaking Startups by Qualcomm Ventures

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2014/02/09/fourth-annual-qprize-competition-to-fund-the-worlds-next-groundbreaking-startups-by-qualcomm-ventures/

Cancer Genomics – Leading the Way by Cancer Genomics Program at UC Santa Cruz

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2012/10/29/cancer-genomics-leading-the-way-by-cancer-genomics-program-at-uc-santa-cruz/

Research Paradigm Shift in Human Genomics – Predictive Biomarkers and Personalized Medicine

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/01/13/paradigm-shift-in-human-genomics-predictive-biomarkers-and-personalized-medicine-part-1/

LEADERS in the Competitive Space of Genome Sequencing of Genetic Mutations for Therapeutic Drug Selection in Cancer Personalized Treatment

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/01/13/leaders-in-genome-sequencing-of-genetic-mutations-for-therapeutic-drug-selection-in-cancer-personalized-treatment-part-2/

Personalized Medicine: An Institute Profile – Coriell Institute for Medical Research

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/01/13/personalized-medicine-an-institute-profile-coriell-institute-for-medical-research-part-3/

The Consumer Market for Personal DNA Sequencing

Aviva Lev-Ari, PhD, RN

http://pharmaceuticalintelligence.com/2013/01/13/consumer-market-for-personal-dna-sequencing-part-4/

Read Full Post »

Introduction to Genomics and Epigenomics Roles in Cardiovascular Diseases

Introduction to Genomics and Epigenomics Roles in Cardiovascular Diseases

Author and Curator: Larry H Bernstein, MD, FCAP

This introduction is to a thorough evaluation of a rich source of research literature on the genomic influences, which may have variable strength in the biological causation of atherosclerosis, microvascular disease, plaque formation, not necessarily having expressing, except in a multivariable context that includes the environment, dietary factors, level of emotional stress, sleep habits, and the daily activities of living for affected individuals.  The potential of genomics is carried in the DNA, copied to RNA, and this is most well studied in the micro RNAs (miRNA).  The miRNA has been explored for the appearance in the circulation of specific miRNAs that might be associated with myocyte or endothelial cell injury, and they are also being used as targets for therapeutics by the creation of silencing RNAs (siRNA).  The extent to which there is evidence of success in these studies is limited, but is being translated from animal studies to human disease.  There is also a long history of the measurement of  circulating enzymes and isoenzymes (alanine amino transferase, creatine kinase, and lactate dehydrogenase, not to leave out the adenylate kinase species specific to myocardium), and more recently the release of troponins I and T, and the so far still not fully explored ischemia modified albumin, or of miRNAs for the diagnosis of myocardial infarction.

There is also a significant disagreement about the value of measuring high sensitivity C reactive protein (hs-CRP), which has always been a marker for systemic inflammatory disease, in both chronic rheumatic and infectious diseases having a broad range, so that procalcitonin has appeared to be better for that situation, and for early diagnosis of sepsis. The hs-CRP has been too easily ignored because of

1. the ubiquitous elevations in the population
2. the expressed concerns that one might not be inclined to treat a mild elevation without other risk factors, such as, LDL cholesterolemia, low HDL, absent diabetes or obesity.  Nevertheless, hs-CRP raises an reasonable argument for preventive measures, and perhaps the use of a statin.

There has been a substantial amount of work on the relationship of obesity to both type 2 diabetes mellitus (T2DM) and to coronary vascular disease and stroke.  Here we bring in the relationship of the vascular endothelium, adipose tissue secretion of adiponectin, and platelet activation.  A whole generation of antiplatelet drugs addresses the mechanism of platelet activation, adhession, and interaction with endothelium.   Very interesting work has appeared on RESISTIN, that could bear some fruit in the treatment of both obesity and T2DM.

It is important to keep in mind that epigenomic gene rearrangements or substitutions occur throughout life, and they may have an expression late in life.  Some of the known epigenetic events occur with some frequency, but the associations are extremely difficult to pin down, as well as the strength of the association.  In a population that is not diverse, epigenetic changes are passed on in the population in the period of childbearing age.  The establishment of an epigenetic change is diluted in a diverse population.  There have been a number of studies with different findings of association between cardiovascular disease and genetic mutations in the Han and also in the Uyger Chinese populations, which are distinctly different populations that is not part of this discussion.

This should be sufficient to elicit broad appeal in reading this volume on cardiovascular diseases, and perhaps the entire series.  Below is a diagram of this volume in the series.

PART 1 – Genomics and Medicine
Introduction to Genomics and Medicine (Vol 3)
Genomics and Medicine: The Physician’s View
Ribozymes and RNA Machines
Genomics and Medicine: Genomics to CVD Diagnoses
Establishing a Patient-Centric View of Genomic Data
VIDEO:  Implementing Biomarker Programs ­ P Ridker PART 2 – Epigenetics – Modifiable
Factors Causing CVD
Diseases Etiology
   Environmental Contributors
Implicated as Causing CVD
   Diet: Solids and Fluid Intake
and Nutraceuticals
   Physical Activity and
Prevention of CVD
   Psychological Stress and
Mental Health: Risk for CVD
   Correlation between
Cancer and CVD
PART 3  Determinants of CVD – Genetics, Heredity and Genomics Discoveries
Introduction
    Why cancer cells contain abnormal numbers of chromosomes (Aneuploidy)
     Functional Characterization of CV Genomics: Disease Case Studies @ 2013 ASHG
     Leading DIAGNOSES of CVD covered in Circulation: CV Genetics, 3/2010 – 3/2013
     Commentary on Biomarkers for Genetics and Genomics of CVD
PART 4 Individualized Medicine Guided by Genetics and Genomics Discoveries
    Preventive Medicine: Cardiovascular Diseases
    Walking and Running: Similar Risk Reductions for Hypertension, Hypercholesterolemia,
DM, and possibly CAD
http://pharmaceuticalintelligence.com/2013/04/04/walking-and-running-similar-risk-reductions-for-hypertension-hypercholesterolemia-dm-and-possibly-cad/
    Prevention of Type 2 Diabetes: Is Bariatric Surgery the Solution?
http://pharmaceuticalintelligence.com/2012/08/23/prevention-of-type-2-diabetes-is-bariatric-surgery-the-solution/
Gene-Therapy for CVD
Congenital Heart Disease/Defects
   Medical Etiologies: EBM – LEADING DIAGNOSES, Risks Pharmacogenomics for Cardio-
vascular Diseases
   Signaling Pathways     Response to Rosuvastatin in
Patients With Acute Myocardial Infarction:
Hepatic Metabolism and Transporter Gene
Variants Effect
http://pharmaceuticalintelligence.com/2014/
01/02/response-to-rosuvastatin-in-patients-
with-acute-myocardial-infarction-hepatic-
metabolism-and-transporter-gene-variants-effect/
   Proteomics and Metabolomics      Voltage-Gated Calcium Channel and Pharmaco-
genetic Association with Adverse Cardiovascular
Outcomes: Hypertension Treatment with Verapamil
SR (CCB) vs Atenolol (BB) or Trandolapril (ACE)
http://pharmaceuticalintelligence.com/2014/01/02/
voltage-gated-calcium-channel-and-pharmacogenetic-
association-with-adverse-cardiovascular-outcomes-
hypertension-treatment-with-verapamil-sr-ccb-vs-
atenolol-bb-or-trandolapril-ace/
      SNPs in apoE are found to influence statin response
significantly. Less frequent variants in
PCSK9 and smaller effect sizes in SNPs in HMGCR
http://pharmaceuticalintelligence.com/2014/01/02/snps-in-apoe-are-found-to-influence-statin-response-significantly-less-frequent-variants-in-pcsk9-and-smaller-effect-sizes-in-snps-in-hmgcr/

Read Full Post »

« Newer Posts - Older Posts »