Hematological Malignancy Diagnostics
Author and Curator: Larry H. Bernstein, MD, FCAP
18.104.22.168 Computer-aided diagnostics
Decision-making in the clinical setting
Didner, R Mar 1999 Amer Clin Lab
Mr. Didner is an Independent Consultant in Systems Analysis, Information Architecture (Informatics) Operations Research, and Human Factors Engineering (Cognitive Psychology), Decision Information Designs, 29 Skyline Dr., Morristown, NJ07960, U.S.A.; tel.: 973-455-0489; fax/e-mail: email@example.com
A common problem in the medical profession is the level of effort dedicated to administration and paperwork necessitated by various agencies, which contributes to the high cost of medical care. Costs would be reduced and accuracy improved if the clinical data could be captured directly at the point they are generated in a form suitable for transmission to insurers or machine transformable into other formats. Such a capability could also be used to improve the form and the structure of information presented to physicians and support a more comprehensive database linking clinical protocols to outcomes, with the prospect of improving clinical outcomes. Although the problem centers on the physician’s process of determining the diagnosis and treatment of patients and the timely and accurate recording of that process in the medical system, it substantially involves the pathologist and laboratorian, who interact significantly throughout the in-formation-gathering process. Each of the currently predominant ways of collecting information from diagnostic protocols has drawbacks. Using blank paper to collect free-form notes from the physician is not amenable to computerization; such free-form data are also poorly formulated, formatted, and organized for the clinical decision-making they support. The alternative of preprinted forms listing the possible tests, results, and other in-formation gathered during the diagnostic process facilitates the desired computerization, but the fixed sequence of tests and questions they present impede the physician from using an optimal decision-making sequence. This follows because:
- People tend to make decisions and consider information in a step-by-step manner in which intermediate decisions are intermixed with data acquisition steps.
- The sequence in which components of decisions are made may alter the decision outcome.
- People tend to consider information in the sequence it is requested or displayed.
- Since there is a separate optimum sequence of tests and questions for each cluster of history and presenting symptoms, there is no one sequence of tests and questions that can be optimal for all presenting clusters.
- As additional data and test results are acquired, the optimal sequence of further testing and data acquisition changes, depending on the already acquired information.
Therefore, promoting an arbitrary sequence of information requests with preprinted forms may detract from outcomes by contributing to a non-optimal decision-making sequence. Unlike the decisions resulting from theoretical or normative processes, decisions made by humans are path dependent; that is, the out-come of a decision process may be different if the same components are considered in a different sequence.
This paper proposes a general approach to gathering data at their source in computer-based form so as to improve the expected outcomes. Such a means must be interactive and dynamic, so that at any point in the clinical process the patient’s presenting symptoms, history, and the data already collected are used to determine the next data or tests requested. That de-termination must derive from a decision-making strategy designed to produce outcomes with the greatest value and supported by appropriate data collection and display techniques. The strategy must be based on the knowledge of the possible outcomes at any given stage of testing and information gathering, coupled with a metric, or hierarchy of values for assessing the relative desirability of the possible outcomes.
A value hierarchy
- The numbered list below illustrates a value hierarchy. In any particular instance, the higher-numbered values should only be considered once the lower- numbered values have been satisfied. Thus, a diagnostic sequence that is very time or cost efficient should only be considered if it does not increase the likelihood (relative to some other diagnostic sequence) that a life-threatening disorder may be missed, or that one of the diagnostic procedures may cause discomfort.
- Minimize the likelihood that a treatable, life-threatening disorder is not treated.
- Minimize the likelihood that a treatable, discomfort-causing disorder is not treated.
- Minimize the likelihood that a risky procedure(treatment or diagnostic procedure) is inappropriately administered.
- Minimize the likelihood that a discomfort-causing procedure is inappropriately administered.
- Minimize the likelihood that a costly procedure is inappropriately administered.
- Minimize the time of diagnosing and treating thepatient.8.Minimize the cost of diagnosing and treating the patient.
The above hierarchy is relative, not absolute; for many patients, a little bit of testing discomfort may be worth a lot of time. There are also some factors and graduations intentionally left out for expository simplicity (e.g., acute versus chronic disorders).This value hierarchy is based on a hypothetical patient. Clearly, the hierarchy of a health insurance carrier might be different, as might that of another patient (e.g., a geriatric patient). If the approach outlined herein were to be followed, a value hierarchy agreed to by a majority of stakeholders should be adopted.
Once the higher values are satisfied, the time and cost of diagnosis and treatment should be minimized. One way to do so would be to optimize the sequence in which tests are performed, so as to minimize the number, cost, and time of tests that need to be per-formed to reach a definitive decision regarding treatment. Such an optimum sequence could be constructed using Claude Shannon’s information theory.
According to this theory, the best next question to ask under any given situation (assuming the question has two possible outcomes) is that question that divides the possible outcomes into two equally likely sets. In the real world, all tests or questions are not equally valuable, costly, or time consuming; therefore, value(risk factors), cost, and time should be used as weighting factors to optimize the test sequence, but this is a complicating detail at this point.
A value scale
For dynamic computation of outcome values, the hierarchy could be converted into a weighted value scale so differing outcomes at more than one level of the hierarchy could be readily compared. An example of such a weighted value scale is Quality Adjusted Life Years (QALY).
Although QALY does not incorporate all of the factors in this example, it is a good conceptual starting place.
The display, request, decision-making relationship
For each clinical determination, the pertinent information should be gathered, organized, formatted, and formulated in a way that facilitates the accuracy, reliability, and efficiency with which that determination is made. A physician treating a patient with high cholesterol and blood pressure (BP), for example, may need to know whether or not the patient’s cholesterol and BP respond to weight changes to determine an appropriate treatment (e.g., weight control versus medication). This requires searching records for BP, certain blood chemicals (e.g., HDLs, LDLs, triglycerides, etc.), and weight from several
sources, then attempting to track them against each other over time. Manually reorganizing this clinical information each time it is used is extremely inefficient. More important, the current organization and formatting defies principles of human factors for optimally displaying information to enhance human information-processing characteristics, particularly for decision support.
While a discussion of human factors and cognitive psychology principles is beyond the scope of this paper, following are a few of the system design principles of concern:
- Minimize the load on short-term memory.
- Provide information pertinent to a given decision or component of a decision in a compact, contiguous space.
- Take advantage of basic human perceptual and pat-tern recognition facilities.
- Design the form of an information display to com-plement the decision-making task it supports.
F i g u re 1 shows fictitious, quasi-random data from a hypothetical patient with moderately elevated cholesterol. This one-page display pulls together all the pertinent data from six years of blood tests and related clinical measurements. At a glance, the physician’s innate pattern recognition, color, and shape perception facilities recognize the patient’s steadily increasing weight, cholesterol, BP, and triglycerides as well as the declining high-density lipoproteins. It would have taken considerably more time and effort to grasp this information from the raw data collection and blood test reports as they are currently presented in independent, tabular time slices.
Design the formulation of an information display to complement the decision-making task.
The physician may wish to know only the relationship between weight and cardiac risk factors rather than whether these measures are increasing or decreasing, or are within acceptable or marginal ranges. If so, Table 1 shows the correlations between weight and the other factors in a much more direct and simple way using the same data as in Figure 1. One can readily see the same conclusions about relations that were drawn from Figure 1.This type of abstract, symbolic display of derived information also makes it easier to spot relationships when the individual variables are bouncing up and down, unlike the more or less steady rise of most values in Figure 1. This increase in precision of relationship information is gained at the expense of other types of information (e.g., trends). To display information in an optimum form then, the system designer must know what the information demands of the task are at the point in the task when the display is to be used.
Present the sequence of information display clusters to complement an optimum decision-making strategy.
Just as a fixed sequence of gathering clinical, diagnostic information may lead to a far from optimum outcome, there exists an optimum sequence of testing, considering information, and gathering data that will lead to an optimum outcome (as defined by the value hierarchy) with a minimum of time and expense. The task of the information system designer, then, is to provide or request the right information, in the best form, at each stage of the procedure. For ex-ample, Figure 1 is suitable for the diagnostic phase since it shows the current state of the risk factors and their trends. Table 1, on the other hand, might be more appropriate in determining treatment, where there may be a choice of first trying a strict dietary treatment, or going straight to a combination of diet plus medication. The fact that Figure 1 and Table 1 have somewhat redundant information is not a problem, since they are intended to optimally provide information for different decision-making tasks. The critical need, at this point, is for a model of how to determine what information should be requested, what tests to order, what information to request and display, and in what form at each step of the decision-making process. Commitment to a collaborative relationship between physicians and laboratorians and other information providers would be an essential requirement for such an undertaking. The ideal diagnostic data-collection instrument is a flexible, computer-based device, such as a notebook computer or Personal Digital Assistant (PDA) sized device.
Barriers to interactive, computer-driven data collection at the source
As with any major change, it may be difficult to induce many physicians to change their behavior by interacting directly with a computer instead of with paper and pen. Unlike office workers, who have had to make this transition over the past three decades, most physicians’ livelihoods will not depend on converting to computer interaction. Therefore, the transition must be made attractive and the changes less onerous. Some suggestions follow:
- Make the data collection a natural part of the clinical process.
- Ensure that the user interface is extremely friendly, easy to learn, and easy to use.
- Use a small, portable device.
- Use the same device for collection and display of existing information (e.g., test results and his-tory).
- Minimize the need for free-form written data entry (use check boxes, forms, etc.).
- Allow the entry of notes in pen-based free-form (with the option of automated conversion of numeric data to machine-manipulable form).
- Give the physicians a more direct benefit for collecting data, not just a means of helping a clerk at an HMO second-guess the physician’s judgment.
- Improve administrative efficiency in the office.
- Make the data collection complement the clinical decision-making process.
- Improve information displays, leading to better outcomes.
- Make better use of the physician’s time and mental effort.
The medical profession is facing a crisis of information. Gathering information is costing a typical practice more and more while fees are being restricted by third parties, and the process of gathering this in-formation may be detrimental to current outcomes. Gathered properly, in machine-manipulable form, these data could be reformatted so as to greatly improve their value immediately in the clinical setting by leading to decisions with better outcomes and, in the long run, by contributing to a clinical data warehouse that could greatly improve medical knowledge. The challenge is to create a mechanism for data collection that facilitates, hastens, and improves the outcomes of clinical activity while minimizing the inconvenience and resistance to change on the part of clinical practitioners. This paper is intended to provide a high-level overview of how this may be accomplished, and start a dialogue along these lines.
- Tversky A. Elimination by aspects: a theory of choice. Psych Rev 1972; 79:281–99.
- Didner RS. Back-to-front design: a guns and butter approach. Ergonomics 1982; 25(6):2564–5.
- Shannon CE. A mathematical theory of communication. Bell System Technical J 1948; 27:379–423 (July), 623–56 (Oct).
- Feeny DH, Torrance GW. Incorporating utility-based quality-of-life assessment measures in clinical trials: two examples. Med Care 1989; 27:S190–204.
- Smith S, Mosier J. Guidelines for designing user interface soft-ware. ESD-TR-86-278, Aug 1986.
- Miller GA. The magical number seven plus or minus two. Psych Rev 1956; 65(2):81–97.
- Sternberg S. High-speed scanning in human memory. Science 1966; 153: 652–4.
Correlation of weight with other cardiac risk factors
Figure 1 Hypothetical patient data.
Realtime Clinical Expert Support
Regression: A richly textured method for comparison and classification of predictor variables
Converting Hematology Based Data into an Inferential Interpretation
Larry H. Bernstein, Gil David, James Rucinski and Ronald R. Coifman
In Hematology – Science and Practice
Lawrie CH, Ch 22. Pp541-552.
InTech Feb 2012, ISBN 978-953-51-0174-1
A model for Thalassemia Screening using Hematology Measurements
22.214.171.124 A model for automated screening of thalassemia in hematology (math study).
Kneifati-Hayek J, Fleischman W, Bernstein LH, Riccioli A, Bellevue R.
Lab Hematol. 2007; 13(4):119-23. http://dx.doi.org:/10.1532/LH96.07003.
The results of 398 patient screens were collected. Data from the set were divided into training and validation subsets. The Mentzer ratio was determined through a receiver operating characteristic (ROC) curve on the first subset, and screened for thalassemia using the second subset. HgbA2 levels were used to confirm beta-thalassemia.
RESULTS: We determined the correct decision point of the Mentzer index to be a ratio of 20. Physicians can screen patients using this index before further evaluation for beta-thalassemia (P < .05).
CONCLUSION: The proposed method can be implemented by hospitals and laboratories to flag positive matches for further definitive evaluation, and will enable beta-thalassemia screening of a much larger population at little to no additional cost.
126.96.36.199 Bernstein LH, Rucinski J. Clin Chem Lab Med. 2011 Sep 21;49(12):2089-95.
David G, Bernstein LH, Coifman RR. Nutrition. 2013 Jan; 29(1):113-21.
188.8.131.52 Molecular Diagnostics
Genomic Analysis of Hematological Malignancies
Acute lymphoblastic leukemia (ALL) is the most common hematologic malignancy that occurs in children. Although more than 90% of children with ALL now survive to adulthood, those with the rarest and high-risk forms of the disease continue to have poor prognoses. Through the Pediatric Cancer Genome Project (PCGP), investigators in the Hematological Malignancies Program are identifying the genetic aberrations that cause these aggressive forms of leukemias. Here we present two studies on the genetic bases of early T-cell precursor ALL and acute megakaryoblastic leukemia.
- Early T-Cell Precursor ALL Is Characterized by Activating Mutations
- The CBFA2T3-GLIS2Fusion Gene Defines an Aggressive Subtype of Acute Megakaryoblastic Leukemia in Children
Early T-cell precursor ALL (ETP-ALL), which comprises 15% of all pediatric T-cell leukemias, is an aggressive disease that is typically resistant to contemporary therapies. Children with ETP-ALL have a high rate of relapse and an extremely poor prognosis (i.e., 5-year survival is approximately 20%). The genetic basis of ETP-ALL has remained elusive. Although ETP-ALL is associated with a high burden of DNA copy number aberrations, none are consistently found or suggest a unifying genetic alteration that drives this disease.
Through the efforts of the PCGP, Jinghui Zhang, PhD (Computational Biology), James R. Downing, MD (Pathology), Charles G. Mullighan, MBBS(Hons), MSc, MD (Pathology), and colleagues analyzed the whole-genome sequences of leukemic cells and matched normal DNA from 12 pediatric patients with ETP-ALL. The identified genetic mutations were confirmed in a validation cohort of 52 ETP-ALL specimens and 42 non-ETP T-lineage ALLs (T-ALL).
In the journal Nature, the investigators reported that each ETP-ALL sample carried an average of 1140 sequence mutations and 12 structural variations. Of the structural variations, 51% were breakpoints in genes with well-established roles in hematopoiesis or leukemogenesis (e.g., MLH2,SUZ12, and RUNX1). Eighty-four percent of the structural variations either caused loss of function of the gene in question or resulted in the formation of a fusion gene such as ETV6-INO80D. The ETV6 gene, which encodes a protein that is essential for hematopoiesis, is frequently mutated in leukemia. Among the DNA samples sequenced in this study, ETV6 was altered in 33% of ETP-ALL but only 10% of T-ALL cases.
Next-generation sequencing in hematologic malignancies: what will be the dividends?
The application of high-throughput, massively parallel sequencing technologies to hematologic malignancies over the past several years has provided novel insights into disease initiation, progression, and response to therapy. Here, we describe how these new DNA sequencing technologies have been applied to hematolymphoid malignancies. With further improvements in the sequencing and analysis methods as well as integration of the resulting data with clinical information, we expect these technologies will facilitate more precise and tailored treatment for patients with hematologic neoplasms.
Leveraging cancer genome information in hematologic malignancies.
The use of candidate gene and genome-wide discovery studies in the last several years has led to an expansion of our knowledge of the spectrum of recurrent, somatic disease alleles, which contribute to the pathogenesis of hematologic malignancies. Notably, these studies have also begun to fundamentally change our ability to develop informative prognostic schema that inform outcome and therapeutic response, yielding substantive insights into mechanisms of hematopoietic transformation in different tissue compartments. Although these studies have already had important biologic and translational impact, significant challenges remain in systematically applying these findings to clinical decision making and in implementing new technologies for genetic analysis into clinical practice to inform real-time decision making. Here, we review recent major genetic advances in myeloid and lymphoid malignancies, the impact of these findings on prognostic models, our understanding of disease initiation and evolution, and the implication of genomic discoveries on clinical decision making. Finally, we discuss general concepts in genetic modeling and the current state-of-the-art technology used in genetic investigation.
p53 mutations are associated with resistance to chemotherapy and short survival in hematologic malignancies
E Wattel, C Preudhomme, B Hecquet, M Vanrumbeke, et AL.
Blood, (Nov 1), 1994; 84(9): pp 3148-3157
We analyzed the prognostic value of p53 mutations for response to chemotherapy and survival in acute myeloid leukemia (AML), myelodysplastic syndrome (MDS), and chronic lymphocytic leukemia (CLL). Mutations were detected by single-stranded conformation polymorphism (SSCP) analysis of exons 4 to 10 of the P53 gene, and confirmed by direct sequencing. A p53 mutation was found in 16 of 107 (15%) AML, 20 of 182 (11%) MDS, and 9 of 81 (11%) CLL tested. In AML, three of nine (33%) mutated cases and 66 of 81 (81%) nonmutated cases treated with intensive chemotherapy achieved complete remission (CR) (P = .005) and none of five mutated cases and three of six nonmutated cases treated by low-dose Ara C achieved CR or partial remission (PR) (P = .06). Median actuarial survival was 2.5 months in mutated cases, and 15 months in nonmutated cases (P < lo-‘). In the MDS patients who received chemotherapy (intensive chemotherapy or low-dose Ara C), 1 of 13 (8%) mutated cases and 23 of 38 (60%) nonmutated cases achieved CR or PR (P = .004), and median actuarial survival was 2.5 and 13.5 months, respectively (P C lo-’). In all MDS cases (treated and untreated), the survival difference between mutated cases and nonmutated cases was also highly significant. In CLL, 1 of 8 (12.5%) mutated cases treated by chemotherapy (chlorambucil andlor CHOP andlor fludarabine) responded, as compared with 29 of 36 (80%) nonmutated cases (P = .02). In all CLL cases, survival from p53 analysis was significantly shorter in mutated cases (median 7 months) than in nonmutated cases (median not reached) (P < IO-’). In 35 of the 45 mutated cases of AML, MDS, and CLL, cytogenetic analysis or SSCP and sequence findings showed loss of the nonmutated P53 allele. Our findings show that p53 mutations are a strong prognostic indicator of response to chemotherapy and survival in AML, MDS, and CLL. The usual association of p53 mutations to loss of the nonmutated P53 allele, in those disorders, ie, to absence of normal p53 in tumor cells, suggests that p53 mutations could induce drug resistance, at least in part, by interfering with normal apoptotic pathways in tumor cells.
Genomic approaches to hematologic malignancies
Benjamin L. Ebert and Todd R. Golub
Blood. 2004; 104:923-932
In the past several years, experiments using DNA microarrays have contributed to an increasingly refined molecular taxonomy of hematologic malignancies. In addition to the characterization of molecular profiles for known diagnostic classifications, studies have defined patterns of gene expression corresponding to specific molecular abnormalities, oncologic phenotypes, and clinical outcomes. Furthermore, novel subclasses with distinct molecular profiles and clinical behaviors have been identified. In some cases, specific cellular pathways have been highlighted that can be therapeutically targeted. The findings of microarray studies are beginning to enter clinical practice as novel diagnostic tests, and clinical trials are ongoing in which therapeutic agents are being used to target pathways that were identified by gene expression profiling. While the technology of DNA microarrays is becoming well established, genome-wide surveys of gene expression generate large data sets that can easily lead to spurious conclusions. Many challenges remain in the statistical interpretation of gene expression data and the biologic validation of findings. As data accumulate and analyses become more sophisticated, genomic technologies offer the potential to generate increasingly sophisticated insights into the complex molecular circuitry of hematologic malignancies. This review summarizes the current state of discovery and addresses key areas for future research.
184.108.40.206 Flow cytometry
Introduction to Flow Cytometry: Blood Cell Identification
Dana L. Van Laeys
No other laboratory method provides as rapid and detailed analysis of cellular populations as flow cytometry, making it a valuable tool for diagnosis and management of several hematologic and immunologic diseases. Understanding this relevant methodology is important for any medical laboratory scientist.
Whether you have no previous experience with flow cytometry or just need a refresher, this course will help you to understand the basic principles, with the help of video tutorials and interactive case studies.
Basic principles include:
- Immunophenotypic features of various types of hematologic cells
- Labeling cellular elements with fluorochromes
- Blood cell identification, specifically B and T lymphocyte identification and analysis
- Cell sorting to isolate select cell population for further analysis
- Analyzing and interpreting result reports and printouts