Feeds:
Posts
Comments

Archive for the ‘Anemia’ Category

Highlighted Progress in Science – 2017

Reporter: Sudipta Saha, PhD

 

  1. Lungs can supply blood stem cells and also produce platelets: Lungs, known primarily for breathing, play a previously unrecognized role in blood production, with more than half of the platelets in a mouse’s circulation produced there. Furthermore, a previously unknown pool of blood stem cells has been identified that is capable of restoring blood production when bone marrow stem cells are depleted.

 

  1. A new drug for multiple sclerosis: A new multiple sclerosis (MS) drug, which grew out of the work of UCSF (University of California, San Francisco) neurologist was approved by the FDA. Ocrelizumab, the first drug to reflect current scientific understanding of MS, was approved to treat both relapsing-remitting MS and primary progressive MS.

 

  1. Marijuana legalized – research needed on therapeutic possibilities and negative effects: Recreational marijuana will be legal in California starting in January, and that has brought a renewed urgency to seek out more information on the drug’s health effects, both positive and negative. UCSF scientists recognize marijuana’s contradictory status: the drug has proven therapeutic uses, but it can also lead to tremendous public health problems.

 

  1. Source of autism discovered: In a finding that could help unlock the fundamental mysteries about how events early in brain development lead to autism, researchers traced how distinct sets of genetic defects in a single neuronal protein can lead to either epilepsy in infancy or to autism spectrum disorders in predictable ways.

 

  1. Protein found in diet responsible for inflammation in brain: Ketogenic diets, characterized by extreme low-carbohydrate, high-fat regimens are known to benefit people with epilepsy and other neurological illnesses by lowering inflammation in the brain. UCSF researchers discovered the previously undiscovered mechanism by which a low-carbohydrate diet reduces inflammation in the brain. Importantly, the team identified a pivotal protein that links the diet to inflammatory genes, which, if blocked, could mirror the anti-inflammatory effects of ketogenic diets.

 

  1. Learning and memory failure due to brain injury is now restorable by drug: In a finding that holds promise for treating people with traumatic brain injury, an experimental drug, ISRIB (integrated stress response inhibitor), completely reversed severe learning and memory impairments caused by traumatic brain injury in mice. The groundbreaking finding revealed that the drug fully restored the ability to learn and remember in the brain-injured mice even when the animals were initially treated as long as a month after injury.

 

  1. Regulatory T cells induce stem cells for promoting hair growth: In a finding that could impact baldness, researchers found that regulatory T cells, a type of immune cell generally associated with controlling inflammation, directly trigger stem cells in the skin to promote healthy hair growth. An experiment with mice revealed that without these immune cells as partners, stem cells cannot regenerate hair follicles, leading to baldness.

 

  1. More intake of good fat is also bad: Liberal consumption of good fat (monounsaturated fat) – found in olive oil and avocados – may lead to fatty liver disease, a risk factor for metabolic disorders like type 2 diabetes and hypertension. Eating the fat in combination with high starch content was found to cause the most severe fatty liver disease in mice.

 

  1. Chemical toxicity in almost every daily use products: Unregulated chemicals are increasingly prevalent in products people use every day, and that rise matches a concurrent rise in health conditions like cancers and childhood diseases, Thus, researcher in UCSF is working to understand the environment’s role – including exposure to chemicals – in health conditions.

 

  1. Cytomegalovirus found as common factor for diabetes and heart disease in young women: Cytomegalovirus is associated with risk factors for type 2 diabetes and heart disease in women younger than 50. Women of normal weight who were infected with the typically asymptomatic cytomegalovirus, or CMV, were more likely to have metabolic syndrome. Surprisingly, the reverse was found in those with extreme obesity.

 

References:

 

https://www.ucsf.edu/news/2017/12/409241/most-popular-science-stories-2017

 

https://www.ucsf.edu/news/2017/03/406111/surprising-new-role-lungs-making-blood

 

https://www.ucsf.edu/news/2017/03/406296/new-multiple-sclerosis-drug-ocrelizumab-could-halt-disease

 

https://www.ucsf.edu/news/2017/06/407351/dazed-and-confused-marijuana-legalization-raises-need-more-research

 

https://www.ucsf.edu/news/2017/01/405631/autism-researchers-discover-genetic-rosetta-stone

 

https://www.ucsf.edu/news/2017/09/408366/how-ketogenic-diets-curb-inflammation-brain

 

https://www.ucsf.edu/news/2017/07/407656/drug-reverses-memory-failure-caused-traumatic-brain-injury

 

https://www.ucsf.edu/news/2017/05/407121/new-hair-growth-mechanism-discovered

 

https://www.ucsf.edu/news/2017/06/407536/go-easy-avocado-toast-good-fat-can-still-be-bad-you-research-shows

 

https://www.ucsf.edu/news/2017/06/407416/toxic-exposure-chemicals-are-our-water-food-air-and-furniture

 

https://www.ucsf.edu/news/2017/02/405871/common-virus-tied-diabetes-heart-disease-women-under-50

 

Read Full Post »

Metformin and vitamin B12 deficiency?

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 

Years of taking popular diabetes drug tied to risk of B12 deficiency

 

Long-term Metformin Use and Vitamin B12 Deficiency in the Diabetes Prevention Program Outcomes Study

 

Metformin linked to vitamin B12 deficiency

David Holmes   Nature Reviews Endocrinology(2016)    http://dx.doi.org:/10.1038/nrendo.2016.39

Secondary analysis of data from the Diabetes Prevention Program Outcomes Study (DPPOS), one of the largest and longest studies of metformin treatment in patients at high risk of developing type 2 diabetes mellitus, shows that long-term use of metformin is associated with vitamin B12deficiency.

Aroda, V. R. et al. Long-term metformin use and vitamin B12 deficiency in the Diabetes Prevention Program Outcomes Study. J. Clin. Endocrinol. Metab. http://dx.doi.org/10.1210/jc.2015-3754 (2016)

 

Long-term Follow-up of Diabetes Prevention Program Shows Continued Reduction in Diabetes Development

http://www.diabetes.org/newsroom/press-releases/2014/long-term-follow-up-of-diabetes-prevention-program-shows-reduction-in-diabetes-development.html

San Francisco, California
June 16, 2014

Treatments used to decrease the development of type 2 diabetes continue to be effective an average of 15 years later, according to the latest findings of the Diabetes Prevention Program Outcomes Study, a landmark study funded by the National Institutes of Health (NIH).

The results, presented at the American Diabetes Association’s 74th Scientific Sessions®, come more than a decade after the Diabetes Prevention Program, or DPP, reported its original findings. In 2001, after an average of three years of study, the DPP announced that the study’s two interventions, a lifestyle program designed to reduce weight and increase activity levels and the diabetes medicinemetformin, decreased the development of type 2 diabetes in a diverse group of people, all of whom were at high risk for the disease, by 58 and 31 percent, respectively, compared with a group taking placebo.

The Diabetes Prevention Program Outcomes Study, or DPPOS, was conducted as an extension of the DPP to determine the longer-term effects of the two interventions, including further reduction in diabetes development and whether delaying diabetes would reduce the development of the diabetes complications that can lead to blindness, kidney failure, amputations and heart disease. Funded largely by the NIH’s National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK), the new findings show that the lifestyle intervention and metformin treatment have beneficial effects, even years later, but did not reduce microvascular complications.

Delaying Type 2 Diabetes

Participants in the study who were originally assigned to the lifestyle intervention and metformin during DPP continued to have lower rates of type 2 diabetes development than those assigned to placebo, with 27 percent and 17 percent reductions, respectively, after 15 years.

“What we’re finding is that we can prevent or delay the onset of type 2 diabetes, a chronic disease, through lifestyle intervention or with metformin, over a very long period of time,” said David M. Nathan, MD, Chairman of the DPP/DPPOS and Professor of Medicine at Harvard Medical School. “After the initial randomized treatment phase in DPP, all participants were offered lifestyle intervention and the rates of diabetes development fell in the metformin and former placebo groups, leading to a reduction in the treatment group differences over time.  However, the lifestyle intervention and metformin are still quite effective at delaying, if not preventing, type 2 diabetes,” Dr. Nathan said. Currently, an estimated 79 million American adults are at high-risk for developing type 2 diabetes.

Microvascular Complications
The DPPOS investigators followed participants for an additional 12 years after the end of the DPP to determine both the extent of diabetes prevention over time and whether the study treatments would also decrease the small vessel -or microvascular- complications, such as eye, nerve and kidney disease. These long-term results did not demonstrate significant differences among the lifestyle intervention, metformin or placebo groups on the microvascular complications, reported Kieren Mather, MD, Professor of Medicine at Indiana University School of Medicine and a study investigator.

“However, regardless of type of initial treatment, participants who didn’t develop diabetes had a 28 percent lower occurrence of the microvascular complications than those participants who did develop diabetes. These findings show that intervening in the prediabetes phase is important in reducing early stage complications,” Dr. Mather noted. The absence of differences in microvascular complications among the intervention groups may be explained by the small differences in average glucose levels among the groups at this stage of follow-up.

Risk for Cardiovascular Disease

The DPP population was relatively young and healthy at the beginning of the study, and few participants had experienced any severe cardiovascular events, such as heart attack or stroke, 15 years later. The relatively small number of events meant that the DPPOS researchers could not test the effects of interventions on cardiovascular disease. However, the research team did examine whether the study interventions, or a delay in the onset of type 2 diabetes, improved cardiovascular risk factors.

“We found that cardiovascular risk factors, such as hypertension, are generally improved by the lifestyle intervention and somewhat less by metformin,” said Ronald Goldberg, MD, Professor of Medicine at the University of Miami and one of the DPPOS investigators. “We know that people with type 2 diabetes are at much higher risk for heart disease and stroke than those who do not have diabetes, so a delay in risk factor development or improvement in risk factors may prove to be beneficial.”

Long-term Results with Metformin

The DPP/DPPOS is the largest and longest duration study to examine the effects of metformin, an inexpensive, well-known and generally safe diabetes medicine, in people who have not been diagnosed with diabetes. For DPPOS participants, metformin treatment was associated with a modest degree of long-term weight loss. “Other than a small increase in vitamin B-12 deficiency, which is a recognized consequence of metformin therapy, it has been extremely safe and well-tolerated over the 15 years of our study,” said Jill Crandall, MD, Professor of Medicine at Albert Einstein College of Medicine and a DPPOS investigator. “Further study will help show whether metformin has beneficial effects on heart disease and cancer, which are both increased in people with type 2 diabetes.”

Looking to the Future

In addition to the current findings, the DPPOS includes a uniquely valuable population that can help researchers understand the clinical course of type 2 diabetes.  Since the participants did not have diabetes at the beginning of the DPP, for those who have developed diabetes, the data show precisely when they developed the disease, which is rare in previous studies. “The DPP and DPPOS have given us an incredible wealth of information by following a very diverse group of people with regard to race and age as they have progressed from prediabetes to diabetes,” said Judith Fradkin, MD, Director of the NIDDK Division of Diabetes, Endocrinology and Metabolic Diseases. “The study provides us with an opportunity to make crucial discoveries about the clinical course of type 2 diabetes.”

Dr. Fradkin noted that the study population held promise for further analyses because researchers would now be able to examine how developing diabetes at different periods of life may cause the disease to progress differently. “We can look at whether diabetes behaves differently if you develop it before the age of 50 or after the age of 60,” she said. “Thanks to the large and diverse population of DPPOS that has remained very loyal to the study, we will be able to see how and when complications first develop and understand how to intervene most effectively.”

She added that NIDDK had invited the researchers to submit an application for a grant to follow the study population for an additional 10 years.

The Diabetes Prevention Program Outcomes Study was funded under NIH grant U01DK048489 by the NIDDK; National Institute on Aging; National Cancer Institute; National Heart, Lung, and Blood Institute; National Eye Institute; National Center on Minority Health and Health Disparities; and the Office of the NIH Director; Eunice Kennedy Shriver National Institute of Child Health and Human Development; Office of Research on Women’s Health; and Office of Dietary Supplements, all part of the NIH, as well as the Indian Health Service, Centers for Disease Control and Prevention and American Diabetes Association. Funding in the form of supplies was provided by Merck Sante, Merck KGaA and LifeScan.

The American Diabetes Association is leading the fight to Stop Diabetes® and its deadly consequences and fighting for those affected by diabetes. The Association funds research to prevent, cure and manage diabetes; delivers services to hundreds of communities; provides objective and credible information; and gives voice to those denied their rights because of diabetes. Founded in 1940, our mission is to prevent and cure diabetes and to improve the lives of all people affected by diabetes. For more information please call the American Diabetes Association at 1-800-DIABETES (1-800-342-2383) or visit http://www.diabetes.org. Information from both these sources is available in English and Spanish.

Association of Biochemical B12Deficiency With Metformin Therapy and Vitamin B12Supplements  

The National Health and Nutrition Examination Survey, 1999–2006

Lael ReinstatlerYan Ping QiRebecca S. WilliamsonJoshua V. Garn, and Godfrey P. Oakley Jr.
Diabetes Care February 2012 vol. 35 no. 2 327-333 
     http://dx.doi.org:/10.2337/dc11-1582

OBJECTIVE To describe the prevalence of biochemical B12deficiency in adults with type 2 diabetes taking metformin compared with those not taking metformin and those without diabetes, and explore whether this relationship is modified by vitamin B12supplements.

RESEARCH DESIGN AND METHODS Analysis of data on U.S. adults ≥50 years of age with (n = 1,621) or without type 2 diabetes (n = 6,867) from the National Health and Nutrition Examination Survey (NHANES), 1999–2006. Type 2 diabetes was defined as clinical diagnosis after age 30 without initiation of insulin therapy within 1 year. Those with diabetes were classified according to their current metformin use. Biochemical B12 deficiency was defined as serum B12concentrations ≤148 pmol/L and borderline deficiency was defined as >148 to ≤221 pmol/L.

RESULTS Biochemical B12 deficiency was present in 5.8% of those with diabetes using metformin compared with 2.4% of those not using metformin (P = 0.0026) and 3.3% of those without diabetes (P = 0.0002). Among those with diabetes, metformin use was associated with biochemical B12 deficiency (adjusted odds ratio 2.92; 95% CI 1.26–6.78). Consumption of any supplement containing B12 was not associated with a reduction in the prevalence of biochemical B12deficiency among those with diabetes, whereas consumption of any supplement containing B12 was associated with a two-thirds reduction among those without diabetes.

CONCLUSIONS Metformin therapy is associated with a higher prevalence of biochemical B12 deficiency. The amount of B12recommended by the Institute of Medicine (IOM) (2.4 μg/day) and the amount available in general multivitamins (6 μg) may not be enough to correct this deficiency among those with diabetes.

It is well known that the risks of both type 2 diabetes and B12deficiency increase with age (1,2). Recent national data estimate a 21.2% prevalence of diagnosed diabetes among adults ≥65 years of age and a 6 and 20% prevalence of biochemical B12 deficiency (serum B12<148 pmol/L) and borderline deficiency (serum B12 ≥148–221 pmol/L) among adults ≥60 years of age (3,4).

The diabetes drug metformin has been reported to cause a decrease in serum B12 concentrations. In the first efficacy trial, DeFronzo and Goodman (5) demonstrated that although metformin offers superior control of glycosylated hemoglobin levels and fasting plasma glucose levels compared with glyburide, serum B12 concentrations were lowered by 22% compared with placebo, and 29% compared with glyburide therapy after 29 weeks of treatment. A recent, randomized control trial designed to examine the temporal relationship between metformin and serum B12 found a 19% reduction in serum B12 levels compared with placebo after 4 years (6). Several other randomized control trials and cross-sectional surveys reported reductions in B12ranging from 9 to 52% (716). Although classical B12 deficiency presents with clinical symptoms such as anemia, peripheral neuropathy, depression, and cognitive impairment, these symptoms are usually absent in those with biochemical B12 deficiency (17).

Several researchers have made recommendations to screen those with type 2 diabetes on metformin for serum B12 levels (6,7,1416,1821). However, no formal recommendations have been provided by the medical community or the U.S. Prevention Services Task Force. High-dose B12 injection therapy has been successfully used to correct the metformin-induced decline in serum B12 (15,21,22). The use of B12supplements among those with type 2 diabetes on metformin in a nationally representative sample and their potentially protective effect against biochemical B12 deficiency has not been reported. It is therefore the aim of the current study to use the nationally representative National Health and Nutrition Examination Survey (NHANES) population to determine the prevalence of biochemical B12deficiency among those with type 2 diabetes ≥50 years of age taking metformin compared with those with type 2 diabetes not taking metformin and those without diabetes, and to explore how these relationships are modified by B12 supplement consumption.

Design overview

NHANES is a nationally representative sample of the noninstitutionalized U.S. population with targeted oversampling of U.S. adults ≥60 years of age, African Americans, and Hispanics. Details of these surveys have been described elsewhere (23). All participants gave written informed consent, and the survey protocol was approved by a human subjects review board.

Setting and participants

Our study included adults ≥50 years of age from NHANES 1999–2006. Participants with positive HIV antibody test results, high creatinine levels (>1.7 mg/dL for men and >1.5 mg/dL for women), and prescription B12 injections were excluded from the analysis. Participants who reported having prediabetes or borderline diabetes (n = 226) were removed because they could not be definitively grouped as having or not having type 2 diabetes. We also excluded pregnant women, those with type 1 diabetes, and those without diabetes taking metformin. Based on clinical aspects described by the American Diabetes Association and previous work in NHANES, those who were diagnosed before the age of 30 and began insulin therapy within 1 year of diagnosis were classified as having type 1 diabetes (24,25). Type 2 diabetes status in adults was dichotomized as yes/no. Participants who reported receiving a physician’s diagnosis after age 30 (excluding gestational diabetes) and did not initiate insulin therapy within 1 year of diagnosis were classified as having type 2 diabetes.

Outcomes and follow-up

The primary outcome was biochemical B12 deficiency determined by serum B12 concentrations. Serum B12 levels were quantified using the Quantaphase II folate/vitamin B12 radioassay kit from Bio-Rad Laboratories (Hercules, CA). We defined biochemical B12 deficiency as serum levels ≤148 pmol/L, borderline deficiency as serum B12 >148 to ≤221 pmol/L, and normal as >221 pmol/L (26).

The main exposure of interest was metformin use. Using data collected in the prescription medicine questionnaire, those with type 2 diabetes were classified as currently using metformin therapy (alone or in combination therapy) versus those not currently using metformin. Length of metformin therapy was used to assess the relationship between duration of metformin therapy and biochemical B12 deficiency. In the final analysis, two control groups were used to allow the comparison of those with type 2 diabetes taking metformin with those with type 2 diabetes not taking metformin and those without diabetes.

To determine whether the association between metformin and biochemical B12 deficiency is modified by supplemental B12 intake, data from the dietary supplement questionnaire were used. Information regarding the dose and frequency was used to calculate average daily supplemental B12 intake. We categorized supplemental B12 intake as 0 μg (no B12 containing supplement), >0–6 μg, >6–25 μg, and >25 μg. The lower intake group, >0–6 μg, includes 6 μg, the amount of vitamin B12 typically found in over-the-counter multivitamins, and 2.4 μg, the daily amount the IOM recommends for all adults ≥50 years of age to consume through supplements or fortified food (1). The next group, >6–25 μg, includes 25 μg, the amount available in many multivitamins marketed toward senior adults. The highest group contains the amount found in high-dose B-vitamin supplements.

 

In the final analysis, there were 575 U.S. adults ≥50 years of age with type 2 diabetes using metformin, 1,046 with type 2 diabetes not using metformin, and 6,867 without diabetes. The demographic and biological characteristics of the groups are shown in Table 1. Among metformin users, mean age was 63.4 ± 0.5 years, 50.3% were male, 66.7% were non-Hispanic white, and 40.7% used a supplement containing B12. The median duration of metformin use was 5 years. Compared with those with type 2 diabetes not taking metformin, metformin users were younger (P < 0.0001), reported a lower prevalence of insulin use (P < 0.001), and had a shorter duration of diabetes (P = 0.0207). Compared with those without diabetes, metformin users had a higher proportion of nonwhite racial groups (P< 0.0001), a higher proportion of obesity (P < 0.0001), a lower prevalence of macrocytosis (P = 0.0017), a lower prevalence of supplemental folic acid use (P = 0.0069), a lower prevalence of supplemental vitamin B12 use (P = 0.0180), and a lower prevalence of calcium supplement use (P = 0.0002). There was a twofold difference in the prevalence of anemia among those with type 2 diabetes versus those without, and no difference between the groups with diabetes.    

Association of Biochemical B12Deficiency With Metformin Therapy and Vitamin B12Supplements

Demographic and biological characteristics of U.S. adults ≥50 years of age: NHANES 1999–2006

Table 1
The geometric mean serum B12 concentration among those with type 2 diabetes taking metformin was 317.5 pmol/L. This was significantly lower than the geometric mean concentration in those with type 2 diabetes not taking metformin (386.7 pmol/L; P = 0.0116) and those without diabetes (350.8 pmol/L; P = 0.0011). As seen in Fig. 1, the weighted prevalence of biochemical B12 deficiency adjusted for age, race, and sex was 5.8% for those with type 2 diabetes taking metformin, 2.2% for those with type 2 diabetes not taking metformin (P = 0.0002), and 3.3% for those without diabetes (P = 0.0026). Among the three aforementioned groups, borderline deficiency was present in 16.2, 5.5, and 8.8%, respectively (P < 0.0001). Applying the Fleiss formula for calculating attributable risk from cross-sectional data (27), among all of the cases of biochemical B12 deficiency, 3.5% of the cases were attributable to metformin use; and among those with diabetes, 41% of the deficient cases were attributable to metformin use. When the prevalence of biochemical B12 deficiency among those with diabetes taking metformin was analyzed by duration of metformin therapy, there was no notable increase in the prevalence of biochemical B12 deficiency as the duration of metformin use increased. The prevalence of biochemical B12 deficiency was 4.1% among those taking metformin <1 year, 6.3% among those taking metformin ≥1–3 years, 4.1% among those taking metformin >3–10 years, and 8.1% among those taking metformin >10 years (P = 0.3219 for <1 year vs. >10 years). Similarly, there was no clear increase in the prevalence of borderline deficiency as the duration of metformin use increased (15.9% among those taking metformin >10 years vs. 11.4% among those taking metformin <1 year; P = 0.4365).
Figure 1
Weighted prevalence of biochemical B12 deficiency and borderline deficiency adjusted for age, race, and sex in U.S. adults ≥50 years of age: NHANES 1999–2006. Black bars are those with type 2 diabetes on metformin, gray bars are those with type 2 diabetes not on metformin, and the white bars are those without diabetes. *P = 0.0002 vs. type 2 diabetes on metformin. †P < 0.0001 vs. type 2 diabetes on metformin. ‡P = 0.0026 vs. type 2 diabetes on metformin.
Table 2 presents a stratified analysis of the weighted prevalence of biochemical B12 deficiency and borderline deficiency by B12supplement use. For those without diabetes, B12 supplement use was associated with an ∼66.7% lower prevalence of both biochemical B12deficiency (4.8 vs. 1.6%; P < 0.0001) and borderline deficiency (16.6 vs. 5.5%; P < 0.0001). A decrease in the prevalence of biochemical B12deficiency was seen at all levels of supplemental B12 intake compared with nonusers of supplements. Among those with type 2 diabetes taking metformin, supplement use was not associated with a decrease in the prevalence of either biochemical B12 deficiency (5.6 vs. 5.3%; P= 0.9137) or borderline deficiency (15.5 vs. 8.8%; P = 0.0826). Among the metformin users who also used supplements, those who consumed >0–6 μg of B12 had a prevalence of biochemical B12 deficiency of 14.1%. However, consumption of a supplement containing >6 μg of B12 was associated with a prevalence of biochemical B12 deficiency of 1.8% (P = 0.0273 for linear trend). Similar trends were seen in the association of supplemental B12 intake and the prevalence of borderline deficiency. For those with type 2 diabetes not taking metformin, supplement use was also not associated with a decrease in the prevalence of biochemical B12 deficiency (2.1 vs. 2.0%; P = 0.9568) but was associated with a 54% reduction in the prevalence of borderline deficiency (7.8 vs. 3.4%; P = 0.0057 for linear trend).
Table 2
Comparison of average daily B12 supplement intake by weighted prevalence of biochemical B12 deficiency (serum B12 ≤148 pmol/L) and borderline deficiency (serum B12 >148 to ≤221 pmol/L) among U.S. adults ≥50 years of age: NHANES 1999–2006.
Table 3 demonstrates the association of various risk factors with biochemical B12 deficiency. Metformin therapy was associated with biochemical B12 deficiency (odds ratio [OR] 2.89; 95% CI 1.33–6.28) and borderline deficiency (OR 2.32; 95% CI 1.31–4.12) in a crude model (results not shown). After adjusting for age, BMI, and insulin and supplement use, metformin maintained a significant association with biochemical B12 deficiency (OR 2.92; 95% CI 1.28–6.66) and borderline deficiency (OR 2.16; 95% CI 1.22–3.85). Similar to Table 2, B12 supplements were protective against borderline (OR 0.43; 95% CI 0.23–0.81), but not biochemical, B12 deficiency (OR 0.76; 95% CI 0.34–1.70) among those with type 2 diabetes. Among those without diabetes, B12 supplement use was ∼70% protective against biochemical B12 deficiency (OR 0.26; 95% CI 0.17–0.38) and borderline deficiency (OR 0.27; 95% CI 0.21–0.35).
Table 3
Polytomous logistic regression for potential risk factors of biochemical B12 deficiency and borderline deficiency among U.S. adults ≥50 years of age: NHANES 1999–2006, OR (95% CI)

The IOM has highlighted the detection and diagnosis of B12 deficiency as a high-priority topic for research (1). Our results suggest several findings that add to the complexity and importance of B12 research and its relation to diabetes, and offer new insight into the benefits of B12 supplements. Our data confirm the relationship between metformin and reduced serum B12 levels beyond the background prevalence of biochemical B12 deficiency. Our data demonstrate that an intake of >0–6 μg of B12, which includes the dose most commonly found in over-the-counter multivitamins, was associated with a two-thirds reduction of biochemical B12 deficiency and borderline deficiency among adults without diabetes. This relationship has been previously reported with NHANES and Framingham population data (4,29). In contrast, we did not find that >0–6 μg of B12 was associated with a decrease in the prevalence of biochemical B12 deficiency or borderline deficiency among adults with type 2 diabetes taking metformin. This observation suggests that metformin reduces serum B12 by a mechanism that is additive to or different from the mechanism in older adults. It is also possible that metformin may exacerbate the deficiency among older adults with low serum B12. Our sample size was too small to determine which amount >6 μg was associated with maximum protection, but we did find a dose-response trend.

We were surprised to find that those with type 2 diabetes not using metformin had the lowest prevalence of biochemical B12 deficiency. It is possible that these individuals may seek medical care more frequently than the general population and therefore are being treated for their biochemical B12 deficiency. Or perhaps, because this population had a longer duration of diabetes and a higher proportion of insulin users compared with metformin users, they have been switched from metformin to other diabetic treatments due to low serum B12 concentrations or uncontrolled glucose levels and these new treatments may increase serum B12 concentrations. Despite the observed effects of metformin on serum B12 levels, it remains unclear whether or not this reduction is a public health concern. With lifetime risks of diabetes estimated to be one in three and with metformin being a first-line intervention, it is important to increase our understanding of the effects of oral vitamin B12 on metformin-associated biochemical deficiency (20,21).

The strengths of this study include its nationally representative, population-based sample, its detailed information on supplement usage, and its relevant biochemical markers. This is the first study to use a nationally representative sample to examine the association between serum B12 concentration, diabetes status, and metformin use as well as examine how this relationship may be modified by vitamin B12 supplementation. The data available regarding supplement usage provided specific information regarding dose and frequency. This aspect of NHANES allowed us to observe the dose-response relationship in Table 2 and to compare it within our three study groups.

This study is also subject to limitations. First, NHANES is a cross-sectional survey and it cannot assess time as a factor, and therefore the results are associations and not causal relationships. A second limitation arises in our definition of biochemical B12 deficiency. There is no general consensus on how to define normal versus low serum B12levels. Some researchers include the functional biomarker methylmalonic acid (MMA) in the definition, but this has yet to be agreed upon (3034). Recently, an NHANES roundtable discussion suggested that definitions of biochemical B12 deficiency should incorporate one biomarker (serum B12 or holotranscobalamin) and one functional biomarker (MMA or total homocysteine) to address problems with sensitivity and specificity of the individual biomarkers. However, they also cited a need for more research on how the biomarkers are related in the general population to prevent misclassification (34). MMA was only measured for six of our survey years; one-third of participants in our final analysis were missing serum MMA levels. Moreover, it has recently been reported that MMA values are significantly greater among the elderly with diabetes as compared with the elderly without diabetes even when controlling for serum B12 concentrations and age, suggesting that having diabetes may independently increase the levels of MMA (35). This unique property of MMA in elderly adults with diabetes makes it unsuitable as part of a definition of biochemical B12 deficiency in our specific population groups. Our study may also be subject to misclassification bias. NHANES does not differentiate between diabetes types 1 and 2 in the surveys; our definition may not capture adults with type 2 diabetes exclusively. Additionally, we used responses to the question “Have you received a physician’s diagnosis of diabetes” to categorize participants as having or not having diabetes. Therefore, we failed to capture undiagnosed diabetes. Finally, we could only assess current metformin use. We cannot determine if nonmetformin users have ever used metformin or if they were not using it at the time of the survey.

Our data demonstrate several important conclusions. First, there is a clear association between metformin and biochemical B12 deficiency among adults with type 2 diabetes. This analysis shows that 6 μg of B12 offered in most multivitamins is associated with two-thirds reduction in biochemical B12 deficiency in the general population, and that this same dose is not associated with protection against biochemical B12 deficiency among those with type 2 diabetes taking metformin. Our results have public health and clinical implications by suggesting that neither 2.4 μg, the current IOM recommendation for daily B12 intake, nor 6 μg, the amount found in most multivitamins, is sufficient for those with type 2 diabetes taking metformin.

This analysis suggests a need for further research. One research design would be to identify those with biochemical B12 deficiency and randomize them to receive various doses of supplemental B12chronically and then evaluate any improvement in serum B12concentrations and/or clinical outcomes. Another design would use existing cohorts to determine clinical outcomes associated with biochemical B12 deficiency and how they are affected by B12supplements at various doses. Given that a significant proportion of the population ≥50 years of age have biochemical B12 deficiency and that those with diabetes taking metformin have an even higher proportion of biochemical B12 deficiency, we suggest that support for further research is a reasonable priority.

 

Discussion:
One research design would be to identify those with biochemical B12 deficiency and randomize them to receive various doses of supplemental B12chronically and then evaluate any improvement in serum B12concentrations and/or clinical outcomes. Another design would use existing cohorts to determine clinical outcomes associated with biochemical B12 deficiency and how they are affected by B12supplements at various doses.
This is of considerable interest.  As far as I can see, there is insufficient data presented to discern all of the variables entangled.  In a study of 8000 hemograms several years ago, it was of some interest that there were a large percentage of patients who were over age 75 years having a MCV of 94 – 100, not considered indicative of macrocytic anemia.  It would have been interesting to explore that set of the data further.
UPDATED 3/17/2020
 2019 May 7;11(5). pii: E1020. doi: 10.3390/nu11051020.

Monitoring Vitamin B12 in Women Treated with Metformin for Primary Prevention of Breast Cancer and Age-Related Chronic Diseases.

Abstract

Metformin (MET) is currently being used in several trials for cancer prevention or treatment in non-diabetics. However, long-term MET use in diabetics is associated with lower serum levels of total vitamin B12. In a pilot randomized controlled trial of the Mediterranean diet (MedDiet) and MET, whose participants were characterized by different components of metabolic syndrome, we tested the effect of MET on serum levels of B12, holo transcobalamin II (holo-TC-II), and methylmalonic acid (MMA). The study was conducted on 165 women receiving MET or placebo for three years. Results of the study indicate a significant overall reduction in both serum total B12 and holo-TC-II levels according with MET-treatment. In particular, in the MET group 26 of 81 patients and 10 of the 84 placebo-treated subjects had B12 below the normal threshold (<221 pmol/L) at the end of the study. Considering jointly all B12, Holo-TC-II, and MMA, 13 of the 165 subjects (10 MET and 3 placebo-treated) had at least two deficits in the biochemical parameters at the end of the study, without reporting clinical signs. Although our results do not affect whether women remain in the trial, B12 monitoring for MET-treated individuals should be implemented.

ntroduction

Metformin (MET) is the first-line treatment for type-2 diabetes and has been used for decades to treat this chronic condition [1]. Given its favorable effects on glycemic control, weight patterns, insulin requirements, and cardiovascular outcomes, MET has been recently proposed in addition to lifestyle interventions to reduce metabolic syndrome (MS) and age-related chronic diseases [2]. Observational studies have also suggested that diabetic patients treated with MET had a significantly lower risk of developing cancer or lower cancer mortality than those untreated or treated with other drugs [3,4]. For this reason, a number of clinical trials are in progress in different solid cancers.
One of the limitations in implementing long-term use of MET to prevent chronic conditions in healthy subjects relates to its potential lowering effect on vitamin B12 (B12). The aim of the present study was to assess the effect of three years of MET treatment in a randomized, controlled trial considering both B12 levels and biomarkers of its metabolism and biological effectiveness.
Cobalamin, also known as B12, is a water-soluble, cobalt-containing vitamin. All forms of B12 are converted intracellularly into adenosyl-Cbl and methylcobalamin—the biologically active forms at the cellular level [5]. Vitamin B12 is a vital cofactor of two enzymes: methionine synthase and L-methyl-malonyl-coenzyme. A mutase in intracellular enzymatic reactions related to DNA synthesis, as well as in amino and fatty acid metabolism. Vitamin B12, under the catalysis of the enzyme l-methyl-malonyl-CoA mutase, synthesizes succinyl-CoA from methylmalonyl-CoA in the mitochondria. Deficiency of B12, thus results in elevated methylmalonic acid (MMA) levels.
Dietary B12 is normally bound to proteins. Food-bound B12 is released in the stomach under the effect of gastric acid and pepsin. The free vitamin is then bound to an R-binder, a glycoprotein in gastric fluid and saliva that protects B12 from the highly acidic stomach environment. Pancreatic proteases degrade R-binder in the duodenum and liberate B12; finally, the free vitamin is then bound by the intrinsic factor (IF)—a glycosylated protein secreted by gastric parietal cells—forming an IF-B12 complex [6]. The IF resists proteolysis and serves as a carrier for B12 to the terminal ileum where the IF-B12 complex undergoes receptor (cubilin)-mediated endocytosis [7]. The vitamin then appears in circulation bound to holo-transcobalamin-I (holo-TC-I), holo-transcobalamin-II (holo-TC-II), and holo-transcobalamin-III (holo-TC-III). It is estimated that 20–30% of the total circulating B12 is bound to holo-TC-II and only this form is available to the cells [7]. Holo-TC-I binds 70–80% of circulating B12, preventing the loss of the free unneeded portion [6]. Vitamin B12 is stored mainly in the liver and kidneys.
Many mechanisms have been proposed to explain how MET interferes with the absorption of B12: diminished absorption due to changes in bacterial flora, interference with intestinal absorption of the IF–B12 complex (and)/or alterations in IF levels. The most widely accepted current mechanism suggests that MET antagonizes the calcium cation and interferes with the calcium-dependent IF–B12 complex binding to the ileal cubilin receptor [8,9]. The recognition and treatment of B12 deficiency is important because it is a cause of bone marrow failure, macrocytic anemia, and irreversible neuropathy [10].
In general, previous studies on diabetics have observed a reduction in serum levels of B12 after both short- and long-term MET treatment [1]. A recent review on observational studies showed significantly lower levels of B12 and an increased risk of borderline or frank B12 deficiency in patients on MET than not on MET [1]. The meta-analysis of four trials (only one double-blind) found a significant overall mean B12 reducing effect of MET after six weeks to three months of use [1]. A secondary analysis (13 years after randomization) of the Diabetes Prevention Program Outcomes Study, which randomized over 3000 persons at high risk for type 2 diabetes to MET or placebo, showed a 13% increase in the risk of B12 deficiency per year of total MET use [3]. In this study, B12 levels were measured from samples obtained in years 1 and 9. Stored serum samples from other time points, including baseline, were not available, and potentially informative red blood cell indices that might have demonstrated the macrocytic anemia, typical of B12 deficiency, were not recorded [3]. The HOME (Hyperinsulinaemia: the Outcome of its Metabolic Effects) study, a large randomized controlled trial investigating the long-term effects of MET versus placebo in patients with type 2 diabetes treated with insulin, showed that the addition of MET improved glycemic control, reduced insulin requirements, prevented weight gain but lowered serum B12 over time, and raised serum homocysteine, suggesting tissue B12 deficiency [4]. A recent analysis of 277 diabetics from the same trial showed that serum levels of MMA, the specific biomarker for tissue B12 deficiency [5], were significantly higher in people treated with MET than those receiving placebo after four years (on average) [4].
The risk of MET-associated B12 deficiency may be higher in older individuals and those with poor dietary habits. Prospective studies have found negative associations between obesity and B12 in numerous ethnicities [11,12]. An energy-dense but micronutrient-insufficient diet consumed by individuals who are overweight or obese might explain this [12]. Furthermore, obesity is associated with low-grade inflammation and these physiological changes have been shown to be associated, in several studies, with elevated C-reactive protein and homocysteine and with low concentrations of B12 and other vitamins [13,14].
As part of a pilot randomized controlled trial of the Mediterranean diet (MedDiet) and MET for primary prevention of breast cancer and other chronic age-related diseases in healthy women with tracts of MS [15] we tested the effect of MET on serum levels of B12, holo-TC-II, and MMA.

Other articles of note on the Mediterranean Diet in this Online Open Access Scientific Journal Include

Read Full Post »

What about Theranos?

Curator: Larry H. Bernstein, MD, FCAP

Is Theranos Situation False Crowdfunding Claims at Scale or ‘Outsider’ Naivety?

http://www.mdtmag.com/blog/2015/11/theranos-situation-false-crowdfunding-claims-scale-or-outsider-naivety

If you’ve been following the Theranos situation that involves several damning articles from the Wall Street Journal on the company (see sidebar below video), you know that “something is rotten in the state of Denmark.” That is to say, regardless of whether or not you believe the WSJ articles 100%, believe Theranos 100%, or land somewhere in between, it’s hard not to see that something at the company is definitely creating questions about their original claims. In fact, the company has apparently even tempered some language with regard to its capabilities while “debating” the accuracy of the WSJ articles. It’s really a big mess for a company that was supposedly making significant changes in the way we’d conduct blood testing and the way patients controlled and accessed their own health data (although, I think the idea behind that specific aspect is a very good one).

Due to FDA inspections and findings of concern with Thernos practices, the company is currently only collecting blood for one test using its revolutionary proprietary technology. While the company’s CEO Elizabeth Holmes continues to assure the public that the problems are tied to FDA related procedures and not an issue with the technology itself, stakeholders such as Walgreens put any further interactions with the company on hold.

In the following video from Fortune’s Global Forum, you can see Ms. Holmes discussing the situation over the FDA inspections and the changes that are currently in place with regard to the testing that’s happening at the company.

https://youtu.be/A8qgmGtRMsY

So what’s the story behind this story? Is this a deliberate attempt to deceive on the part of Theranos or is it an example of what can happen when an “outsider” gets involved in the highly regulated medical device industry and faces off with the FDA without the proper experience in place to address potential areas of concern?

In a recent blog, I looked at the crowdfunding of medical devices and what can happen when claims made don’t live up to the reality of the product that’s actually developed. Once enthusiastic investors can quickly (and loudly) turn on a company or project, venting their frustration even directly on the crowdfunding page for all to see. Unfortunately, with the way these sites seem to be set-up, the money is still provided to the company that produces a product, albeit one that does not live up to the initial concept.

Is that what Theranos ultimately is? Were the technology claims taken at face value by significant investment backers? It would seem very unlikely, but given some of the accusations of former Theranos employees in the WSJ articles, it wouldn’t be the only instance of Theranos trying to manipulate testing protocols for the sake of appearing more impressive. Theranos counters those claims by saying the former employees were actually unfamiliar with the actual testing the company performs. Whether or not you believe that is entirely up to you.

Another alternative to blatant deceit on the part of Theranos is the possibility that the company was simply playing in an industry it wasn’t truly experienced enough to handle. In other words, how many FDA savy employees work for Theranos? Did they seek consultants to help with the regulatory processes? Or were they simply naïve to the ways of the regulated industry in which they were entering?

Again, this scenario too seems unlikely, but it also brings in the debate over lab-developed tests and the FDA’s regulation of them. If Theranos testing protocols fall under the realm of LDTs, then they aren’t necessary under the oversight of the FDA. Sure, the blood collection device is (and that’s why changes are currently occurring at the company), but does the FDA have the authority to inspect the company’s tests if they are LDTs?

Ultimately, I think everyone (with the exception of competitors to Theranos perhaps) wants the company to be successful. The ideas and hope embedded within the original claims the company made will only enhance the quality of care that we are able to achieve within our healthcare system. Further, empowering patients to make decisions and get involved with their own healthcare management would likely improve their overall health.

Unfortunately, before any of that will be possible, Theranos is going to have an uphill battle in defending itself, its technology, and its CEO in this very public debate over the realistic capabilities it can provide. Hopefully, it learns from this experience and if the technology truly functions the way they’ve claimed, they’ll bring on the necessary regulatory experts and better navigate the troubled waters in which they currently find themselves.

Single Blood Drop Diagnostics Key to Resolving Healthcare Challenges

At TEDMED 2014, President and CEO of Theranos, Elizabeth Holmes, talked about the importance of enabling early detection of disease through new diagnostic tools and empowering individuals to make educated decisions about their healthcare.

Read Full Post »

Protein Energy Malnutrition and Early Child Development

Curator: Larry H. Bernstein, MD, FCAP

 

 

In the preceding articles we have seen that poverty and low social class combined with cultural strictures or dependence on a sulfur-poor diet results in childhood stunting and impaired brain development. This is a global health issue.

Protein-Energy Malnutrition

  • Author: Noah S Scheinfeld, JD, MD, FAAD; Chief Editor: Romesh Khardori, MD, PhD, FACP

http://emedicine.medscape.com/article/1104623-overview

The World Health Organization (WHO)[1] defines malnutrition as “the cellular imbalance between the supply of nutrients and energy and the body’s demand for them to ensure growth, maintenance, and specific functions.” The term protein-energy malnutrition (PEM) applies to a group of related disorders that includemarasmus, kwashiorkor (see the images below), and intermediate states of marasmus-kwashiorkor. The term marasmus is derived from the Greek wordmarasmos, which means withering or wasting. Marasmus involves inadequate intake of protein and calories and is characterized by emaciation. The term kwashiorkor is taken from the Ga language of Ghana and means “the sickness of the weaning.” Williams first used the term in 1933, and it refers to an inadequate protein intake with reasonable caloric (energy) intake. Edema is characteristic of kwashiorkor but is absent in marasmus.

Studies suggest that marasmus represents an adaptive response to starvation, whereas kwashiorkor represents a maladaptive response to starvation. Children may present with a mixed picture of marasmus and kwashiorkor, and children may present with milder forms of malnutrition. For this reason, Jelliffe suggested the term protein-calorie (energy) malnutrition to include both entities.
Although protein-energy malnutrition affects virtually every organ system, this article primarily focuses on its cutaneous manifestations. Patients with protein-energy malnutrition may also have deficiencies of vitamins, essential fatty acids, and trace elements, all of which may contribute to their dermatosis.

In general, marasmus is an insufficient energy intake to match the body’s requirements. As a result, the body draws on its own stores, resulting in emaciation. In kwashiorkor, adequate carbohydrate consumption and decreased protein intake lead to decreased synthesis of visceral proteins. The resulting hypoalbuminemia contributes to extravascular fluid accumulation. Impaired synthesis of B-lipoprotein produces a fatty liver.

Protein-energy malnutrition also involves an inadequate intake of many essential nutrients. Low serum levels of zinc have been implicated as the cause of skin ulceration in many patients. In a 1979 study of 42 children with marasmus, investigators found that only those children with low serum levels of zinc developed skin ulceration. Serum levels of zinc correlated closely with the presence of edema, stunting of growth, and severe wasting. The classic “mosaic skin” and “flaky paint” dermatosis of kwashiorkor bears considerable resemblance to the skin changes of acrodermatitis enteropathica, the dermatosis of zinc deficiency.

In 2007, Lin et al[2] stated that “a prospective assessment of food and nutrient intake in a population of Malawian children at risk for kwashiorkor” found “no association between the development of kwashiorkor and the consumption of any food or nutrient.”

Marasmus and kwashiorkor can both be associated with impaired glucose clearance that relates to dysfunction of pancreatic beta-cells.[3] In utero, plastic mechanisms appear to operate, adjusting metabolic physiology and adapting postnatal undernutrition and malnutrition to define whether marasmus and kwashiorkor will develop.[4]

In 2012, a report from Texas noted an 18-month-old infant with type 1 glutaric acidemia who had extensive desquamative plaques, generalized nonpitting edema, and red-tinged sparse hair, with low levels of zinc, alkaline phosphatase, albumin, and iron. This patient has a variation on kwashiorkor, and the authors suggest that it be termed acrodermatitis dysmetabolica.[5] On the same note, a boy aged 18 months with type 1 glutaric acidemia suffered from zinc deficiency and acquired protein energy malnutrition.[6]

For complex reasons, sickle cell anemia can predispose suffers to protein malnutrition.[7]

Protein energy malnutrition ramps up arginase activity in macrophages and monocytes.[8]

Protein energy malnutrition (PEM), brain and various facets of child development.

Protein energy malnutrition (PEM) is a global problem. Nearly 150 million children under 5 years in the world and 70-80 million in India suffer from PEM, nearly 20 million in the world and 4 million in India suffer from severe forms of PEM, viz., marasmus, kwashiorkor and marasmic kwashiorkor. The studies in experimental animals in the west and children in developing countries have revealed the adverse effects of PEM on the biochemistry of developing brain which leads to tissue damage and tissue contents, growth arrest, developmental differentiation, myelination, reduction of synapses, synaptic transmitters and overall development of dendritic activity. Many of these adverse effects have been described in children in clinical data, biochemical studies, reduction in brain size, histology of the spinal cord, quantitative studies and electron microscopy of sural nerve, neuro -CT scan, magnetic resonance imaging (MRI) and morphological changes in the cerebellar cells. Longer the PEM, younger the child, poorer the maternal health and literacy, more adverse are the effects of PEM on the nervous system. Just like the importance of nutrients on the developing brain, so are the adverse effects on the child development of lack of environmental stimulation, emotional support and love and affection to the child. When both the adverse factors are combined, the impact is severe. Hence prevention of PEM in pregnant and lactating mothers, breast feeding, adequate home based supplements, family support and love will improve the physical growth, mental development, social competence and academic performance of the child. Hence nutritional rehabilitation, psychosocial and psychomotor development of the child should begin in infancy and continue throughout. It should be at all levels, most important being in family, school, community and various intervention programmes, local, regional and national. Moreover medical students, health personnel, all medical disciplines concerned with total health care and school teachers should learn and concentrate on the developmental stimulation and enrichment of the child.

Cognitive development in children with chronic protein energy malnutrition

Behav Brain Funct. 2008; 4: 31.  http://dx.doi.org:/10.1186/1744-9081-4-31 
Background: Malnutrition is associated with both structural and functional pathology of the brain. A wide range of cognitive deficits has been reported in malnourished children. Effect of chronic protein energy malnutrition (PEM) causing stunting and wasting in children could also affect the ongoing development of higher cognitive processes during childhood (>5 years of age). The present study examined the effect of stunted growth on the rate of development of cognitive processes using neuropsychological measures.
Methods: Twenty children identified as malnourished and twenty as adequately nourished in the age groups of 5–7 years and 8–10 years were examined. NIMHANS neuropsychological battery for children sensitive to the effects of brain dysfunction and age related improvement was employed. The battery consisted of tests of motor speed, attention, visuospatial ability, executive functions, comprehension and learning and memory
Results: Development of cognitive processes appeared to be governed by both age and nutritional status. Malnourished children performed poor on tests of attention, working memory, learning and memory and visuospatial ability except on the test of motor speed and coordination. Age related improvement was not observed on tests of design fluency, working memory, visual construction, learning and memory in malnourished children. However, age related improvement was observed on tests of attention, visual perception, and verbal comprehension in malnourished children even though the performance was deficient as compared to the performance level of adequately nourished children.
Conclusion: Chronic protein energy malnutrition (stunting) affects the ongoing development of higher cognitive processes during childhood years rather than merely showing a generalized cognitive impairment. Stunting could result in slowing in the age related improvement in certain and not all higher order cognitive processes and may also result in long lasting cognitive impairments.
Malnutrition is the consequence of a combination of inadequate intake of protein, carbohydrates, micronutrients and frequent infections [1]. In India malnutrition is rampant. WHO report states that for the years 1990–1997 52% of Indian children less than 5 years of age suffer from severe to moderate under nutrition [2]. About 35% of preschool children in sub-Saharan Africa are reported to be stunted [3]. Malnutrition is associated with both structural and functional pathology of the brain. Structurally malnutrition results in tissue damage, growth retardation, disorderly differentiation, reduction in synapses and synaptic neurotransmitters, delayed myelination and reduced overall development of dendritic arborization of the developing brain. There are deviations in the temporal sequences of brain maturation, which in turn disturb the formation of neuronal circuits [1]. Long term alterations in brain function have been reported which could be related to long lasting cognitive impairments associated with malnutrition [4]. A wide range of cognitive deficits has been observed in malnourished children in India. In a study, malnourished children were assessed on the Gessell’s developmental schedule from 4 to 52 weeks of age. Children with grades II and III malnutrition had poor development in all areas of behaviour i.e., motor, adaptive, language and personal social [5]. Rural children studying in primary school between the ages of 6–8 years were assessed on measures of social maturity (Vineland social maturity scale), visuomotor co-ordination (Bender gestalt test), and memory (free recall of words, pictures and objects). Malnutrition was associated with deficits of social competence, visuomotor coordination and memory. Malnutrition had a greater effect on the immediate memory of boys as compared with those of girls. Malnourished boys had greater impairment of immediate memory for words, pictures and objects, while malnourished girls had greater impairment of immediate memory for only pictures. Delayed recall of words and pictures of malnourished boys was impaired. Malnourished girls had an impairment of delayed recall of only words. The same authors measured the intelligence of malnourished children using Malin’s Indian adaptation of the Wechsler’s intelligence scale for children. IQ scores decreased with the severity of malnutrition. Significant decreases were observed in performance IQ, as well as on the subtests of information and digit span among the verbal subtests [6]. The above study has shown that though there is decrease in full scale IQ, yet performance on all the subtests was not affected. This suggests that malnutrition may affect different neuropsychological functions to different degrees. Studies done in Africa and South America have focused on the effect of stunted growth on cognitive abilities using verbal intelligence tests based on assessment of reasoning [7]. Such an assessment does not provide a comprehensive and specific assessment of cognitive processes like attention, memory, executive functions, visuo-spatial functions, comprehension as conducted in the present study. Information about the functional status of specific cognitive processes has implications for developing a cognitive rehabilitation program for malnourished children. A neuropsychological assessment would throw light on functional status of brain behaviour relationships affected by malnutrition. Deficits of cognitive, emotional and behavioural functioning are linked to structural abnormalities of different regions of the brain. Brain structures and brain circuits compute different components of cognitive processes [8]. Malnutrition has long lasting effects in the realm of cognition and behaviour, although the cognitive processes like executive functions have not been fully assessed [9]. The differential nature of cognitive deficits associated with malnutrition suggests that different areas of the brain are compromised to different degrees. A neuropsychological assessment would be able to delineate the pattern of brain dysfunction. Malnutrition is a grave problem in our country as 52% of our children are malnourished. Effects of protein-calorie malnutrition are inextricably blended with the effects of social cultural disadvantage; even within the disadvantaged class, literacy environment at home and parental expectation regarding children’s education are powerful variables. Perhaps membership in a higher caste confers some advantage in regard to home literacy, and parental expectation. Short and tall children do differ in some cognitive tests, but not in all as demonstrated in a study done in Orissa, India [10]. But whether or not stunted growth alone is the causative variable for cognitive weakness is not determined as yet. Moreover, the functional integrity of specific cognitive processes is less clear. Chronic PEM resulting in stunting and wasting could result in delay in the development of cognitive processes or in permanent cognitive impairments. Neuropsychological measures can demonstrate delay in normally developing cognitive processes as well as permanent cognitive deficits.
Children in the age range of 5–10 years attending a corporation school in the city of Bangalore participated in the study. Corporation schools in India are government schools with minimal fee attended by children from lowmiddle class. There were 20 children in adequately nourished group and 20 in the malnourished group. The gender distribution was equal. Children in both the groups were from the same ethnic/language background. They were natives of Karnataka living in Bangalore.
After identifying the malnourished and adequately nourished children the coloured progressive matrices test [12] was administered to rule out mental retardation. Children falling at or below the fifth percentile were excluded from the sample, as the 5th percentile is suggestive of intellectually defective range. The percentile points were calculated from the raw scores using Indian norms [13]. Mental retardation was ruled out as otherwise scores on neuropsychological tests would be uniformly depressed and a differentiation of deficits might not occur. Intelligence was not treated as a covariate in the study. The groups did not differ significantly in their scores on CPM (a screening instrument to rule out intellectual impairment in both the groups).
Table 1: Demographic details of the participants
                            Adequately nourished N = 20                  Malnourished N = 20
Mean age              5–7 years        8–10 years                     5–7 years      8–10 years
                               5.8 years        8.8 years                          6.3 years      9.3 years
Gender                   Girls:10           Boys: 10                          Girls:10         Boys: 10
Stunted %
(height for age -2 SD from the median) —-                                  70%
Stunted and wasted %
(height for age and
weight for height: -2 SD from the median) —-                               30%
Exclusion of behaviour problems and history of neurological disorders The children’s behaviour questionnaire form B [14] was administered to the class teachers of the identified children. Children who scored above the cut off score of 9 were not included in the sample. The personal data sheet was filled in consultation with the parents and teachers to rule out any history of any neurological/psychiatric disorders including head injury and epilepsy and one child with epilepsy was excluded. This was one of the exclusion criteria.
Exclusion of behaviour problems and history of neurological disorders The children’s behaviour questionnaire form B [14] was administered to the class teachers of the identified children. Children who scored above the cut off score of 9 were not included in the sample. The personal data sheet was filled in consultation with the parents and teachers to rule out any history of any neurological/psychiatric disorders including head injury and epilepsy and one child with epilepsy was excluded. This was one of the exclusion criteria.
The tests have been grouped under specific cognitive domains on the basis of theoretical rationale and factor analysis. Factor analysis has been done for the battery and the grouping of tests under cognitive functions like executive functions, visuospatial functions, comprehension and learning and memory was done on the basis of the clustering observed in factor analysis as well as on theoretical grounds
The neuropsychological battery consisted of the following tests:
1. Motor speed  Finger tapping test [15]
2. Expressive speech  Expressive speech test was administered to rule out speech related deficits
3. Attention  Color trails test [18] is a measure of focused attention and conceptual tracking.
4. Color cancellation test [21] is a measure of visual scanning/selective attention
5. Executive functions FAS phonemic fluency test is a measure of verbal fluency.
6. Design fluency test [24] is a measure of design fluency, cognitive flexibility and imaginative capacity.
7. Visuo-spatial working memory span task [23]: This test is a measure of visuo-spatial working memory (VSWM) span.
8. Visuospatial functions Motor-free visual perception test [29] is a measure of visuoperceptual ability, having 36 items for visual discrimination, visual closure, figure-ground, perceptual matching and visual memory. Since this test has been originally developed for children between 5–8 years of age, it was modified and items in increasing difficulty level were added by the authors to make it applicable for the children above 8 years. Number of correct responses comprises the score.
9. Picture completion test [30] is a measure of visuoconceptual ability, visual organization and visuo-conceptual reasoning.
10. Block design test [30] is a measure of visuoconstructive ability.
11. Comprehension, learning and memory Token test [31] is a measure of verbal comprehension of commands of increasing complexity.
12. Rey’s auditory verbal learning test (RAVLT) [32] is a measure of verbal learning and memory.
13. Memory for designs test [34] is a measure of visual learning and memory.
Comparison between the performance of adequately nourished children and malnourished children Table 2.0 shows that malnourished group differed significantly from the adequately nourished group on tests of phonemic fluency, design fluency, selective attention, visuospatial working memory, visuospatial functions, verbal comprehension and verbal learning and memory showing poor performance. The two groups did not differ on the test of finger tapping. Since expressive speech was a question answer type assessment looking at repetitive speech, nominative speech and narrative speech, which is like an initial screening for aphasia, like symptoms. Since it did not give a quantitative score, hence was not taken for analysis. As a descriptive account of expressive speech it was observed that malnourished children did not have any difficulty with respect to expressive speech.
Comparison of age related differences in cognitive functions between adequately nourished and malnourished children Data was further subjected to post hoc analysis to compare the two groups across the two age groups to study the rate of improvement with age (Table 2). In both the age groups of 5–7 years and 8–10 years the adequately nourished children performed better than the malnourished children. Figures 1, 2, 3, 4, 5, 6 indicate age related improvement in performance across different cognitive functions in adequately nourished children as compared to malnourished children. Motor speed and coordination was not significantly affected in malnourished children as compared to the adequately nourished children (figure 1). The rate of age related improvement across the two age groups was found rapid on certain functions like selective attention (figure 2) and verbal fluency (figure 3) in malnourished children. However, working memory, design fluency, visuospatial functions, comprehension, learning, and memory showed slowing in terms of age related improvement in malnourished children. Most of the cognitive functions like design fluency (figure 3), working memory (figure 3), Visual perception (figure 4), visuoconceptual reasoning (figure 4), visual construction (figure 4), verbal comprehension (figure 5), verbal and visual memory (figures 6) have shown a very slow rate of improvement with respect to the difference in performance between the two age groups of 5–7 and 8–10 years. On the contrary functions like verbal fluency (figure 3), motor speed (figures 1), and selective attention (figure 2) showed similar rates of improvement in adequately nourished children and malnourished children while comparing the two age groups.
Table 2: Mean comparisons for the cognitive functions across the two age groups of adequately nourished and malnourished children (not shown)
Table 3: Post-hoc comparisons between adequately nourished and malnourished groups across the two age groups (not shown)
Figure 1 Age related comparisons between adequately nourished and malnourished children on motor speed (right and left hand) Age related comparisons between adequately nourished and malnourished children on motor speed (right and left hand). (not shown)
Figure 2 Age related comparisons between adequately nourished and malnourished children on selective attention (color cancellation test). (not shown)
Post-hoc comparisons were computed with Tukey’s posthoc tests to compare the means across age groups between malnourished and adequately nourished children for those test scores that showed significant effects. Hence, post hoc tests were not computed for the finger tapping test scores assessing motor speed. Table 3 presents the post-hoc results with the significance (probability level) levels of the differences across age groups and between adequately nourished and malnourished children. Post hoc results have been done to support our theoretical claims about the lack of age related improvement in certain cognitive functions on one hand and the nature of cognitive impairments on the other in malnourished children. Four comparisons were interpreted i.e., comparing performance between the two age groups of adequately nourished and malnourished children separately. The other comparison was between the adequately nourished and malnourished children for the age group of 5–7 years and similarly for the age group of 8–10 years. Results indicate age related differences within each group as well as between the two groups. Age related differences were found significant for some of the test scores between 5–7 and 8–10 year old children in the adequately nourished group but not for most of the test scores for malnourished group indicative of a delay in development of certain cognitive functions. Differences were found significant between the adequately nourished and malnourished children for the same age group for most of the test scores indicative of a deficit in a particular cognitive function. In few of the tests, performance was not found to be significantly different between the two age groups for both adequately nourished and malnourished children.
Discussion The findings of the present study could be discussed in terms of the effect of chronic malnutrition on neuropsychological performance and with respect to the rate of development of cognitive processes.
Effect of malnutrition on neuropsychological performance Our study indicates that malnourished children perform poor on most of the neuropsychological tests except that of motor speed as compared to adequately nourished children. Malnourished children showed poor performance on tests of higher cognitive functions like cognitive flexibility, attention, working memory, visual perception, verbal comprehension, and memory. These findings are supported by another study on Indian malnourished children, which reported memory impairments in undernourished children and spared fine motor coordination [36]. Malnourished children showed poor performance on novel tasks like tests of executive functions i.e., working memory spatial locations. Poor performance on the tests of fluency and working memory also coincides with very slow rate of improvement between the age groups of 5–7 years and 8–10 years. Poor performance on most of the neuropsychological tests indicated a diffuse impairment including attention, executive functions, visuospatial functions, comprehension and memory.
Effect of malnutrition on cognitive development Both the groups were tested on a neuropsychological battery, which has been found to be sensitive to age related differences in cognitive functions in children (5–15 years). The age trends reported in the present study are based on the assessment that employed the NIMHANS neuropsychological battery for children [13]. The test battery has been standardized based on the growth curve modeling approach for empirical validation of age-related differences in performance on neuropsychological tests. The tests in the battery were found sensitive to show age related differences.
Malnourished children showed poor performance with respect to age as compared to adequately nourished children. The performance of malnourished children in the 5–7 years age group was poor and much lower than the adequately nourished children and did not seem to show much improvement in the 8–10 years age group. The rate of cognitive development was found to be different for different cognitive functions. The rate of development was affected for some of the cognitive functions showing minimal age related improvement across the age range of 5–7 years and 8–10 years such as design fluency, working memory, visual construction, verbal comprehension, learning and memory for verbal and visual material. On the contrary, age related improvement was observed on certain other cognitive functions in malnourished children, where the level of performance was low for both the age groups but the rate of improvement between the two age groups was similar to adequately nourished children.
Not shown
Figure 3 Age related comparisons between adequately nourished and malnourished children on executive functions.
Note: VF: verbal fluency; DF: design fluency; WM: working memory; AN: adequately nourished; MN: malnourished.

MN 5–7 vs 8–10 p > .05 5–7 years AN vs MN p > .05 8–10 years AN vs MN p < .05 Visual memory (memory for designs test) AN 5–7 vs 8–10 p > .05 MN 5–7 vs 8–10 p > .05 5–7 years AN vs MN p < .05 8–10 years AN vs MN p < .05

Figure 4 Age related comparisons between adequately nourished and malnourished children on visuospatial functions.
Figure 5 Age related comparisons between adequately nourished and malnourished children on verbal comprehension and verbal learning.
Motor speed (right and left hand) was not found impaired in malnourished children and the rate of development was also found similar to adequately nourished children.
Executive functions such as design fluency, selective attention and working memory were found deficient in malnourished children also showing poor rate of improvement between the two age groups. All the three tests of executive functions like fluency, selective attention and working memory for spatial locations involved novel stimuli and performance required cognitive flexibility as well as faster information processing which was affected in malnourished children. Results also indicate that malnourished children showed a very slow rate of improvement on these functions.
Visuo-spatial functions like visual perception, visual construction and visuo-conceptual reasoning showed significantly poor performance when compared to the adequately nourished children but showed a steep age related improvement in performance. Performance on functions like visual perception (visual discrimination, perceptual matching, visual closure and visuospatial relationships) and visual construction was severely affected in malnourished children and also showed poor rate of improvement with age.
Verbal comprehension, learning and memory for verbal and visual material was found poor as compared to adequately nourished children but the rate of improvement between 5–7 years age group and 8–10 years age group was similar to that of adequately nourished children. These results suggest that development of comprehension with age might not be affected in malnourished children. However, other than the poor performance on the AVLT test of verbal learning, malnourished children also showed minimal improvement between the two age groups as compared to the greater magnitude of difference between the two age groups in adequately nourished children. Visual memory was most severely affected in malnourished children in terms of the poor performance on delayed recall on design learning test as well as in terms of the difference between the two age groups.
Malnutrition affects brain growth and development and hence future behavioral outcomes [37]. School-age children who suffered from early childhood malnutrition have generally been found to have poorer IQ levels, cognitive function, school achievement and greater behavioral problems than matched controls and, to a lesser extent, siblings. The disadvantages last at least until adolescence. There is no consistent evidence of a specific cognitive deficit [38]. The functional integrity of specific cognitive processes is less clear. Stunting in early childhood is common in developing countries and is associated with poorer cognition and school achievement in later childhood [39]. Deficits in children’s scores have been reported to be smaller at age 11 years than at age 8 years in a longitudinal study on malnourished children stunted children suggesting that adverse effects may decline over time [7]. In our study also all the children in malnourished group were stunted and the cross sectional assessment of age related improvement has shown similar rate of improvement across 5–7 years to 8–10 years age groups as observed in adequately nourished children though the baseline performance was low in malnourished children. These results indicate that the adverse effects of malnutrition (stunting in particular) may decline with age only for certain cognitive functions but the rate of cognitive development for most of the cognitive processes particularly higher cognitive processes including executive processes and visuospatial perception could be severely affected during the childhood years. Decline in the effects of malnutrition overtime has been reported to be independent of differences in educational, socioeconomic and psychosocial resources [7]. Hence, malnutrition (particularly stunting) may result in delayed development of cognitive processes during childhood years rather than a permanent generalized cognitive impairment.
The neuropsychological interpretation of the cognitive processes more severely affected in malnourished children suggests a diffuse cortical involvement. This is with reference to deficits pertaining to functions mediated by dorsolateral prefrontal cortex (poor performance on tests of attention, fluency and working memory), right parietal (poor performance on tests of visuospatial functions) and bilateral temporal cortex (poor performance on tests of comprehension, verbal learning, and memory for verbal and visual material). The prefrontal cortex may be particularly vulnerable to malnutrition [4]. The adverse effects of malnutrition (PEM-stunting) on cognitive development could be related to the delay in certain processes of structural and functional maturation like delayed myelination and reduced overall development of dendritic arborization of the developing brain [1].
The present study highlights two ways in which malnutrition particularly stunting could affect cognitive functions. On one hand age related improvement in cognitive performance is compromised and on the other hand there could be long lasting cognitive impairments as well. However, the effect is nor specific to a particular cognitive domain and is rather more diffuse. Results of the study also indicate that: certain cognitive functions could be vulnerable to the effect of malnutrition in terms of showing impairment but the rate of development of these functions may not be affected. On the other hand, rate of development of certain cognitive functions may be affected and may also show impairment when compared with adequately nourished children.
Conclusion Chronic protein energy malnutrition (stunting) results in cognitive impairments as well as slowing in the rate of the development of cognitive processes. Rate of development of cognitive functions may follow different patterns in children with malnutrition. Chronic protein energy malnutrition affects the development of cognitive processes differently during childhood years rather than merely showing an overall cognitive dysfunction as compared to adequately nourished children. Stunting could result in delay in the development of cognitive functions as well as in permanent cognitive impairments which show minimal improvement with increase in age. Rate of development of attention, executive functions like cognitive flexibility, working memory, visuospatial functions like visual construction is more severely affected by protein energy malnutrition in childhood years, a period that is marked by rapid ongoing development of cognitive functions.
The effects of protein energy malnutrition in early childhood on intellectual and motor abilities in later childhood and adolescence.
Dev Med Child Neurol. 1976 Jun;18(3):330-50.

Three groups of Ugandan children (20 in each group) and one comparison group of 20 children were examined between 11 and 17 years of age. The first three groups had been admitted to hospital for treatment of protein energy malnutrition between the ages of eight to 15, 16 to 21 and 22 to 27 months, respectively. The comparison group had not been clinically malnourished throughout the whole period up to 27 months of age. All the children came from one tribe and were individually matched for sex, age, education and home environment. It was found that the three malnourished groups fell significantly below the comparison group in anthropometric measurements and in tests of intellectual and motor abilities. No evidence was found for a relationship between the deficit and age at admission. Further analysis among the 60 malnourished children revealed that anthropometry and intellectual and motor abilities are the more affected the greater the degree of ‘chronic undernutrition’ at admission, but no correlation was found with the severity of the ‘acute malnutrition’. The results show a general impairment of intellectual abilities, with reasoning and spatial abilities most affected, memory and rote learning intermediately and language ability least, if at all, affected. These findings are discussed in the context of a comprehensive and critical appraisal of the existing literature.

Quake-Hit Nepal Gears up to Tackle Stunting in Children

By Gopal Sharma  July 08, 2015  http://www.medscape.com/viewarticle/847572

HECHO, Nepal (Thomson Reuters Foundation) – Shanti Maharjan, who gave birth to a baby girl 10 days ago, has spent the last two months living under corrugated iron sheets with her husband and five others after two major earthquakes reduced her mud-and-brick home to rubble.

Adequate food, drinking water and aid such as tents and blankets have been hard to come by, she says, though scores of aid agencies rushed to the Himalayan nation to help survivors.
What worries the 26-year-old mother most is her inability to produce breastmilk for her new-born daughter, who she fears is at serious risk of malnutrition in the aftermath of the 7.8 and 7.3 magnitude quakes in April and May.

“The earthquake destroyed everything, including our food reserves,” said Maharjan, sitting under the iron sheeting on farmland on the outskirts of the capital, Kathmandu.

“There is not enough food. Getting meat, oil and fruits to eat is difficult in this situation. I am worried about my daughter’s nourishment,” she said as the baby, wrapped in a green cloth, lay sleeping on a wooden bed.

The government, aware that disruption caused by the quakes could worsen the country’s already high rate of child malnutrition is sending out teams of community nurses to give advice and food supplements to women and children in the affected areas.

A 2011 government study showed that more than 40% of Napel’s under-five-year-olds were stunted, showing that the country’s child malnutrition rate was one of the world’s highest.
Experts say the two quakes, which killed 8,895 people and destroyed half a million houses, could make things worse as survivors have inadequate food, water, shelter, healthcare and sanitation.

United Nations officials warn that the rate of stunting among children in the South Asian nation could return to the 2001 level of 57%, if authorities and aid agencies do not respond effectively.

“The risk of malnutrition is high and requires the nutrition and other sectors like agriculture, health, water, sanitation, education and social protection to respond adequately,” said Stanley Chitekwe, UNICEF’s nutrition chief in Nepal.

DRIVE TO NOURISH

Child malnutrition is an underlying cause of death for 3 million children annually around the world – nearly half of all child deaths – most of whom die from preventable illnesses such as diarrhoea due to weak immune systems.

Those lucky enough to survive grow up without enough energy, protein, vitamins and minerals, causing their brains and bodies to be stunted, and they are often unable to fulfill their potential.

Government officials admit the challenges, citing data showing that almost 70% of Nepali children under the age of two suffer from anaemia caused by iron deficiency.

“This shows that (poor) nutrition is a very big problem. The earthquake will further worsen the situation because people simply don’t have enough to eat, let alone have a nutritious diet,” said Health Ministry official Krishna Prasad Paudel.

Supported by UNICEF, authorities have now launched a drive to reach out to more than 500,000 women and children who need supplementary food and medicines.

More than 10,000 female community volunteers will be fanning out across 14 districts affected by the earthquakes, visiting devastated towns and villages and speaking to new and expectant mothers about breast-feeding their infants.

The volunteers will also advise families on eating locally available nutritious foods such as green vegetables and meat and will distribute vitamin A, iron and folic acid, and other micronutrient supplements to pregnant and breastfeeding women.

In Imadole, a prosperous district on the outskirts of the ancient town of Patan, health volunteer Urmila Sharma Dahal found an extremely thin two-year-old boy weighing 7.5 kg (16.5 pounds) last week, suffering from severe acute malnutrition.

Dahal said she provided his family with sachets of ready-to-use therapeutic food – a paste of peanut, sugar, milk powder, vitamin and oil – and the child gained nearly a kilo (2.2 pounds) in weight in just seven days.

“It does not take much. It can be done with small but right interventions,” said Dahal as she sat next to the child in the family’s brick-and-cement home.

Protein-energy malnutrition occurs due to inadequate intake of food and is a major cause of morbidity and mortality in children in developing countries (Grover and Ee 2009).

http://www.wcs-heal.org/global-challenges/public-health-issues-and-costs/malnutrition/protein-energy-malnutrition

http://www.wcs-heal.org/uploads/images/Chris_Golden-malnourished_children_692x513_scaled_cropp.jpg

Protein energy malnutrition (PEM) has significant negative impacts on children’s growth and development (Grover and Ee 2009). Chronic PEM causes children to have stunted growth (low height for age) and to be underweight (low weight for age); it is estimated that among children under age five, one in every four is stunted and one in every six is underweight. PEM also causes two specific conditions in children: marasmus, which is characterized by an emaciated appearance, and kwashiorkor, in which children develop swollen bellies due to edema (abnormal accumulation of fluid) and discoloration of the hair because of pigment loss among other symptoms (UNWFP 2013b, Ahmed et al. 2012). Countries in sub-Saharan Africa and south Asia have the highest proportions of children suffering from PEM (UNWFP 2013a).

PEM causes direct mortality in children and also increases vulnerability to other serious diseases including diarrhea, pneumonia, and malaria. Children suffering from PEM have compromised immune systems, making them particularly susceptible to infectious diseases.  Furthermore, PEM has negative impacts on children’s brain development, resulting in issues with memory and delayed motor function; these children have decreased ability to learn and have lower productivity as adults. PEM also has serious and potentially long-term impacts on other organ systems including the cardiovascular, respiratory, and gastrointestinal systems (Grover and Ee 2009).

Many adults in developing countries also suffer from PEM, with women disproportionately impacted compared with men, particularly in south Asian countries (UNWFP 2013a). Pregnant women who are undernourished can fall even further behind in their nutritional status due to the increased demand for nutrients by the developing fetus. Women who don’t gain sufficient weight during pregnancy are at increased risk for complications including maternal morbidity and mortality, low birth weight, and neonatal mortality. These women can also have difficulty providing sufficient quantities of breast milk, leading to malnutrition among neonates (Ahmed et al. 2012).

Read Full Post »

Blood Transfusions

Larry H Bernstein, MD, FCAP, Curator

LPBI

What Is a Blood Transfusion?  

A blood transfusion is a safe, common procedure in which blood is given to you through an intravenous (IV) line in one of your blood vessels.

Blood transfusions are done to replace blood lost during surgery or due to a serious injury. A transfusion also may be done if your body can’t make blood properly because of an illness.

During a blood transfusion, a small needle is used to insert an IV line into one of your blood vessels. Through this line, you receive healthy blood. The procedure usually takes 1 to 4 hours, depending on how much blood you need.

Blood transfusions are very common. Each year, almost 5 million Americans need a blood transfusion. Most blood transfusions go well. Mild complications can occur. Very rarely, serious problems develop.

Blood is made up of various parts, including red blood cells, white blood cells, platelets (PLATE-lets), and plasma. Blood is transfused either as whole blood (with all its parts) or, more often, as individual parts.

Blood Types

Every person has one of the following blood types: A, B, AB, or O. Also, every person’s blood is either Rh-positive or Rh-negative. So, if you have type A blood, it’s either A positive or A negative.

The blood used in a transfusion must work with your blood type. If it doesn’t, antibodies (proteins) in your blood attack the new blood and make you sick.

Type O blood is safe for almost everyone. About 40 percent of the population has type O blood. People who have this blood type are called universal donors. Type O blood is used for emergencies when there’s no time to test a person’s blood type.

People who have type AB blood are called universal recipients. This means they can get any type of blood.

If you have Rh-positive blood, you can get Rh-positive or Rh-negative blood. But if you have Rh-negative blood, you should only get Rh-negative blood. Rh-negative blood is used for emergencies when there’s no time to test a person’s Rh type.

Blood Banks

Blood banks collect, test, and store blood. They carefully screen all donated blood for possible infectious agents, such as viruses, that could make you sick. (For more information, see“What Are the Risks of a Blood Transfusion?”)

Blood bank staff also screen each blood donation to find out whether it’s type A, B, AB, or O and whether it’s Rh-positive or Rh-negative. Getting a blood type that doesn’t work with your own blood type will make you very sick. That’s why blood banks are very careful when they test the blood.

To prepare blood for a transfusion, some blood banks remove white blood cells. This process is called white cell or leukocyte (LU-ko-site) reduction. Although rare, some people are allergic to white blood cells in donated blood. Removing these cells makes allergic reactions less likely.

Not all transfusions use blood donated from a stranger. If you’re going to have surgery, you may need a blood transfusion because of blood loss during the operation. If it’s surgery that you’re able to schedule months in advance, your doctor may ask whether you would like to use your own blood, rather than donated blood.

Alternatives to Blood Transfusions 

Researchers are trying to find ways to make blood. There’s currently no man-made alternative to human blood. However, researchers have developed medicines that may help do the job of some blood parts.

For example, some people who have kidney problems can now take a medicine called erythropoietin that helps their bodies make more red blood cells. This means they may need fewer blood transfusions.

Surgeons try to reduce the amount of blood lost during surgery so that fewer patients need blood transfusions. Sometimes they can collect and reuse the blood for the patient.

https://www.nhlbi.nih.gov/health/health-topics/topics/bt

Your options may be limited by time and health factors, so it is important to begin carrying out your decision as soon as possible. For example, if friends or family members are donating blood for a patient (directed donors), their blood should be drawn several days prior to the anticipated need to allow adequate time for testing and labeling. The exact protocols are hospital and donor site specific.

The safest blood product is your own, so if a transfusion is likely, this is your lowest risk choice. Unfortunately this option is usually only practical when preparing for elective surgery. In most other instances the patient cannot donate their own blood due to the acute nature of the need for blood. Although you have the right to refuse a blood transfusion, this decision may have life-threatening consequences. If you are a parent deciding for your child, you as the parent or guardian must understand that in a life-threatening situation your doctors will act in your child’s best interest to insure your child’s health and wellbeing in accordance with standards of medical care regardless of religious beliefs. Please carefully review this material and decide with your doctor which option(s) you prefer, understanding that your doctor will always act in the best interest of his or her patient.

To assure a safe transfusion make sure your healthcare provider who starts the transfusion verifies your name and matches it to the blood that is going to be transfused. Besides your name, a second personal identifier usually used is your birthday. This assures the blood is given to the correct patient.

If during the transfusion you have symptoms of shortness of breath, itching,fever or chills or just not feeling well, alert the person transfusing the blood immediately.

Blood can be provided from two sources: autologous blood (using your own blood) or donor blood (using someone else’s blood).

Autologous blood (using your own blood)

Pre-operative donation: donating your own blood before surgery. The blood bank draws your blood and stores it until you need it during or after surgery. This option is only for non-emergency (elective) surgery. It has the advantage of eliminating or minimizing the need for someone else’s blood during and after surgery. The disadvantage is that it requires advanced planning which may delay surgery. Some medical conditions may prevent the pre-operative donation of blood products.

Intra-operative autologous transfusion: recycling your blood during surgery. Blood lost during surgery is filtered, and put back into your body during surgery. This can be done in emergency and elective surgeries. It has the advantage of eliminating or minimizing the need for someone else’s blood during surgery. Large amounts of blood can be recycled. This process cannot be used if cancer or infection is present.

Post-operative autologous transfusion: recycling your blood after surgery. Blood lost after surgery is collected, filtered and returned to your body. This can be done in emergency and elective surgeries. It has the advantage of eliminating or minimizing the need for someone else’s blood during surgery. This process can’t be used in patients where cancer or infection is present.

Hemodilution: donating your own blood during surgery. Immediately before surgery, some of your blood is taken and replaced with IV fluids. After surgery, your blood is filtered and returned to you. This is done only for elective surgeries. This process dilutes your own blood so you lose less concentrated blood during surgery. It has the advantage of eliminating or minimizing the need for someone else’s blood during surgery. The disadvantage of this process is that only a limited amount of blood can be removed, and certain medical conditions may prevent the use of this technique.

Apheresis: donating your own platelets and plasma. Before surgery, your platelets and plasma, which help stop bleeding, are withdrawn, filtered and returned to you when you need it later. This can be done only for elective surgeries. This process may eliminate the need for donor platelets and plasma, especially in high blood-loss procedures. The disadvantage of this process is that some medical conditions may prevent apheresis, and in actual practice it has limited applications. 

http://www.medicinenet.com/blood_transfusion/article.htm

Diseases Requiring Blood Transfusion

Cancer

Some illnesses cause your body to make too few platelets or clotting factors. You may need transfusions of just those blood components to make up for low levels.

Cancer may decrease your body’s production of red blood cells, white blood cells and platelets by impacting the organs that influence blood count, such as the kidneys, bone marrow and the spleen. Radiation and chemotherapy drugs also can decrease components of the blood. Blood transfusions may be used to counter such effects.

Other illness

Some illnesses cause your body to make too few platelets or clotting factors. You may need transfusions of just those blood components to make up for low levels.

Infection, liver failure or severe burns

If you experience an infection, liver failure or severe burns, you may need a transfusion of plasma. Plasma is the liquid part of blood.

Blood disorders

People with blood diseases may receive transfusions of red blood cells, platelets or clotting factors.

Severe liver malfunction

If you have severe liver problems, you may receive a transfusion of albumin, a blood protein.

Risks

By Mayo Clinic Staff

Blood transfusions are generally considered to be safe. But they do carry some risk of complications. Complications may happen during the transfusion or not for weeks, months or even years afterward. They include the following:

Allergic reaction and hives

If you have an allergic reaction to the transfusion, you may experience hives and itching during the procedure or very soon after. This type of reaction is usually treated with antihistamines. Rarely, a more serious allergic reaction causes difficulty breathing, low blood pressure and nausea.

Fever

If you quickly develop a fever during the transfusion, you may be having a febrile transfusion reaction. Your doctor will stop the transfusion to do further tests before deciding whether to continue. A febrile reaction can also occur shortly after the transfusion. Fever may be accompanied by chills and shaking.

Acute immune hemolytic reaction

This is a very rare but serious transfusion reaction in which your body attacks the transfused red blood cells because the donor blood type is not a good match. In response, your immune system attacks the transfused red blood cells, which are viewed as foreign. These destroyed cells release a substance into your blood that harms your kidneys. This usually occurs during or right after a transfusion. Signs and symptoms include fever, nausea, chills, lower back or chest pain, and dark urine.

Lung injury

Transfusion-related acute lung injury (TRALI) is thought to occur due to antibodies or other biologic substances in the blood components. With TRALI, the lungs become damaged, making it difficult to breathe. Usually, TRALI occurs within one to six hours of the transfusion. People usually recover, especially when treated quickly. Most people who die after TRALI were very sick before the transfusion.

Bloodborne infections

Blood banks screen donors for risk factors and test donated blood to reduce the risk of transfusion-related infections. Infections related to blood transfusion still rarely may occur. It can take weeks or months after a blood transfusion to determine that you’ve been infected with a virus, bacterium or parasite.

The National Institutes of Health offers the following estimates for the risk of a blood donation carrying an infectious disease:

  • HIV — 1 in 2 million donations, which is lower than the risk of being killed by lightning
  • Hepatitis B — 1 in 205,000 donations
  • Hepatitis C — 1 in 2 million donations

Delayed hemolytic reaction

This type of reaction is similar to an acute immune hemolytic reaction, but it occurs much more slowly. Your body gradually attacks the donor red blood cells. It could take one to four weeks to notice a decrease in red blood cell levels.

Iron overload  

If you receive multiple blood transfusions, you may end up with too much iron in your blood. Iron overload (hemochromatosis) can damage parts of your body, including the liver and the heart. You may receive iron chelation therapy, which uses medication to remove excess iron.

Graft-versus-host disease

Transfusion-associated graft-versus-host disease is a very rare condition in which transfused white blood cells attack the recipient’s bone marrow. This disease is usually fatal. It is more likely to affect people with severely weakened immune systems, such as those being treated for leukemia or lymphoma. Signs and symptoms include fever, rash, diarrhea and abnormal liver function test results. Irradiating the blood before transfusing it reduces the risk.

Most of the donated blood collected by the American Red Cross is used for direct blood transfusions. Common types of blood transfusions including platelet, plasma and red blood cell transfusions.

A patient suffering from an iron deficiency or anemia, a condition where the body does not have enough red blood cells, may receive a Red Blood Cell Transfusion. This type of transfusion increases a patient’s hemoglobin and iron levels, while improving the amount of oxygen in the body.

Platelets are a component of blood that stops the body from bleeding. Often patients suffering from leukemia, or other types of cancer, have lower platelet counts as a side effect of their chemotherapy treatments. Patients who have illnesses that prevent the body from making enough platelets have to get regular transfusions to stay healthy.

Plasma is the liquid part of the body’s blood. It contains important proteins and other substances crucial to one’s overall health. Plasma transfusions are used for patients with liver failure, severe infections, and serious burns.

If you experience an infection, liver failure or severe burns, you may need a transfusion of plasma. Plasma is the liquid part of blood.

Blood disorders

People with blood diseases may receive transfusions of red blood cells, platelets or clotting factors.

Severe liver malfunction

If you have severe liver problems, you may receive a transfusion of albumin, a blood protein.

Read Full Post »

Erythropoietin

Larry H. Bernstein, MD, FCAP, Curator

LPBI

 Erythropoietin test

The erythropoietin test measures the amount of a hormone called erythropoietin (EPO) in blood.

The hormone tells stem cells in the bone marrow to make more red blood cells. EPO is made by cells in the kidney. These cells release more EPO when blood oxygen levels are low.

https://www.nlm.nih.gov/medlineplus/ency/article/003683.htm

The Story of Erythropoietin

Millions of patients worldwide have benefited from research on erythropoietin spanning many decades. In the last 15 years, epoetin alfa (Epo) has become one of the most widely used drugs created through recombinant DNA technology, in which a nearly identical form of a substance that naturally occurs in the body – in this case, erythropoietin – is created by replicating human DNA in a laboratory. Epo is used to treat anemia, a shortage of red blood cells. Since red blood cells carry oxygen to the tissues and organs, anemia causes symptoms such as weakness, fatigue, and shortness of breath. Epo treats this condition by imitating the action of the hormone erythropoietin, stimulating the body to produce more red blood cells. Patients who may benefit from Epo therapy include those with chronic kidney disease, those who are anemic from AIDS or from a wide variety of hematologic disorders (including multiple myeloma and myelodysplastic syndromes), and some cancer patients who are anemic from receiving chemotherapy. In selected patients, Epo may be used to reduce the need for blood transfusions in surgery.

A century ago, two French investigators reported that small amounts of plasma from anemic rabbits injected into normal animals caused an increase in red blood cell production (erythropoiesis) within a few hours. They referred to this activity as hemopoietine. Over time, as investigators became more convinced that this red-blood-cell stimulating activity was caused by a single protein in the blood plasma, they gave it a variety of names – erythropoietic-stimulating activity, erythropoietic-stimulating factor, and, ultimately, “erythropoietin.”

It wasn’t until the 1950s and ’60s that several American investigators again took up the concept that a hormone regulated red cell production. Refining the work of the French scientists, the American investigators conclusively showed that a hormone stimulated red cell production, that the kidneys were the primary source of erythropoietin, and that low oxygen was the main driver of erythropoietin production. Soon, researchers found that patients with anemia responded by increasing their levels of erythropoietin to stimulate increased red blood cell production. Patients who required an increase in red blood cells in order to make up for low oxygen levels in the blood (such as patients with lung disease or patients living at high altitudes) also had elevated erythropoietin levels.

At the same time, other technologies were being developed that set the stage for a remarkable breakthrough involving a combination of medical and molecular engineering. In the early 1960s came the development of hemodialysis, a method of removing waste products from the blood when the kidneys are unable to perform this function, to sustain the lives of patients with end-stage kidney disease. As a result of this treatment advance, these patients were able to survive the underlying disease, but their damaged kidneys could no longer make erythropoietin, leaving them severely anemic and in desperate need of Epo therapy.

In 1983, scientists discovered a method for mass producing a synthetic version of the hormone. Experiments were conducted to test the safety and effectiveness of the new drug, Epo, for treating anemia in patients with kidney failure. The results of these early clinical trials were dramatic. Patients who had been dependent on frequent blood transfusions were able to increase their red blood cell levels to near-normal within just a few weeks of starting therapy. Patients’ appetites returned, and they resumed their active lives. It was the convergence of two technologies – long-term dialysis and molecular biology – that set the stage for anemia management in this group of patients. Since then, millions of patients worldwide have benefited from Epo therapy.

This article was published in December 2008 as part of the special ASH anniversary brochure,50 Years in Hematology: Research That Revolutionized Patient Care.

http://www.hematology.org/About/History/50-Years/1532.aspx

Erythropoietin Stimulating Agents

https://my.clevelandclinic.org/health/diseases_conditions/hic_Anemia/hic_erythropoietin-stimulating_agents

What is erythropoietin?

Red blood cells are produced in the bone marrow (the spongy tissue inside the bone). In order to make red blood cells, the body maintains an adequate supply of erythropoietin (EPO), a hormone that is produced by the kidney.

EPO helps make red blood cells. Having more red blood cells raises your hemoglobin levels. Hemoglobin is the protein in red blood cells that helps blood carry oxygen throughout the body.

Anemia is a disorder that occurs when there is not enough hemoglobin in a person’s blood. There are several different causes of anemia. For instance, anemia can be caused by the body’s inability to produce enough EPO to make red blood cells. If this is the case, the person may have to have a blood transfusion to treat this type of anemia. If you have anemia, your physician can determine the cause.

What is recombinant erythropoietin?

In cases where transfusions are not an option—for example, when the patient cannot have, or refuses, a transfusion—it may be necessary to give the patient recombinant erythropoietin. Recombinant erythropoietin is a man-made version of natural erythropoietin. It is produced by cloning the gene for erythropoietin.

Recombinant erythropoietin drugs are known as erythropoietin-stimulating agents (ESAs). These drugs are given by injection (shot) and work by stimulating the production of more red blood cells. These cells are then released from the bone marrow into the bloodstream.

There are two ESAs on the U.S. market: epoetin alfa (Procrit,® Epogen®), and darbepoietin alfa (Aranesp®).

Who receives ESAs?

ESAs are usually given to patients who have chronic (long-lasting) kidney disease or end-stage renal (kidney) disease. These patients usually have lower hemoglobin levels because they can’t produce enough erythropoietin.

ESAs are also prescribed for patients who have cancer. These patients often have anemia, which can be caused by chemotherapy.

What are the side effects of ESAs?

The side effects that occur most often with ESA use include:

  • High blood pressure
  • Swelling
  • Fever
  • Dizziness
  • Nausea
  • Pain at the site of the injection.

What should the patient consider before using ESAs?

There are several safety issues with ESAs:

  • ESAs increase the risk of venous thromboembolism (blood clots in the veins). A blood clot can break away from one location and travel to the lung (pulmonary embolism), where it can block circulation. Symptoms of blood clots include chest pain, shortness of breath, pain in the legs, and sudden numbness or weakness in the face, arm, or leg.
  • ESAs can cause hemoglobin to rise too high, which puts the patient at higher risk for heart attack, stroke, heart failure, and death.
  • In patients who have cancer, ESAs may cause the tumor to grow. If ESAs are used for these patients, they are usually stopped after the patient’s chemotherapy is finished.
  • The health care provider will keep an eye on the patient’s blood cell counts to make sure they do not put him or her at a higher risk. The dosing may change, depending on the patient’s needs.

Patients who have the following conditions need to consult with their health care provider if an ESA is being considered as part of the treatment plan:

  • Heart disease
  • High blood pressure
  • Porphyria (a group of diseases that are caused by enzyme deficiencies)
  • Seizures
  • An allergy to epoetin alfa or any other part of this medicine
  • Uncontrolled high blood pressure

In addition, women who are pregnant, planning to become pregnant, or breastfeeding should consult with their health care provider before taking an ESA.

Other issues to consider:

  • Transfusions may improve symptoms of anemia right away. ESAs may take from weeks to months to provide noticeable relief of the symptoms of anemia.
  • If a patient has several transfusions, he or she can develop an “iron overload,” or high iron levels. This is a serious medical problem.
  • Iron supplements are often needed for patients who are on ESAs.
  • Keep your health care provider informed about any change in your condition.
  • Check your blood pressure and heart rate as recommended by your health care provider.
  • Remain informed about the results from any blood work that is done.
  • The body may develop antibodies to an ESA. If this happens, the antibodies will block or lessen the body’s ability to make red blood cells. This could result in an anemia. It is important that the patient keep the health care provider informed of any unusual tiredness, lack of energy, dizziness, or fainting.

Read Full Post »

Metabolic Genomics and Pharmaceutics, Vol. 1 of BioMed Series D available on Amazon Kindle

Metabolic Genomics and Pharmaceutics, Vol. 1 of BioMed Series D available on Amazon Kindle

Reporter: Stephen S Williams, PhD

 

Leaders in Pharmaceutical Business Intelligence would like to announce the First volume of their BioMedical E-Book Series D:

Metabolic Genomics & Pharmaceutics, Vol. I

SACHS FLYER 2014 Metabolomics SeriesDindividualred-page2

which is now available on Amazon Kindle at

http://www.amazon.com/dp/B012BB0ZF0.

This e-Book is a comprehensive review of recent Original Research on  METABOLOMICS and related opportunities for Targeted Therapy written by Experts, Authors, Writers. This is the first volume of the Series D: e-Books on BioMedicine – Metabolomics, Immunology, Infectious Diseases.  It is written for comprehension at the third year medical student level, or as a reference for licensing board exams, but it is also written for the education of a first time baccalaureate degree reader in the biological sciences.  Hopefully, it can be read with great interest by the undergraduate student who is undecided in the choice of a career. The results of Original Research are gaining value added for the e-Reader by the Methodology of Curation. The e-Book’s articles have been published on the Open Access Online Scientific Journal, since April 2012.  All new articles on this subject, will continue to be incorporated, as published with periodical updates.

We invite e-Readers to write an Article Reviews on Amazon for this e-Book on Amazon.

All forthcoming BioMed e-Book Titles can be viewed at:

http://pharmaceuticalintelligence.com/biomed-e-books/

Leaders in Pharmaceutical Business Intelligence, launched in April 2012 an Open Access Online Scientific Journal is a scientific, medical and business multi expert authoring environment in several domains of  life sciences, pharmaceutical, healthcare & medicine industries. The venture operates as an online scientific intellectual exchange at their website http://pharmaceuticalintelligence.com and for curation and reporting on frontiers in biomedical, biological sciences, healthcare economics, pharmacology, pharmaceuticals & medicine. In addition the venture publishes a Medical E-book Series available on Amazon’s Kindle platform.

Analyzing and sharing the vast and rapidly expanding volume of scientific knowledge has never been so crucial to innovation in the medical field. WE are addressing need of overcoming this scientific information overload by:

  • delivering curation and summary interpretations of latest findings and innovations on an open-access, Web 2.0 platform with future goals of providing primarily concept-driven search in the near future
  • providing a social platform for scientists and clinicians to enter into discussion using social media
  • compiling recent discoveries and issues in yearly-updated Medical E-book Series on Amazon’s mobile Kindle platform

This curation offers better organization and visibility to the critical information useful for the next innovations in academic, clinical, and industrial research by providing these hybrid networks.

Table of Contents for Metabolic Genomics & Pharmaceutics, Vol. I

Chapter 1: Metabolic Pathways

Chapter 2: Lipid Metabolism

Chapter 3: Cell Signaling

Chapter 4: Protein Synthesis and Degradation

Chapter 5: Sub-cellular Structure

Chapter 6: Proteomics

Chapter 7: Metabolomics

Chapter 8:  Impairments in Pathological States: Endocrine Disorders; Stress

                   Hypermetabolism and Cancer

Chapter 9: Genomic Expression in Health and Disease 

 

Summary 

Epilogue

 

 

Read Full Post »

Treatment for Chronic Leukemias [2.4.4B]

Larry H. Bernstein, MD, FCAP, Author, Curator, Editor

http://pharmaceuticalintelligence.com/2015/8/11/larryhbern/Treatment-for-Chronic-Leukemias-[2.4.4B]

2.4.4B1 Treatment for CML

Chronic Myelogenous Leukemia Treatment (PDQ®)

http://www.cancer.gov/cancertopics/pdq/treatment/CML/Patient/page4

Treatment Option Overview

Key Points for This Section

There are different types of treatment for patients with chronic myelogenous leukemia.

Six types of standard treatment are used:

  1. Targeted therapy
  2. Chemotherapy
  3. Biologic therapy
  4. High-dose chemotherapy with stem cell transplant
  5. Donor lymphocyte infusion (DLI)
  6. Surgery

New types of treatment are being tested in clinical trials.

Patients may want to think about taking part in a clinical trial.

Patients can enter clinical trials before, during, or after starting their cancer treatment.

Follow-up tests may be needed.

There are different types of treatment for patients with chronic myelogenous leukemia.

Different types of treatment are available for patients with chronic myelogenous leukemia (CML). Some treatments are standard (the currently used treatment), and some are being tested in clinical trials. A treatment clinical trial is a research study meant to help improve current treatments or obtain information about new treatments for patients with cancer. When clinical trials show that a new treatment is better than the standard treatment, the new treatment may become the standard treatment. Patients may want to think about taking part in a clinical trial. Some clinical trials are open only to patients who have not started treatment.

Six types of standard treatment are used:

Targeted therapy

Targeted therapy is a type of treatment that uses drugs or other substances to identify and attack specific cancer cells without harming normal cells. Tyrosine kinase inhibitors are targeted therapy drugs used to treat chronic myelogenous leukemia.

Imatinib mesylate, nilotinib, dasatinib, and ponatinib are tyrosine kinase inhibitors that are used to treat CML.

See Drugs Approved for Chronic Myelogenous Leukemia for more information.

Chemotherapy

Chemotherapy is a cancer treatment that uses drugs to stop the growth of cancer cells, either by killing the cells or by stopping them from dividing. When chemotherapy is taken by mouth or injected into a vein or muscle, the drugs enter the bloodstream and can reach cancer cells throughout the body (systemic chemotherapy). When chemotherapy is placed directly into the cerebrospinal fluid, an organ, or a body cavity such as the abdomen, the drugs mainly affect cancer cells in those areas (regional chemotherapy). The way the chemotherapy is given depends on the type and stage of the cancer being treated.

See Drugs Approved for Chronic Myelogenous Leukemia for more information.

Biologic therapy

Biologic therapy is a treatment that uses the patient’s immune system to fight cancer. Substances made by the body or made in a laboratory are used to boost, direct, or restore the body’s natural defenses against cancer. This type of cancer treatment is also called biotherapy or immunotherapy.

See Drugs Approved for Chronic Myelogenous Leukemia for more information.

High-dose chemotherapy with stem cell transplant

High-dose chemotherapy with stem cell transplant is a method of giving high doses of chemotherapy and replacing blood-forming cells destroyed by the cancer treatment. Stem cells (immature blood cells) are removed from the blood or bone marrow of the patient or a donor and are frozen and stored. After the chemotherapy is completed, the stored stem cells are thawed and given back to the patient through an infusion. These reinfused stem cells grow into (and restore) the body’s blood cells.

See Drugs Approved for Chronic Myelogenous Leukemia for more information.

Donor lymphocyte infusion (DLI)

Donor lymphocyte infusion (DLI) is a cancer treatment that may be used after stem cell transplant.Lymphocytes (a type of white blood cell) from the stem cell transplant donor are removed from the donor’s blood and may be frozen for storage. The donor’s lymphocytes are thawed if they were frozen and then given to the patient through one or more infusions. The lymphocytes see the patient’s cancer cells as not belonging to the body and attack them.

Surgery

Splenectomy

What`s new in chronic myeloid leukemia research and treatment?

http://www.cancer.org/cancer/leukemia-chronicmyeloidcml/detailedguide/leukemia-chronic-myeloid-myelogenous-new-research

Combining the targeted drugs with other treatments

Imatinib and other drugs that target the BCR-ABL protein have proven to be very effective, but by themselves these drugs don’t help everyone. Studies are now in progress to see if combining these drugs with other treatments, such as chemotherapy, interferon, or cancer vaccines (see below) might be better than either one alone. One study showed that giving interferon with imatinib worked better than giving imatinib alone. The 2 drugs together had more side effects, though. It is also not clear if this combination is better than treatment with other tyrosine kinase inhibitors (TKIs), such as dasatinib and nilotinib. A study going on now is looking at combing interferon with nilotinib.

Other studies are looking at combining other drugs, such as cyclosporine or hydroxychloroquine, with a TKI.

New drugs for CML

Because researchers now know the main cause of CML (the BCR-ABL gene and its protein), they have been able to develop many new drugs that might work against it.

In some cases, CML cells develop a change in the BCR-ABL oncogene known as a T315I mutation, which makes them resistant to many of the current targeted therapies (imatinib, dasatinib, and nilotinib). Ponatinib is the only TKI that can work against T315I mutant cells. More drugs aimed at this mutation are now being tested.

Other drugs called farnesyl transferase inhibitors, such as lonafarnib and tipifarnib, seem to have some activity against CML and patients may respond when these drugs are combined with imatinib. These drugs are being studied further.

Other drugs being studied in CML include the histone deacetylase inhibitor panobinostat and the proteasome inhibitor bortezomib (Velcade).

Several vaccines are now being studied for use against CML.

2.4.4.B2 Chronic Lymphocytic Leukemia

Chronic Lymphocytic Leukemia Treatment (PDQ®)

General Information About Chronic Lymphocytic Leukemia

Key Points for This Section

  1. Chronic lymphocytic leukemia is a type of cancer in which the bone marrow makes too many lymphocytes (a type of white blood cell).
  2. Leukemia may affect red blood cells, white blood cells, and platelets.
  3. Older age can affect the risk of developing chronic lymphocytic leukemia.
  4. Signs and symptoms of chronic lymphocytic leukemia include swollen lymph nodes and tiredness.
  5. Tests that examine the blood, bone marrow, and lymph nodes are used to detect (find) and diagnose chronic lymphocytic leukemia.
  6. Certain factors affect treatment options and prognosis (chance of recovery).
  7. Chronic lymphocytic leukemia is a type of cancer in which the bone marrow makes too many lymphocytes (a type of white blood cell).

Chronic lymphocytic leukemia (also called CLL) is a blood and bone marrow disease that usually gets worse slowly. CLL is one of the most common types of leukemia in adults. It often occurs during or after middle age; it rarely occurs in children.

http://www.cancer.gov/images/cdr/live/CDR755927-750.jpg

Anatomy of the bone; drawing shows spongy bone, red marrow, and yellow marrow. A cross section of the bone shows compact bone and blood vessels in the bone marrow. Also shown are red blood cells, white blood cells, platelets, and a blood stem cell.

Anatomy of the bone. The bone is made up of compact bone, spongy bone, and bone marrow. Compact bone makes up the outer layer of the bone. Spongy bone is found mostly at the ends of bones and contains red marrow. Bone marrow is found in the center of most bones and has many blood vessels. There are two types of bone marrow: red and yellow. Red marrow contains blood stem cells that can become red blood cells, white blood cells, or platelets. Yellow marrow is made mostly of fat.

Leukemia may affect red blood cells, white blood cells, and platelets.

Normally, the body makes blood stem cells (immature cells) that become mature blood cells over time. A blood stem cell may become a myeloid stem cell or a lymphoid stem cell.

A myeloid stem cell becomes one of three types of mature blood cells:

  1. Red blood cells that carry oxygen and other substances to all tissues of the body.
  2. White blood cells that fight infection and disease.
  3. Platelets that form blood clots to stop bleeding.

A lymphoid stem cell becomes a lymphoblast cell and then one of three types of lymphocytes (white blood cells):

  1. B lymphocytes that make antibodies to help fight infection.
  2. T lymphocytes that help B lymphocytes make antibodies to fight infection.
  3. Natural killer cells that attack cancer cells and viruses.
Blood cell development. CDR526538-750

Blood cell development. CDR526538-750

http://www.cancer.gov/images/cdr/live/CDR526538-750.jpg

Blood cell development; drawing shows the steps a blood stem cell goes through to become a red blood cell, platelet, or white blood cell. A myeloid stem cell becomes a red blood cell, a platelet, or a myeloblast, which then becomes a granulocyte (the types of granulocytes are eosinophils, basophils, and neutrophils). A lymphoid stem cell becomes a lymphoblast and then becomes a B-lymphocyte, T-lymphocyte, or natural killer cell.

Blood cell development. A blood stem cell goes through several steps to become a red blood cell, platelet, or white blood cell.

In CLL, too many blood stem cells become abnormal lymphocytes and do not become healthy white blood cells. The abnormal lymphocytes may also be called leukemia cells. The lymphocytes are not able to fight infection very well. Also, as the number of lymphocytes increases in the blood and bone marrow, there is less room for healthy white blood cells, red blood cells, and platelets. This may cause infection, anemia, and easy bleeding.

This summary is about chronic lymphocytic leukemia. See the following PDQ summaries for more information about leukemia:

  • Adult Acute Lymphoblastic Leukemia Treatment.
  • Childhood Acute Lymphoblastic Leukemia Treatment.
  • Adult Acute Myeloid Leukemia Treatment.
  • Childhood Acute Myeloid Leukemia/Other Myeloid Malignancies Treatment.
  • Chronic Myelogenous Leukemia Treatment.
  • Hairy Cell Leukemia Treatment

Older age can affect the risk of developing chronic lymphocytic leukemia.

Anything that increases your risk of getting a disease is called a risk factor. Having a risk factor does not mean that you will get cancer; not having risk factors doesn’t mean that you will not get cancer. Talk with your doctor if you think you may be at risk. Risk factors for CLL include the following:

  • Being middle-aged or older, male, or white.
  • A family history of CLL or cancer of the lymph system.
  • Having relatives who are Russian Jews or Eastern European Jews.

Signs and symptoms of chronic lymphocytic leukemia include swollen lymph nodes and tiredness.

Usually CLL does not cause any signs or symptoms and is found during a routine blood test. Signs and symptoms may be caused by CLL or by other conditions. Check with your doctor if you have any of the following:

  • Painless swelling of the lymph nodes in the neck, underarm, stomach, or groin.
  • Feeling very tired.
  • Pain or fullness below the ribs.
  • Fever and infection.
  • Weight loss for no known reason.

Tests that examine the blood, bone marrow, and lymph nodes are used to detect (find) and diagnose chronic lymphocytic leukemia.

The following tests and procedures may be used:

Physical exam and history : An exam of the body to check general signs of health, including checking for signs of disease, such as lumps or anything else that seems unusual. A history of the patient’s health habits and past illnesses and treatments will also be taken.

Complete blood count (CBC) with differential : A procedure in which a sample of blood is drawn and checked for the following:

The number of red blood cells and platelets.

The number and type of white blood cells.

The amount of hemoglobin (the protein that carries oxygen) in the red blood cells.

The portion of the blood sample made up of red blood cells.

Results from the Phase 3 Resonate™ Trial

Significantly improved progression free survival (PFS) vs ofatumumab in patients with previously treated CLL

  • Patients taking IMBRUVICA® had a 78% statistically significant reduction in the risk of disease progression or death compared with patients who received ofatumumab1
  • In patients with previously treated del 17p CLL, median PFS was not yet reached with IMBRUVICA® vs 5.8 months with ofatumumab (HR 0.25; 95% CI: 0.14, 0.45)1

Significantly prolonged overall survival (OS) with IMBRUVICA® vs ofatumumab in patients with previously treated CLL

  • In patients with previously treated CLL, those taking IMBRUVICA® had a 57% statistically significant reduction in the risk of death compared with those who received ofatumumab (HR 0.43; 95% CI: 0.24, 0.79; P<0.05)1

Typical treatment of chronic lymphocytic leukemia

http://www.cancer.org/cancer/leukemia-chroniclymphocyticcll/detailedguide/leukemia-chronic-lymphocytic-treating-treatment-by-risk-group

Treatment options for chronic lymphocytic leukemia (CLL) vary greatly, depending on the person’s age, the disease risk group, and the reason for treating (for example, which symptoms it is causing). Many people live a long time with CLL, but in general it is very difficult to cure, and early treatment hasn’t been shown to help people live longer. Because of this and because treatment can cause side effects, doctors often advise waiting until the disease is progressing or bothersome symptoms appear, before starting treatment.

If treatment is needed, factors that should be taken into account include the patient’s age, general health, and prognostic factors such as the presence of chromosome 17 or chromosome 11 deletions or high levels of ZAP-70 and CD38.

Initial treatment

Patients who might not be able to tolerate the side effects of strong chemotherapy (chemo), are often treated with chlorambucil alone or with a monoclonal antibody targeting CD20 like rituximab (Rituxan) or obinutuzumab (Gazyva). Other options include rituximab alone or a corticosteroid like prednisione.

In stronger and healthier patients, there are many options for treatment. Commonly used treatments include:

  • FCR: fludarabine (Fludara), cyclophosphamide (Cytoxan), and rituximab
  • Bendamustine (sometimes with rituximab)
  • FR: fludarabine and rituximab
  • CVP: cyclophosphamide, vincristine, and prednisone (sometimes with rituximab)
  • CHOP: cyclophosphamide, doxorubicin, vincristine (Oncovin), and prednisone
  • Chlorambucil combined with prednisone, rituximab, obinutuzumab, or ofatumumab
  • PCR: pentostatin (Nipent), cyclophosphamide, and rituximab
  • Alemtuzumab (Campath)
  • Fludarabine (alone)

Other drugs or combinations of drugs may also be also used.

If the only problem is an enlarged spleen or swollen lymph nodes in one region of the body, localized treatment with low-dose radiation therapy may be used. Splenectomy (surgery to remove the spleen) is another option if the enlarged spleen is causing symptoms.

Sometimes very high numbers of leukemia cells in the blood cause problems with normal circulation. This is calledleukostasis. Chemo may not lower the number of cells until a few days after the first dose, so before the chemo is given, some of the cells may be removed from the blood with a procedure called leukapheresis. This treatment lowers blood counts right away. The effect lasts only for a short time, but it may help until the chemo has a chance to work. Leukapheresis is also sometimes used before chemo if there are very high numbers of leukemia cells (even when they aren’t causing problems) to prevent tumor lysis syndrome (this was discussed in the chemotherapy section).

Some people who have very high-risk disease (based on prognostic factors) may be referred for possible stem cell transplant (SCT) early in treatment.

Second-line treatment of CLL

If the initial treatment is no longer working or the disease comes back, another type of treatment may help. If the initial response to the treatment lasted a long time (usually at least a few years), the same treatment can often be used again. If the initial response wasn’t long-lasting, using the same treatment again isn’t as likely to be helpful. The options will depend on what the first-line treatment was and how well it worked, as well as the person’s health.

Many of the drugs and combinations listed above may be options as second-line treatments. For many people who have already had fludarabine, alemtuzumab seems to be helpful as second-line treatment, but it carries an increased risk of infections. Other purine analog drugs, such as pentostatin or cladribine (2-CdA), may also be tried. Newer drugs such as ofatumumab, ibrutinib (Imbruvica), and idelalisib (Zydelig) may be other options.

If the leukemia responds, stem cell transplant may be an option for some patients.

Some people may have a good response to first-line treatment (such as fludarabine) but may still have some evidence of a small number of leukemia cells in the blood, bone marrow, or lymph nodes. This is known as minimal residual disease. CLL can’t be cured, so doctors aren’t sure if further treatment right away will be helpful. Some small studies have shown that alemtuzumab can sometimes help get rid of these remaining cells, but it’s not yet clear if this improves survival.

Treating complications of CLL

One of the most serious complications of CLL is a change (transformation) of the leukemia to a high-grade or aggressive type of non-Hodgkin lymphoma called diffuse large cell lymphoma. This happens in about 5% of CLL cases, and is known as Richter syndrome. Treatment is often the same as it would be for lymphoma (see our document called Non-Hodgkin Lymphoma for more information), and may include stem cell transplant, as these cases are often hard to treat.

Less often, CLL may transform to prolymphocytic leukemia. As with Richter syndrome, these cases can be hard to treat. Some studies have suggested that certain drugs such as cladribine (2-CdA) and alemtuzumab may be helpful.

In rare cases, patients with CLL may have their leukemia transform into acute lymphocytic leukemia (ALL). If this happens, treatment is likely to be similar to that used for patients with ALL (see our document called Leukemia: Acute Lymphocytic).

Acute myeloid leukemia (AML) is another rare complication in patients who have been treated for CLL. Drugs such as chlorambucil and cyclophosphamide can damage the DNA of blood-forming cells. These damaged cells may go on to become cancerous, leading to AML, which is very aggressive and often hard to treat (see our document calledLeukemia: Acute Myeloid).

CLL can cause problems with low blood counts and infections. Treatment of these problems were discussed in the section “Supportive care in chronic lymphocytic leukemia.”

Read Full Post »

Treatments other than Chemotherapy for Leukemias and Lymphomas

Author, Curator, Editor: Larry H. Bernstein, MD, FCAP

2.5.1 Radiation Therapy 

http://www.lls.org/treatment/types-of-treatment/radiation-therapy

Radiation therapy, also called radiotherapy or irradiation, can be used to treat leukemia, lymphoma, myeloma and myelodysplastic syndromes. The type of radiation used for radiotherapy (ionizing radiation) is the same that’s used for diagnostic x-rays. Radiotherapy, however, is given in higher doses.

Radiotherapy works by damaging the genetic material (DNA) within cells, which prevents them from growing and reproducing. Although the radiotherapy is directed at cancer cells, it can also damage nearby healthy cells. However, current methods of radiotherapy have been improved upon, minimizing “scatter” to nearby tissues. Therefore its benefit (destroying the cancer cells) outweighs its risk (harming healthy cells).

When radiotherapy is used for blood cancer treatment, it’s usually part of a treatment plan that includes drug therapy. Radiotherapy can also be used to relieve pain or discomfort caused by an enlarged liver, lymph node(s) or spleen.

Radiotherapy, either alone or with chemotherapy, is sometimes given as conditioning treatment to prepare a patient for a blood or marrow stem cell transplant. The most common types used to treat blood cancer are external beam radiation (see below) and radioimmunotherapy.
External Beam Radiation

External beam radiation is the type of radiotherapy used most often for people with blood cancers. A focused radiation beam is delivered outside the body by a machine called a linear accelerator, or linac for short. The linear accelerator moves around the body to deliver radiation from various angles. Linear accelerators make it possible to decrease or avoid skin reactions and deliver targeted radiation to lessen “scatter” of radiation to nearby tissues.

The dose (total amount) of radiation used during treatment depends on various factors regarding the patient, disease and reason for treatment, and is established by a radiation oncologist. You may receive radiotherapy during a series of visits, spread over several weeks (from two to 10 weeks, on average). This approach, called dose fractionation, lessens side effects. External beam radiation does not make you radioactive.

2.5.2  Bone marrow (BM) transplantation

http://www.nlm.nih.gov/medlineplus/ency/article/003009.htm

There are three kinds of bone marrow transplants:

Autologous bone marrow transplant: The term auto means self. Stem cells are removed from you before you receive high-dose chemotherapy or radiation treatment. The stem cells are stored in a freezer (cryopreservation). After high-dose chemotherapy or radiation treatments, your stems cells are put back in your body to make (regenerate) normal blood cells. This is called a rescue transplant.

Allogeneic bone marrow transplant: The term allo means other. Stem cells are removed from another person, called a donor. Most times, the donor’s genes must at least partly match your genes. Special blood tests are done to see if a donor is a good match for you. A brother or sister is most likely to be a good match. Sometimes parents, children, and other relatives are good matches. Donors who are not related to you may be found through national bone marrow registries.

Umbilical cord blood transplant: This is a type of allogeneic transplant. Stem cells are removed from a newborn baby’s umbilical cord right after birth. The stem cells are frozen and stored until they are needed for a transplant. Umbilical cord blood cells are very immature so there is less of a need for matching. But blood counts take much longer to recover.

Before the transplant, chemotherapy, radiation, or both may be given. This may be done in two ways:

Ablative (myeloablative) treatment: High-dose chemotherapy, radiation, or both are given to kill any cancer cells. This also kills all healthy bone marrow that remains, and allows new stem cells to grow in the bone marrow.

Reduced intensity treatment, also called a mini transplant: Patients receive lower doses of chemotherapy and radiation before a transplant. This allows older patients, and those with other health problems to have a transplant.

A stem cell transplant is usually done after chemotherapy and radiation is complete. The stem cells are delivered into your bloodstream usually through a tube called a central venous catheter. The process is similar to getting a blood transfusion. The stem cells travel through the blood into the bone marrow. Most times, no surgery is needed.

Donor stem cells can be collected in two ways:

  • Bone marrow harvest. This minor surgery is done under general anesthesia. This means the donor will be asleep and pain-free during the procedure. The bone marrow is removed from the back of both hip bones. The amount of marrow removed depends on the weight of the person who is receiving it.
  • Leukapheresis. First, the donor is given 5 days of shots to help stem cells move from the bone marrow into the blood. During leukapheresis, blood is removed from the donor through an IV line in a vein. The part of white blood cells that contains stem cells is then separated in a machine and removed to be later given to the recipient. The red blood cells are returned to the donor.

Why the Procedure is Performed

A bone marrow transplant replaces bone marrow that either is not working properly or has been destroyed (ablated) by chemotherapy or radiation. Doctors believe that for many cancers, the donor’s white blood cells can attach to any remaining cancer cells, similar to when white cells attach to bacteria or viruses when fighting an infection.

Your doctor may recommend a bone marrow transplant if you have:

Certain cancers, such as leukemia, lymphoma, and multiple myeloma

A disease that affects the production of bone marrow cells, such as aplastic anemia, congenital neutropenia, severe immunodeficiency syndromes, sickle cell anemia, thalassemia

Had chemotherapy that destroyed your bone

2.5.3 Autologous stem cell transplantation

Phase II trial of 131I-B1 (anti-CD20) antibody therapy with autologous stem cell transplantation for relapsed B cell lymphomas

O.W Press,  F Appelbaum,  P.J Martin, et al.
http://www.thelancet.com/journals/lancet/article/PIIS0140-6736(95)92225-3/abstract

25 patients with relapsed B-cell lymphomas were evaluated with trace-labelled doses (2·5 mg/kg, 185-370 MBq [5-10 mCi]) of 131I-labelled anti-CD20 (B1) antibody in a phase II trial. 22 patients achieved 131I-B1 biodistributions delivering higher doses of radiation to tumor sites than to normal organs and 21 of these were treated with therapeutic infusions of 131I-B1 (12·765-29·045 GBq) followed by autologous hemopoietic stem cell reinfusion. 18 of the 21 treated patients had objective responses, including 16 complete remissions. One patient died of progressive lymphoma and one died of sepsis. Analysis of our phase I and II trials with 131I-labelled B1 reveal a progression-free survival of 62% and an overall survival of 93% with a median follow-up of 2 years. 131I-anti-CD20 (B1) antibody therapy produces complete responses of long duration in most patients with relapsed B-cell lymphomas when given at maximally tolerated doses with autologous stem cell rescue.

Autologous (Self) Transplants

http://www.leukaemia.org.au/treatments/stem-cell-transplants/autologous-self-transplants

An autologous transplant (or rescue) is a type of transplant that uses the person’s own stem cells. These cells are collected in advance and returned at a later stage. They are used to replace stem cells that have been damaged by high doses of chemotherapy, used to treat the person’s underlying disease.

In most cases, stem cells are collected directly from the bloodstream. While stem cells normally live in your marrow, a combination of chemotherapy and a growth factor (a drug that stimulates stem cells) called Granulocyte Colony Stimulating Factor (G-CSF) is used to expand the number of stem cells in the marrow and cause them to spill out into the circulating blood. From here they can be collected from a vein by passing the blood through a special machine called a cell separator, in a process similar to dialysis.

Most of the side effects of an autologous transplant are caused by the conditioning therapy used. Although they can be very unpleasant at times it is important to remember that most of them are temporary and reversible.

Procedure of Hematopoietic Stem Cell Transplantation

Hematopoietic stem cell transplantation (HSCT) is the transplantation of multipotent hematopoietic stem cells, usually derived from bone marrow, peripheral blood, or umbilical cord blood. It may be autologous (the patient’s own stem cells are used) or allogeneic (the stem cells come from a donor).

Hematopoietic Stem Cell Transplantation

Author: Ajay Perumbeti, MD, FAAP; Chief Editor: Emmanuel C Besa, MD
http://emedicine.medscape.com/article/208954-overview

Hematopoietic stem cell transplantation (HSCT) involves the intravenous (IV) infusion of autologous or allogeneic stem cells to reestablish hematopoietic function in patients whose bone marrow or immune system is damaged or defective.

The image below illustrates an algorithm for typically preferred hematopoietic stem cell transplantation cell source for treatment of malignancy.

An algorithm for typically preferred hematopoietic stem cell transplantation cell source for treatment of malignancy: If a matched sibling donor is not available, then a MUD is selected; if a MUD is not available, then choices include a mismatched unrelated donor, umbilical cord donor(s), and a haploidentical donor.

Supportive Therapies

2.5.4  Blood transfusions – risks and complications of a blood transfusion

  • Allogeneic transfusion reaction (acute or delayed hemolytic reaction)
  • Allergic reaction
  • Viruses Infectious Diseases

The risk of catching a virus from a blood transfusion is very low.

HIV. Your risk of getting HIV from a blood transfusion is lower than your risk of getting killed by lightning. Only about 1 in 2 million donations might carry HIV and transmit HIV if given to a patient.

Hepatitis B and C. The risk of having a donation that carries hepatitis B is about 1 in 205,000. The risk for hepatitis C is 1 in 2 million. If you receive blood during a transfusion that contains hepatitis, you’ll likely develop the virus.

Variant Creutzfeldt-Jakob disease (vCJD). This disease is the human version of Mad Cow Disease. It’s a very rare, yet fatal brain disorder. There is a possible risk of getting vCJD from a blood transfusion, although the risk is very low. Because of this, people who may have been exposed to vCJD aren’t eligible blood donors.

  • Fever
  • Iron Overload
  • Lung Injury
  • Graft-Versus-Host Disease

Graft-versus-host disease (GVHD) is a condition in which white blood cells in the new blood attack your tissues.

2.5.5 Erythropoietin

Erythropoietin, (/ɨˌrɪθrɵˈpɔɪ.ɨtɨn/UK /ɛˌrɪθr.pˈtɪn/) also known as EPO, is a glycoprotein hormone that controls erythropoiesis, or red blood cell production. It is a cytokine (protein signaling molecule) for erythrocyte (red blood cell) precursors in the bone marrow. Human EPO has a molecular weight of 34 kDa.

Also called hematopoietin or hemopoietin, it is produced by interstitial fibroblasts in the kidney in close association with peritubular capillary and proximal convoluted tubule. It is also produced in perisinusoidal cells in the liver. While liver production predominates in the fetal and perinatal period, renal production is predominant during adulthood. In addition to erythropoiesis, erythropoietin also has other known biological functions. For example, it plays an important role in the brain’s response to neuronal injury.[1] EPO is also involved in the wound healing process.[2]

Exogenous erythropoietin is produced by recombinant DNA technology in cell culture. Several different pharmaceutical agents are available with a variety ofglycosylation patterns, and are collectively called erythropoiesis-stimulating agents (ESA). The specific details for labelled use vary between the package inserts, but ESAs have been used in the treatment of anemia in chronic kidney disease, anemia in myelodysplasia, and in anemia from cancer chemotherapy. Boxed warnings include a risk of death, myocardial infarction, stroke, venous thromboembolism, and tumor recurrence.[3]

2.5.6  G-CSF (granulocyte-colony stimulating factor)

Granulocyte-colony stimulating factor (G-CSF or GCSF), also known as colony-stimulating factor 3 (CSF 3), is a glycoprotein that stimulates the bone marrow to produce granulocytes and stem cells and release them into the bloodstream.

There are different types, including

  • Lenograstim (Granocyte)
  • Filgrastim (Neupogen, Zarzio, Nivestim, Ratiograstim)
  • Long acting (pegylated) filgrastim (pegfilgrastim, Neulasta) and lipegfilgrastim (Longquex)

Pegylated G-CSF stays in the body for longer so you have treatment less often than with the other types of G-CSF.

2.5.7  Plasma Exchange (plasmapheresis)

http://emedicine.medscape.com/article/1895577-overview

Plasmapheresis is a term used to refer to a broad range of procedures in which extracorporeal separation of blood components results in a filtered plasma product.[1, 2] The filtering of plasma from whole blood can be accomplished via centrifugation or semipermeable membranes.[3] Centrifugation takes advantage of the different specific gravities inherent to various blood products such as red cells, white cells, platelets, and plasma.[4] Membrane plasma separation uses differences in particle size to filter plasma from the cellular components of blood.[3]

Traditionally, in the United States, most plasmapheresis takes place using automated centrifuge-based technology.[5] In certain instances, in particular in patients already undergoing hemodialysis, plasmapheresis can be carried out using semipermeable membranes to filter plasma.[4]

In therapeutic plasma exchange, using an automated centrifuge, filtered plasma is discarded and red blood cells along with replacement colloid such as donor plasma or albumin is returned to the patient. In membrane plasma filtration, secondary membrane plasma fractionation can selectively remove undesired macromolecules, which then allows for return of the processed plasma to the patient instead of donor plasma or albumin. Examples of secondary membrane plasma fractionation include cascade filtration,[6] thermofiltration, cryofiltration,[7] and low-density lipoprotein pheresis.

The Apheresis Applications Committee of the American Society for Apheresis periodically evaluates potential indications for apheresis and categorizes them from I to IV based on the available medical literature. The following are some of the indications, and their categorization, from the society’s 2010 guidelines.[2]

  • The only Category I indication for hemopoietic malignancy is Hyperviscosity in monoclonal gammopathies

2.5.8  Platelet Transfusions

Indications for platelet transfusion in children with acute leukemia

Scott Murphy, Samuel Litwin, Leonard M. Herring, Penelope Koch, et al.
Am J Hematol Jun 1982; 12(4): 347–356
http://onlinelibrary.wiley.com/doi/10.1002/ajh.2830120406/abstract;jsessionid=A6001D9D865EA1EBC667EF98382EF20C.f03t01
http://dx.doi.org:/10.1002/ajh.2830120406

In an attempt to determine the indications for platelet transfusion in thrombocytopenic patients, we randomized 56 children with acute leukemia to one of two regimens of platelet transfusion. The prophylactic group received platelets when the platelet count fell below 20,000 per mm3 irrespective of clinical events. The therapeutic group was transfused only when significant bleeding occurred and not for thrombocytopenia alone. The time to first bleeding episode was significantly longer and the number of bleeding episodes were significantly reduced in the prophylactic group. The survival curves of the two groups could not be distinguished from each other. Prior to the last month of life, the total number of days on which bleeding was present was significantly reduced by prophylactic therapy. However, in the terminal phase (last month of life), the duration of bleeding episodes was significantly longer in the prophylactic group. This may have been due to a higher incidence of immunologic refractoriness to platelet transfusion. Because of this terminal bleeding, comparison of the two groups for total number of days on which bleeding was present did not show a significant difference over the entire study period.

Clinical and Laboratory Aspects of Platelet Transfusion Therapy
Yuan S, Goldfinger D
http://www.uptodate.com/contents/clinical-and-laboratory-aspects-of-platelet-transfusion-therapy

INTRODUCTION — Hemostasis depends on an adequate number of functional platelets, together with an intact coagulation (clotting factor) system. This topic covers the logistics of platelet use and the indications for platelet transfusion in adults. The approach to the bleeding patient, refractoriness to platelet transfusion, and platelet transfusion in neonates are discussed elsewhere.

Pooled Platelets – A single unit of platelets can be isolated from every unit of donated blood, by centrifuging the blood within the closed collection system to separate the platelets from the red blood cells (RBC). The number of platelets per unit varies according to the platelet count of the donor; a yield of 7 x 1010 platelets is typical [1]. Since this number is inadequate to raise the platelet count in an adult recipient, four to six units are pooled to allow transfusion of 3 to 4 x 1011 platelets per transfusion [2]. These are called whole blood-derived or random donor pooled platelets.

Advantages of pooled platelets include lower cost and ease of collection and processing (a separate donation procedure and pheresis equipment are not required). The major disadvantage is recipient exposure to multiple donors in a single transfusion and logistic issues related to bacterial testing.

Apheresis (single donor) Platelets – Platelets can also be collected from volunteer donors in the blood bank, in a one- to two-hour pheresis procedure. Platelets and some white blood cells are removed, and red blood cells and plasma are returned to the donor. A typical apheresis platelet unit provides the equivalent of six or more units of platelets from whole blood (ie, 3 to 6 x 1011 platelets) [2]. In larger donors with high platelet counts, up to three units can be collected in one session. These are called apheresis or single donor platelets.

Advantages of single donor platelets are exposure of the recipient to a single donor rather than multiple donors, and the ability to match donor and recipient characteristics such as HLA type, cytomegalovirus (CMV) status, and blood type for certain recipients.

Both pooled and apheresis platelets contain some white blood cells (WBC) that were collected along with the platelets. These WBC can cause febrile non-hemolytic transfusion reactions (FNHTR), alloimmunization, and transfusion-associated graft-versus-host disease (ta-GVHD) in some patients.

Platelet products also contain plasma, which can be implicated in adverse reactions including transfusion-related acute lung injury (TRALI) and anaphylaxis. (See ‘Complications of platelet transfusion’ .)

Read Full Post »

Hematological Malignancy Diagnostics

Author and Curator: Larry H. Bernstein, MD, FCAP

 

2.4.3 Diagnostics

2.4.3.1 Computer-aided diagnostics

Back-to-Front Design

Robert Didner
Bell Laboratories

Decision-making in the clinical setting
Didner, R  Mar 1999  Amer Clin Lab

Mr. Didner is an Independent Consultant in Systems Analysis, Information Architecture (Informatics) Operations Research, and Human Factors Engineering (Cognitive Psychology),  Decision Information Designs, 29 Skyline Dr., Morristown, NJ07960, U.S.A.; tel.: 973-455-0489; fax/e-mail: bdidner@hotmail.com

A common problem in the medical profession is the level of effort dedicated to administration and paperwork necessitated by various agencies, which contributes to the high cost of medical care. Costs would be reduced and accuracy improved if the clinical data could be captured directly at the point they are generated in a form suitable for transmission to insurers or machine transformable into other formats. Such a capability could also be used to improve the form and the structure of information presented to physicians and support a more comprehensive database linking clinical protocols to outcomes, with the prospect of improving clinical outcomes. Although the problem centers on the physician’s process of determining the diagnosis and treatment of patients and the timely and accurate recording of that process in the medical system, it substantially involves the pathologist and laboratorian, who interact significantly throughout the in-formation-gathering process. Each of the currently predominant ways of collecting information from diagnostic protocols has drawbacks. Using blank paper to collect free-form notes from the physician is not amenable to computerization; such free-form data are also poorly formulated, formatted, and organized for the clinical decision-making they support. The alternative of preprinted forms listing the possible tests, results, and other in-formation gathered during the diagnostic process facilitates the desired computerization, but the fixed sequence of tests and questions they present impede the physician from using an optimal decision-making sequence. This follows because:

  • People tend to make decisions and consider information in a step-by-step manner in which intermediate decisions are intermixed with data acquisition steps.
  • The sequence in which components of decisions are made may alter the decision outcome.
  • People tend to consider information in the sequence it is requested or displayed.
  • Since there is a separate optimum sequence of tests and questions for each cluster of history and presenting symptoms, there is no one sequence of tests and questions that can be optimal for all presenting clusters.
  • As additional data and test results are acquired, the optimal sequence of further testing and data acquisition changes, depending on the already acquired information.

Therefore, promoting an arbitrary sequence of information requests with preprinted forms may detract from outcomes by contributing to a non-optimal decision-making sequence. Unlike the decisions resulting from theoretical or normative processes, decisions made by humans are path dependent; that is, the out-come of a decision process may be different if the same components are considered in a different sequence.

Proposed solution

This paper proposes a general approach to gathering data at their source in computer-based form so as to improve the expected outcomes. Such a means must be interactive and dynamic, so that at any point in the clinical process the patient’s presenting symptoms, history, and the data already collected are used to determine the next data or tests requested. That de-termination must derive from a decision-making strategy designed to produce outcomes with the greatest value and supported by appropriate data collection and display techniques. The strategy must be based on the knowledge of the possible outcomes at any given stage of testing and information gathering, coupled with a metric, or hierarchy of values for assessing the relative desirability of the possible outcomes.

A value hierarchy

  • The numbered list below illustrates a value hierarchy. In any particular instance, the higher-numbered values should only be considered once the lower- numbered values have been satisfied. Thus, a diagnostic sequence that is very time or cost efficient should only be considered if it does not increase the likelihood (relative to some other diagnostic sequence) that a life-threatening disorder may be missed, or that one of the diagnostic procedures may cause discomfort.
  • Minimize the likelihood that a treatable, life-threatening disorder is not treated.
  • Minimize the likelihood that a treatable, discomfort-causing disorder is not treated.
  • Minimize the likelihood that a risky procedure(treatment or diagnostic procedure) is inappropriately administered.
  • Minimize the likelihood that a discomfort-causing procedure is inappropriately administered.
  • Minimize the likelihood that a costly procedure is inappropriately administered.
  • Minimize the time of diagnosing and treating thepatient.8.Minimize the cost of diagnosing and treating the patient.

The above hierarchy is relative, not absolute; for many patients, a little bit of testing discomfort may be worth a lot of time. There are also some factors and graduations intentionally left out for expository simplicity (e.g., acute versus chronic disorders).This value hierarchy is based on a hypothetical patient. Clearly, the hierarchy of a health insurance carrier might be different, as might that of another patient (e.g., a geriatric patient). If the approach outlined herein were to be followed, a value hierarchy agreed to by a majority of stakeholders should be adopted.

Efficiency

Once the higher values are satisfied, the time and cost of diagnosis and treatment should be minimized. One way to do so would be to optimize the sequence in which tests are performed, so as to minimize the number, cost, and time of tests that need to be per-formed to reach a definitive decision regarding treatment. Such an optimum sequence could be constructed using Claude Shannon’s information theory.

According to this theory, the best next question to ask under any given situation (assuming the question has two possible outcomes) is that question that divides the possible outcomes into two equally likely sets. In the real world, all tests or questions are not equally valuable, costly, or time consuming; therefore, value(risk factors), cost, and time should be used as weighting factors to optimize the test sequence, but this is a complicating detail at this point.

A value scale

For dynamic computation of outcome values, the hierarchy could be converted into a weighted value scale so differing outcomes at more than one level of the hierarchy could be readily compared. An example of such a weighted value scale is Quality Adjusted Life Years (QALY).

Although QALY does not incorporate all of the factors in this example, it is a good conceptual starting place.

The display, request, decision-making relationship

For each clinical determination, the pertinent information should be gathered, organized, formatted, and formulated in a way that facilitates the accuracy, reliability, and efficiency with which that determination is made. A physician treating a patient with high cholesterol and blood pressure (BP), for example, may need to know whether or not the patient’s cholesterol and BP respond to weight changes to determine an appropriate treatment (e.g., weight control versus medication). This requires searching records for BP, certain blood chemicals (e.g., HDLs, LDLs, triglycerides, etc.), and weight from several

sources, then attempting to track them against each other over time. Manually reorganizing this clinical information each time it is used is extremely inefficient. More important, the current organization and formatting defies principles of human factors for optimally displaying information to enhance human information-processing characteristics, particularly for decision support.

While a discussion of human factors and cognitive psychology principles is beyond the scope of this paper, following are a few of the system design principles of concern:

  • Minimize the load on short-term memory.
  • Provide information pertinent to a given decision or component of a decision in a compact, contiguous space.
  • Take advantage of basic human perceptual and pat-tern recognition facilities.
  • Design the form of an information display to com-plement the decision-making task it supports.

F i g u re 1 shows fictitious, quasi-random data from a hypothetical patient with moderately elevated cholesterol. This one-page display pulls together all the pertinent data from six years of blood tests and related clinical measurements. At a glance, the physician’s innate pattern recognition, color, and shape perception facilities recognize the patient’s steadily increasing weight, cholesterol, BP, and triglycerides as well as the declining high-density lipoproteins. It would have taken considerably more time and effort to grasp this information from the raw data collection and blood test reports as they are currently presented in independent, tabular time slices.

Design the formulation of an information display to complement the decision-making task.

The physician may wish to know only the relationship between weight and cardiac risk factors rather than whether these measures are increasing or decreasing, or are within acceptable or marginal ranges. If so, Table 1 shows the correlations between weight and the other factors in a much more direct and simple way using the same data as in Figure 1. One can readily see the same conclusions about relations that were drawn from Figure 1.This type of abstract, symbolic display of derived information also makes it easier to spot relationships when the individual variables are bouncing up and down, unlike the more or less steady rise of most values in Figure 1. This increase in precision of relationship information is gained at the expense of other types of information (e.g., trends). To display information in an optimum form then, the system designer must know what the information demands of the task are at the point in the task when the display is to be used.

Present the sequence of information display clusters to complement an optimum decision-making strategy.

Just as a fixed sequence of gathering clinical, diagnostic information may lead to a far from optimum outcome, there exists an optimum sequence of testing, considering information, and gathering data that will lead to an optimum outcome (as defined by the value hierarchy) with a minimum of time and expense. The task of the information system designer, then, is to provide or request the right information, in the best form, at each stage of the procedure. For ex-ample, Figure 1 is suitable for the diagnostic phase since it shows the current state of the risk factors and their trends. Table 1, on the other hand, might be more appropriate in determining treatment, where there may be a choice of first trying a strict dietary treatment, or going straight to a combination of diet plus medication. The fact that Figure 1 and Table 1 have somewhat redundant information is not a problem, since they are intended to optimally provide information for different decision-making tasks. The critical need, at this point, is for a model of how to determine what information should be requested, what tests to order, what information to request and display, and in what form at each step of the decision-making process. Commitment to a collaborative relationship between physicians and laboratorians and other information providers would be an essential requirement for such an undertaking. The ideal diagnostic data-collection instrument is a flexible, computer-based device, such as a notebook computer or Personal Digital Assistant (PDA) sized device.

Barriers to interactive, computer-driven data collection at the source

As with any major change, it may be difficult to induce many physicians to change their behavior by interacting directly with a computer instead of with paper and pen. Unlike office workers, who have had to make this transition over the past three decades, most physicians’ livelihoods will not depend on converting to computer interaction. Therefore, the transition must be made attractive and the changes less onerous. Some suggestions follow:

  1. Make the data collection a natural part of the clinical process.
  2. Ensure that the user interface is extremely friendly, easy to learn, and easy to use.
  3. Use a small, portable device.
  4. Use the same device for collection and display of existing information (e.g., test results and his-tory).
  5. Minimize the need for free-form written data entry (use check boxes, forms, etc.).
  6. Allow the entry of notes in pen-based free-form (with the option of automated conversion of numeric data to machine-manipulable form).
  7. Give the physicians a more direct benefit for collecting data, not just a means of helping a clerk at an HMO second-guess the physician’s judgment.
  8. Improve administrative efficiency in the office.
  9. Make the data collection complement the clinical decision-making process.
  10. Improve information displays, leading to better outcomes.
  11. Make better use of the physician’s time and mental effort.

Conclusion

The medical profession is facing a crisis of information. Gathering information is costing a typical practice more and more while fees are being restricted by third parties, and the process of gathering this in-formation may be detrimental to current outcomes. Gathered properly, in machine-manipulable form, these data could be reformatted so as to greatly improve their value immediately in the clinical setting by leading to decisions with better outcomes and, in the long run, by contributing to a clinical data warehouse that could greatly improve medical knowledge. The challenge is to create a mechanism for data collection that facilitates, hastens, and improves the outcomes of clinical activity while minimizing the inconvenience and resistance to change on the part of clinical practitioners. This paper is intended to provide a high-level overview of how this may be accomplished, and start a dialogue along these lines.

References

  1. Tversky A. Elimination by aspects: a theory of choice. Psych Rev 1972; 79:281–99.
  2. Didner RS. Back-to-front design: a guns and butter approach. Ergonomics 1982; 25(6):2564–5.
  3. Shannon CE. A mathematical theory of communication. Bell System Technical J 1948; 27:379–423 (July), 623–56 (Oct).
  4. Feeny DH, Torrance GW. Incorporating utility-based quality-of-life assessment measures in clinical trials: two examples. Med Care 1989; 27:S190–204.
  5. Smith S, Mosier J. Guidelines for designing user interface soft-ware. ESD-TR-86-278, Aug 1986.
  6. Miller GA. The magical number seven plus or minus two. Psych Rev 1956; 65(2):81–97.
  7. Sternberg S. High-speed scanning in human memory. Science 1966; 153: 652–4.

Table 1

Correlation of weight with other cardiac risk factors

Cholesterol 0.759384
HDL 0.53908
LDL 0.177297
BP-syst. 0.424728
BP-dia. 0.516167
Triglycerides 0.637817

Figure 1  Hypothetical patient data.

(not shown)

Realtime Clinical Expert Support

http://pharmaceuticalintelligence.com/2015/05/10/realtime-clinical-expert-support/

Regression: A richly textured method for comparison and classification of predictor variables

http://pharmaceuticalintelligence.com/2012/08/14/regression-a-richly-textured-method-for-comparison-and-classification-of-predictor-variables/

Converting Hematology Based Data into an Inferential Interpretation

Larry H. Bernstein, Gil David, James Rucinski and Ronald R. Coifman
In Hematology – Science and Practice
Lawrie CH, Ch 22. Pp541-552.
InTech Feb 2012, ISBN 978-953-51-0174-1
https://www.researchgate.net/profile/Larry_Bernstein/publication/221927033_Converting_Hematology_Based_Data_into_an_Inferential_Interpretation/links/0fcfd507f28c14c8a2000000.pdf

A model for Thalassemia Screening using Hematology Measurements

https://www.researchgate.net/profile/Larry_Bernstein/publication/258848064_A_model_for_Thalassemia_Screening_using_Hematology_Measurements/links/0c9605293c3048060b000000.pdf

2.4.3.2 A model for automated screening of thalassemia in hematology (math study).

Kneifati-Hayek J, Fleischman W, Bernstein LH, Riccioli A, Bellevue R.
Lab Hematol. 2007; 13(4):119-23. http://dx.doi.org:/10.1532/LH96.07003.

The results of 398 patient screens were collected. Data from the set were divided into training and validation subsets. The Mentzer ratio was determined through a receiver operating characteristic (ROC) curve on the first subset, and screened for thalassemia using the second subset. HgbA2 levels were used to confirm beta-thalassemia.

RESULTS: We determined the correct decision point of the Mentzer index to be a ratio of 20. Physicians can screen patients using this index before further evaluation for beta-thalassemia (P < .05).

CONCLUSION: The proposed method can be implemented by hospitals and laboratories to flag positive matches for further definitive evaluation, and will enable beta-thalassemia screening of a much larger population at little to no additional cost.

Measurement of granulocyte maturation may improve the early diagnosis of the septic state.

2.4.3.3 Bernstein LH, Rucinski J. Clin Chem Lab Med. 2011 Sep 21;49(12):2089-95.
http://dx.doi.org:/10.1515/CCLM.2011.688.

2.4.3.4 The automated malnutrition assessment.

David G, Bernstein LH, Coifman RR. Nutrition. 2013 Jan; 29(1):113-21.
http://dx.doi.org:/10.1016/j.nut.2012.04.017

2.4.3.5 Molecular Diagnostics

Genomic Analysis of Hematological Malignancies

Acute lymphoblastic leukemia (ALL) is the most common hematologic malignancy that occurs in children. Although more than 90% of children with ALL now survive to adulthood, those with the rarest and high-risk forms of the disease continue to have poor prognoses. Through the Pediatric Cancer Genome Project (PCGP), investigators in the Hematological Malignancies Program are identifying the genetic aberrations that cause these aggressive forms of leukemias. Here we present two studies on the genetic bases of early T-cell precursor ALL and acute megakaryoblastic leukemia.

  • Early T-Cell Precursor ALL Is Characterized by Activating Mutations
  • The CBFA2T3-GLIS2Fusion Gene Defines an Aggressive Subtype of Acute Megakaryoblastic Leukemia in Children

Early T-cell precursor ALL (ETP-ALL), which comprises 15% of all pediatric T-cell leukemias, is an aggressive disease that is typically resistant to contemporary therapies. Children with ETP-ALL have a high rate of relapse and an extremely poor prognosis (i.e., 5-year survival is approximately 20%). The genetic basis of ETP-ALL has remained elusive. Although ETP-ALL is associated with a high burden of DNA copy number aberrations, none are consistently found or suggest a unifying genetic alteration that drives this disease.

Through the efforts of the PCGP, Jinghui Zhang, PhD (Computational Biology), James R. Downing, MD (Pathology), Charles G. Mullighan, MBBS(Hons), MSc, MD (Pathology), and colleagues analyzed the whole-genome sequences of leukemic cells and matched normal DNA from 12 pediatric patients with ETP-ALL. The identified genetic mutations were confirmed in a validation cohort of 52 ETP-ALL specimens and 42 non-ETP T-lineage ALLs (T-ALL).

In the journal Nature, the investigators reported that each ETP-ALL sample carried an average of 1140 sequence mutations and 12 structural variations. Of the structural variations, 51% were breakpoints in genes with well-established roles in hematopoiesis or leukemogenesis (e.g., MLH2,SUZ12, and RUNX1). Eighty-four percent of the structural variations either caused loss of function of the gene in question or resulted in the formation of a fusion gene such as ETV6-INO80D. The ETV6 gene, which encodes a protein that is essential for hematopoiesis, is frequently mutated in leukemia. Among the DNA samples sequenced in this study, ETV6 was altered in 33% of ETP-ALL but only 10% of T-ALL cases.

Next-generation sequencing in hematologic malignancies: what will be the dividends?

Jason D. MerkerAnton Valouev, and Jason Gotlib
Ther Adv Hematol. 2012 Dec; 3(6): 333–339.
http://dx.doi.org:/10.1177/2040620712458948

The application of high-throughput, massively parallel sequencing technologies to hematologic malignancies over the past several years has provided novel insights into disease initiation, progression, and response to therapy. Here, we describe how these new DNA sequencing technologies have been applied to hematolymphoid malignancies. With further improvements in the sequencing and analysis methods as well as integration of the resulting data with clinical information, we expect these technologies will facilitate more precise and tailored treatment for patients with hematologic neoplasms.

Leveraging cancer genome information in hematologic malignancies.

Rampal R1Levine RL.
J Clin Oncol. 2013 May 20; 31(15):1885-92.
http://dx.doi.org:/10.1200/JCO.2013.48.7447

The use of candidate gene and genome-wide discovery studies in the last several years has led to an expansion of our knowledge of the spectrum of recurrent, somatic disease alleles, which contribute to the pathogenesis of hematologic malignancies. Notably, these studies have also begun to fundamentally change our ability to develop informative prognostic schema that inform outcome and therapeutic response, yielding substantive insights into mechanisms of hematopoietic transformation in different tissue compartments. Although these studies have already had important biologic and translational impact, significant challenges remain in systematically applying these findings to clinical decision making and in implementing new technologies for genetic analysis into clinical practice to inform real-time decision making. Here, we review recent major genetic advances in myeloid and lymphoid malignancies, the impact of these findings on prognostic models, our understanding of disease initiation and evolution, and the implication of genomic discoveries on clinical decision making. Finally, we discuss general concepts in genetic modeling and the current state-of-the-art technology used in genetic investigation.

p53 mutations are associated with resistance to chemotherapy and short survival in hematologic malignancies

E Wattel, C Preudhomme, B Hecquet, M Vanrumbeke, et AL.
Blood, (Nov 1), 1994; 84(9): pp 3148-3157
http://www.bloodjournal.org/content/bloodjournal/84/9/3148.full.pdf

We analyzed the prognostic value of p53 mutations for response to chemotherapy and survival in acute myeloid leukemia (AML), myelodysplastic syndrome (MDS), and chronic lymphocytic leukemia (CLL). Mutations were detected by single-stranded conformation polymorphism (SSCP) analysis of exons 4 to 10 of the P53 gene, and confirmed by direct sequencing. A p53 mutation was found in 16 of 107 (15%) AML, 20 of 182 (11%) MDS, and 9 of 81 (11%) CLL tested. In AML, three of nine (33%) mutated cases and 66 of 81 (81%) nonmutated cases treated with intensive chemotherapy achieved complete remission (CR) (P = .005) and none of five mutated cases and three of six nonmutated cases treated by low-dose Ara C achieved CR or partial remission (PR) (P = .06). Median actuarial survival was 2.5 months in mutated cases, and 15 months in nonmutated cases (P < lo-‘). In the MDS patients who received chemotherapy (intensive chemotherapy or low-dose Ara C), 1 of 13 (8%) mutated cases and 23 of 38 (60%) nonmutated cases achieved CR or PR (P = .004), and median actuarial survival was 2.5 and 13.5 months, respectively (P C lo-’). In all MDS cases (treated and untreated), the survival difference between mutated cases and nonmutated cases was also highly significant. In CLL, 1 of 8 (12.5%) mutated cases treated by chemotherapy (chlorambucil andlor CHOP andlor fludarabine) responded, as compared with 29 of 36 (80%) nonmutated cases (P = .02). In all CLL cases, survival from p53 analysis was significantly shorter in mutated cases (median 7 months) than in nonmutated cases (median not reached) (P < IO-’). In 35 of the 45 mutated cases of AML, MDS, and CLL, cytogenetic analysis or SSCP and sequence findings showed loss of the nonmutated P53 allele. Our findings show that p53 mutations are a strong prognostic indicator of response to chemotherapy and survival in AML, MDS, and CLL. The usual association of p53 mutations to loss of the nonmutated P53 allele, in those disorders, ie, to absence of normal p53 in tumor cells, suggests that p53 mutations could induce drug resistance, at least in part, by interfering with normal apoptotic pathways in tumor cells.

Genomic approaches to hematologic malignancies

Benjamin L. Ebert and Todd R. Golub
Blood. 2004; 104:923-932
https://www.broadinstitute.org/mpr/publications/projects/genomics/Review%20Genomics%20of%20Heme%20Malig,%20Blood%202004.pdf

In the past several years, experiments using DNA microarrays have contributed to an increasingly refined molecular taxonomy of hematologic malignancies. In addition to the characterization of molecular profiles for known diagnostic classifications, studies have defined patterns of gene expression corresponding to specific molecular abnormalities, oncologic phenotypes, and clinical outcomes. Furthermore, novel subclasses with distinct molecular profiles and clinical behaviors have been identified. In some cases, specific cellular pathways have been highlighted that can be therapeutically targeted. The findings of microarray studies are beginning to enter clinical practice as novel diagnostic tests, and clinical trials are ongoing in which therapeutic agents are being used to target pathways that were identified by gene expression profiling. While the technology of DNA microarrays is becoming well established, genome-wide surveys of gene expression generate large data sets that can easily lead to spurious conclusions. Many challenges remain in the statistical interpretation of gene expression data and the biologic validation of findings. As data accumulate and analyses become more sophisticated, genomic technologies offer the potential to generate increasingly sophisticated insights into the complex molecular circuitry of hematologic malignancies. This review summarizes the current state of discovery and addresses key areas for future research.

2.4.3.6 Flow cytometry

Introduction to Flow Cytometry: Blood Cell Identification

Dana L. Van Laeys
https://www.labce.com/flow_cytometry.aspx

No other laboratory method provides as rapid and detailed analysis of cellular populations as flow cytometry, making it a valuable tool for diagnosis and management of several hematologic and immunologic diseases. Understanding this relevant methodology is important for any medical laboratory scientist.

Whether you have no previous experience with flow cytometry or just need a refresher, this course will help you to understand the basic principles, with the help of video tutorials and interactive case studies.

Basic principles include:

  1. Immunophenotypic features of various types of hematologic cells
  2. Labeling cellular elements with fluorochromes
  3. Blood cell identification, specifically B and T lymphocyte identification and analysis
  4. Cell sorting to isolate select cell population for further analysis
  5. Analyzing and interpreting result reports and printouts

Read Full Post »

Older Posts »

%d bloggers like this: