Advances in Nutrition: An International Review Journal current issue

Perspective: Limiting Dependence on Nonrandomized Studies and Improving Randomized Trials in Human Nutrition Research: Why and How

AbstractA large majority of human nutrition research uses nonrandomized observational designs, but this has led to little reliable progress. This is mostly due to many epistemologic problems, the most important of which are as follows: difficulty detecting small (or even tiny) effect sizes reliably for nutritional risk factors and nutrition-related interventions; difficulty properly accounting for massive confounding among many nutrients, clinical outcomes, and other variables; difficulty measuring diet accurately; and suboptimal research reporting. Tiny effect sizes and massive confounding are largely unfixable problems that narrowly confine the scenarios in which nonrandomized observational research is useful. Although nonrandomized studies and randomized trials have different priorities (assessment of long-term causality compared with assessment of treatment effects), the odds for obtaining reliable information with the former are limited. Randomized study designs should therefore largely replace nonrandomized studies in human nutrition research going forward. To achieve this, many of the limitations that have traditionally plagued most randomized trials in nutrition, such as small sample size, short length of follow-up, high cost, and selective reporting, among others, must be overcome. Pivotal megatrials with tens of thousands of participants and lifelong follow-up are possible in nutrition science with proper streamlining of operational costs. Fixable problems that have undermined observational research, such as dietary measurement error and selective reporting, need to be addressed in randomized trials. For focused questions in which dietary adherence is important to maximize, trials with direct observation of participants in experimental in-house settings may offer clean answers on short-term metabolic outcomes. Other study designs of randomized trials to consider in nutrition include registry-based designs and “N-of-1” designs. Mendelian randomization designs may also offer some more reliable leads for testing interventions in trials. Collectively, an improved randomized agenda may clarify many things in nutrition science that might never be answered credibly with nonrandomized observational designs.

Perspective: Are Large, Simple Trials the Solution for Nutrition Research?

AbstractNutritional research and policies have been criticized for relying on observational evidence, using self-report diet assessment methods, and supposedly being unable to present a consensus on what constitutes a healthy diet. In particular, it is often asserted that for progress to occur in nutrition science, large, simple trials, which have worked well in evaluating the efficacy of drugs, need to replace most observational research and small trials in nutrition. However, this idea is infeasible, and is unlikely to advance nutritional sciences or improve policies. This article addresses some commonly held and unfounded “myths” surrounding dietary assessments, effect sizes, and confounding, demonstrating how carefully conducted observational studies can provide reliable and reproducible evidence on diet and health. Also, there is a strong consensus among nutritional researchers and practitioners about the basic elements of a healthy diet. To move forward, we should continue to improve study design and diet assessment methodologies, reduce measurement errors, and leverage new technologies. Advances in the field lie in coalescing evidence from multiple study designs, methodologies, and technologies, and translating what we already know into policy and practice, so we can improve diet quality and enhance health in an equitable and sustainable manner across the world.

Chromium

AbstractTwo oxidation states of chromium are considered to be biologically and environmentally relevant based on their stability in the presence of water and oxygen. Compounds containing chromium(6 + ) are mutagenic and carcinogenic when inhaled and potentially when ingested orally in large quantity as well. Chromium as the trivalent will be the focus of this work as it was proposed to be an essential element for mammals ∼60 y ago; however, in the last 2 decades its status has been questioned. Chromium has been postulated to be involved in regulating carbohydrate and lipid (and potentially also protein) metabolism by enhancing insulin's efficacy (1). However, in 2014, the European Food Safety Authority found no convincing evidence that chromium is an essential element (2). Dietary chromium apparently is absorbed via passive diffusion and the extent of absorption is low (∼1%). Chromium is maintained in the bloodstream bound to the protein transferrin. It is generally believed to be delivered to tissues by transferrin via endocytosis (1). No unambiguous animal model of chromium deficiency has been established (2). One limitation in characterizing chromium deficiency in humans is the lack of an accepted biomarker of chromium nutritional status. Attempts to identify a glucose tolerance factor have not provided a chemically defined functional compound that conforms with the proposed physiologic role of chromium as a facilitator of insulin action in vivo.

Micronutrient Status: Potential Modifiers—Drugs and Chronic Disease

An individual's micronutrient status can be affected by a wide range of factors, including habitual diet, medications, dietary supplements, metabolic disorders, and surgery, particularly surgery involving the gastrointestinal system. Challenges related to assessing micronutrient sufficiency or excess hinge, in large part, on the difficulty of accurately determining micronutrient status. This challenge is of particular concern for the shortfall nutrients identified in the 2015–2020 US Dietary Guidelines for Americans (vitamins D and K and calcium) and overconsumed nutrients (sodium). In addition to intakes below the Estimated Average Requirement or above the Tolerable Upper Intake Level of nutrients by some in the United States, certain individuals have chronic conditions or illnesses or are taking medications or botanical supplements that may alter micronutrient bioavailability and thus alter requirements.

Proton Pump Inhibitors, H2-Receptor Antagonists, Metformin, and Vitamin B-12 Deficiency: Clinical Implications

AbstractThere is clear evidence that proton-pump inhibitors (PPIs), H2-receptor antagonists (H2RAs), and metformin can reduce serum vitamin B-12 concentrations by inhibiting the absorption of the vitamin. However, it is unclear if the effects of these drugs on serum vitamin B-12 are associated with increased risk of biochemical or functional deficiency (as is indicated by elevated blood concentrations of homocysteine and methylmalonic acid) or clinical deficiency (including megaloblastic anemia and neurologic disorders such as peripheral neuropathy and cognitive dysfunction). This review provides an overview of vitamin B-12 absorption and biochemistry and the mechanisms by which PPIs, H2RAs, and metformin affect these functions. It also summarizes the literature relating the use of these drugs to the risk of vitamin B-12 deficiency. Also discussed is that strategies for assessing vitamin B-12 status and diagnosing vitamin B-12 deficiency have evolved in recent years beyond solely measuring serum total vitamin B-12. Multiple analyte testing, a strategy in which ≥2 of 4 biomarkers of vitamin B-12 status—serum total vitamin B-12, holotranscobalamin, homocysteine, and methylmalonic acid—are measured, increases sensitivity and specificity for diagnosing vitamin B-12 deficiency. It is concluded that randomized controlled trials are now needed that use the strategy of multiple analyte testing to determine if PPIs, H2RAs, and metformin do indeed increase the risk of vitamin B-12 deficiency. Until these studies are conducted, a reasonable recommendation for physicians and their patients who are taking these drugs is to monitor vitamin B-12 status and to provide vitamin B-12 supplements if altered blood biomarkers or clinical signs consistent with low or deficient vitamin B-12 status develop.

Coenzyme Q10 as Treatment for Statin-Associated Muscle Symptoms—A Good Idea, but…

Abstract3-Hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins) are extremely well tolerated but are associated with a range of mild-to-moderate statin-associated muscle symptoms (SAMS). Estimates of SAMS incidence vary from <1% in industry-funded clinical trials to 10–25% in nonindustry-funded clinical trials and ∼60% in some observational studies. SAMS are important because they result in dose reduction or discontinuation of these life-saving medications, accompanied by higher healthcare costs and cardiac events. The mechanisms that produce SAMS are not clearly defined. Statins block the production of farnesyl pyrophosphate, an intermediate in the mevalonate pathway, which is responsible for the production of coenzyme Q10 (CoQ10). This knowledge has prompted the hypothesis that reductions in plasma CoQ10 concentrations contribute to SAMS. Consequently, CoQ10 is popular as a form of adjuvant therapy for the treatment of SAMS. However, the data evaluating the efficacy of CoQ10 supplementation has been equivocal, with some, but not all, studies suggesting that CoQ10 supplementation mitigates muscular complaints. This review discusses the rationale for using CoQ10 in SAMS, the results of CoQ10 clinical trials, the suggested management of SAMS, and the lessons learned about CoQ10 treatment of this problem.

Clinically Relevant Herb-Micronutrient Interactions: When Botanicals, Minerals, and Vitamins Collide

AbstractThe ability of certain foods to impair or augment the absorption of various vitamins and minerals has been recognized for many years. However, the contribution of botanical dietary supplements (BDSs) to altered micronutrient disposition has received little attention. Almost half of the US population uses some type of dietary supplement on a regular basis, with vitamin and mineral supplements constituting the majority of these products. BDS usage has also risen considerably over the last 2 decades, and a number of clinically relevant herb-drug interactions have been identified during this time. BDSs are formulated as concentrated plant extracts containing a plethora of unique phytochemicals not commonly found in the normal diet. Many of these uncommon phytochemicals can modulate various xenobiotic enzymes and transporters present in both the intestine and liver. Therefore, it is likely that the mechanisms underlying many herb-drug interactions can also affect micronutrient absorption, distribution, metabolism, and excretion. To date, very few prospective studies have attempted to characterize the prevalence and clinical relevance of herb-micronutrient interactions. Current research indicates that certain BDSs can reduce iron, folate, and ascorbate absorption, and others contribute to heavy metal intoxication. Researchers in the field of nutrition may not appreciate many of the idiosyncrasies of BDSs regarding product quality and dosage form performance. Failure to account for these eccentricities can adversely affect the outcome and interpretation of any prospective herb-micronutrient interaction study. This review highlights several clinically relevant herb-micronutrient interactions and describes several common pitfalls that often beset clinical research with BDSs.

Dietary Inflammatory Index and Site-Specific Cancer Risk: A Systematic Review and Dose-Response  Meta-Analysis

AbstractExisting evidence suggests a link between the inflammatory potential of diet and risk of cancer. This study aimed to test the linear and potential nonlinear dose-response associations of the Dietary Inflammatory Index (DII), as being representative of inflammatory features of the diet, and site-specific cancer risk. A systematic search was conducted with the use of PubMed and Scopus from 2014 to November 2017. Prospective cohort or case-control studies reporting the risk estimates of any cancer type for ≥3 categories of the DII were selected. Studies that reported the association between continuous DII score and cancer risk were also included. Pooled RRs were calculated by using a random-effects model. Eleven prospective cohort studies (total n = 1,187,474) with 28,614 incident cases and 29 case-control studies with 19,718 cases and 33,229 controls were identified. The pooled RRs for a 1-unit increment in the DII were as follows: colorectal cancer, 1.06 (95% CI: 1.04, 1.08; I2 = 72.5%; n = 9); breast cancer, 1.03 (95% CI: 1.00, 1.07; I2 = 84.0%; n = 7); prostate cancer, 1.06 (95% CI: 0.97, 1.15; I2 = 56.2%; n = 6); pancreatic cancer, 1.16 (95% CI: 1.05, 1.28; I2 = 61.6%; n = 2); ovarian cancer, 1.08 (95% CI: 1.03, 1.13; I2 = 0%; n = 2); esophageal squamous cell carcinoma, 1.24 (95% CI: 1.10, 1.38; I2 = 64.3%; n = 2); renal cell carcinoma, 1.08 (95% CI: 1.02, 1.13; I2 = 0%; n = 2); and esophageal adenocarcinoma, 1.26 (95% CI: 1.13, 1.39; I2 = 0%; n = 2). A nonlinear dose-response meta-analysis showed that, after a somewhat unchanged risk within initial scores of the DII, the risk of colorectal cancer increased linearly with increasing DII score. In the analyses of breast and prostate cancers, the risk increased with a very slight trend with increasing DII score. In conclusion, the results showed that dietary habits with high inflammatory features might increase the risk of site-specific cancers.

A Systematic Review of Renal Health in Healthy Individuals Associated with Protein Intake above the US Recommended Daily Allowance in Randomized Controlled Trials and Observational Studies

AbstractA systematic review was used to identify randomized controlled trials (RCTs) and observational epidemiologic studies (OBSs) that examined protein intake consistent with either the US RDA (0.8 g/kg or 10–15% of energy) or a higher protein intake (≥20% but <35% of energy or ≥10% higher than a comparison intake) and reported measures of kidney function. Studies (n = 26) of healthy, free-living adults (>18 y old) with or without metabolic disease risk factors were included. Studies of subjects with overt disease, such as chronic kidney, end-stage renal disease, cancer, or organ transplant, were excluded. The most commonly reported variable was glomerular filtration rate (GFR), with 13 RCTs comparing GFRs obtained with normal and higher protein intakes. Most (n = 8), but not all (n = 5), RCTs reported significantly higher GFRs in response to increased protein intake, and all rates were consistent with normal kidney function in healthy adults. The evidence from the current review is limited and inconsistent with regard to the role of protein intake and the risk of kidney stones. Increased protein intake had little or no effect on blood markers of kidney function. Evidence reported here suggests that protein intake above the US RDA has no adverse effect on blood pressure. All included studies were of moderate to high risk of bias and, with the exception of 2 included cohorts, were limited in duration (i.e. <6 mo). Data in the current review are insufficient to determine if increased protein intake from a particular source, i.e., plant or animal, influences kidney health outcomes. These data further indicate that, at least in the short term, higher protein intake within the range of recommended intakes for protein is consistent with normal kidney function in healthy individuals.

Caffeine in Kidney Stone Disease: Risk or Benefit?

AbstractKidney stone disease is a global health care problem, with a high recurrence rate after stone removal. It is thus crucial to develop effective strategies to prevent the formation of new or recurrent stones. Caffeine is one of the main components in caffeinated beverages worldwide (i.e., coffee, tea, soft drinks, and energy drinks). Previous retrospective and prospective studies have reported contradictory effects of caffeine on kidney stone risk. Although it has a diuretic effect on enhancing urinary output, it may slightly increase the stone risk index. However, 3 large cohorts have suggested a preventive role of caffeine in kidney stone disease. In addition, a recent in vitro study has addressed relevant mechanisms underlying the preventive role of caffeine against stone pathogenesis. This review summarizes the relevant data from previous evidence and discusses the association between caffeine consumption and kidney stone risk reduction.

Associations between Single Nucleotide Polymorphisms and Total Energy, Carbohydrate, and Fat Intakes: A Systematic Review

AbstractA better understanding of the genetic underpinning of total energy, carbohydrate, and fat intake is a prerequisite to develop personalized dietary recommendations. For this purpose, we systematically reviewed associations between single nucleotide polymorphisms (SNPs) and total energy, carbohydrate, and fat intakes. Four databases were searched for studies that assessed an association between SNPs and total energy, carbohydrate, and fat intakes. Screening of articles and data extraction was performed independently by 2 reviewers. Articles in English or German language, published between 1994 and September 2017, on human studies in adults and without specific populations were considered for the review. In total, 39 articles, including 86 independent loci, met the inclusion criteria. The fat mass and obesity–associated (FTO) gene as well as the melanocortin 4 receptor (MC4R) locus were most frequently studied. Limited significant evidence of an association between the FTO SNP rs9939609 and lower total energy intake and between the MC4R SNP rs17782313 and higher total energy intake was reported. Most of the other identified loci showed inconsistent results. In conclusion, there is no consistent evidence that the investigated SNPs are associated with and predictive for total energy, carbohydrate, and fat intakes.

Effect of Vitamin D Supplementation, Food Fortification, or Bolus Injection on Vitamin D Status in Children Aged 2–18 Years: A Meta-Analysis

AbstractMeta-analyses on the effect of vitamin D intake on status in children are lacking, especially those focused on vitamin D–fortified foods. The objective of this meta-analysis was to investigate the effect of vitamin D interventions (fortified foods, supplements, bolus injections) on vitamin D status in children 2–18 y of age. Following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines, literature searches were conducted up to December 2016. Randomized placebo-controlled vitamin D interventions in healthy children aged 2–18 y were included. A random-effects model was used with I2 assessing heterogeneity. We included 26 trials (5403 children) with interventions (n = 9 fortified foods, n = 15 supplements, n = 2 bolus injections) from 100–4000 IU vitamin D/d over 4 wk to 2 y. The serum 25-hydroxyvitamin D [25(OH)D] weighted mean difference for all 26 trials (23.5 nmol/L; 95% CI: 20.7, 26.3 nmol/L; I2 = 99.9%) resulted in a mean increase of 1.0 nmol/L (95% CI: 0.3, 1.7 nmol/L) for each increase of 100 IU vitamin D/d (per 1 µg/d : 0.4 nmol/L; 95% CI: 0.1, 0.7 nmol/L). The response per 100 IU vitamin D/d was greater in trials with a mean baseline serum 25(OH)D <30 nmol/L, with the use of fortified foods and with baseline vitamin D intakes <100 IU/d. In conclusion, the serum 25(OH)D response to vitamin D intake differs on the basis of baseline status, intakes, and delivery mode, but not age, sex, or latitude.

Intrinsic and Extrinsic Factors Impacting Absorption, Metabolism, and Health Effects of Dietary Carotenoids

AbstractCarotenoids are orange, yellow, and red lipophilic pigments present in many fruit and vegetables, as well as other food groups. Some carotenoids contribute to vitamin A requirements. The consumption and blood concentrations of specific carotenoids have been associated with reduced risks of a number of chronic conditions. However, the interpretation of large, population-based observational and prospective clinical trials is often complicated by the many extrinsic and intrinsic factors that affect the physiologic response to carotenoids. Extrinsic factors affecting carotenoid bioavailability include food-based factors, such as co-consumed lipid, food processing, and molecular structure, as well as environmental factors, such as interactions with prescription drugs, smoking, or alcohol consumption. Intrinsic, physiologic factors associated with blood and tissue carotenoid concentrations include age, body composition, hormonal fluctuations, and variation in genes associated with carotenoid absorption and metabolism. To most effectively investigate carotenoid bioactivity and to utilize blood or tissue carotenoid concentrations as biomarkers of intake, investigators should either experimentally or statistically control for confounding variables affecting the bioavailability, tissue distribution, and metabolism of carotene and xanthophyll species. Although much remains to be investigated, recent advances have highlighted that lipid co-consumption, baseline vitamin A status, smoking, body mass and body fat distribution, and genetics are relevant covariates for interpreting blood serum or plasma carotenoid responses. These and other intrinsic and extrinsic factors are discussed, highlighting remaining gaps in knowledge and opportunities for future research. To provide context, we review the state of knowledge with regard to the prominent health effects of carotenoids.

Autophagy: The Last Defense against Cellular Nutritional Stress

AbstractHomeostasis of nutrient metabolism is critical for maintenance of the normal physiologic status of the cell and the integral health of humans and mammals. In vivo, there is a highly efficient and precise process involved in nutrient recycling and organelle cleaning. This process is named autophagy, and it can be induced in response to the dynamic change of nutrients. When cells face nutritional stress, such as stress caused by nutrient deficiency or nutrient excess, the autophagy pathway will be activated. Generally, when nutrients are withdrawn, cells will sense the signs of starvation and respond. AMP-activated protein kinase and the mammalian target of rapamycin, two of the major metabolic kinases, are responsible for monitoring cellular energy and the concentration of amino acids, respectively. Nutrient excess also induces autophagy, mainly via the reactive oxygen species and endoplasmic reticulum stress pathway. When nutritional stress activates the autophagy pathway, the nutrients or damaged organelles will be recycled for cell survival. However, if autophagy is overwhelmingly induced, autophagic cell death will possibly occur. The balance of the autophagy induction is the crucial factor for cell survival or death. Herein, we summarize the current knowledge on the induction of autophagy, the autophagy response under nutritional stresses, and autophagic cell death and related diseases, which will highlight the process of nutritional stress-induced autophagy and its important physiologic and/or pathologic roles in cell metabolism and diseases, and shed light on the research into the mechanism and clinical applications of autophagy induced by nutritional stresses.

Perspective: Protein Requirements and Optimal Intakes in Aging: Are We Ready to Recommend More Than the Recommended Daily Allowance?

AbstractThe Dietary Reference Intakes set the protein RDA for persons >19 y of age at 0.8 g protein ⋅ kg body weight−1 ⋅ d−1. A growing body of evidence suggests, however, that the protein RDA may be inadequate for older individuals. The evidence for recommending a protein intake greater than the RDA comes from a variety of metabolic approaches. Methodologies centered on skeletal muscle are of paramount importance given the age-related decline in skeletal muscle mass and function (sarcopenia) and the degree to which dietary protein could mitigate these declines. In addition to evidence from short-term experimental trials, observational data show that higher protein intakes are associated with greater muscle mass and, more importantly, better muscle function with aging. We are in dire need of more evidence from longer-term intervention trials showing the efficacy of protein intakes that are higher than the RDA in older persons to support skeletal muscle health. We propose that it should be recommended that older individuals consume ≥1.2 g protein · kg−1 · d−1 and that there should be an emphasis on the intake of the amino acid leucine, which plays a central role in stimulating skeletal muscle anabolism. Critically, the often-cited potential negative effects of consuming higher protein intakes on renal and bone health are without a scientific foundation in humans.

Perspective: Structure-Function Claims on Infant Formula

AbstractIn the context of a food product label, the term “claim” refers to information that attributes value to the product. The term extends to many different types of information, from product identity, descriptors of intended use, and identification of characteristic properties to the physiologic effects in the body of substances in the food, including the reduction of risk of disease. Food labeling, which includes claims, provides information that consumers want and use to improve their diets. Consumers prefer short statements on the front label claims to longer, more detailed information, including ingredients statements and a nutrition panel. Three types of claims are permitted in the United States. Nutrient content claims describe the level of the nutrient in the food relative to an established daily value, e.g., “Excellent source of choline,” and are subject to composition limits for other nutrients, such as total fat, saturated fat, and cholesterol. Health claims describe the relation between a food substance and the risk of disease, e.g., “Adequate calcium and vitamin D throughout life, as part of a well-balanced diet, may reduce the risk of osteoporosis.” They must undergo a premarket evaluation by the FDA to ensure that there is significant scientific agreement about the relation in question. The third type of claim, structure-function (SF) claims, has recently come under scrutiny, particularly regarding their use on infant formula. Such claims represent a food's effect on the structure or function of the body for maintenance of good health and nutrition. These claims must be truthful and not misleading, but are not subject to premarket approval before use. The purpose of this perspective is to describe the origins and unique niche of SF claims, and to comment on recent proposals to further regulate such claims on infant formula.

Molybdenum

AbstractMolybdenum, a trace element essential for micro-organisms, plants, and animals, was discovered in 1778 by a Swedish chemist named Karl Scheele. Initially mistaken for lead, molybdenum was named after the Greek work molybdos, meaning lead-like. In the 1930s, it was recognized that ingestion of forage with high amounts of molybdenum by cattle caused a debilitating condition. In the 1950s, the essentiality of molybdenum was established with the discovery of the first molybdenum-containing enzymes. In humans, only 4 enzymes requiring molybdenum have been identified to date: sulfite oxidase, xanthine oxidase, aldehyde oxidase, and mitochondrial amidoxime-reducing component (mARC). Sulfite oxidase, an enzyme found in mitochondria, catalyzes oxidation of sulfite to sulfate, the final step in oxidation of sulfur amino acids (cysteine and methionine). Xanthine oxidase converts hypoxanthine to xanthine, and further converts xanthine to uric acid, preventing hypoxanthine, formed from spontaneous deamination of adenine, from leading to DNA mutations if paired with cytosine in place of thymine. Aldehyde oxidase is abundant in the liver and is an important enzyme in phase 1 drug metabolism. Finally, mARC, discovered less than a decade ago, works in concert with cytochrome b5 type B and NAD(H) cytochrome b5 reductase to reduce a variety of N-hydroxylated substrates, although the physiologic significance is still unclear. In the case of each of the molybdenum enzymes, activity is catalyzed via a tricyclic cofactor composed of a pterin, a dithiolene, and a pyran ring, called molybdenum cofactor (MoCo) (1).

Microbiome-Mediated Effects of the Mediterranean Diet on Inflammation

AbstractThe Mediterranean diet pattern is increasingly associated with improved metabolic health. Two mechanisms by which consuming a Mediterranean diet pattern may contribute to improved metabolic health are modulation of the gastrointestinal (GI) microbiota and reduction of metabolic endotoxemia. Metabolic endotoxemia, defined as a 2- to 3-fold increase in circulating levels of bacterial endotoxin, has been proposed as a cause of inflammation during metabolic dysfunction. As the largest source of endotoxins in the human body, the GI microbiota represents a crucial area for research on strategies for reducing endotoxemia. Diets high in saturated fat and low in fiber contribute to metabolic endotoxemia through several mechanisms, including changes in the GI microbiome and bacterial fermentation end products, intestinal physiology and barrier function, and enterohepatic circulation of bile acids. Thus, the Mediterranean diet pattern, rich in unsaturated fats and fiber, may be one dietary strategy to reduce metabolic endotoxemia. Preclinical studies have demonstrated the differential effects of dietary saturated and unsaturated fats on the microbiota and metabolic health, but human studies are lacking. The role of dietary fiber and the GI microbiome in metabolic endotoxemia is underinvestigated. Clinical research on the effects of different types of dietary fat and fiber on the GI microbiota and GI and systemic inflammation is necessary to determine efficacious dietary strategies for reducing metabolic endotoxemia, inflammation, and subsequent metabolic disease.

Impact of Double-Fortified Salt with Iron and Iodine on Hemoglobin, Anemia, and Iron Deficiency Anemia: A Systematic Review and Meta-Analysis

AbstractDouble-fortified salt (DFS) containing iron and iodine has been proposed as a feasible and cost-effective alternative for iron fortification in low- and middle-income countries (LMICs). We conducted a systematic review and meta-analysis from randomized and quasi-randomized controlled trials to 1) assess the effect of DFS on biomarkers of iron status and the risk of anemia and iron deficiency anemia (IDA) and 2) evaluate differential effects of DFS by study type (efficacy or effectiveness), population subgroups, iron formulation (ferrous sulfate, ferrous fumarate, and ferric pyrophosphate), iron concentration, duration of intervention, and study quality. A systematic search with the use of MEDLINE, EMBASE, Cochrane, Web of Science, and other sources identified 221 articles. Twelve efficacy and 2 effectiveness studies met prespecified inclusion criteria. All studies were conducted in LMICs: 10 in India, 2 in Morocco, and 1 each in Côte d'Ivoire and Ghana. In efficacy studies, DFS increased hemoglobin concentrations [standardized mean difference (SMD): 0.28; 95% CI: 0.11, 0.44; P < 0.001] and reduced the risk of anemia (RR: 0.59; 95% CI: 0.46, 0.77; P < 0.001) and IDA (RR 0.37; 95% CI: 0.25, 0.54; P < 0.001). In effectiveness studies, the effect size for hemoglobin was smaller but significant (SMD: 0.03; 95% CI: 0.01, 0.05; P < 0.01). Stratified analyses of efficacy studies by population subgroups indicated positive effects of DFS among women and school-age children. For the latter, DFS increased hemoglobin concentrations (SMD: 0.32; 95% CI: 0.03, 0.60; P < 0.05) and reduced the risk of anemia (SMD: 0.48; 95% CI: 0.34, 0.67; P < 0.001) and IDA (SMD: 0.37; 95% CI: 0.25, 0.54; P < 0.001). Hemoglobin concentrations, anemia prevalence and deworming at baseline, sample size, and study duration were not associated with effect sizes. The results indicate that DFS is efficacious in increasing hemoglobin concentrations and reducing the risk of anemia and IDA in LMIC populations. More effectiveness studies are needed.

Pages