Advances in Nutrition: An International Review Journal current issue

Mathematical Optimization to Explore Tomorrow's Sustainable Diets: A Narrative Review

AbstractA sustainable diet is, by definition, nutritionally adequate, economically affordable, culturally acceptable, and environmentally respectful. Designing such a diet has to integrate different dimensions of diet sustainability that may not be compatible with each other. Among multicriteria assessment methods, diet optimization is a whole-diet approach that simultaneously combines several metrics for dimensions of diet sustainability. This narrative review based on 67 published studies shows how mathematical diet optimization can help with understanding the relations between the different dimensions of diet sustainability and how it can be properly used to identify sustainable diets. Diet optimization aims to find the optimal combination of foods for a population, a subpopulation, or an individual that fulfills a set of constraints while minimizing or maximizing an objective function. In the studies reviewed, diet optimization was used to examine the links between dimensions of diet sustainability, identify the minimum cost or environmental impact of a nutritionally adequate diet, or identify food combinations able to combine ≥2 sustainability dimensions. If some constraints prove difficult to fulfill, this signals an incompatibility between nutrient recommendations, over-monotonous food-consumption patterns, an inadequate supply of nutrient-rich foods, or an incompatibility with other dimensions. If diet optimization proves successful, it can serve to design nutritionally adequate, culturally acceptable, economically affordable, and environmentally friendly diets. Diet optimization results can help define dietary recommendations, tackle food security issues, and promote sustainable dietary patterns. This review emphasizes the importance of carefully choosing the model parameters (variables, objective function, constraints) and input data and the need for appropriate expertise to correctly interpret and communicate the results. Future research should make improvements in the choice of metrics used to assess each aspect of a sustainable diet, especially the cultural dimension, to improve the practicability of the results.

Validity of the Body Adiposity Index in Predicting Body Fat in Adults: A Systematic Review

AbstractThe Body Adiposity Index (BAI) is a practical anthropometric method used to measure body fat (BF) percentage (BF%). Recently developed, the validity and precision of BAI has been studied with adult samples of men and women, populations from different countries and ethnicities, varying amounts of BF, and sensitivity to detecting change over time. However, it is still necessary to determine its potential use in clinical practice and epidemiologic studies. Thus, our objective was to verify, through a systematic review, the validity of the BAI in predicting BF% in adults. Two independent researchers performed a search using PubMed, Web of Science, Science Direct, and Scopus databases. In order to be included, the studies had to use dual-energy X-ray absorptiometry (DXA) as a reference method. We excluded studies with samples from individuals with diseases or syndromes that alter the regional distribution of BF%. We included 19 studies with samples on individuals from different continents, varied ethnicities, both sexes, and a wide age range (18–83 y). The concordance of the BAI with DXA assessed by Lin's concordance correlation coefficient showed results classified as poor (pc < 0.90). Bland-Altman plots showed that the BAI produced large individual errors when predicting BF% in all studies using this analysis. The studies were consistent in affirming that the BAI showed limited capacity to estimate BF% in adults. The BAI shows wide individual errors, in agreement with the reference method, and a lack of sensitivity in detecting change in BF% over time. The method presents a systematic error of BF% overestimation in individuals with ≤20% of BF, and underestimation in individuals with >30% of BF, regardless of sex, age, and ethnicity. The results of this systematic review show enough evidence that the BAI does not present satisfying results, and its use is not recommended for BF% determination in adults.

Role of Probiotics in Managing Gastrointestinal Dysfunction in Children with Autism Spectrum Disorder: An Update for Practitioners

AbstractChildren with autism spectrum disorder (ASD) are 4 times as likely to experience gastrointestinal symptoms as children without ASD. The gut microbiota has increasingly been the subject of investigation as a contributing factor to these symptoms in this population because there is evidence to suggest that alterations in the intestinal microflora are correlated with gastrointestinal and ASD symptom severity. Probiotic therapy has been proposed as a treatment for augmented gastrointestinal symptom severity in children with ASD. This narrative review systematically searched the literature to provide an update for practitioners on the state of the evidence surrounding probiotic therapy in children with ASD as a treatment option for reducing gastrointestinal symptoms. A total of 186 articles were screened and 5 articles met the inclusion criteria. A collective sample of 117 children with ASD is represented and outcomes addressed include improvement in gastrointestinal symptoms as well as influence of probiotic supplementation on the gut microbiota and ASD symptoms and behavior. There is promising evidence to suggest that probiotic therapy may improve gastrointestinal dysfunction, beneficially alter fecal microbiota, and reduce the severity of ASD symptoms in children with ASD. Future research is still warranted in this area because there are methodologic flaws in the available literature and optimal species, strains, dosages, and duration of treatment have not been identified.

Perspective: Fundamental Limitations of the Randomized Controlled Trial Method in Nutritional Research: The Example of Probiotics

AbstractStudies on the relation between health and nutrition are often inconclusive. There are concerns about the validity of many research findings, and methods that can deliver high-quality evidence—such as the randomized controlled trial (RCT) method—have been embraced by nutritional researchers. Unfortunately, many nutritional RCTs also yield ambiguous results. It has been argued that RCTs are ill-suited for certain settings, including nutritional research. In this perspective, we investigate whether there are fundamental limitations of the RCT method in nutritional research. To this end, and to limit the scope, we use probiotic studies as an example. We use an epistemological approach and evaluate the presuppositions that underlie the RCT method. Three general presuppositions are identified and discussed. We evaluate whether these presuppositions can be considered true in probiotic RCTs, which appears not always to be the case. This perspective concludes by exploring several alternative study methods that may be considered for future probiotic or nutritional intervention trials.

Perspective: Should Vitamin E Recommendations for Older Adults Be Increased?

AbstractCurrent vitamin E requirements are uniformly applied across the population for those >14 y of age. However, aging is associated with alterations in cellular and physiologic functions, which are affected by vitamin E. Therefore, it is questionable whether vitamin E requirements can be uniformly applied to all adult age categories. With aging, there is dysregulation of the immune system in which there are decreased cell-mediated and pathogen defense responses coupled with an overactive, prolonged inflammatory state. Both animal and human studies in the aged suggest that intake above currently recommended levels of vitamin E may improve immune and inflammatory responses and be associated with a reduced risk of infectious disease. We review the evidence that was considered in establishing the current requirements for vitamin E and highlight data that should be considered in determining the vitamin E requirements in older adults, particularly focusing on the evidence suggesting a benefit of increased vitamin E intake on immune function and inflammatory processes and resistance to infection. The main objective of this Perspective is to initiate the discussion of whether the current Dietary Reference Intake for vitamin E should be increased for the older population. We make this suggestion on the basis of mechanistic studies showing biological plausibility, correction of a major cellular dysfunction in older adults, and strong evidence from several animal and a few human studies indicating a reduction in risk and morbidity from infections.

Perspective: Food-Based Dietary Guidelines in Europe—Scientific Concepts, Current Status, and Perspectives

AbstractFood-based dietary guidelines (FBDGs) are important tools for nutrition policies and public health. FBDGs provide guidelines on healthy food consumption and are based on scientific evidence. In the past, disease prevention and nutrient recommendations dominated the process of establishing FBDGs. However, scientific advances and social developments such as changing lifestyles, interest in personalized health, and concerns about sustainability require a reorientation of the creation of FBDGs to include a wider range of aspects of dietary behavior. The present review evaluates current European FBDGs with regard to the concepts and aspects used in their derivation, and summarizes the major aspects currently discussed to be considered in future establishment or updates of FBDGs. We identified English information on official European FBDGs through an Internet search (FAO, PubMed, Google) and analyzed the aspects used for their derivation. Furthermore, we searched literature databases (PubMed, Google Scholar) for conceptional considerations dealing with FBDGs. A total of 34 out of 53 European countries were identified as having official FBDGs, and for 15 of these, documents with information on the scientific basis could be identified and described. Subsequently, aspects underlying the derivation of current FBDGs and aspects considered in the literature as important for future FBDGs were discussed. Eight aspects were identified: diet-health relations, nutrient supply, energy supply, dietary habits, sustainability, food-borne contaminants, target group segmentation, and individualization. The first 4 have already been widely applied in existing FBDGs; the others have almost never been taken into account. It remains a future challenge to (re)conceptionalize the development of FBDGs, to operationalize the aspects to be incorporated in their derivation, and to convert concepts into systematic approaches. The current review may assist national expert groups and clarifies the options for future development of local FBDGs.

Perspective: Novel Commercial Packaging and Devices for Complementary Feeding

AbstractIn recent years, so-called baby food pouches and other novel packaging and devices have been marketed for complementary feeding. To date, no experimental studies have been conducted to determine health and nutrition effects or the safety of baby food pouches and related feeding devices. Yet, these products hold the potential to fundamentally change the ways in which infants and children consume solid foods in infancy and early childhood. In this review, a selection of complementary feeding devices and their potential effects on breastfeeding, formula-feeding, safe and appropriate complementary feeding, and the timely transition to family foods are explored. Because manufacturers have innovated older designs of traditional feeding bottles and pacifiers for complementary feeding, perspectives on potential health effects and the safety of devices are drawn from research on feeding bottles and pacifiers. Recommendations include scaling up research on the safety, nutrition, and health impacts of commercial packaging and devices. In addition, manufacturers should ensure that devices conform to consumer product safety commission specifications and that instructions for use are in line with policies protecting pediatric dental health. Marketing of commercial devices and packaging should conform to the International Code of Marketing of Breastmilk Substitutes.

Use of Stable Isotopes to Evaluate Bioefficacy of Provitamin A Carotenoids, Vitamin A Status, and Bioavailability of Iron and Zinc

AbstractThe ability of nutrition scientists to measure the status, bioavailability, and bioefficacy of micronutrients is affected by lack of access to the parts of the body through which a nutrient may travel before appearing in accessible body compartments (typically blood or urine). Stable isotope–labeled tracers function as safe, nonradioactive tools to follow micronutrients in a quantitative manner because the absorption, distribution, metabolism, and excretion of the tracer are assumed to be similar to the unlabeled vitamin or mineral. The International Atomic Energy Agency (IAEA) supports research on the safe use of stable isotopes in global health and nutrition. This review focuses on IAEA's contributions to vitamin A, iron, and zinc research. These micronutrients are specifically targeted by the WHO because of their importance in health and worldwide prevalence of deficiency. These 3 micronutrients are included in food fortification and biofortification efforts in low- and middle-income regions of the world. Vitamin A isotopic techniques can be used to evaluate the efficacy and effectiveness of interventions. For example, total body retinol stores were estimated by using 13C2-retinol isotope dilution before and after feeding Zambian children maize biofortified with β-carotene to determine if vitamin A reserves were improved by the intervention. Stable isotopes of iron and zinc have been used to determine mineral bioavailability. In Thailand, ferrous sulfate was better absorbed from fish sauce than was ferrous lactate or ferric ammonium citrate, determined with the use of different iron isotopes in each compound. Comparisons of one zinc isotope injected intravenously with another isotope taken orally from a micronutrient powder proved that the powder increased total absorbed zinc from a meal in Pakistani infants. Capacity building by the IAEA with appropriate collaborations in low- and middle-income countries to use stable isotopes has resulted in many advancements in human nutrition.

Americans’ Perceptions about Fast Food and How They Associate with Its Consumption and Obesity Risk

AbstractWe aimed to systematically examine Americans’ perceptions of fast food (FF) and how these perceptions might affect fast food consumption (FFC) and obesity risk. We searched PubMed and Google for studies published in English until February 17, 2017 that reported on Americans’ perceptions (defined as their beliefs, attitudes, and knowledge) regarding FF as well as those on their associations with FFC and obesity risk. Thirteen articles met inclusion criteria. Limited research has been conducted on these topics, and most studies were based on convenience samples. A 2013 nationally representative phone survey of about 2000 subjects showed that one-fifth of Americans thought FF was good for health, whereas two-thirds considered FF not good. Even over two-thirds of weekly FF consumers (47% of the total population) thought FF not good. Americans seem to have limited knowledge of calories in FF. Negative and positive FF perceptions were associated with FFC. Those who consumed less FF seemed more likely to view FF negatively. When Americans valued the convenience and taste of FF and preferred FF restaurants with kid's menus and play areas, they were likely to purchase more FF. Available research indicates neither perceived availability of FF nor Geographical Information System (GIS)-based FF presence in the neighborhood has significant associations with weekly FFC. No studies examined potential links between FF perceptions and obesity risk. Americans’ perceptions of FF and how they might associate with FFC and obesity risk are understudied. Considerable variation was observed in Americans’ perceptions and FFC.

Perspective: What Will It Cost to Scale-up Breastfeeding Programs? A Comparison of Current Global Costing Methodologies

AbstractBreastfeeding is one of the most feasible and cost-effective maternal-child health interventions. Currently, global investments needed to achieve the WHO global nutrition target for exclusive breastfeeding (EBF) do not meet the recommended standards for economic investment and implementation of policies supporting mothers to breastfeed. Estimating implementation costs of high-quality, high-impact programs based on each country's enabling environment and specific context is essential for developing and prioritizing recommendations that can drive the successful scaling-up of breastfeeding programs globally. We provide a detailed comparison (strengths, limitations, and gaps) of the 2 most recent global cost analysis frameworks used to estimate financial needs for scaling-up breastfeeding interventions from World Breastfeeding Costing Initiative (WBCi) and the World Bank. Our comparison found that the World Bank presents the more advanced costing methodology for scaling-up breastfeeding programs. However, there is a need to adapt and improve this costing framework to guide individual countries based on key contextual factors that consider the complexity of health systems.

Protein

AbstractProteins are polymers of amino acids linked via α-peptide bonds. They can be represented as primary, secondary, tertiary, and even quaternary structures, but from a nutritional viewpoint only the primary (amino acid) sequence is of interest. Similarly, although there are many compounds in the body that can be chemically defined as amino acids, we are only concerned with the 20 canonical amino acids encoded in DNA, plus 5 others—ornithine, citrulline, γ-aminobutyrate, β-alanine, and taurine—that play quantitatively important roles in the body. We consume proteins, which are digested in the gastrointestinal tract, absorbed as small peptides (di- and tripeptides) and free amino acids, and then used for the resynthesis of proteins in our cells. Additionally, some amino acids are also used for the synthesis of specific (nonprotein) products, such as nitric oxide, polyamines, creatine, glutathione, nucleotides, glucosamine, hormones, neurotransmitters, and other factors. Again, such functions are not quantitatively important for most amino acids, and the bulk of amino acid metabolism is directly related to protein turnover (synthesis and degradation). For an individual in nitrogen balance, an amount of protein equal to that of the daily protein (nitrogen) intake is degraded each day with the nitrogen being excreted as urea and ammonia (with limited amounts of creatinine and uric acid). The carbon skeletons of the amino acids degraded to urea and ammonia are recovered through gluconeogenesis or ketone synthesis, or oxidized to carbon dioxide. Of the 20 amino acids present in proteins, 9 are considered nutritionally indispensable (essential) in adult humans because the body is not able to synthesize their carbon skeletons. These 9 amino acids are leucine, valine, isoleucine, histidine, lysine, methionine, threonine, tryptophan, and phenylalanine. In addition, 2 others are made from their indispensable precursors: cysteine from methionine, and tyrosine from phenylalanine. Although arginine is needed in neonates, it appears that adults, with the possible exceptions of pregnancy in females and spermatogenesis in males, can synthesize sufficient arginine to maintain a nitrogen balance. The others, glutamate, glutamine, aspartate, asparagine, serine, glycine, proline, and alanine, can all be synthesized from glucose and a suitable nitrogen source. Under some conditions, glutamine, glutamate, glycine, proline, and arginine may be considered as conditionally indispensable, meaning that the body is not capable of synthesizing them in sufficient quantities for a specific physiologic or pathologic condition (1). Thus, any discussion of dietary protein must consider not only quantity but also quality (ratio of indispensable amino acids).

Perspective: Limiting Dependence on Nonrandomized Studies and Improving Randomized Trials in Human Nutrition Research: Why and How

AbstractA large majority of human nutrition research uses nonrandomized observational designs, but this has led to little reliable progress. This is mostly due to many epistemologic problems, the most important of which are as follows: difficulty detecting small (or even tiny) effect sizes reliably for nutritional risk factors and nutrition-related interventions; difficulty properly accounting for massive confounding among many nutrients, clinical outcomes, and other variables; difficulty measuring diet accurately; and suboptimal research reporting. Tiny effect sizes and massive confounding are largely unfixable problems that narrowly confine the scenarios in which nonrandomized observational research is useful. Although nonrandomized studies and randomized trials have different priorities (assessment of long-term causality compared with assessment of treatment effects), the odds for obtaining reliable information with the former are limited. Randomized study designs should therefore largely replace nonrandomized studies in human nutrition research going forward. To achieve this, many of the limitations that have traditionally plagued most randomized trials in nutrition, such as small sample size, short length of follow-up, high cost, and selective reporting, among others, must be overcome. Pivotal megatrials with tens of thousands of participants and lifelong follow-up are possible in nutrition science with proper streamlining of operational costs. Fixable problems that have undermined observational research, such as dietary measurement error and selective reporting, need to be addressed in randomized trials. For focused questions in which dietary adherence is important to maximize, trials with direct observation of participants in experimental in-house settings may offer clean answers on short-term metabolic outcomes. Other study designs of randomized trials to consider in nutrition include registry-based designs and “N-of-1” designs. Mendelian randomization designs may also offer some more reliable leads for testing interventions in trials. Collectively, an improved randomized agenda may clarify many things in nutrition science that might never be answered credibly with nonrandomized observational designs.

Perspective: Are Large, Simple Trials the Solution for Nutrition Research?

AbstractNutritional research and policies have been criticized for relying on observational evidence, using self-report diet assessment methods, and supposedly being unable to present a consensus on what constitutes a healthy diet. In particular, it is often asserted that for progress to occur in nutrition science, large, simple trials, which have worked well in evaluating the efficacy of drugs, need to replace most observational research and small trials in nutrition. However, this idea is infeasible, and is unlikely to advance nutritional sciences or improve policies. This article addresses some commonly held and unfounded “myths” surrounding dietary assessments, effect sizes, and confounding, demonstrating how carefully conducted observational studies can provide reliable and reproducible evidence on diet and health. Also, there is a strong consensus among nutritional researchers and practitioners about the basic elements of a healthy diet. To move forward, we should continue to improve study design and diet assessment methodologies, reduce measurement errors, and leverage new technologies. Advances in the field lie in coalescing evidence from multiple study designs, methodologies, and technologies, and translating what we already know into policy and practice, so we can improve diet quality and enhance health in an equitable and sustainable manner across the world.

Chromium

AbstractTwo oxidation states of chromium are considered to be biologically and environmentally relevant based on their stability in the presence of water and oxygen. Compounds containing chromium(6 + ) are mutagenic and carcinogenic when inhaled and potentially when ingested orally in large quantity as well. Chromium as the trivalent will be the focus of this work as it was proposed to be an essential element for mammals ∼60 y ago; however, in the last 2 decades its status has been questioned. Chromium has been postulated to be involved in regulating carbohydrate and lipid (and potentially also protein) metabolism by enhancing insulin's efficacy (1). However, in 2014, the European Food Safety Authority found no convincing evidence that chromium is an essential element (2). Dietary chromium apparently is absorbed via passive diffusion and the extent of absorption is low (∼1%). Chromium is maintained in the bloodstream bound to the protein transferrin. It is generally believed to be delivered to tissues by transferrin via endocytosis (1). No unambiguous animal model of chromium deficiency has been established (2). One limitation in characterizing chromium deficiency in humans is the lack of an accepted biomarker of chromium nutritional status. Attempts to identify a glucose tolerance factor have not provided a chemically defined functional compound that conforms with the proposed physiologic role of chromium as a facilitator of insulin action in vivo.

Micronutrient Status: Potential Modifiers—Drugs and Chronic Disease

An individual's micronutrient status can be affected by a wide range of factors, including habitual diet, medications, dietary supplements, metabolic disorders, and surgery, particularly surgery involving the gastrointestinal system. Challenges related to assessing micronutrient sufficiency or excess hinge, in large part, on the difficulty of accurately determining micronutrient status. This challenge is of particular concern for the shortfall nutrients identified in the 2015–2020 US Dietary Guidelines for Americans (vitamins D and K and calcium) and overconsumed nutrients (sodium). In addition to intakes below the Estimated Average Requirement or above the Tolerable Upper Intake Level of nutrients by some in the United States, certain individuals have chronic conditions or illnesses or are taking medications or botanical supplements that may alter micronutrient bioavailability and thus alter requirements.

Proton Pump Inhibitors, H2-Receptor Antagonists, Metformin, and Vitamin B-12 Deficiency: Clinical Implications

AbstractThere is clear evidence that proton-pump inhibitors (PPIs), H2-receptor antagonists (H2RAs), and metformin can reduce serum vitamin B-12 concentrations by inhibiting the absorption of the vitamin. However, it is unclear if the effects of these drugs on serum vitamin B-12 are associated with increased risk of biochemical or functional deficiency (as is indicated by elevated blood concentrations of homocysteine and methylmalonic acid) or clinical deficiency (including megaloblastic anemia and neurologic disorders such as peripheral neuropathy and cognitive dysfunction). This review provides an overview of vitamin B-12 absorption and biochemistry and the mechanisms by which PPIs, H2RAs, and metformin affect these functions. It also summarizes the literature relating the use of these drugs to the risk of vitamin B-12 deficiency. Also discussed is that strategies for assessing vitamin B-12 status and diagnosing vitamin B-12 deficiency have evolved in recent years beyond solely measuring serum total vitamin B-12. Multiple analyte testing, a strategy in which ≥2 of 4 biomarkers of vitamin B-12 status—serum total vitamin B-12, holotranscobalamin, homocysteine, and methylmalonic acid—are measured, increases sensitivity and specificity for diagnosing vitamin B-12 deficiency. It is concluded that randomized controlled trials are now needed that use the strategy of multiple analyte testing to determine if PPIs, H2RAs, and metformin do indeed increase the risk of vitamin B-12 deficiency. Until these studies are conducted, a reasonable recommendation for physicians and their patients who are taking these drugs is to monitor vitamin B-12 status and to provide vitamin B-12 supplements if altered blood biomarkers or clinical signs consistent with low or deficient vitamin B-12 status develop.

Coenzyme Q10 as Treatment for Statin-Associated Muscle Symptoms—A Good Idea, but…

Abstract3-Hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins) are extremely well tolerated but are associated with a range of mild-to-moderate statin-associated muscle symptoms (SAMS). Estimates of SAMS incidence vary from <1% in industry-funded clinical trials to 10–25% in nonindustry-funded clinical trials and ∼60% in some observational studies. SAMS are important because they result in dose reduction or discontinuation of these life-saving medications, accompanied by higher healthcare costs and cardiac events. The mechanisms that produce SAMS are not clearly defined. Statins block the production of farnesyl pyrophosphate, an intermediate in the mevalonate pathway, which is responsible for the production of coenzyme Q10 (CoQ10). This knowledge has prompted the hypothesis that reductions in plasma CoQ10 concentrations contribute to SAMS. Consequently, CoQ10 is popular as a form of adjuvant therapy for the treatment of SAMS. However, the data evaluating the efficacy of CoQ10 supplementation has been equivocal, with some, but not all, studies suggesting that CoQ10 supplementation mitigates muscular complaints. This review discusses the rationale for using CoQ10 in SAMS, the results of CoQ10 clinical trials, the suggested management of SAMS, and the lessons learned about CoQ10 treatment of this problem.

Clinically Relevant Herb-Micronutrient Interactions: When Botanicals, Minerals, and Vitamins Collide

AbstractThe ability of certain foods to impair or augment the absorption of various vitamins and minerals has been recognized for many years. However, the contribution of botanical dietary supplements (BDSs) to altered micronutrient disposition has received little attention. Almost half of the US population uses some type of dietary supplement on a regular basis, with vitamin and mineral supplements constituting the majority of these products. BDS usage has also risen considerably over the last 2 decades, and a number of clinically relevant herb-drug interactions have been identified during this time. BDSs are formulated as concentrated plant extracts containing a plethora of unique phytochemicals not commonly found in the normal diet. Many of these uncommon phytochemicals can modulate various xenobiotic enzymes and transporters present in both the intestine and liver. Therefore, it is likely that the mechanisms underlying many herb-drug interactions can also affect micronutrient absorption, distribution, metabolism, and excretion. To date, very few prospective studies have attempted to characterize the prevalence and clinical relevance of herb-micronutrient interactions. Current research indicates that certain BDSs can reduce iron, folate, and ascorbate absorption, and others contribute to heavy metal intoxication. Researchers in the field of nutrition may not appreciate many of the idiosyncrasies of BDSs regarding product quality and dosage form performance. Failure to account for these eccentricities can adversely affect the outcome and interpretation of any prospective herb-micronutrient interaction study. This review highlights several clinically relevant herb-micronutrient interactions and describes several common pitfalls that often beset clinical research with BDSs.

Pages