Literature
首页医源资料库在线期刊美国临床营养学杂志2007年86卷第6期

Reassessing folic acid consumption patterns in the United States (1999–2004): potential effect on neural tube defects and overexposure to folate

来源:《美国临床营养学杂志》
摘要:ABSTRACTBackground:IntheUnitedStates,folicacidfortificationofcereal-grainfoodshassignificantlyincreasedfolatestatus。Objectives:Theobjectivesofthestudyweretoquantifychangesinfolateintakeafterfolicacidfortificationandtoestimatetheeffectonneuraltubedefect(......

点击显示 收起

Eoin P Quinlivan1 and Jesse F Gregory, III1

1 From the Biomedical Mass Spectrometry Laboratory, General Clinical Research Center (EPQ), and the Food Science and Human Nutrition Department (JFG), University of Florida, Gainesville, FL

2 Supported by NIH grants no. DK37481 and no. DK56274 and by the Florida Agricultural Experiment Station.

3 Reprints not available. Address correspondence to JF Gregory, Food Science and Human Nutrition Department, PO Box 110370, University of Florida, Gainesville, FL 32611-0370. E-mail: jfgy{at}ufl.edu.


ABSTRACT  
Background: In the United States, folic acid fortification of cereal- grain foods has significantly increased folate status. However, blood folate concentrations have decreased from their postfortification high as a result, in part, of decreasing food fortification concentrations and the popularity of low-carbohydrate weight-loss diets.

Objectives: The objectives of the study were to quantify changes in folate intake after folic acid fortification and to estimate the effect on neural tube defect (NTD) occurrence.

Design: Expanding on an earlier model, we used data from 11 intervention studies to determine the relation between chronic folate intervention and changes in steady state serum folate concentrations. With serum folate data from the National Health and Nutrition Examination Survey (NHANES), we used reverse prediction to calculate postfortification changes in daily folate equivalents (DFEs). With the use of NHANES red blood cell folate data and a published equation that related NTD risk to maternal red cell folate concentrations, we calculated NTD risk.

Results: Folate intake decreased by 130 µg DFE/d from its postfortification high, primarily as a result of changes seen in women with the highest folate status. This decrease in folate intake was predicted to increase the incidence of NTD by 4–7%, relative to a predicted 43% postfortification decrease. In addition, the number of women consuming >1 mg bioavailable folate/d decreased.

Conclusions: Folate consumption by women of childbearing age in the United States has decreased. However, the decrease in those women with the lowest folate status was disproportionately small. Consequently, the effect on NTD risk should be less than would be seen if a uniform decrease in folate concentrations had occurred. These results reinforce the need to maintain monitoring of the way fortification is implemented.

Key Words: Food fortification • folic acid • folate • neural tube defects • nutrition


INTRODUCTION  
In January 1998, in an attempt to reduce the incidence of neural tube defects (NTDs), the addition of folic acid to enriched-grain products became mandatory in the United States (1). The folic acid fortification levels were originally set so as to maximize folate consumption by women of childbearing age and to minimize the number of persons consuming >1 mg folic acid/d. However, it soon became evident that folic acid intake derived from fortification was almost twice that originally envisioned (2, 3). This larger-than-expected increase in folic acid consumption resulted in part from wide-scale overfortification of enriched-grain products; initial studies suggested that fortified foods typically contained 160% (4) to 175% (5) of the mandated amount of folic acid. Recent anecdotal and empirical evidence suggests that folic acid fortification levels have decreased in recent years and that they are coming more in line with mandated levels (6). This apparent reduction in fortification levels, coupled with the popularity of low-carbohydrate diets (7), has been cited as an explanation for the decreases in serum and red blood cell (RBC) folate concentrations observed in the annual National Health and Nutrition Examination Survey (NHANES) from 1999 through 2004 (8).

We set out to quantify the change in folate consumption since the instigation of folic acid fortification and to quantify the effect this change may have had on the incidence of NTDs and the extent of folate overconsumption.


METHODS  
Relation between changes in folate consumption and changes in serum folate concentration
Previously, we (2) identified 4 studies (9–12) in which serum or plasma folate concentrations were measured before and after oral folic acid intervention (Table 1). Another 7 studies (13–19), which were published after our original report, were also identified. In all studies, intervention periods were sufficient to achieve plateau serum folate concentrations (9, 10). Daily folate intervention from each of these studies was expressed as daily folate equivalents (DFEs); DFEs are calculated as folic acid x 1.7 on the basis that folic acid (20) and, by extension, folate monoglutamates (13, 21), are 1.7 times more bioavailable than are food folates. From this calculation, we plotted the change in serum or plasma folate concentration due to the intervention versus daily folate dose and calculated the linear regression equation describing that relation. The regression line defined by the data of van Oort et al (19) appeared to be different from the regression defined by the other data points. We used a coincidence test, with the equation defined by Kleinbaum et al (22), calculated by using an Excel spreadsheet (Microsoft Corp, Redmond, WA) to determine whether the difference was significant. The regression linearity was determined by using SIGMASTAT for WINDOWS statistical software (version 3.00; SPSS Inc, Chicago, IL).


View this table:
TABLE 1. Change in serum or plasma folate concentrations observed in intervention studies on the effect of oral folic acid consumption1

 
Red blood cell and serum folate concentrations
Prefortification (23) and postfortification (8) serum and RBC folate concentrations for women of childbearing age were taken from published NHANES studies.

Changes in folate consumption
By subtracting the 1989–1994 NHANES III (8) serum folate data from the data of each of the NHANES postfortification surveys (20), we calculated the change in serum folate concentration over this period. With the use of reverse prediction and comparing the changes in serum folate concentrations with the linear regression equation derived above, we calculated the apparent change in daily folate consumption since fortification.

Total daily folate consumption
On the grounds that we were able to determine the change in daily folate consumption by the change in serum folate concentration, we then decided to test whether we could determine total daily folate consumption on the basis of total serum folate concentrations. With the use of reverse prediction and comparing total serum folate concentrations from the NHANES surveys (8, 23) with the linear regression equation derived above, we calculated the apparent total DFEs.

To validate this hypothesis, we used serum folate concentrations from the Framingham Offspring Cohort (24) to calculate apparent total DFEs for that group. We compared these calculated values to published DFE values from the same cohort that were derived by using food-frequency questionnaires (3). (The food composition tables used to analyze the food-frequency questionnaires had been adjusted to reflect actual folic acid fortification.)

Neural tube defect risk
Daly et al (25) derived an equation that defines the relation between RBC folate concentrations and NTD risk. We used this equation and RBC folate concentrations from the NHANES surveys (8, 20) to calculate the NTD risk for each of the NHANES surveys. NTD risk was expressed relative to the median prefortification group.


RESULTS  
Relation between changes in folate consumption and changes in serum folate concentration
Changes in serum or plasma folate concentrations were plotted against DFEs (Figure 1). Because the slope in the study of van Oort et al (19) differed significantly [P < 0.01, as determined by a coincidence test (22)] from the slope defined by the data points from the other studies, it was excluded from the final analysis. Because van Oort et al used a competitive binding assay (Immulite 2000; Diagnostic Products Co, Los Angeles, CA) to calculate the serum folate concentrations, it is possible that the elevated folate concentrations observed were due to systematic error within the assay. Such folate-binding assays can show a high degree of variability (26), depending on the folate species present, the calibrants, and the assay conditions. For instance, the Immulite 2000 assay has been reported to give higher RBC folate concentrations than does the microbiological assay (27).


View larger version (11K):
FIGURE 1.. Relation between controlled folate intake and the resulting change in median or mean serum or plasma folate concentration. Data were derived from intervention studies of the effect of longitudinal folate supplementation or fortification with known daily amounts of folate on median or mean serum or plasma folate concentrations. Intervention was with folic acid or 5-CH3-THF. Daily folic acid intervention was converted to dietary folate equivalents (DFE) by multiplying by 1.7 (20). Likewise, daily 5-CH3-THF intervention was converted to DFE after adjustment for differences in mass between 5-CH3-THF and folic acid. •••, the regression line derived in our original study (2); - - - (with ), the regression line derived from van Oort et al (19)—data that were excluded from the final analysis because the slope was determined by a coincidence test (22) to be significantly different (P < 0.01) from the slope defined by the other data points; — (with ), the regression line derived from all intervention studies except van Oort et al (y = 0.0145x + 0.132; r = 0.979, P < 0.001).

 
The slope of the data from the remaining studies was linear (r = 0.979, P < 0.001). Comparing the data points from our original study (2) with those from the current, expanded study, we found no significant (P > 0.6) difference in slope between the 2 sets of data. The age of the subjects in the intervention studies had no significant effect on serum folate response (data not shown).

Red blood cell and serum folate concentrations
Both RBC (Table 2) and serum (Table 3) folate concentrations increased between 1988–1994 and 1999–2000, and then they decreased each year from 1999–2000 to 2003–2004. Between 1988–1994 and 1999–2000, the percentage increase in serum and RBC folate concentration was smallest in the women with the highest folate status. In contrast, the percentage decline in serum and RBC folate concentrations between 1999–2000 and 2003–2004 was greatest in the women with the highest folate status.


View this table:
TABLE 2. Change in red blood cell (RBC) folate concentrations between the third National Health and Nutrition Examination Survey (NHANES III) and the annual NHANES surveys from 1999 through 2004

 

View this table:
TABLE 3. Change in serum folate concentrations between the third National Health and Nutrition Examination Survey (NHANES III) and the annual NHANES surveys from 1999 through 2004

 
Changes in folate consumption
Median folate consumption (Table 4) increased by 529 µg DFE/d between 1988–1994 (before fortification) and 1999–2000 (after fortification); it then decreased by 135 µg DFE/d between 1999–2000 and 2003–2004. This overall decrease in folate consumption was primarily the result of changes in subjects with the highest folate status: eg, folic acid consumption decreased by 20 and 74 µg DFE/d in women in the 10th and 25th percentile of serum folate, respectively, whereas the decrease was 215 and 417 µg DFE/d for women in the 75th and 90th percentile, respectively.


View this table:
TABLE 4. Change in daily folate intake between the third National Health and Nutrition Examination Survey (NHANES III) and the annual NHANES surveys from 1999 through 2004 and total daily folate intake in each study year, stratified by percentile of serum folate concentration1

 
Total folate consumption
The following findings supported our hypothesis that total daily folate intake can be estimated by using serum folate concentrations. As shown in Figure 2, we found a strong correlation (r = 0.9761, P < 0.001) between total daily folate intakes, estimated in this manner, and published intake values (3), estimated by using food-frequency questionnaires (updated for actual folic acid concentrations in food).


View larger version (18K):
FIGURE 2.. Relation between total daily folate intake calculated by regression (see Figure 1) versus intakes calculated by using corrected food-frequency questionnaires (y = 1.106x + 10.387; r = 0.9761, P < 0.001).

 
As expected, total folate consumption increased in the year after mandatory fortification (1999–2000), so that we estimate that subjects in the 90th percentile consumed a total of 1666 µg DFE/d. However, by 2003–2004, total folate consumed by subjects in the 90th percentile had decreased to 1249 µg DFE/d.

Neural tube defect risk
Our analysis predicted a 43% decrease in NTD risk between 1988–1994 and 1999–2000. However, it also predicted that NTD risk increased by 4–7% between 1999–2000 and 2003–2004 (calculated by subtracting the relative NTD risk in 1999–2000 from that in 2003–2004) (Table 5).


View this table:
TABLE 5. Relative risk of having a child with a neural tube defect (NTD) by percentile of red blood cell folate concentration during the third National Health and Nutrition Examination Survey (NHANES III) and each of the annual NHANES surveys from 1999 through 20041

 

DISCUSSION  
Changes in serum and red blood cell folate concentration
Women with the lowest percentiles of folate status had the largest percentage increase in RBC (Table 2) and serum (Table 3) folate concentrations between 1988–1994 and 1999–2000. In contrast, women in the same percentiles had the smallest percentage decrease in folate concentrations between 1999–2000 and 2003–2004. This disproportionate change in folate status cannot be attributed solely to changes in a single factor. For instance, if folic acid fortification causes a relatively large increase in folate concentrations, withdrawal of fortification cannot then cause a relatively small decrease in folate concentration in the same women. Several factors must have combined to give this disproportionate response.

Other factors that may have affected folate status include the changing use of folic acid supplements and the increasing popularity of diets low in enriched-grain products and breakfast cereal (eg, fortified products). For instance, the popularity of low-carbohydrate diets (low in grain products) has increased in recent years (7). Daily folic acid supplement use has increased over this period from 25% in 1995 to 29% in 1998 (28) and to 31% in 2003 (29). We estimate that supplement use increased average folate consumption by 164 µg DFE/d more than did fortification alone [ie, average folate consumption increased by 529 µg DFE/d (Table 4)] in all women in the NHANES cohort, but by only 365 µg DFE/d [215 µg folic acid/d (2)] in supplement nonusers in the Framingham Offspring Cohort (529 µg DFE/d – 365 µg DFE/d = 164 µg DFE/d).

Risk of folate overconsumption
The Food and Drug Administration's upper safe limit for total folate (folic acid plus natural folate) is 1 mg/d. This value is somewhat arbitrary and makes no allowances for the differences in bioavailability between folic acid and natural folates. Natural food folates, because of their polyglutamyl tails, may be less bioavailable than are folate monoglutamates (21), and folic acid is 1.7 times as bioavailable as are natural food folates. Therefore, when setting upper tolerable limits for total folate consumption, a better measure may be total bioavailable folate = DFE ÷ 1.7. Using this definition and the data from Table 4, we estimated that, soon after fortification (1999–2000), 10% of women (ie, those above the 90th percentile) were consuming >980 µg bioavailable folate/d (1666 µg DFE/d ÷ 1.7). However, by 2003–2004, the amount of bioavailable folate consumed by women in the 90th percentile had decreased by almost 25%, to 687 µg/d (1168 µg DFE/d ÷ 1.7), which suggested that the number of women consuming >1 mg bioavailable folate/d had decreased significantly.

Neural tube defect risk
There is some confusion concerning the extent to which folic acid fortification reduced the incidence of NTDs in the United States. One widely cited study (30) suggested that NTD occurrence decreased by 19%, whereas the Centers for Disease Control and Prevention (CDC) estimated a slightly higher value of 26% (31). However, both of these studies used post-partum medical records in their analysis and may have underestimated NTD frequency by excluding cases of spontaneous or medical abortion. More thorough estimates that included prenatal diagnosis data suggested that NTD incidences decreased by 40% (32). Such a value would be in line with the 43% decrease predicted in Table 5. Thus, assuming the validity of our estimates in Table 5, we predict that the NTD risk would have increased by 4–7% between 1999 and 2004 (relative to the 43% decrease between 1989–1995 and 1999).

Moore et al (33) conducted an NTD risk assessment similar to that of Daly et al (23) but using daily folate intake, calculated from food-frequency questionnaires, rather than RBC folate concentration. An estimate of the change in NTD risk, based on our estimates for daily folate intakes, and the regression equation of Moore et al are included elsewhere (See Supplemental Data, including Table S1, under "Supplemental data" in the current online issue at www.ajcn.org).

Conclusions
The recent decrease in serum and RBC folate concentrations in the United States has resulted primarily from changes in women with the highest folate concentrations. Consequently, we estimate that the effect on NTD occurrence would be less than that seen if a uniform decrease in folate concentrations had occurred. In addition, the large decrease in folate consumption in women with the highest folate status may limit the potential danger from folate overconsumption. This change in folate consumption patterns is in accordance with the FDA's aim of maximizing folate intake in women with low folate status (and thus reducing the incidence of NTDs) at the same time that the incidence of folate overconsumption is minimized. However, the manner in which this change in folate consumption occurred is entirely fortuitous, and the mechanism remains unregulated. It is quite conceivable that the events leading to this trend, particularly the decrease in folate overconsumption, could easily be reversed or could even become exacerbated.

Of concern is the effect that the decrease in serum and RBC folate concentrations may have on the monitoring of the potential risks of folic acid fortification. Some models of cancer development predict that, whereas folic acid fortification may prevent precancerous cells from turning cancerous, it may also promote the proliferation of neoplastic cells—ie, cells that already are cancerous (34). Thus, folic acid fortification may decrease cancer rates by preventing noncancerous cells from turning neoplastic but also may increase cancer rates by increasing neoplastic cell proliferation. An extension of this model is the possibility that an increase in cancer rates due to fortification may be transitory, because fortification also reduces the transformation of normal cells to neoplastic cells. Such a model may explain the recent report by Mason et al (35) showing that rates of colorectal cancer in the United States increased between 1995 and 1998, coincident with the introduction of folic acid fortification. However, there are 2 possible explanations for the subsequent decrease in colorectal cancer rates observed by Mason et al between 1998 and 2002. First, as predicted by the model, folic acid fortification may have prevented noncancerous cells from becoming neoplastic. Second, serum and RBC folate concentrations also decreased over this period, however, and thus it is possible that colorectal cancer rates decreased because folate concentrations were no longer sufficient to sustain the elevated cancer proliferation rates. The possibility of this second scenario, although unlikely, is of concern, because it presents the possibility that colorectal cancer rates may increase again if folate concentrations revert to 1999 levels.

We, therefore, call for the continued monitoring of the way in which fortification is implemented—particularly with regard to the monitoring of food consumption patterns. Furthermore, a better understanding of this phenomenon may provide information on ways in which we may increase folate consumption in women with low folate status and, at the same time, limit folate overconsumption. It should be noted that the present study did not address changes in folate consumption by men. Such an analysis should be conducted before any further changes are made in fortification programs.


ACKNOWLEDGMENTS  
The authors' responsibilities were as follows—EPQ: developed the original concept and conducted the statistical analysis; JFG (principal investigator): provided input on the execution of the concept; and EPQ and JFG: wrote the report. Neither author had a personal or financial conflict of interest.


REFERENCES  

Received for publication July 30, 2007. Accepted for publication August 10, 2007.


作者: Eoin P Quinlivan1
医学百科App—中西医基础知识学习工具
  • 相关内容
  • 近期更新
  • 热文榜
  • 医学百科App—健康测试工具