The assessment presented in the core paper of this debate by Schafer and Kegley does not adequately describe the computational methodology or sources of data that were used to estimate exposures. While it is difficult to determine from the article, the exposure estimates seem to be very dependent on action levels, rather than on empirically derived data. There is no adequate presentation of analytical methods, limits of detection, or the significance of non-detects in deriving estimates of exposure.
- persistent organic pollutants
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Our comments on the article by Schafer and Kegley1 fall into several broad categories:
The paper does not adequately describe the computational methodology or sources of data that were used to estimate exposures (for example, there is very little quantitative description of what was done).
The analysis does not include a sensitivity analysis or discussion of the relative importance of different foods.
There is no discussion of analytical methods, limits of detection, or the significance of non-detects, which is critical in deriving estimates of exposure.
The exposure estimates are over-reliant on action levels, rather than on actual residue levels.
The manuscript fails to discuss how data from the US Department of Agriculture (USDA) Pesticide Data Program (PDP) and the Food and Drug Administration (FDA) Total Diet Study (TDS)2 were combined. The following issues should be elaborated to get a sense of what the authors did in their analysis:
How were data for different food types handled, as one survey looks at foods prepared as eaten while another focuses on raw agricultural commodities?
How were data reconciled when it was collected using different sampling plans (numbers of samples, collection areas)?
As the programmes use different methods of analysis that have different limits of detection, how was this accounted for?
In using the regional diets, the manuscript does not explain how they were developed and what data were used as the basis. A description of the different diets should at least discuss which foods are major contributors, and elaborate the differences between diets.
For the calculation of dieldrin exposure, it is not appropriate to use the 10% factor to scale exposure. If the authors had the data, they should have calculated the concentration in each food, taking into account non-detects in an appropriate way, and then multiply the concentration by the food consumption.
In the Results and Discussion, using the number of “hits” is not a credible measure of exposure or risk, particularly if the goal of the project is to evaluate exposure and risk quantitatively. The discussion of foods with “multiple hits” does not tackle the significance of multiple residues in one food, does not consider the differing toxicity of the compounds that are found, nor does it discuss an appropriate means of taking into account those that are not detected.
The authors state1 that “ . . . POPs residues usually occur at less than 100 ppb . . .however, even at these levels of exposure . . ..” We note however that the 100 ppb does not refer to an exposure, but is a concentration in food.
Figure 1 of the article1 (chart of dieldrin exposures) and the discussion of this figure, do not present adequate information about how the numerical estimates were developed. There is no comment on which foods are major contributors or what differentiates the exposures in the different regions, nor is there an interpretation of what “maximum” exposure implies or which population might be maximally exposed.
Table 1 of the article1 (which POPs are found in foods in Western diet) provides no quantification or perspective on exposure to multiple POPs. The discussion and the table present no consideration of the fact that level of detections for TDS are very low—often lower than for compliance programmes.
Table 3 (DDT exposures) does not explain where the food intakes come from, and does not specify the consuming population. We noted that for a number of foods, the consumption amount is very high (eggs, fish), which results in an excessive total food consumption (2.3 kg). As the time frame and population basis for consumption are not specified, one should not add these food intakes together. Furthermore, the exposure(s) should be compared with a toxicological end point appropriate for the population under consideration.
The statement that the FDA does not regularly test for dioxins and furans is not correct. Since 1999 the FDA has tested TDS foods for dioxins. In addition, in 1996, 1998, and 1999 the FDA undertook sampling of individual fish and shellfishspecies and dairy products for dioxins with the intention of determining background levels in these foods. These data have been presented at various scientific meetings, including the annual international dioxin meetings, and have been made available in a recent publication.3 In 2001 the FDA increased dioxin testing overall, with current annual testing at nearly 1000 samples per year, under a programme described elsewhere (www.cfsan.fda.gov/∼lrd/dioxstra.html). The statement regarding PCB sampling is also not correct: TDS foods (including fish) have been analysed for total PCBs for many years, and the results are available on the web in the same files as the dieldrin data used by the authors.