Previous Page Table of Contents Next Page


The Codex Committee on Food Hygiene (CCFH) requested that the expert group provide answers to the questions shown below, in Tables 6 and 7.

Table 6. Risk management questions for Salmonella in eggs

1.1 Estimate the risk from Salmonella in eggs in the general population and in the various susceptible populations (e.g. elderly, children, immunocompromised patients) at various prevalence and concentration levels of Salmonella in contaminated eggs.

1.2 Estimate the change in risk likely to occur from each of the interventions below, including their efficacy.

1.2.1 Reduce the prevalence of positive flocks (Destroy positive breeding or laying flocks, or both; Vaccinate egg laying flocks for Salmonella; Competitive exclusion).

1.2.2 Reduce the prevalence of Salmonella-positive eggs (Test and divert eggs from positive flocks to pasteurization).

1.2.3 Reduce the number of Salmonella organisms in eggs (Heat treatment of egg products; Refrigeration of eggs after lay and during distribution; Requirement for a specific shelf life for eggs stored at ambient temperatures).

Table 7. Risk management questions for Salmonella in broiler chickens

2.1 Estimate the risk from pathogenic Salmonella in broiler chickens for the general population and for various susceptible population groups (elderly, children and immunocompromised patients) consequent to a range of levels in raw poultry.

2.2 Estimate the change in risk likely to occur for each of the interventions under consideration (see below), including their efficacy.

2.2.1 Reduction in the prevalence of positive flocks (Destruction of positive breeder and chicken (broiler) flocks; Vaccination of breeding flocks; Competitive exclusion (e.g. with S. Sofia)).

2.2.2 Reduction in the prevalence of Salmonella-positive birds at the end of slaughter and processing (Use of chlorine in water chilling of chicken (broilers); Water chilling versus air chilling for chicken (broilers)).

2.2.3 Evaluation of the importance of various routes by which pathogenic Salmonella are introduced into flocks, including through feed, replacement birds, vectors and hygiene.

2.2.4 The impact on risk of change in consumer behaviour (not part of the questions asked by CCCFH but addressed by the Risk Assessment).

Question 1.1 - Estimate the risk from Salmonella in eggs in the general population and in the various susceptible populations (e.g. elderly, children or immuno-compromised patients) at various prevalence and concentration levels of Salmonella in contaminated eggs

The model was used to estimate the relative effects of different prevalence and concentration levels of Salmonella in contaminated eggs. Prevalence can either be the proportion of flocks containing one or more infected hens (i.e. flock prevalence) or the proportion of infected hens within infected flocks (i.e. within-flock prevalence). The risk associated with different flock prevalence levels is illustrated in Table 4. One can also examine the risk of illness per serving for different within-flock prevalence levels, as well as for different starting concentrations of Salmonella per egg.

To model the effect of within-flock prevalence on risk, the 1st, 50th, and 99th percentiles of the within-flock prevalence distribution (0.1%, 0.5%, and 22.3%, respectively) were simulated (Figure 9). Flock prevalence was 25% for these simulations. In the baseline time-temperature scenario, risk of illness per serving was 6 ´ 10-8 (6 per 100 million) 3 ´ 10-7 (3 per 10 million) and 1 ´10-5 (1 per 100 000) for within-flock prevalence of 0.1%, 0.5% and 22.3%, respectively. The results show that a change in within-flock prevalence will lead to a directly proportional change in the risk of illness per serving. Consequently, risk per serving from a flock whose within-flock prevalence is 10% (i.e. 10 in every 100 hens is infected) poses 100 times the risk to humans relative to a flock whose within-flock prevalence is 0.1% (i.e. one in every 1000 hens is infected).

Figure 9. Predicted probability of illness, assuming that within-flock prevalence is either 0.1%, 0.5% or 22.3% (1st, 50th or 99th percentiles of the lognormal distribution used in the model, respectively). Three egg storage time and temperature scenarios are considered. Flock prevalence is assumed to be 25%.

Figure 10. Predicted probability of illness per serving assuming that the number of Salmonella per contaminated egg at lay is 1, 10 or 100. Three flock prevalence levels are considered. Egg storage times and temperatures are assumed to be the baseline settings.

The impact of different initial levels of Salmonella in eggs at the time of lay have on the probability of illness per serving, assuming that all contaminated eggs started with 1, 10 or 100 organisms are showed on Figure 10. The baseline egg storage time and temperature scenario was assumed, but flock prevalence was varied. For a flock prevalence of 5%, risk per serving was about 2 per 10 million, regardless of whether the initial number of Salmonella per egg was 1, 10 or 100. For flock prevalence levels of 25% and 50%, a more detectable change in risk per serving occurs between eggs initially contaminated with 1, 10 or 100 Salmonella. For example, at 25% flock prevalence, the risk per serving increases from 8 per 10 million to 10 per 10 million as the number of Salmonella in eggs at lay increases from 1 to 100. Nevertheless, for one-log changes in the initial numbers of Salmonella, the resulting change in probability of illness is much less than one log.

The dose-response function used in this risk characterization predicts that the probability of illness given an average dose of 1, 10 or 100 organisms is 0.2%, 2.2% or 13%, respectively. If all contaminated eggs were consumed raw immediately after lay, one would expect these probabilities to be appropriate to predict illnesses. The production module predicts that contaminated eggs are produced at a frequency of about 5 ´ 10-5 (~1 in 20 000) when flock prevalence is 25%. If all contaminated eggs contained just one organism and there was no growth or decline before consumption, the predicted risk per serving should be 1 in 10 million Similarly, the risk per serving if all eggs were contaminated with 10 and 100 organisms is 10-6 (1 in 1 million) and ~7 ´ 10-6 (7 in 1 million), respectively.

The dose of Salmonella ingested and the attack rates for children under five years of age were compared with the rest of the population exposed in order to compare susceptible and normal populations. The database did not reveal an increased risk of illness in children under five year of age compared with the rest of the population exposed to salmonella. The database may lack sufficient power to reveal true differences that might exist.

Questions 1.2 - Estimate the change in risk likely to occur from each of the interventions, including their efficacy (1.2.1 reduce the prevalence of positive flocks; 1.2.2 reduce the prevalence of Salmonella-positive eggs; 1.2.3 reduce the number of Salmonella organisms in eggs).

As shown earlier, risk of illness per serving decreases as the percentage of infected flocks (i.e. flock prevalence) decreases. Table 8 illustrates the influence of flock prevalence on risk of illness per serving. Because the model includes uncertain inputs, risk per serving is also uncertain, and this table summarizes uncertainty as the mean, 5th and 95th percentile values (rounded to the nearest significant digit) of the predicted distribution.

Table 8. Predicted uncertainty in risk of illness per egg serving for different flock prevalence levels

Flock prevalence
























The results in Table 8 can be used to predict the reduction in risk for a country or region that decides to control infected flocks. For example, consider a country with 5% of its flocks containing one or more infected hens. If such a country were to institute a programme (for example destruction of positive flocks) with 98% effectiveness in reducing flock prevalence, then successful implementation of the programme would result in a flock prevalence of about 0.1%. The model predicts, in this case, that the mean risk of illness per egg serving would decrease from 2 per 10 million to 5 per 1000 million. Pre-harvest interventions such as those used in Sweden (destruction of positive flocks) and other countries might result in flock prevalence levels of 0.1% or lower.

Although the model predicts that probability of illness per serving is proportional to flock prevalence, the question remains: how can one reduce prevalence of infected flocks. To accomplish this, one seemingly needs to either prevent uninfected flocks from becoming infected, or to treat infected flocks to render them uninfected.

Treatment of breeding flocks to render them uninfected has been used in The Netherlands (Edel, 1994). Antibiotic treatment of the flock followed by administration of a competitive exclusion culture might succeed in eliminating the organism from infected hens, but environmental reservoirs might still exist to re-infect hens once the effects of the antibiotic have worn off. Furthermore, application of this method to commercial flocks may be neither feasible nor economic.

Preventing uninfected flocks from becoming infected is where most attention is focused in control programmes. Uninfected flocks can become infected via vertical transmission (i.e. eggs infected before hatch result in exposure of a cohort via horizontal transmission following hatching), via feed contamination, or via environmental sources (i.e. carry-over infection from previously infected flocks). Control programmes may attempt to eliminate these avenues of exposure by:

(i) Testing breeding flocks to detect Salmonella infection, followed by destruction of the flock, if infected, to prevent it from infecting commercial flocks through its future offspring.

(ii) Requiring heat treatment of feed before sale (thereby eliminating Salmonella and other pathogens from chicken feed).

(iii) Intense cleaning and disinfecting of known-contaminated poultry environments after removing an infected flock. Such an approach must also eliminate potential reservoirs (e.g. rodents).

Most control programmes use all three interventions to prevent Salmonella infection of flocks. The control programme in Sweden consists of such an approach (Engvall and Anderson, 1999). The Pennsylvania Egg Quality Assurance Program in the United States of America also used such an approach (Schlosser et al., 1999). However, discerning the efficacy of each intervention is difficult. Ideally, one would like to know what percentage of newly infected flocks result from vertical transmission, or feed contamination, or from previously contaminated environments.

Giessen et al. (1994) presented a model for determining the relative contribution of risk of infection from vertical, feed-borne (or other outside environmental sources), and carry-over environmental contamination. Comparing the model to data collected in The Netherlands, it appears that carry-over infection was the dominant contributor to risk of infection. The conclusion was based on the shape of a cumulative frequency curve for flock infection, which suggests that most flocks are infected soon after placement in commercial facilities. There is also evidence that the prevalence of infected breeder flocks is very low in The Netherlands.

Data from the United States of America Salmonella Pilot Project (Schlosser et al., 1999) suggest a fairly constant prevalence by age, and that infection did not necessarily increase over time. Nevertheless, these data do not describe the age when infection was introduced. Roughly 60% of the poultry flocks tested in this project were S. Enteritidis-positive. Additional evidence presented shows that 6 of 79 pullet flocks (8%) tested were S. Enteritidis-positive. These data suggest that the risk of infection from vertical transmission might be about 8%. Furthermore, there is little suspicion that feed contamination is an important source of Salmonella for United States of America poultry flocks.

The data from The Netherlands and the United States of America suggest that the carry-over route may account for >80% of the risk of flock infection in countries where Salmonella is endemic. If true, then complete control of breeder flocks might only be expected to achieve ³20% reduction in the prevalence of Salmonella infected flocks in such countries.

Results of an aggressive monitoring programme of breeder flocks in The Netherlands between 1989 and 1992 have been reported (Edel, 1994). For egg-sector breeding flocks, there is some suggestion that prevalence of infected flocks was reduced by ~50% per year. Effectiveness was less dramatic for meat-sector breeding flocks. This programme involved regular faecal testing of all breeder flocks, as well as regular testing of hatchery samples from day-old chicks. Positive flocks were depopulated until mid-1992, when treatment with enrofloxacin and a competitive exclusion culture was allowed as an alternative to the expense of prematurely depopulating a breeding flock. If a programme with 50% effectiveness in reducing prevalence of infected flocks each year were implemented for 3 years, one might predict that prevalence would be about 12% (0.53) of the prevalence at the start of the programme.

To reduce the risk of carry-over infection for commercial flocks, it is thought that aggressive cleaning and disinfection must be completed after an infected flock is removed and before another flock is placed to begin a new production cycle. Cleaning and disinfection must also include an effective long-term rodent control programme. Analysis of efforts in Pennsylvania to reduce the prevalence of infected commercial flocks suggest a decline from 38% to 13% during three years of programme operation (White et al., 1997). This programme routinely screened flocks for evidence of Salmonella and required thorough cleaning, disinfection and rodent control once positive flocks were removed. Another study in Pennsylvania (Schlosser et al., 1999) found 16 of 34 (47%) poultry environments that were initially S. Enteritidis-positive were negative for the pathogen following cleaning and disinfection of the environment.

The effectiveness of "test and divert" programmes depends on the specific testing used in commercial flocks. For example, Sweden collected three pooled samples, each consisting of 30 faecal droppings, during two or more examinations of egg production flocks during each production cycle (Engvall and Anderson, 1999). In their breeder-flock monitoring programme, The Netherlands' testing protocol collects two pools of 50 caecal droppings each, every 4 to 9 weeks of production (Edel, 1994). The Salmonella Pilot Project's protocol in the United States of America required collection of swabs from each manure bank and egg belt in a hen house on three occasions during each production cycle (Schlosser et al., 1999).

Regardless of the size or type of sample collected, it would seem that a testing protocol that examines commercial flocks frequently and diverts eggs soon after detection should result in a meaningful reduction in the number of contaminated shell eggs marketed each year.

To examine the effect of a "test and divert" programme using the present model, two protocols were assumed, with either one or three tests administered to the entire population of egg production flocks. The single test is administered at the beginning of egg production. Under the three-test regime, testing at the beginning of egg production is followed by a second test four months later, and the third administered just before the flock is depopulated. Each single test consists of 90 faecal samples randomly collected from each flock. A flock is considered positive if one or more samples contained S. Enteritidis.

For the within-flock prevalence distribution used in this model, a single test of 90 faecal samples was likely to detect 44% of infected flocks. This was calculated using an equation which assumed that an infected hen shed sufficient S. Enteritidis in her faeces to be detected using standard laboratory methods.

If a flock tested positive (i.e. one or more samples were positive for S. Enteritidis), its entire egg production was diverted to pasteurization. It was assumed that the egg products industry normally uses 30% of all egg production (consistent with the United States of America industry). Therefore, eggs going to breaker plants from flocks other than those mandatorily diverted were adjusted to maintain an overall frequency of 30% (i.e. the percentage of eggs sent to breaker plants from test-negative infected flocks and from non-infected flocks was reduced proportionally).

The premises of flocks found test-positive were assumed to be cleaned and disinfected following flock removal. The effectiveness of cleaning and disinfection in preventing re-infection of the subsequent flock was assumed to be 50%. Furthermore, it was assumed that carry-over infection was responsible for flocks becoming infected. Consequently, houses that were not effectively cleaned and disinfected resulted in infected flocks when they were repopulated.

Assuming a starting prevalence of 25% and the baseline egg storage time and temperature scenario, the effectiveness of the two testing protocols was estimated over a four-year period. The probability of illness per shell egg serving for each year was calculated for each protocol (Figure 11). Testing three times per year for four years reduced the risk of human illness from shell eggs by more than 90%. Testing once a year for four years reduced risk by over 70%. At the end of the fourth year, the flock prevalence for the one-test and three-test protocols had fallen to 7% and 2%, respectively. Therefore, assuming the cost of testing three times per year is three times that of testing once a year (ignoring producer costs or market effects from diversion of eggs), then the change in flock prevalence suggests a roughly proportional difference (e.g., 7% ÷ 2% » 3) in the protocols. Nevertheless, the reduction in risk per serving of the one-test protocol is greater than one-third of the three-test protocol. In other words, the one-test protocol achieves a 70% reduction in risk of human illness, while a testing protocol that is three times more costly achieves a 90% reduction. Such a result is not surprising when we consider that a single test at the beginning of the production year most substantially affects risk, as flocks detected on the first test have their eggs diverted for the entire year, while flocks detected on a second test have their eggs diverted for just over half the year. Furthermore, flocks detected on the third test are tested so late in production that diversion of their eggs does not influence the population risk at all.

While egg diversion from positive flocks reduces the public health risk from shell eggs, it might be expected that there is some increased risk from egg products. Mandatory diversion causes more contaminated eggs to be sent to pasteurization. Nevertheless, the average quality of contaminated eggs is improved by diversion in this model.

It was assumed in the model that all diverted eggs were nest run (i.e. stored usually less than 2 days). Without mandatory diversion, 97% of lots were S. Enteritidis-free post-pasteurization and the average number of surviving Salmonella in a 4500 litre bulk tank was 200 (assuming 25% flock prevalence and the baseline egg storage for time and temperature scenario in the model). If a single test is used to determine which flocks are diverted, there are still 97% of vats that are S. Enteritidis-free and they average 140 Salmonella per lot. The decrease in the average number of Salmonella per lot is due to the increased proportion of nest run eggs that are diverted. Nest run eggs are stored for a shorter period of time and, consequently, contribute fewer organisms. If two tests are used, then there are 97% of vats that are S. Enteritidis-free and the average is 130 per lot. If three tests are used, there is no additional effect on egg products beyond the second test because the third test occurs just as the flock is going out of production.

Although not a direct measure of public health risk, these egg products results suggest that the risk from egg products decreases as flocks are detected and diverted. However, this effect is conditional on nest run eggs being substantially less contaminated than restricted or graded eggs. Alternative scenarios to the one considered here may result in some increase in risk from diversion.

Figure 11. Predicted probability of illness per serving from shell eggs per year following implementation of two testing protocols. It is assumed that all flocks in the region are tested each time. Beginning flock prevalence is assumed 25%. The baseline egg storage time and temperature scenario is used for the four years.

Vaccination for Salmonella has been examined extensively in experimental settings, but less so in field trials. Experimentally, several types of vaccines have been evaluated: killed bacterins of various strains, live bacterins of attenuated strains, and surface antigen extracts of various strains. Injected killed bacterins are thought to have limited efficacy in preventing intestinal colonization of hens with S. Enteritidis, although such bacterins may reduce internal organ (including ovary) infection via stimulation of humoral antibody. Live bacterins - or surface antigen vaccines - may be more effective at modulating intestinal colonization by Salmonella because these products may elicit the cell-mediated immune response needed to resist colonization. Nevertheless, most commercially available vaccines are currently of the killed variety.

Evidence used for this model concerning the effectiveness of Salmonella bacterins in controlling infection was from a report for some flocks in Pennsylvania in the United Satates of America (Schlosser et al., 1999). A group of 19 flocks from two farms used a bacterin to control their Salmonella infection, and sampling results were compared with 51 flocks that did not use a bacterin. Only a slight difference was noted in environmentally positive samples collected in vaccinated (12%) and unvaccinated (16%) flocks. However, the overall prevalence of S. Enteritidis-positive eggs was 0.37 per 10 000 in vaccinated flocks against 1.5 per 10 000 in unvaccinated flocks. These results support the hypothesis that bacterins may not influence risk of colonization, but may reduce systemic invasion of S. Enteritidis, and reduce resultant egg contamination. This analysis did not control for confounding factors (e.g. rodent control, adequacy of cleaning and disinfection) that may have influenced the differences between vaccinated and unvaccinated flocks.

To evaluate the effect of vaccination against Salmonella using the present model, it was assumed that flocks would need to be tested to determine their status prior to use of a vaccine. A single test, or two tests four months apart, with 90 faecal samples per test, was assumed. The vaccine was assumed to be capable of reducing the frequency of contaminated eggs by approximately 75% (e.g. 0.37 per 10 000 for vaccinated flocks ÷ 1.5 per 10 000 for non-vaccinated flocks).

Assuming 25% flock prevalence and the baseline egg storage time and temperature scenario, the probability of illness per serving for a single test and vaccination protocol is about 70% of a non-vaccination protocol (Figure 12). Risk is reduced to 60% of the non-vaccination protocol if two tests are applied.

Figure 12. Comparison of predicted probability of illness per serving between scenarios when no vaccination is used; when one test is applied at the beginning of production and positive flocks are all vaccinated; and when a second test is applied four months after the first test and additional test-positive flocks are vaccinated. Flock prevalence is assumed to be 25%, and the baseline egg storage time and temperature scenario is used.

Given the efficacy of bacterin use based on the field evidence, one could assume that universal vaccination might reduce the baseline risk to 25% of the risk resulting from a non-vaccinated population. However, the cost of vaccinating the entire population of laying hens could be high. The scenarios considered here assume that some testing is done first to determine if a flock is infected before that flock is vaccinated. Nevertheless, the cost of testing all flocks must be weighed against the cost of vaccination. Also, more field research concerning the true efficacy of vaccination should be conducted before the cost of vaccination is borne by more than a few producers (i.e. if costs are to be paid by the public or shared across the entire industry).

The effects of competitive exclusion (CE) treatment are difficult to quantify from field evidence. Sweden and The Netherlands for example include the use of CE in their Salmonella control programmes. Nevertheless, such treatment is only one component of these programmes and its effect is not clearly separable from other components. Competitive exclusion has been studied in experimental settings for newly hatched chicks. The intent of CE inoculation in chicks is to quickly establish an indigenous intestinal flora that resists Salmonella colonization. Efficacy of preventing infection appears to depend on the CE culture used, timing of exposure, dose of exposure, and possibly the addition of lactose (Corrier and Nisbet, 1999). Field evidence of CE efficacy in mature hens comes from the United Kingdom and The Netherlands. In both countries, antibiotic treatment was applied to flocks known to be infected and the hens were subsequently inoculated with CE cultures. The intent of CE inoculation for hens was to quickly restore intestinal flora - destroyed by the antibiotic treatment - to assist the hens in resisting future Salmonella exposures. In the United Kingdom, 20 of 22 trials that combined antibiotic and CE treatments succeeded in preventing re-infection of flocks for a 3-month study period (Corrier and Nisbet, 1999). Infection status was determined from cloacal swab samples in treated flocks. In The Netherlands, combining antibiotic and CE treatments resulted in preventing 72% (n=32) of flocks becoming re-infected. Two such combined treatments prevented re-infection in 93% of flocks.

Interventions intended to minimize the dose of Salmonella in contaminated eggs focus on preventing any growth of the pathogen after the egg is laid. Most evidence suggests that naturally-contaminated eggs contain very few Salmonella organisms at lay. If eggs are consumed soon after lay, or eggs are kept refrigerated during storage, then the number of Salmonella is relatively unchanged prior to preparation of egg-containing meals.

Available predictive microbiology models suggest that eggs stored at 10°C will not grow Salmonella for 46 days on average. If most eggs are stored at <10°C and are consumed within 25 days, then interventions intended to improve egg handling will only be influential on the fraction of eggs that are time-temperature abused.

The effect of mandatory retail storage times and temperatures using slightly different baseline assumptions was evaluated. These hypothetical settings used might be typical in a country that does not have egg refrigeration requirements. The effects of time and temperature restrictions were evaluated assuming a flock prevalence of 25%.

Truncating retail storage time to a maximum of either 14 days or 7 days simulated a shelf-life restriction scenario. Truncating the retail storage temperature to less than 7.7°C simulated a refrigeration requirement. The results are summarized in Figure 13.

Figure 13. Probability of illness per serving of shell eggs given a mandatory shelf-life of <14 or <7 days at retail, or mandatory retail storage temperature <7.7°C. Egg storage times and temperatures are modelled as the in baseline scenario, except for changes introduced to represent a country or region that does not routinely refrigerate eggs. Flock prevalence was assumed to be 25%.

Restricting shelf-life to less than 14 days reduced the predicted risk of illness per serving by a negligible amount (~1%). However, keeping retail storage temperature at no more than 7.7°C reduced risk of illness per serving by about 60%. Were shelf-life to be reduced to 7 days, risk per serving would also be reduced by about 60%.

Figure 14 compares these predicted risks - when no growth or cooking is assumed - with the predictions shown in Figure 10 for 25% flock prevalence. When just a single Salmonella organism is in contaminated eggs, Figure 14 implies that allowing growth inside eggs elevates the risk. Yet, when contaminated eggs contain 10 or 100 organisms, Figure 14 implies that cooking of egg meals substantially reduces the risk. The explanation for these findings is that regardless of the initial contamination, the combined effect of growth and cooking is to stabilize the risk per serving to nearly one per million. It can be concluded from Figures 10 and 14 that the model's output is relatively less sensitive to initial numbers of Salmonella than other inputs that influence growth and cooking.

Figure 14. Comparison of predicted risk of illness when exposure assessment model includes effects of growth and cooking, versus when no growth or cooking is modelled, for cases where the initial number of Salmonella (SE) in contaminated eggs at lay is 1, 10 or 100. Flock prevalence is assumed to be 25%, and baseline egg storage times and temperatures are assumed when growth and cooking are modelled.

Question 2.1 - Estimate the risk from Salmonella in broiler chickens for the general population and for various susceptible populations (e.g. elderly, children or immuno-compromised patients) consequent to a range of levels in raw poultry and question 2.2.1 - reduce the prevalence of positive flocks.

The questions concerning on-farm interventions could not be evaluated due to lack of representative data. It was nevertheless estimated that a reduction in the concentration of infected birds leaving processing would reduce the risk of illness per serving at least proportionally. The expert group found the available data on the importance of various routes for introduction of Salmonella into flocks, including feed, replacement birds, vectors and hygiene. It was not possible, therefore, to evaluate the importance of on-farm routes of introduction of Salmonella. Also, the need was identified for better understanding of cross-contamination processes in all the steps in the production chain.

A change in the prevalence of contaminated raw product affects the risk to the consumer by altering the frequency of exposure to risk events, i.e. exposure to the pathogen. The change in risk as a result of a change in the prevalence of Salmonella-contaminated broilers was estimated by simulating the model using a range of initial prevalence levels. Seven different prevalence levels were investigated: 0.05%, 1%, 5%, 10%, 20%, 50% and 90%. If the prevalence of contaminated chickens leaving processing is altered, through some management practice either at the farm level or at the processing level, the expected risk per serving is altered. The magnitude of the changes in risk per serving and risk per cross- contamination event as a result of changes in prevalence are summarized in Table 9.

Table 9. Impact on risk. Following change in prevalence










Expected risk per serving *








Number of servings








Annual expected risk








Rate of illness per 100 000








Calculation of expected number of cases in the year based on assumed population size and exposed population


20 000 000

Proportion of population that eats chicken


Potentially exposed population

15 000 000

Expected number of cases in the year



1 097

2 195

4 389

10 970

19 741


Expected risk per event








* 2.81E-08 can also be expressed as 2.81 cases per 100 million servings. Similarly for the other risks expressed. E-07 is ...per 10 million; E-06 is ...per million; E-05 is ...per 100 000; etc...

A reduction of 50% in the number of cases of salmonellosis was estimated if a 20% contamination rate at the retail level was reduced to 10% contamination. The relationship between a percentage change in prevalence and expected risk is largely a linear one. Assuming everything else remains constant, it can be expected to reduce the expected risk by the same percentage.

Questions 2.2 - Estimate the change in risk likely to occur from each of the interventions, including their efficacy (2.2.2 Reduction in the prevalence of Salmonella-positive birds at the end of slaughter and processing; and 2.2.3 evaluation of the importance of various route by which pathogenic Salmonella are introduced into flocks.

The effectiveness of specific mitigation interventions, either on-farm or as treatments during processing, were not evaluated in the present risk model because lack of representative data precluded analysis of changes in either or both prevalence and level of contamination that might be attributable to a specific intervention. However, the influence of reducing prevalence can be interpreted, although with a high degree of uncertainty given the current state of knowledge, in the context of chlorine addition to the chill tanks during processing. There is little evidence that the addition of chlorine at levels of 50 ppm or less actually decreases the numbers of the pathogen attached to the skin of poultry carcasses. However, available data suggest that chlorine prevents an increase in the prevalence of contaminated carcasses, i.e. a reduction in cross-contamination (Table 10), while one study observed a substantial reduction in prevalence. In Table 10, the factor in the last column is a ratio of the prevalence after chilling to the prevalence before chilling. A ratio greater than 1 indicates an increase in prevalence of contaminated carcasses.

Table 10. Experimental data for effects of chlorine on Salmonella prevalence after immersion chill tank.



Prevalence before chilling

Prevalence after chilling








With Chlorine


20-50 ppm (tank)









4-9 ppm (overflow)









1-5 ppm (overflow)?









15-50 ppm (tank)









Without Chlorine
























































NOTES: (1) Ratio of prevalence after chilling to prevalence before chilling. A ratio >1 indicates an increase in prevalence of contaminated carcasses.

DATA SOURCES: [1] Izat et al., 1989. [2] James et al., 1992a. [3] Cason et al., 1997. [4] Campbell 1983. [5] James et al., 1992a. [6] James et al., 1992a. [7] Lillard, 1980. [8] Campbell, 1983.

Figure 15. Original and post-intervention concentration distributions.

The effect of reducing the numbers of Salmonella on poultry carcasses without changing the prevalence of contaminated carcasses was also assessed, although not specifically noted in the CCFH list of questions. The values of the post-intervention concentration distributions compared to the baseline scenario were reduced by 50% (approximately 0.3 logMPN per carcass; Figure 15). The model was run using the reduced level of contamination while maintaining the prevalence at 20% and with no changes in any of the other parameters. Figure 16 compares the per-serving risk estimates for the modified simulation representing an intervention with the original data representing the baseline situation.

Figure 16. Risk per serving distribution before and after intervention to change concentration.

Unlike a change in prevalence, a change in concentration of the pathogen does not necessarily have a linear relationship with the risk outcome. The distribution of risk shown in Figure 16, is the risk per serving when contaminated. The servings were estimated to be contaminated and potentially undercooked approximately 2% of the time. That statistic remains unchanged even if the level of contamination is reduced.

The expected risk per serving, which incorporates the prevalence of contaminated servings and the probability of undercooking, was estimated to be 11.3 illnesses per million servings in the original case, and 4.28 per million servings in the situation when the level of contamination is reduced. The expected risk per serving is therefore reduced by approximately 62%. A summary of the results is shown in Table 11.

The risk from cross-contamination events is also affected when the level of contamination is reduced.

Table 11. Risk summary before and after intervention to change concentration.


After Intervention




Expected risk per serving

1.13 per 100 000

4.28 per million

Number of servings in year



Annual expected risk

2.94 per 10 000

1.11 per 10 000

Rate of illness per 100 000



Illustrative calculation for annual expected number of illnesses for a country/region with this annual expected risk


20 000 000

20 000 000

Proportion of population that eats chicken



Potentially exposed population

15 000 000

15 000 000

Expected number of cases in the year



The data available were inconclusive concerning the importance of the various routes by which pathogenic Salmonella are introduced into flocks - including through feed, replacement birds, vectors and poor hygiene. Interpretations of existing studies and results are confounded because of the number of different sampling protocols, specimen types and laboratory methods, as well as the nature of poultry-rearing operations (e.g. very large versus very small premises; types of waterers or feeders). For these reasons, it was not possible to evaluate the importance of on-farm routes of introduction of Salmonella, and this stage was not incorporated into the risk assessment.

Question 2.2.4 - Change in consumer behaviour and its effect on risk (not asked by CCFH)

The consumer represents the final intervention in mitigating risk. However, the effectiveness of strategies aimed at changing consumer behaviour is difficult to anticipate, and difficult to measure. However, for the purposes of this assessment, the potential impact on risk resulting from modifying food preparation practices was investigated by running the simulation assuming the implementation of a strategy that changes consumer behaviour. The assumed changes were:

- probability that product is not adequately cooked:


Min = 5%,

Most likely = 10%,

Max = 15%


Min = 0%,

Most likely = 5%,

Max = 10%

- exposure time (minutes):


Min = 0.5,

Most likely = 1.0,

Max = 1.5


Min = 1.0,

Most likely = 1.5,

Max = 2.0

The changes are thus assumed to reduce the probability of the consumer not adequately cooking the food, and, for those that do tend to undercook, the degree to which they undercook is less.

If the simulation model is re-run with these assumptions, the expected risk is reduced from 11.3 per million, to 2.2 per million. As a result, the changes in consumer practices reduce the expected risk per serving by almost 80%. The changes in consumer practices have an impact on the frequency with which a potentially contaminated product remains contaminated prior to consumption (probability of undercooking) and also reduces the risk when the potentially contaminated product reaches the consumer (longer cooking time). The distribution of risk per serving before and after the intervention is shown in Figure 17.

Figure 17. Risk distribution per serving before and after intervention to alter consumer behaviour.

It is important to note that the mitigation strategy to alter cooking practices does not address the risk associated with cross-contamination. In the baseline scenario, the expected risk per cross-contamination event was shown to be much larger than the risk from consumption of under-cooked chicken. As a result, the strategy to change the consumers cooking practices needs to be tempered by the fact that cross-contamination may in fact be the predominant source of risk, and that the nature of cross-contamination in the home is still a highly uncertain phenomenon.

Previous Page Top of Page Next Page