Previous Page Table of Contents Next Page


Chapter 2 - Health risks associated with wastewater use


Types of pathogens present in wastewater
Pathogens that reach the field or crop
Pathogen survival under agricultural field conditions
Relative health risk from wastewater use
Agronomic conditions that minimize disease spread when wastewater is used for irrigation
Guidelines for public health protection during wastewater use


There are agronomic and economic benefits of wastewater use in agriculture. Irrigation with wastewater can increase the available water supply or release better quality supplies for alternative uses. In addition to these direct economic benefits that conserve natural resources, the fertilizer value of many wastewaters is important. FAO (1992) estimated that typical wastewater effluent from domestic sources could supply all of the nitrogen and much of the phosphorus and potassium that are normally required for agricultural crop production. In addition, micronutrients and organic matter also provide additional benefits.

There are many successful wastewater use schemes throughout the world where nutrient recycling is a major benefit to the project (Pescod and Arar, 1988; FAO, 1992). Rarely, however, is a scheme laid out or planned on the basis of nutrient recycling. The primary constraint to any wastewater use project is public health. Wastewater, especially domestic wastewater, contains pathogens which can cause disease spread when not managed properly. The primary objective of any wastewater use project must therefore be to minimize or eliminate potential health risks.

In most developing countries direct wastewater use projects are normally centred near large metropolitan areas. These schemes often only use a small percentage of the wastewater generated. The result is that indirect use of wastewater prevails inmost developing countries.

Indirect use occurs when treated, partially treated or untreated wastewater is discharged to reservoirs, rivers and canals that supply irrigation water to agriculture. Indirect use poses the same health risks as planned wastewater use projects, but may have a greater potential for health problems because the water user is unaware of the wastewater being present. Indirect use is likely to expand rapidly in the future as urban population growth outstrips the financial resources to build adequate treatment works. Where indirect use occurs, the primary objective must also be to ensure that it is in a manner than minimizes or eliminates potential health risks.

The health hazards associated with direct and indirect wastewater use are of two kinds: the rural health and safety problem for those working on the land or living on or near the land where the water is being used, and the risk that contaminated products from the wastewater use area may subsequently infect humans or animals through consumption or handling of the foodstuff or through secondary human contamination by consuming foodstuffs from animals that used the area (WHO, 1989).

The survival of pathogens and how they infect a new host needs to be understood in developing a programme to eliminate or minimize health risks. The importance and complexity of the rural health problem for those living and working where wastewater is used is beyond the scope of this document. The focus of this document will be on the concern with those who handle, prepare or eat the crop after it has been harvested. The health issues associated with wastewater use for the handlers, preparers and consumers of the crop can be broken down into a series of questions (each will be covered in more detail in subsequent sections of this document):

What types of pathogens are likely to be present in the wastewater?

How many and what types of pathogens reach the field or crop?

Are these pathogens likely to survive in sufficient numbers and for sufficient time to be infectious to the handler or consumer?

How significant is the infection route for the various pathogens?

Which crops carry the highest potential for carrying infections to the handler or consumer?

Are there guidelines or limits available to measure the potential for health risk?

Types of pathogens present in wastewater

Wastewater or natural water supplies into which wastewater has been discharged, are likely to contain pathogenic organisms similar to those in the original human excreta. Disease prevention programmes have centred upon four groups of pathogens potentially present in such wastes: bacteria, viruses, protozoa and helminths. There have been extensive reviews published on the range of these pathogenic organisms normally found in human excreta and wastewater. The most complete reviews are Feachem et al. (1983), Rose (1986) and Shuval et al. (1986a). The following short discussion is extracted from those reviews and is presented to establish a basic understanding of the pathogens and their abundance.

Bacteria. The faeces of a healthy person contains large numbers of bacteria (> 1010/g), most of which are not pathogenic. Pathogenic or potentially pathogenic bacteria are normally absent from a healthy intestine unless infection occurs. When infection occurs, large numbers of pathogenic bacteria will be passed in the faeces thus allowing the spread of infection to others. Diarrhoea is the most prevalent type of infection, with cholera the worst form. Typhoid, paratyphoid and other Salmonella type diseases are also caused by bacterial pathogens.

Viruses. Numerous viruses may infect humans and are passed in the faeces (> 109/g). Five groups of pathogenic excreted viruses are particularly important: adenoviruses, enteroviruses (including polioviruses), hepatitis A virus, reoviruses and diarrhoea-causing viruses (especially rotavirus).

Protozoa. Many species of protozoa can infect humans and cause diarrhoea and dysentery. Infective forms of these protozoa are often passed as cysts in the faeces and humans are infected when they ingest them. Only three species are considered to be pathogenic: Giardia lamblia, Balantidium coli and Entamoeba histolytica. An asymptomatic carrier state is common in all three and may be responsible for continued transmission.

Helminths. There are many species of parasitic worms or helminths that have human hosts. Some can cause serious illnesses and the ones that pass eggs or larval forms in the excreta are of importance in considering wastewater use. Most helminths do not multiply within the human host, a factor of great importance in understanding their transmission, the ways they cause disease and the effects that environmental change will have on their control. Often the developmental stages (life cycles) through which they pass before reinfecting humans are very complex. Those that have soil, water or plant life as one of their intermediate hosts are extremely important in any scheme where wastewater is used directly or indirectly.

The helminths are classified in two main groups: the roundworms (nematodes) and worms that are flat in cross section. The flatworm, in turn, may be divided into two groups: the tapeworms which form chains of helminths "segments" and the flukes which have a single, flat, unsegmented body. Most of the roundworms that infect humans and also the schistosome flukes have separate sexes. The result is that transmission depends upon infection with both male and female worms and upon meeting, mating and egg production within the human body.

Pathogens that reach the field or crop

All the pathogens discussed in the previous section have the potential to reach the field. From the time of excretion, the potential for all pathogens to cause infection usually declines due to their death or loss of infectivity. The ability of an excreted organism to survive outside the human body is referred to as its persistence. For all the organisms, survival is highly dependent on temperature with greatly increased persistence at lower temperatures.

The first exposure of excreted pathogenic organisms outside the body is usually in water. This blend with freshwater is often referred to as sewage. This sewage is then either subjected to treatment prior to discharge, used directly for crop production or discharged to a watercourse where indirect use then occurs downstream. There are many studies on the survival or persistence of excreted organisms in water and sewage. A summary is shown in Table 1.

Many bacterial populations decline exponentially so that 90 to 99 percent of the bacteria are lost relatively quickly. Survival of bacteria, like many other organisms, depends greatly on how hostile the environment is including other micro-organisms in the water that might provide competition or predation. Bacteria often survive longer in clean water than in dirty water but survival in excess of 50 days is most unlikely and at 20-30°C, 20-30 days is a more common maximum survival time.

Viral survival may be longer than bacterial survival and is greatly increased at lower temperatures. In the 20-30°C range, two months seems a typical survival time, whereas at around 10°C, nine months is a more realistic figure. There is evidence that virus survival is enhanced in polluted waters, presumably as a result of some protective effect that the viruses may receive when they are adsorbed onto suspended solid particles in dirty water.

TABLE 1: Survival times of excreted pathogens in freshwater and sewage at 20-30°C

Pathogen

Survival time (days)

Virusesa


Enterovirusesb

<120 but usually <50

Bacteria


Faecal coliforma

<60 but usually <30


Salmonella spp.a

<60 but usually <30


Shigella spp.a

<30 but usually <10


Vibrio cholerac

<30 but usually <10

Protozoa


Entamoeba histolytica cysts

<30 but usually <15

Helminths


Ascaris lumbriocoides eggs

Many months

a. In seawater, viral survival is less, and bacterial survival is very much less than in freshwater.

b. Includes polio-, echo-, and coxsackieviruses.

c. V. cholera survival in aqueous environments is still uncertain.

Source: Feachem et al. (1983).

Protozoal cysts are poor survivors in any environment. A likely maximum in sewage or polluted water would not exceed that shown in Table 1 for Entamoeba histolytica. Helminth eggs vary from the very fragile to the very persistent. One of the most persistent is the Ascaris egg which may survive for a year or more. The major concern for this helminth is that the soil is its intermediate host prior to reinfecting humans.

The survival times shown in Table 1 may be altered by the type or degree of wastewater treatment given the sewage water prior to use or discharge to a water body. Different treatment processes remove pathogens to varying degrees. What is not well understood in wastewater treatment systems is whether the treatment process produced an elevated level of hostile environment that accelerated the death of the organism or whether the treatment process had little effect on excreted pathogens and simply allowed the necessary time for natural die-off to occur independent of the treatment process.

The critical factor to consider for wastewater use is that most wastewater treatment plants were designed to reduce organic pollution of rivers and lakes and rarely are designed to remove all risks from pathogenic organisms. Therefore, regardless of the level of treatment provided, some pathogenic organisms will reach the agricultural fields when the water is used.

In instances where the sewage water has not received treatment, the level of pathogenic organisms is likely to be higher whether the use is occurring directly from raw sewage or from raw sewage that has been blended with other water supplies. In both instances, pathogenic organisms will reach the agricultural fields. These pathogenic organisms, as with treated sewage, have the potential to contaminate both the soil and the crop depending upon how the irrigation water is used. The critical element is to understand that whether treated, partially treated, or untreated water is used, pathogenic organisms are present and the use site must be managed in a manner that minimizes or eliminates the potential for disease transmission.

TABLE 2: Factors affecting survival time of enteric bacteria in soil

Soil factor

Effect on bacterial survival

Antagonism from soil microflora

Increased survival time in sterile soil

Moisture content

Greater survival time in moist soils and during times of high rainfall

Moisture-holding capacity

Survival time is less in sandy soils than in soils with greater water-holding capacity

Organic matter

Increased survival and possible regrowth when sufficient amounts of organic matter are present

pH

Shorter survival time in acid soils (pH 3-5) than in alkaline soils

Sunlight

Shorter survival time at soil surface

Temperature

Longer survival at low temperatures; longer survival in winter than in summer

Source: Shuval et al. (1986a) as adapted from Gerba et al. (1975).

Pathogen survival under agricultural field conditions

The literature on survival times of excreted pathogens in soil and on crop surfaces has been reviewed by Feachem et al. (1983) and Strauss (1985). As expected there was wide variability in reported survival times which reflects the influence of environmental and analytical factors. Table 2 describes several factors affecting survival time of bacteria in soil. Many of these factors may also affect survival of other pathogenic organisms.

Knowledge of the survival of pathogens in soil and on the crop allows an initial assessment of the risk of transmitting disease via produced foodstuff or through worker exposure. WHO (1989) presented a summary of the potential survival times in agricultural cropping environments (Table 3). WHO concludes that "Available evidence indicates that almost all excreted pathogens can survive in soil... for a sufficient length of time to pose potential risks to farm workers. Pathogens survive on crop surfaces for a shorter time than in the soil as they are less well protected from the harsh effects of sunlight and desiccation. Nevertheless, survival times can be long enough in some cases to pose potential risks to crop handlers and consumers, especially when survival times are longer than crop growing cycles as is often the case with vegetables". While the length of the crop growing cycle is important, equally important is the length of time since the last irrigation cycle (potential exposure cycle). WHO (1989) points out that excreted pathogens, if they do enter an irrigated area with the irrigation water, have the potential to remain infectious for a considerable period of time thus steps must be taken to interrupt this infection cycle.

Relative health risk from wastewater use

The discussion in the previous sections show that a broad spectrum of pathogenic microorganisms including bacteria, viruses, helminths and protozoa is present in wastewater and they survive for days, weeks and at times months in the soil and on crops that come in contact with wastewater. Early approaches to measuring the health risk from these pathogenic micro-organisms centred on detection. Based upon the fact that these micro-organisms could survive, detection in any of these environments was sufficient to indicate that a public health problem existed. It was then assumed that such detection showed evidence that a real potential for disease transmission existed (Shuval et al., 1986a; Shuval, 1991). This is a "zero-risk" approach. Throughout the years a number of standards and guidelines have been developed on this zero-risk approach. This led to standards for wastewater use that approached those of drinking water especially where vegetable crops were being grown.

TABLE 3: Survival times of selected excreted pathogens in soil and on crop surfaces at 20-30°C

Pathogen

Survival time

In soil

On crops

Viruses


Enterovirusesa

<100 but usually <20 days

<60 but usually <15 days

Bacteria


Faecal coliform

<70 but usually <20 days

<30 but usually <15 days


Salmonella spp.

<70 but usually <20 days

<30 but usually <15 days


Vibrio cholera

<20 but usually <10 days

<5 but usually <2 days

Protozoa


Entamoeba histolytica cysts

<20 but usually <10 days

<10 but usually < 2 days

Helminths


Ascaris lumbricoides eggs

Many months

<60 but usually <30 days


Hookworm larvae

<90 but usually <30 days

<30 but usually <10 days


Taenia saginata eggs

Many months

<60 but usually <30 days


Trichuris trichiura eggs

Many months

<60 but usually <30 days

a Includes polio-, echo-, and coxsackieviruses.

Source: WHO (1 989) as summarized from Feachem et al. (1983).

TABLE 4: Effectiveness of enteric pathogens to cause infections through wastewater irrigation related to their epidemiological characteristics

Enteric pathogens

Persistence in environment

Minimum infective dose

Immunity

Concurrent routes of infection

Latency/soil development stage

Viruses

Medium

Low

Long

Mainly home contact and food or water

No

Bacteria

Short/Medium

Medium/High

Short/Medium

Mainly home contact and food or water

No

Protozoa

Short

Low/Medium

None/Little

Mainly home contact and food or water

No

Helminths

Long

Low

None/Little

Mainly soil contact outside home and food

Yes

Source: Shuval et al. (1986b).

Whether a person becomes infected actually depends on a number of additional factors, each of which adds to or diminishes the actual risk of infection. Feachem et al. (1983) and Shuval et al. (1986b) reviewed these factors and found several that are important for determining the relative health risk during wastewater use:

Excreted load. This refers to the concentration of pathogens passed by an infected person and represents the total number of pathogens.

Latency. Latency refers to the interval between the time that a pathogen is excreted and the time that it can infect a new host.

Persistence. Viability of a pathogen in the environment or persistence is a measure of how quickly it dies after leaving the human body.

Multiplication. A measure of whether a pathogen can multiply outside the human body.

Infective dose. Number of organisms needed to cause infection (this is not easy to predict).

Host response. A measure of the response (immunity) once an individual has received a dose of an infective agent.

Non-human hosts. Some infections are confined strictly to humans while others may need an intermediate host prior to reinfection.

Shuval et al. (1986b) developed a theoretical epidemiological model based on the above factors. The model looked at their relationship to the probability that one of the four enteric pathogen groups described earlier would cause infections in humans through wastewater irrigation. The following factors were considered necessary to cause a high probability of infection:

long persistence in the environment;

low minimal infective dose;

short or no human immunity;

minimal concurrent transmission through other routes such as food, water and poor personal or domestic hygiene; and

long latent period and/or soil development stage required.

Table 4 presents the summary of how Shuval et al. (1986b) rated the five factors when considering the enteric pathogen groups.

The Shuval model shows that helminth diseases, if they are endemic, will be very effectively transmitted by irrigation with raw wastewater. On the other hand, the enteric virus diseases should be the least effectively transmitted by irrigation with raw wastewater. The bacterial and protozoan diseases rank between these two extremes. Shuval et al. (1986b) ranked the pathogens in the following descending order of risk:

1. High: Helminths (the intestinal nematodes - Ascaris, Trichuris, hookworm and Taenia)

2. Lower: Bacterial infections (i.e. cholera, typhoid and shigellosis) and Protozoan infections (i.e. ameobiasis, giardiasis)

3. Least: Viral infections (viral gastroenteritis and infectious hepatitis)

This ranking is consistent with the theoretical considerations noted by Feachem et al. (1983) where the determinations were made on factors other than wastewater use. Shuval et al. (1986b) reviewed the available epidemiological evidence to determine whether the theoretical model fitted the empirical evidence. This review concluded that there is evidence of disease transmission in association with the use of raw or partially treated wastewater. This evidence points most strongly to the helminths as the number one problem, particularly in developing countries. There was limited transmission of bacterial and virus disease. The empirical evidence therefore points to the usefulness of the theoretical model and especially the priority ranking for the potential threat of disease transmission. The Shuval model (Table 4) and the rationale behind the ranking of pathogens shown above were reviewed in the World Bank/WHO-sponsored Engelberg Report (IRCWD, 1985) that obtained the endorsement of an international group of environmental experts and epidemiologists.

Agronomic conditions that minimize disease spread when wastewater is used for irrigation

The previous discussions demonstrate that a potential for disease transmission exists when wastewater is used for irrigation. Pathogens that are brought in with the wastewater can survive in the soil or on the crop. The actual risk of disease transmission, however, is related to whether this survival time is long enough to allow transmission to a susceptible host. The crop and the field are the link between the pathogen in the wastewater and the potential for infection. The factors controlling transmission of disease are agronomic, such as the crop grown, the irrigation method used to apply the wastewater, and the cultural and harvesting practices used.

The choice of crops for wastewater use areas depends upon a number of factors. The crop grown must be suitable to the agronomic conditions in the area. Determining factors include climate, soils, available water, pest control, marketing and farmer skills. These and other general agronomic problems are discussed in numerous publications and will not be discussed in detail here. Another factor of importance for wastewater use areas is water quality. The impact can be on the soil, on crop growth, or it can affect the consumer of that crop. Water quality impacts on the soil and on crop growth are discussed in detail in Ayers and Westcot (FAO, 1985) and will not be covered here. The microbiological quality of the water can directly affect the consumer of that crop because of the risk of infection from that crop. Shuval et al. (1986a) defined three levels of risk in selecting a crop to be grown. They are presented here in increasing order of public health risk:

Low(est) risk to consumer but field worker protection still needed

1. Crops not for human consumption (for example cotton, sisal).

2. Crops normally processed by heat or drying before human consumption (grains, oilseeds, sugar beet).

3. Vegetables and fruit grown exclusively for canning or other processing that effectively destroys pathogens.

4. Fodder crops and other animal feed crops that are sun-dried and harvested before consumption by animals.

5. Landscape irrigation in fenced areas without public access (nurseries, forests, green belts).

Increased risk to consumer and handler

1. Pasture, green fodder crops.

2. Crops for human consumption that do not come into direct contact with wastewater, on condition that none must be picked off the ground and that spray irrigation must not be used (tree crops, vineyards, etc.).

3. Crops for human consumption normally eaten only after cooking (potatoes, eggplant, beetroot).

4. Crops for human consumption, the peel of which is not eaten (melons, citrus fruits, bananas, nuts, groundnuts).

5. Any crop not identified as high-risk if sprinkler irrigation is used.

Highest risk to consumer, field worker and handler

1. Any crops eaten uncooked and grown in close contact with wastewater effluent (fresh vegetables such as lettuce or carrots, or spray-irrigated fruit).

2. Landscape irrigation with public access (parks, lawns, golf courses).

Another path of infection is from direct contact with the crop or soil in the area where wastewater was used. This path is directly related to the level of protection needed for field workers. The only feasible means of dealing with the worker safety problem is prevention. The following are a few of many low and high risk situations:

Low risk of infection

Mechanized cultural practices
Mechanized harvesting practices
Crop is dried prior to harvesting
Long dry periods between irrigations

High risk of infection

High dust areas
Hand cultivation
Hand harvest of food crops
Moving sprinkler equipment
Direct contact with irrigation water

Guidelines for public health protection during wastewater use

International guidelines or standards for the microbiological quality of irrigation water used on a particular crop do not exist. The reason is the lack of direct epidemiological data to show any relationship between the quality of water actually applied at the field level and disease transmission or infection. The only known guideline is from the US Environmental Protection Agency (prepared by the US National Academy of Sciences) which establishes the maximum acceptable level for irrigation with natural surface water, including river water, at 1000 faecal coliforms per 100 ml. This was based on testing of a limited number of rivers and canals used for irrigation between 1965 and 1972 and focused on the presence of pathogens not on epidemiological data (US EPA, 1973).

The lack of direct epidemiological data resulted in standards and guidelines for the quality of wastewater used for irrigation to be focused on effluent standards at the wastewater treatment plant rather than the quality at the point of use. These effluent standards have generally specified both maximum concentrations of bacteria and minimum treatment levels according to the class of crop to be irrigated (consumed vs non-consumable). These standards are most often used for process control at wastewater treatment plants. There have been few checks made of the actual microbiological quality of the water at the place where it is used for irrigation.

The earliest effluent standards for wastewater treatment plants were expressed in terms of the maximum permissible number of faecal coliform bacteria. In practice, faecal coliform count was a reasonable indicator of bacterial pathogens. Their environmental survival characteristics and rates of removal or die-off in treatment processes were similar. Faecal coliforms therefore made good indicators of treatment efficiencies (WHO, 1989). As technology advanced in wastewater treatment and disinfection, stricter effluent standards were often adopted without regard to the risks associated with use of the water. Standards as recent as 20 years ago were based on a "zero-risk" concept with the aim to achieve in the effluent a pathogen or microbial-free environment without regard to pathogen-host relationships or to valid epidemiological evidence of disease transmission caused by the practice of wastewater use (Hespanhol and Prost, 1994). Theoretically, these technology-based standards could be met, therefore the maximum permissible levels were set correspondingly low in countries with this advanced level of technology. For example, the California (USA) State Health Department adopted a bacterial standard for unrestricted wastewater irrigation of <2.2 total coliforms/100 ml which was close to the existing drinking water standard. Many countries followed this lead and adopted the same criteria with little or no adaptation to local constraints or to the level of technology available to meet this standard (Shuval, 1991).

WHO, supported by a group of specialists, recognized that the extremely strict California standard for wastewater use that was being adopted by many countries was not justified by the available epidemiological evidence nor was it likely that many countries, especially developing countries, could meet this strict standard. The WHO group of experts recommended a microbial guideline for unrestricted irrigation of all crops of not more than 100 total coliforms per 100 ml (WHO, 1973). This was a significant liberalization. The WHO group of experts also recognized the lack of sound epidemiological data and recommended that future wastewater irrigation guidelines be given a sounder epidemiological basis.

Extensive epidemiological evidence has been accumulated since the initial 1973 WHO Guidelines (Feachem et al., 1983; Blum and Feachem, 1985; Rose, 1986; Shuval et al., 1986a). This evidence was reviewed at international meetings in Engelberg (IRCWD, 1985) and Adelboden (Mara and Cairncross, 1989). The consensus of health experts is that the actual risk associated with irrigation with treated wastewater is much lower than previously estimated particularly with respect to bacterial pathogens. On the other hand, they raised the level of concern for parasitic diseases which they felt were the main risk for individual and overall public health associated with the use of insufficiently treated wastewater in agriculture. This is consistent with the relative risk assessment presented in Table 4.

Based on an epidemiological review, a WHO Scientific Group on Health Guidelines for the Use of Wastewater in Agriculture and Aquaculture adopted the microbiological quality guidelines for wastewater use in agriculture shown in Table 5 (WHO, 1989). These new guidelines recommend less stringent values for faecal coliforms than were previously recommended by WHO in 1973. The new guidelines are stricter than previous standards concerning the need to reduce helminth egg concentrations in effluent. The guidelines do not refer specifically to protozoa. It was implied that if the helminth egg level could be reached that equally high removals of all protozoa will be achieved. The purpose of applying the helminth standard throughout all cropping systems was to increase the level of protection for agricultural workers, who are at high risk from intestinal nematode infection (Mara and Cairncross, 1989). The review also concluded that no bacterial guideline was needed for protection of the agricultural worker since there was little evidence indicating a risk to such workers from bacteria; the WHO Scientific Group expected some degree of reduction in bacterial concentration associated with efforts to meet the helminth reduction levels (Figure 1) (WHO, 1989).

It is important to remember that the guidelines in Table 5 are for the microbiological quality of treated effluent from a wastewater treatment plant when that water is intended for crop irrigation. The WHO Scientific Group on Health Guidelines intended the guidelines in Table 5 as design goals in planning wastewater treatment plants and they were not intended as standards for quality surveillance or routine monitoring of irrigation water (Mara and Cairncross, 1989). Reality is, however, that planning, design and construction of wastewater treatment facilities that can consistently meet the present WHO Guidelines will be decades in the making.

With exploding urban populations, the degree of river and irrigation water supply contamination in developing countries will likely increase. Pressure will also increase to utilize partially treated wastewater for irrigation until adequate treatment facilities can be constructed. Because of this increasing level of irrigation water contamination there is an immediate need to control wastewater use in high risk cropping systems such as vegetable crop production. Adequate control, however, can only come about when guidelines or regulations are in place that define the quality of water that can be safely applied to the cropland. The present guidelines of WHO, although intended as design goals for wastewater treatment plants, could be used as interim irrigation water standards for regulating cropping practices. These guidelines could be applied in areas where wastewater is utilized directly for irrigation or where use is indirect by diversion of contaminated river water supplies.

Even though there is a lack of data to define whether the WHO Guidelines could be used as irrigation water standards, their potential use is implied in the WHO discussion of handling partially treated wastewater, which stated, "A lesser degree of removal [than needed to achieve the recommended guideline quality for unrestricted irrigation] can be accepted if other health protection measures are envisaged, or if the quality of the wastewater will be further improved after treatment, whether by dilution in naturally occurring waters, by prolonged storage or by transport over long distances in a river or canal" (WHO, 1989). Bartone (1991) in a review of effluent irrigation also implied that if the WHO Guidelines were routinely applied, no undue health risk of infectious disease transmission in effluent irrigation projects should arise. These statements recognize the importance of at least partial treatment and other steps that may occur prior to irrigation use.

TABLE 5: Recommended microbiological quality guidelines for wastewater use in agriculturea

Category

Reuse condition

Exposed group

Intestinal nematodesb (arithmetic mean no. of eggs per litrec)

Faecal coliforms (geometric mean no. per 100 mlc)

Wastewater treatment expected to achieve the required micro-biological quality

A

Irrigation of crops likely to be eaten uncooked, sports fields, public parksd

Workers, consumers, public

£ 1

£ 1000d

A series of stabilization ponds designed to achieve the microbiological quality indicated, or equivalent treatment

B

Irrigation of cereal crops, industrial crops, fodder crops, pasture and treese

Workers

£ 1

No standard recommended

Retention in stabilization ponds for 8-10 days or equivalent helminth and faecal coliform removal

C

Localized irrigation of crops in cat. B if exposure of workers and the public does not occur

None

Not applicable

Not applicable

Pretreatment as required by the irrigation technology, but not less than primary sedimentation

a In specific cases, local epidemiological, socio-cultural and environmental factors should be taken into account, and the guidelines modified accordingly.

b Ascaris and Trichuris species and hookworms.

c During the irrigation period.

d A more stringent guideline (£ 200 faecal coliforms per 100 ml) is appropriate for public lawns, such as hotel lawns, with which the public may come into direct contact.

e In the case of fruit trees, irrigation should cease two weeks before fruit is picked, and no fruit should be picked off the ground. Sprinkler irrigation should not be used.

Source: WHO (1989).

FIGURE 1: Generalized removal curves for BOD, helminth eggs, excreted bacteria, and viruses in waste stabilization ponds at temperatures above 20°C

(Source: Shuval et al., 1986b)

In spite of the lack of experience in using the present WHO Guidelines as irrigation water standards or for routine monitoring of areas directly or indirectly using wastewater for irrigation, the present situation requires that interim irrigation water standards be established. Until sufficient epidemiological information is available, it seems prudent to utilize the 1989 WHO Guidelines for controlling the quality of water used to irrigate vegetable or other high-risk crops. These Guidelines should not be considered a level to which quality can deteriorate, rather they should be a performance goal to achieve for those water supplies which presently exceed this level. The goal would be to control the use of wastewater in cropping areas that would present a high risk of disease spread. Using the WHO Guidelines as irrigation standards would help to:

assess the extent of contamination;

reduce the disease infection risk until suitable wastewater treatment works are in place;

improve the basic health level in the rural areas; and

provide data that can be used in long-term planning for wastewater management in the agricultural sector.

Shuval et al. (1986b) stressed that a major or total reduction in negative health effects could be made if the greatest emphasis is placed on helminth egg removal during wastewater treatment. The dilemma is that little or no experience is available in using helminth egg concentration in irrigation water monitoring nor are there well understood monitoring techniques available. Because of this shortcoming, the initial emphasis in using the WHO Guidelines should focus on the faecal coliform guideline. Monitoring and evaluation techniques for faecal coliforms are well understood.

The faecal coliform level defined in the WHO Guidelines is already being used in the USA as a water quality guideline. The US Environmental Protection Agency (EPA), together with the National Academy of Sciences (NAS), have recommended that the acceptable guideline for irrigation with natural surface water, including river water containing wastewater discharges, be set at 1000 faecal coliforms per 100 ml (US EPA, 1973). The US EPA level is also consistent with the 1000-2000 faecal coliforms per 100 ml level used as a standard for bathing in Europe (WHO, 1989). The US EPA Guideline has been adopted in some countries as an irrigation water quality standard. For example in Chile, NCh 1333 dated 1978 establishes the US EPA faecal coliform level as an irrigation standard.

It must not be implied that the recommendation to use only the faecal coliform guideline as the irrigation water standard would be equivalent to the WHO Guidelines. WHO has stressed the use of the helminth egg level also, but a lack of experience in applying this helminth guideline makes it difficult to implement for routine surveillance. It is unclear whether using faecal coliform as the only irrigation standard would pose the same or higher risk than a similar concentration coming from a wastewater treatment plant. WHO (1989) feels the wastewater treatment process lowers the helminth egg level but it is unclear whether the same action would occur with untreated or partially treated wastewater that is diluted in natural river flow or where bacterial die-off has occurred. This concern demonstrates a potential shortcoming or criticism of using only the faecal coliform portion of the WHO Guidelines as an irrigation water standard.

TABLE 6: Faecal coliforms in rivers

Number of faecal coliforms per 100 ml

No. of rivers tested in each region

North America

Central and South America

Europe

Asia and the Pacific

<10

8

0

1

1

10-100

4

1

3

2

100-1 000

8

10

9

14

1 000-10 000

3

9

11

10

10 000-100 000

0

2

7

2

>100 000

0

2

0

3

Total number of rivers

23

24

31

32

Source: WHO (1989).

There are no helminth egg data available on most rivers that are carrying a percentage of partially or untreated wastewater. Considerable data are available for faecal coliform levels. Table 6 shows that in about 45 percent of the 110 rivers tested throughout the world, the faecal coliform levels exceeded the WHO Guideline, illustrating that river contamination levels are already high and not likely to improve rapidly until treatment facilities are built.

Programmes to reduce risk often focus on the most highly contaminated waters first. Table 6 shows that nearly 15 percent of the rivers tested worldwide had faecal coliform levels ten or more times greater than the WHO Guidelines. Water from such rivers is widely used for irrigation without any restrictions on its use.

Dramatic initial success in disease reduction can be achieved by concentrating efforts in the worst contaminated areas. As with Chile, however, disease rates still remain high and expanding crop restrictions to a more widespread area will be difficult. As contamination levels are expected to remain high for the foreseeable future, there needs to be an equal emphasis on defining and promoting safe production areas for the high-risk crops such as vegetable crops. A discussion of how to utilize irrigation water quality guidelines to define these safe production areas is given in Chapter 4.

IMPACT OF WASTEWATER DISCHARGES ON RIVER WATER QUALITY

The impact of treatment works can be seen in contrasting examples. In the irrigated areas that surround Metropolitan Santiago, Chile, greater than 60% of the irrigated area is diverting river water with faecal coliform levels in excess of the WHO Guidelines. The cause of these high levels is untreated and unrestricted discharges into the rivers (Figure 2). Chile has begun (1992) a vigorous programme to implement adequate treatment facilities, but this programme is likely to take 10-20 years or more to complete. In contrast, tests of the quality of river water used in North America for unrestricted irrigation show > 90% of the rivers had faecal coliform levels below the WHO Guidelines (Table 6). These levels, however, are only being achieved after an aggressive and costly programme over the last 30 years to upgrade wastewater treatment facilities.

FIGURE 2: Percentage of the irrigated area affected by various levels of faecal contamination in the source of irrigation supply water within the Metropolitan Region of Chile (Source: FAO, 1993)

IMPACT OF CROP RESTRICTIONS ON DISEASE LEVELS NEAR SANTIAGO, CHILE

For example, Figure 2 shows that almost 60% of the irrigated area within the Metropolitan Region of Chile (Santiago) uses water in excess of 10 000 faecal coliforms per 100 ml. In 1992, as a result of a cholera outbreak, the Government of Chile began a vigorous crop restriction programme in the areas with the worst contamination. Preliminary data show that, in addition to controlling the cholera outbreak, there has been a dramatic decline in the cases of hepatitis and typhus (Figure 3).

FIGURE 3: Cases of hepatitis and typhus reported in Chile (Source: FAO, 1993)


Previous Page Top of Page Next Page