Previous Page Table of Contents Next Page


Chapter 4 - Drainage water re-use

Michael C. Shannon, USDA Salinity Laboratory, Riverside, California, USA
Vashek Cervinka, California Department of Food and Agriculture, California, USA; and Dick A. Daniel, CALFED Bay Delta Program, Sacramento, California, USA


Re-use for crop irrigation
Re-use for saline agriculture and forestry
Re-use in a natural wetland


Re-use is an important and natural method of managing drainage water. In order to develop the maximum benefit from a water supply and to help dispose of drainage waters, strategies for water re-use have evolved. Water re-use must be balanced against both short and long-term needs, with consideration for both local and off-site effects. In regions where irrigation water supplies are limited, drainage water can be used to supplement them. However, the quality of the drainage water determines which crops can be irrigated. Highly saline drainage water cannot be used to irrigate salt-sensitive crops. It could, however, be re-used on tolerant forages or in a saline agriculture-forestry system. Saline drainage water is being successively re-used for the irrigation of salt-tolerant crops and trees. Where an irrigation project is located near a natural wetland, the drainage water can be re-used in the wetland. However, precautions will have to be taken to ensure that the quality of the drainage water does not harm fish, waterfowl or other wildlife in the wetland and that the amount of water passing through the wetland is sufficient to prevent toxic concentrations from developing.

Re-use for crop irrigation


Effects of salinity on crop growth and yield
Agricultural management practices
Managing cycling and blending strategies


The major degradation factor of re-used waters is the high concentration of ions. Waters with low ionic concentrations provide plants with an adequate supply of many of the essential nutrients needed for growth. However, as salinity increases, specific ions may become toxic or interfere with the uptake of other nutrients. In soils, the accumulation of ions increases the osmotic potential against which plants extract water. It can also degrade soil structure. Drainage and leaching of salts from the root zone are key factors in the management of salinity in agriculture. Another management factor is control of the range of salt tolerance expressed in crop species. Water re-use for agricultural crops has distinct economic incentives and a number of crops are known to be highly tolerant to salinity. However, as salinity increases in the irrigation water, there is a greater need to monitor and manage irrigation and drainage practices and to consider the sustainability of the system.

It is possible to safely re-use agricultural drainage water if the characteristics of the water, soil, and the intended crop plants are known and can be economically managed. Poor quality water requires selection of crops with appropriate salt tolerances, improvements in water management, and maintenance of soil structure and permeability (tilth, hydraulic conductivity). When sensitive crop growth stages such as germination and early growth are excluded, the temporal weighted mean root zone salinity has been found to be a valid measure for evaluating crop response to salt. The arithmetic mean root zone salinity within the rooting depth integrated over the time of exposure is an effective approximation for estimating crop response. Plants respond to the weighted mean salinity within a specific growth period.

Effects of salinity on crop growth and yield

Salinity in water or soil is an environmental factor that, in general, reduces crop growth. At relatively low salinity, especially among crop species such as cotton or the halophytic sugar beet, some salinity may actually improve crop production. This effect has been attributed in some instances to an improvement in water use efficiency of the plant (Letey, 1993). However, as salinity increases beyond some threshold tolerance, yield decline is inevitable. Usually, at low to moderate salinities, plant growth is simply reduced and there is a slight darkening in leaf colour. These effects are difficult to detect without comparisons with non-saline controls. When salt concentrations in the soil water reach toxic levels, leaves or shoots may exhibit visible symptoms of tip or edge burning or scorching due to high internal concentrations of salts. Other visible symptoms may be associated with nutrient imbalances caused by competitive interactions between Na+ and Ca2+ or K4+, or between Cl- and nitrate (Grattan and Grieve, 1992). Interactions of salinity with nutrients such as PO43-, Mg and micronutrients have also been reported, but these observations seem to be specific to certain crops and waters of specific ionic composition. Depending upon crop species and salinity concentration, salt in the crop root zone may also influence the rate of plant development by increasing or decreasing the time to crop maturity (Shannon et al., 1994). In most crops there are some notable differences in root/shoot ratio, a response that would not be identifiable in the field. In some crops, salinity changes plant growth habit or increases succulence (Luttge and Smith, 1984; Shannon et al., 1994).

There is a wide range in plant species response to salinity. Sugar beet, sugar cane, dates, cotton and barley are among the most salt tolerant; whereas beans, carrots, onions, strawberries and almonds are considered sensitive (Maas, 1986). In general, salinity decreases both yield and quality in crops, and previous research has led to the development of large data bases on the salt tolerances of many crop species and varieties (Francois and Maas, 1978 and 1985; Maas, 1990;

Maas and Hoffman, 1977). Salt tolerance can be represented most simply based on two parameters: the threshold salinity (t) which is expected to cause an initial significant reduction in the maximum expected yield (Y); and the slope (s) of the yield decline. Slope is simply the rate that yield is expected to be reduced by for each unit of added salinity beyond the threshold value. The formula to calculate relative yields is:

YR = Y - s (ECe -t) where ECe > t [1]

Figure 5 shows the general relationship between relative crop yield and soil salinity. The threshold value and the slope of the yield decline line for beans, corn, wheat and barley are shown to illustrate different crop responses in each of the broad tolerance classifications.

Usually, salinity is measured in units of electrical conductivity of a saturated soil paste extract (ECe) taken from the root zone of the plant as averaged over time and depth. Soil paste extracts are water taken from soil samples which are brought up to their water saturation points (Ayers and Westcot, 1985). Electrical conductivities are measured on the filtered water extracts from these samples in units of deciSiemens per metre (dS/m). For most natural waters, 1 dS/m is equivalent to about 640 mg/litre salts. New methods use electronic probes or electromagnetic pulses to estimate ECe with less time and effort (Rhoades, 1976 and 1993).

FIGURE 5 Relative crop yield versus ECe

Where there is uncertainty in the determination of the t and s parameters in Equation 1, crops have been classified in a general range from sensitive to tolerant based upon available data (Francois and Maas, 1994; Maas, 1986). Crop salt-tolerance values and classifications are extremely useful in predicting the effects of re-using a water of a specific salinity on a particular crop.

Most salt-tolerance data are based upon the effects of saline waters predominated by sodium chloride, sometimes with varied amounts of calcium as needed to avoid the development of soil permeability problems associated with soil sodicity. Drainage waters or waters re-used from agricultural processing or manufacturing operations may predominate in other chemical species and/or have high concentrations of B, Se, As or other ions that may be environmental hazards (Francois and dark, 1979b; dark, 1982).

Agricultural management practices

Irrigation systems and scheduling

Traditionally, the goal of agriculture has been to maximize yield and profit, defined by the net difference between inputs and outputs. To maximize yields, soil salinity is reduced by leaching. The kind of management directed toward this goal is mediated by costs of water, drainage, applied nutrients and amendments to soils and waters. Where water quality and/or quantity is limited and where there are restrictions on drainage, the hazardous effects of salinity on crop growth and the dangers of either insufficient or excessive leaching are serious concerns. However, crops can be grown with saline waters provided that suitable irrigation and cropping strategies are used (Rhoades, 1988). Management needs to be more intensive and more precise methods should be used for water application and distribution. Water requirements needed for crop use and leaching should be accurately assessed and provided in a timely manner. Most crop water use coefficients have been developed for non-saline situations (Erie et al., 1982). With saline water, growth and consumptive water use are reduced. However, preliminary evidence indicates that some crop water use coefficients may apply over a range of salinities because the water unused by the crop is needed to offset the increase in the leaching fraction requirement (Letey and Dinar, 1986). The greater the salinity of the irrigation water, the greater the need for adequate irrigation and drainage. Rates of salt accumulation in the soil are dependent upon the amount and concentration of the saline water applied and the amount remaining after plant water needs have been met. In a properly managed sustainable system, there may be a high salt content but no continuing accumulation.

Soil salinity is spatially and temporally non-uniform. Salinity often rises during the summer and then decreases with winter rains. This is a desirable condition which helps reduce permanent salt accumulation. However, under such circumstances, correlating yield responses to such variable salinity is difficult. Best management practices include land grading as a method to provide uniform surface irrigation, or the use of sprinkler or drip irrigation.

Crops irrigated with sprinklers are subject to injury not only from salts in the soil but also from salts absorbed directly through wetted leaf surfaces (Maas, 1985). In general, plants with waxy leaves are less susceptible than others. When saline sprays come into direct contact with leaf surfaces, salts can accumulate to toxic levels in leaf tissue and cause decreased yields. In tree and vine crops, the extent to which leaves are wetted can be minimized by sprinkling under the canopy. However, even with under-canopy sprinklers, severe damage to the lower leaves can occur (Harding et al., 1958). The extent of foliar injury depends on the concentration of salt in the leaves, weather conditions and water stress. For example, salt concentrations that cause severe leaf injury and necrosis after a day or two of hot, dry weather may not cause any symptoms when the weather remains cool and humid. As foliar injury is related more to frequency of sprinkling than to duration (Francois and dark, 1979a), infrequent heavy irrigations should be applied rather than frequent light irrigations. Slowly rotating sprinklers that allow drying between cycles should be avoided, as this approach increases the wetting-drying frequency. Sprinkling should be done at night or in the early morning when evaporation is less. In general, poorer quality water can be used for surface-applied irrigation than for sprinkler irrigation.

Drip irrigation gives the greatest advantages when saline water is used. Drip irrigation avoids wetting of the leaves with saline water and can be managed to maintain relatively high soil water potentials. As drip irrigation is normally applied frequently, there is a continuous leaching of the soil volume from which the plant extracts water. General leaching can be provided intermittently between growing seasons and it is supplemented by seasonal rainfall.

Leaching requirement

In order to calculate the leaching requirement (LR), root zone salinity estimates are typically weighted to account for water uptake at different depths. The root zone may be conceptually divided by depth into four quarters, with a typical water uptake distribution pattern of 40:30:20:10 starting with the most shallow profile. This assumes a 'normal' rooting distribution but modifications should be made according to the frequency or type of irrigation (e.g., drip, sprinkler, alternate furrow). Of the various methods for calculating LR, the simplest is the equation proposed by Rhoades (1974) used in conjunction with salt-tolerance yield parameters:

LR = ECi / (5EC-ECi) [2]

where ECi is the electrical conductivity of the irrigation water, and EC is either ECe at t or at the acceptable yield level below t, and LR is the fraction of the irrigation water that must be passed through the root zone to control salinity.

Other management techniques

There are a number of cultivation and management methods which may offset yield loss when saline water is used for crop production. Some simple methods include: using more vigorous cultivars; using screens to select seed of larger size and higher seed weight; increasing population densities to offset smaller plant size and reductions in tiller number; and placing seed on sloped or modified seed-beds so that salts drawn to the surface by evaporation accumulate away from the seed line. Another way to move salts away from the seed or plant base is to irrigate in alternate furrows. An additional method of ameliorating the effects of moderately saline drainage water is to match fertilization requirements with the chemical composition of the water and soils. As drainage waters often contain some of the essential fertilization requirements of plants, applied fertilizers can be reduced. On the other hand, as ion interaction and competition reactions affect plant uptake, additions of calcium, potassium or some other element may be required for better crop growth. An important consideration is that high sodium concentrations in soils and waters may decrease soil permeability and prevent effective leaching of salts. The suitability of soils and waters is measured using a parameter called the sodium adsorption ratio (SAR). Soil amendments are sometimes a necessary part of salinity management schemes to reduce SAR. On the other hand, high EC (salt concentrations) of the water increase the soil permeability.

Managing cycling and blending strategies

Cyclic strategies for using waters of different salinities have been proposed and it has been demonstrated that cyclic irrigations can be successfully applied to crops during different growth stages or can be used with crop rotations between tolerant and sensitive crops (Rhoades, 1987 and 1989). The feasibility of cyclic and blended applications of high quality water with drainage water depends on both supply and the availability of storage, mixing and delivery systems. Where non-saline waters are available for critical irrigations, growers can take advantage of the fact that many crops are most salt sensitive during the germination and seedling stages and are much more tolerant during later growth stages.

Plant salt tolerance varies with phenological growth stage, and beneficial effects can be obtained by managing irrigations to take advantage of this fact. For example, at salt-sensitive growth stages, salinity effects can be reduced by providing higher quality water or by minimizing salt accumulation through increasing the number of irrigation cycles. In some crops, higher salinities applied during the later stages of development will improve yield quality by increasing the content of sugars or soluble salts in the fruit. The timely application of saline water during fruit development has been used as a strategy to improve both sugar content in melons and soluble solid content in tomatoes (Shannon and Francois, 1978). In response to some moderate levels of salt and drought stress, fruit trees sometimes exhibit significant increases in fruit set and yield, but usually at a cost in subsequent years due to reduced biomass production. Another disadvantage may be decreases in shipping quality. Francois and dark (1980) found that increasing salt stress delayed fruit maturation in 'Valencia' oranges but did not effect fruit quality. Their results also indicated that salinity had no effect on total soluble salts (TSS), but Bielorai et al. (1988) measured slight increases in TSS and sugar contents of 'Shamouti' oranges when the chloride concentration in the irrigation water was 450 mg/litre (m g/g).

For cyclic use strategies, factors that should be considered include the effects of changes in salinity during the growing season, the average salinity distribution in the root zone, the interactions with climatic variables, and the effects of different soil types. After crop establishment, the salinity in the root zone averaged over time may be considered as the effective salinity exposure (Shalhevet, 1994). This means that in most cases, once crops are established, there will probably be little measurable difference in a field situation as a result of applying 4 and 10 dS/m water in alternate irrigations applied over a crop cycle as opposed to using a water of 7 dS/m in each irrigation. This is the result of compensating factors in both soils and plants. Models are being developed to predict the effects of different irrigation water salinities during the growing season on crop yield.

Blending is the mixing of poor quality drainage water with good quality irrigation water. Provided that the blended water is sufficiently low in total salinity and toxic ions, this is the most economic and environmentally acceptable means of disposing of drainage water. This strategy can potentially increase the salinity of the groundwater over time. Often, separation of waters of different qualities is the best strategy.

Egypt has an extensive drainage water re-use programme. Over 4 000 million m3 of agricultural drainage water, produced in the upper Nile Delta, are re-used to supplement irrigation water requirements in the lower delta. Pumping stations lift the better quality drainage water into irrigation canals, where mixing occurs. The Ministry of Public Works and Water Resources has an extensive re-use quantity and quality monitoring programme.

Re-use for saline agriculture and forestry


Concept of agriculture-forestry systems and solar evaporators
System design and planning


Concept of agriculture-forestry systems and solar evaporators

The agriculture-forestry system and solar evaporator are still at the development stage (Tanji and Karajeh, 1992 and 1993; Jorgensen et al., 1993). The concept compares favourably with other intermediate disposal technologies. It manages drainage water as a resource instead of directly disposing of it into evaporation ponds, rivers or the ocean. Agriculture-forestry systems produce energy (biomass) while some other technologies for the treatment of drainage water have high energy demands. By significantly reducing the final volume of non-re-usable drainage water, agriculture-forestry systems provide an opportunity for seasonal discharging of drainage water into rivers during high water flows. Solar evaporators are preferable from the standpoint of wildlife safety as they provide for better bird control than do evaporation ponds. The trees create wildlife habitats, reduce air pollution, and enhance the overall environmental quality of fanning regions.

FIGURE 6 Agriculture-forestry and salt management systems

The agriculture-forestry method has been developed and tested in California for salt management on irrigated farmland. It has two objectives:

i. to utilize the drainage water as a resource to produce marketable crops;
ii. to reduce the volume of discharged drainage water directly on the farm.

The agriculture-forestry system for the productive use of drainage water and its disposal into a solar evaporator has been developed in semi-arid conditions in California and other regions. A diagram of the system is shown in Figure 6. The main function of the trees is to use and evaporate large volumes of drainage water. This can be achieved not only through sequential re-use but also by uptake of the drainage water from shallow water tables, or by intercepting the flow of drainage water from upslope. The trees can be viewed as biological pumps. The tree biomass offers a number of marketing options ranging from electricity generation to the production of biofuels or biochemicals. Drainage water is further concentrated through the irrigation of halophytes that also have marketing potential as food or industrial crops. The salt concentration in the drainage water in each succeeding stage is increased, but the volume of water is reduced.

About 80-85% of the initial volume of drainage water produced while growing salt-sensitive crops is sequentially re-used to produce salt-tolerant crops. The remaining 15-20% of drainage water, with increased concentration of salt, evaporates in solar evaporators. The design of the solar evaporator consists of a levelled area lined with a plastic liner on which the crystallized salt is collected The daily amount of drainage water discharged into the solar evaporator is correlated with daily evaporation rates to prevent water ponding. This makes the facility unattractive to wildlife. This is important as the concentration of Se has proven to be a hazard to wildlife. The agriculture-forestry system for the sequential re-use and reduction of drainage water volume can be viewed as the flow of water from salt-sensitive crops to salt-tolerant trees to more salt-tolerant halophytes and to a solar evaporator (Figure 6). The water flow is programmed and automated using a system of sensor-controlled equipment.

The agriculture-forestry system operating in California for sequential re-use of water and salt removal utilizes drainage water with an initial salt concentration of about 7 000 mg/litre, following the irrigation of salt-tolerant crops. This concentration of drainage water has been conventionally disposed of into evaporation ponds, rivers and/or the ocean. The agriculture-forestry system concentrates salt in a significantly reduced volume of drainage water. This technology offers management options of salt crystallization in a relatively small area on farms (solar evaporators) or the discharge of a reduced volume of drainage water (brine) into solar ponds or natural sinks (e.g., the ocean). The crystallization of salt in solar evaporators provides management options for salt marketing or its (short/long-term) storage on farms or designated landfills for storage in perpetuity.

The system for sequential re-use of drainage water is also effectively reducing the level of Se. A typical Se concentration is 0.5 mg/litre in drainage water applied to trees, and 0.9 mg/litre in the reduced volume of drainage water applied to halophytes. Consequently, the estimated quantity of Se is about 50 kg per 100 000 m3 of drainage water applied to trees, and it is reduced to about 27 kg per 30 dam3 of drainage water sequentially re-used to irrigate halophytes. Selenium is reduced through volatilization and is also taken up by trees and halophytes. Selenium removal by trees is mainly concentrated in leaves, and the measured values (in terms of mg/kg) were 0.5-0.9 for eucalyptus, 0.6-1.0 for casuarina, 2.6-3.6 for athel, and 0.6-1.8 for halophytes. Successful experiments were conducted on the transfer of Se through harvesting halophytes and safely using this forage for cattle feeding (Frost, 1990).

System design and planning

The design of a solar evaporator is fundamental to the development of an integrated agriculture-forestry system. The size of the solar evaporator is a function of evaporation rates typical for the farming region. The area of the solar evaporator is calculated by:

where

Ae area of the solar evaporator (m2)
DWh drainage water discharged from halophytes into the solar evaporator (thousand m3)
EV annual evaporation rate (m)
Ce coefficient of evaporation, for reduced evaporation of saline water.

Further research data are needed to estimate the coefficient of evaporation (Ce). Experimental data indicate that the salt concentration in the drainage water discharged from halophytes into a solar evaporator ranges from 32 000 to 41 000 mg/litre. The final concentration of the crystallized salt exceeds 180 000 mg/litre. The major components of crystallized salt are SO42- (56.70 %), Na (23.50 %), and Cl- (8.40 %). The Se content ranges from 2 to 8 mg/kg.

The area of halophytes that can be grown is a function of the quantity and quality of drainage water recovered from the trees. Experimental data collected during a period of six years indicate that the increase of salt concentration in water drained from trees is about 2.8 times higher than of the drainage water (from conventional farm crops) applied to trees. This indicates that approximately 65% of the re-used drainage water is consumed by trees. The applied 'tree' water (drained from farm crops) is typically EC 8-10 dS/m and the applied 'halophyte' water (drained from trees) is about 28-30 dS/m.

To maintain adequate control of soil quality, the leaching fraction (the percentage of the infiltrated irrigation water that percolates below the root zone) must be sufficiently high to prevent a build-up of salts, Se and B in the soil profile. The SAR should also be monitored.

An agriculture-forestry area of 16 ha has the capacity to process about 110 000 m3 of drainage water per year. The size of a farm that can be serviced by the agriculture-forestry system (including solar evaporator) is a function of several factors, such as cropping system, quality of irrigation water, irrigation water management, soil salinity, and the use of trees for controlling groundwater conditions (water uptake from high water tables or subsurface flows). It is estimated that 30 ha of trees, halophytes and solar evaporator can utilize and process drainage water from about 1 000 ha of conventional farm crops. The 30 ha area would consist of 18.75 ha of trees, 7.5 ha of halophytes, and 3.75 ha of solar evaporator. The relative size of areas is as follows: irrigated area, 1 000 ha; trees, 10 ha; halophytes, 4 ha; and solar evaporator, 2 ha.

The management of biomass from the trees and halophytes is essential for the performance of agriculture-forestry systems and salt removal. The trees and halophytes need to be selected for their salt tolerance, which should range from about 9 000 to 18 000 mg/litre for trees and 18 000 to 41 000 mg/l for halophytes. The halophytes should preferably be perennial plants. The other required characteristics of trees and halophytes include: high water demands, tolerance to frequent flooding, frost tolerance, and marketability of harvested biomass.

Eucalyptus camaldulensis is the species of choice for this particular salt management system because of its salt tolerance and high water requirements. To improve the quality of eucalyptus trees for agriculture-forestry sites in the San Joaquin Valley, a selective breeding programme was initiated in 1987. Selected trees have been systematically evaluated each year, and 22 trees have been chosen for tissue culture propagation. The programme is seeking to achieve a higher diversification of salt-tolerant trees. Additional experimental trees planted include casuarina, athel, acacia, mesquite and poplar.

The selection of halophytes has been based on literature review, field evaluation trials, and a survey of salt-tolerant plants in California. These plants are being selected not only for salt management purposes, but also with a consideration for their biological interaction with conventional farm crops. This is to avoid introducing species that could be potential weeds or host plants for insect vectors or plant viruses. Halophytes have been selected for salt tolerances ranging from an EC of 20 to 45 dS/m. Based on current field evaluations, the most promising plants include salicornia, iodine bush (Allenrolfea occidentalis), salt grass (Dichtilis spicata), and cordgrass (Spartina gracilis). Other promising halophytes include fivehook bassia (Bassia hyssopifolia), Jose tall wheatgrass (Agropyron elongatum), fat-hen (Atriplex patula), red sage (Kochia americana), Atriplex nummularia and Atriplex lentiformis.

Re-use in a natural wetland


Re-use of surface drainage water
Re-use of subsurface drainage water


Drainage water can also be used for wildlife or wetland habitat irrigation. However, prior to using agricultural drainage water for this purpose, the following questions should be addressed:

i. Is the water from a surface or subsurface drain system or both?
ii. What types of vegetation are to be grown?
iii. What constituents are in the water?
iv. Is the water available when it is needed?
v. Is an adequate volume of water available?
vi. Will it have positive impacts on wildlife and the environment?
vii. Will there be adequate runoff from the wetland?
viii. Is the wetland sustainable?

Re-use of surface drainage water

Where drainage water is derived from only surface drainage or tailwater sources, the main question is whether or not the water contains applied and persistent pesticides. In areas where strong environmental safeguards exist and pesticide container label restrictions are followed, there is little risk associated with the re-use of surface runoff or tailwater drainage water. Rice field drainage water accounts for a very large percentage of the water supply for managed natural wetlands in the Sacramento Valley in California and is generally safe for re-use.

Most of the surface derived drainage water is used to flood wetlands in the early autumn. In the rice growing regions, the fields are drained in the late summer or early autumn. This drainage pattern coincides with the autumn migration of the waterfowl to their wintering grounds.

Ideally, the winter waterfowl habitat is flooded to a depth of 20-50 cm. Depending on soil type, seepage and evaporation rates, the drainage water required for initial flooding will range from 500 to 1 500 m³/ha. Where local supplies of surface derived drainage water are available, water is used to maintain ponds from October to March. In warm climates, the annual evaporation or consumptive use is approximately 2 500 m³/ha, which causes a further salt concentration increase in the wetland outflow.

Typical native marsh plants grown with surface agricultural drainage waters include: smartweed (Polygonum lapothefolium), swamp timothy (Heleochloa schenoides), tule or hardstem bulrush (Scirpus fluviatilis), and cattails (Typha spp). These plants are grown under a moist soil water management regimen. Water is applied in the autumn of the year and held until spring soil temperatures begin to warm. This occurs in March or April in the Central Valley of California. When the soil begins to warm, the ponds are drained to mudflat conditions. This stimulates seed germination. In some areas they require no additional water until the autumn flood-up period. However, where summers are hot and dry, they will require one or more irrigations in July or early August to provide for optimal seed production for migrating wildlife food.

Re-use of subsurface drainage water

The re-use of subsurface saline agricultural drainage water for wetland management poses substantial challenges and can generate problems which could result in wildlife losses and habitat reductions. Although subsurface saline drainage water is typically free from contamination by applied pesticides or herbicides, it may contain soil or naturally derived toxicants or trace elements such as salts, nitrates, As, B, Cd, Cr, Pb, Hg, Mo, Ni, Se, Ag, U and V. Each of these constituents is potentially toxic independently, in combination with other constituents, or through the process of biomagnification in wildlife through the food chain. Careful analysis of subsurface agricultural drainage water during several periods of a yearly cycle is required before any plans for re-use as a water supply for wetlands can be considered. The provision of an adequate volume of flow-through water is important to minimize concentration of toxic elements due to evaporation.

At present, there are no comprehensive standards which establish safe levels of trace elements in water used for wetland habitat management. However, because of the high potential for food-chain magnification, most wetland managers intentionally refuse to use subsurface agricultural drainage waters which contain levels of trace elements above background levels. The potential for, and the costs of, clean up or remediation of a contaminated wetland dictate a conservative approach.

Where trace element contamination is not a concern, saline drainage water can be used to support a productive wetland habitat. The major consideration is the management of soil and water salinity. The maintaining of a salt balance between the applied water and the soil/water interface is key to the production of brackish water native marsh plants.

In general, water with a TDS level of 2 500 mg/litre or less is preferable for wetland management. Sometimes, water with a TDS level as high as 5 000 mg/litre can be used for short periods. Standard management practices involve an autumn flooding to a depth of about 20-50 cm, with these depths being maintained until January or February. In late winter, the ponds are drained to discharge drainage water and accumulated salts. The ponds are then refilled with new water as deeply as practicable. After approximately 14 days, the water is drained again. The drainage cycle is repeated two to three times before the cycle is completed. This process removes salt concentrated in the surface water through evaporation and allows for a rebalancing of the water/soil salt equilibrium.

The ponds are drawn down to a mudflat state in March or April to facilitate germination of desirable salt-tolerant marsh plants. Typical plants grown under this regimen include: alkali bulrush (Scirpus robustus), brass buttons {Cotula corinopifola), salt grass (Distichilis spicata) and tules {Scirpus acutus). Depending on the local soil and climate, one or more shallow irrigations may be necessary to bring germinated plants to full maturity.

In all cases, in order to prevent excess salt accumulation, water circulation is maintained when the ponds are filled with saline drainage water, and some constant rate of discharge or outflow from the ponds is necessary. By maintaining water circulation and some constant rate of water flow, outbreaks of waterfowl disease can also be avoided. Any wetland habitat supported principally with saline agricultural drainage water must be carefully managed and monitored to have productive wetlands. In addition, there must be an environmentally safe way (ocean, large river, salt lake) of disposing of the water as it is drained from any wetland area.


Previous Page Top of Page Next Page