Previous Page Table of Contents Next Page


Chapter 3 - Water table management

Daniel Zimmer, CEMAGREF, Antony, France
Chandra A. Madramootoo, McGill University, Quebec, Canada


General features
Water quality benefits
Operational aspects
Monitoring requirements


There are some agronomic and water management practices which may be implemented at the field and watershed scales to reduce the concentration of chemicals (primarily nutrients and pesticides) in agricultural drainage water. Agronomic practices include tillage, cropping methods, and improved and timely fertilizer and pesticide applications.

One water management practice which has been shown to significantly reduce NO3-N in drainage water is water table control. The technique has been tested in the humid regions of eastern Canada, and the eastern and mid-western United States. It is currently promoted in these regions as a combined drainage/irrigation practice for increasing crop yields and reducing NO3-N pollution from relatively flat, artificially drained cropland. The quality of receiving waters is thus improved. The water quality benefits of water table management are well described in Gilliam (1987) and Gilliam et al., (1979). Water table management systems are currently being tested in some parts of Europe and Asia. The practice is yet to be evaluated in the arid and semi-arid regions, although there is potential for water table management in these regions. However, proper management of the water table is required to ensure that re-salinization of the crop root zone does not occur. Early attempts at sub-irrigation in arid and semi-arid regions failed because of soil salinization. More information is required on the effects of different water table levels on crop yields and salt movement in the root zone in various climatic locations and soil types. For these reasons, the focus of this chapter is on water table management in humid regions.

In the humid regions of North America, water table management has evolved on lands which were first improved with surface and subsurface drainage. Drainage improvements were initially required on these poorly drained soils to increase crop productivity and provide better soil conditions for field machine trafficability. With concerns about the quality and impacts of agricultural drainage water on marine ecosystems, as well as nitrate contamination of drinking water, the drainage systems were modified to reduce the concentration of pollutants in the drainage water. Water table management has been practised for several years on organic and sandy soils in humid regions to reduce drought stress of high value crops.

General features


Water table control structures


There are two forms of water table management: controlled drainage and sub-irrigation. Controlled drainage restricts the discharge from a subsurface drain outlet or open main drain, resulting in a higher field water table. The water table drops naturally over time due to evaporation and deep seepage, and is only raised if there is precipitation. With sub-irrigation, water is pumped slowly and nearly continually into open ditches or a subsurface drainage system to maintain a near constant water table. When large rainfalls occur and the field water table rises above the desired level, the irrigation pump is stopped. The excess water then drains from a control structure in the ditch or drain outlet. Such systems are shown in Figure 3.

FIGURE 3 Water table management systems

As a result of higher water tables, water is provided for plant uptake through capillary rise or upward flux. This water helps to meet plant transpiration requirements and reduce drought stress. Corn and soybean yield increases of 10-15% have resulted from sub-irrigation in eastern Canada (Madramootoo et al., 1993; Madramootoo et al., 1995). An added benefit of the higher water table is that it enhances denitrification, thereby reducing nitrate leaching. Denitrification is a microbial process whereby nitrate, rather than oxygen, is used to reduce NO3- to N and NOx. Nitrates retained in the soil profile are available for plant use. Several researchers, including Willardson et al. (1972), Bengtson et al. (1984), Gilliam and Skaggs (1986), Evans et al. (1992) and Lalonde et al. (1995), have shown that water table management can reduce NO3-N in drainage water by over 60%. Denitrification also occurs in water stored in ditches. Other benefits of water table management include: reduced downstream peak flows and drainage volumes, and conservation of soil water to supplement irrigation water requirements. There are, however, some instances where water table management may increase peak flows (Konyha et al., 1992).

Water table control structures

In areas with open canal drainage systems (with or without subsurface drainage), flashboard riser control structures may be used. These are installed in the ditches, and by either inserting or removing a flashboard, the water level in the ditch is either raised or lowered. These structures can also be used to store runoff in canals, which can later be used for sub-irrigation. Surface runoff from drained fields can also be controlled with the aid of the flashboard risers.

FIGURE 4 Side view of an adjustable water table control outlet (Madramootoo, 1 994)

Discharge from a subsurface drainage system can be regulated with a PVC control structure similar to that shown in Figure 4. This adjustable water table control structure, attached to the collector outlet, has a vertical riser through which water can be pumped for sub-irrigation.

With pumped drainage outlets, the pumping system can be designed and managed to maintain a desired water level in the sumps, and thus in the field. It is possible to automate the system using a water level sensor installed in a dipwell at the midpoint between subsurface drain laterals (Fouss et al., 1990). The sensor can switch the pump on or off.

Water quality benefits


Drainage hydrology and water quality
Nutrients
Pesticides


Drainage hydrology and water quality

Water table management systems may be designed to control drainage volumes, peak flow rates or chemical losses from agricultural fields or catchments. Once the design objective has been determined, the potential hydrologic impacts and the resulting changes in water quality must be determined. Hydrologic impacts will depend on the degree and method of drainage improvement already attained in the proposed project area.

High water tables will result in low infiltration capacities of the soils and high surface runoff rates, and will increase the loss of chemicals located at or near the soil surface. On the other hand, unrestricted drain discharge (i.e., no outlet control) will produce low water tables and higher infiltration capacities. Consequently, during periods of high drain discharge, there is increased potential for the leaching of soluble chemicals existing in the soil profile.

The reduction of drainage volumes may be a first step towards improved drainage water quality. Evans et al. (1995) have indicated that water table management may reduce total drainage outflow by over 30% compared to conventional drainage systems without outlet control.

Outflows, however, vary widely, depending on soil type, rainfall and type of drainage system. During very dry years, water table management may totally eliminate outflow. In wet years, it may have little or no effect on total outflow. During the growing season, water table management typically reduces outflow by less than 15 %; much of the reduction occurs during the non-growing period, i.e., winter and early spring. The decrease in outflows is not due to a significant increase in ET. A slight increase may be observed during the growing season, but this would not explain differences during the winter, when most of the reduction occurs. It is possible that most of the water is lost by lateral seepage to remote sinks, or by deep seepage to groundwater. Depending on the topography of the watershed, this water may reappear downstream and it should be quantified.

Generally, with improved drainage, there is a tendency for the planting of higher value crops, and increased chemical applications. This may lead to increased chemical loads in surface runoff and subsurface drainage discharge. Crop production methods should, therefore, focus on more efficient water, fertilizer and pesticide applications. Water quality benefits associated with water table management will only be achieved if sound agricultural practices are followed.

Nutrients

Nutrient losses in agricultural drainage water raise many environmental concerns. Eutrophication, due to high N and P concentrations, increases algal growth and reduces oxygen in surface waters. From agronomic and economic standpoints, nutrient losses represent a decrease in the efficiency of the crop production system. There are also human and animal health concerns, because nutrients, especially NO3-N, can contaminate drinking water supplies. Concentrations exceeding 10 mg/litre can be harmful to humans.

Nitrogen, P and K are the three major elements of fertilizer and manure. Nitrogen is easily converted to NO3-N. Table 5 shows the order of magnitude of total annual losses and peak concentrations of these three nutrients which may be expected in surface runoff and subsurface drainage. It can be seen that P and K are of less concern than NO3-N.

Nutrient concentrations in drainage water are influenced by many factors, including: rate of fertilization; soil and crop type; and soil water regime during the growing season. These factors affect the efficiency of nutrient uptake by the crop roots, and residual nutrient levels in the soil during the fallow or dormant season.

Nitrogen
As discussed in Chapter 2, nitrogen is transported in different forms.

In most cases, water table management aims at reducing NO3-N leaching. When developing a water table management strategy, losses of N in its different forms should be taken into account. A decrease in one form of N may result in an increase in another form. This is particularly the case if a decrease in subsurface drainage discharge results in an increase of surface runoff. In such a case, there might be an important decrease in NO3-N concentration in drainage water, while a small increase in loss of organic N and NH4 may occur.

TABLE 5 Orders of magnitude of peak concentrations and annual losses of NO3-N, P and K in drainage water at field level


Losses

NO3-N

P

K

Surface runoff

Peak concentration (mg/litre)

50

10

10

Annual (kg/ha)

1-20

1-20

1-10

Subsurface drainage

Peak concentration (mg/litre)

200

10

10

Annual (kg/ha)

1-100

0.1-1.0

1.0-10

Knowledge of the N cycle, and the impact of agronomic practices on this cycle, will help with the implementation of a water table management strategy. Temperature and the soil water regime influence nitrification and denitrification. At the field level, the potential risks must be assessed by comparing the water table regime to: (i) fertilizer rates and practices during the growing season; (ii) the periods of high mineralization rate; and (iii) the periods of possible denitrification. These factors are illustrated in Table 6.

High water tables due to water table management may significantly decrease NO3-N concentrations in drainage water. Annual losses of total N from fields with water table management could be 40-50% lower than for conventionally drained fields (Evans et al., 1991). This decrease is due to both a reduction in drainage discharge and an increase in denitrification. If controlled drainage is used during the early spring or late winter, when drainage is not needed for crop production, one can, if temperatures are warm enough, dramatically increase denitrification and reduce NO3-N losses.

TABLE 6 Factors influencing the nitrogen cycle in agricultural soils

Factor

Effect on crop

Effect on nitrogen cycle

Temperature

Higher ET which enhances vegetative growth and thus nitrogen uptake.

Higher temperatures (in the range of 20-30°C) increase the rate of most biochemical processes, especially mineralization and denitrification.

Soil water regime

High water tables restrict crop growth and nitrogen uptake by roots.

Soil water contents below field capacity increase the rates of all processes, except denitrification. Saturated conditions due to higher water tables enhance denitrification.

Fertilizer and agricultural practices

Application of fertilizers increases the rate of crop growth and development.

Drainage improvements are generally linked to changes in cropping systems, which may lead to increased N fertilizer inputs. The conversion of grassland to cropland results in mineralization of N accumulated in the soil profile.

The design of a water table management strategy to control N losses should take the following into account:

i. The timing and amounts of drainage discharge, before, during and after the growing season.

ii. The potential denitrification. Denitrification requires low oxygen concentration, due to shallower water tables, as well as the presence and activity of bacteria. Temperature and soluble organic carbon content in the soil influence denitrification. Soil temperatures above 10°C are necessary to induce significant denitrification.

iii. N2O is a by-product of denitrification. This is a greenhouse gas, which is of environmental concern, and may only be a problem on very large watersheds.

iv. The scale at which the water table management is implemented. At the watershed scale, there is the possibility of less vegetation in and along channels. This may yield higher NO3-N levels at the field edge.

v. Continually high water levels in drainage canals and/or ditches can cause problems of ditch bank stability and also promote problems with beavers and other rodents.

Phosphorus

Phosphorus losses are closely tied to sediment loss. Surface runoff is, therefore, the primary factor influencing both sediment and P transport. Shallow water tables during high rainfalls could increase surface runoff, and hence P movement. However, at the watershed scale, controlled drainage has been shown to reduce P losses (Evans et al., 1991).

Pesticides

Pesticides are generally transported with soil particles in surface runoff. According to Munster et al. (1995), pesticide concentrations are generally higher in surface runoff than in subsurface drainage water. However, Kalita et al. (1992) stated that high pesticide concentrations in subsurface drainage water could be expected if preferential flow occurs.

The amount of pesticide in subsurface drainage water is typically less than 0.1% of the amount applied. The pesticide most studied is atrazine, or compounds of the same family. This is a herbicide for maize crops. Results for aldicarb, alachlor and metolachlor are also available. The concentration of these chemicals is generally very low, in the range of 1-3 m g/litre.

Although further investigation is needed, a water table management strategy that aims to reduce pesticide losses should preferably reduce the amount of surface runoff. The water table level should therefore be lowered if a rainfall occurs immediately after pesticide application. However, Michaelsen (1995) showed preferential transport of several herbicides being enhanced in drained plots in northern Germany. Most pesticides have a relatively short half-life (measured in days), and therefore do not persist in the soil or water for long periods.

Operational aspects


Farm or catchment scale
Topography and soils


Water table management systems can be operated by farmers, water boards or water users associations. However, there is also a need for an institution to monitor water quality. Sometimes a compromise may be required between agricultural and water quality benefits.

Depending on the type and severity of water quality constraints, water table management will seek an optimum for either agricultural production or environmental protection.

System constraints may be either physical, technical or human. The most critical physical aspects include topography, the existing drainage network, water availability (for sub-irrigation) and soil conditions. Detailed information on the design, installation and operation of water table management systems can be found in the ASAE Standards (1994).

Water table management requires greater farmer involvement than does conventional drainage. The operation of subsurface drainage systems requires few management decisions. However, with water table management, operational indicators are difficult to see, as the response to outlet adjustments is not immediately evident in the field. Furthermore, the operational incentive may not be solely increased crop yield, but also improved water quality and increased water quantities for downstream water users. These factors must be clearly defined for system operators at the start of the project.

Farm or catchment scale

There are two approaches to the implementation of water table management: the catchment level (or large tracts of land) and the field scale. These two approaches are complementary. The scale of implementation depends on: water quantity and quality objectives; farming systems; regional hydrology; institutional arrangements; and topography.

Water quality is linked to farming practices and is therefore to be controlled primarily at the field level. However, field systems may necessitate control structures at the catchment level where water quality control depends on a reduction of drainage volumes and sediment transport. Location of the control structures within the catchment will depend on the discharge and drainage area above the project site. The upstream drainage area may be several thousand hectares. In this case, it may be more economical to control the water level in field ditches with smaller structures. This may cost less than larger structures on main channels.

The existing drainage network is also an important consideration. The depth and width of the channels govern water storage capacity. If a dual controlled drainage/sub-irrigation system is to be implemented, channel storage capacity has to be sufficiently large to provide irrigation water.

Topography and soils

Water table management is best suited to relatively flat lands. Sloping lands would require many control structures, to provide uniform water tables. Land slopes of less than 1% are recommended (Shirmohammadi et al., 1992). However, as pointed out by Evans and Skaggs (1989), while there is no absolute limit, slopes of less than 0.1% are the most practical. At the watershed scale, a slope of 0.1% would require a control structure every 300-1 000 m of channel length.

Soil permeability is a key factor governing the rate of water movement from ditches to adjacent fields, and the upward movement of water from the water table to the plant roots. Good lateral movement of water will occur in moderate to highly permeable soils. While Evans and Skaggs (1989) recommended a permeability of about 0.45 m/day, experience in eastern Canada has shown that water table management is also feasible on soils with permeabilities of 0.3 m/day. A restrictive soil layer, not far from the bottom of the subsurface drain pipes, will reduce deep seepage losses.

Monitoring requirements

Generally, a monitoring system needs to be implemented, both for system management and to measure water quality changes. Monitoring involves several factors and has to be performed on a regular basis, both at the field level by the farmer and at the catchment level by the water authorities.

Water quality impacts

Actual and potential detrimental impacts of drainage water quality both within the watershed and downstream should be identified. These investigations will help determine the principal water quality constraints and the most sensitive periods for control. They are best conducted by water authorities. The data may be useful in guiding fertilizer application.

Water table levels

Water table management requires periodic monitoring of the water table at the midpoint between drain laterals or ditches. It is best done by the farmer. One observation well per field is the minimum recommended. For a given soil and drain spacing, the frequency of observation and adjustment of the control structure depend mainly on the weather and crop development stage. For example, when crops are in their early stages of development, a shallow water table may impede proper root development, and this could make the crop more susceptible to drought later in the year. Research by Madramootoo et al. (1993 and 1995) indicates that water table levels between 0.50 m and 0.75 m from the soil surface are appropriate for most crops.

The response of the water table to rainfall and control structure adjustments may be slow, depending on the soil properties and drain spacing. It may take several years to fully understand the response patterns and successfully operate the system. Accurate weather forecasts and the ability of the farmer to make use of climatic data could improve system operation.

Assessing system efficiency

Water agencies should organize a medium or long-term monitoring programme of the impacts of water table management on the hydrology, water quality and crop performance in the project area. Some of the main parameters to be monitored are shown in Table 7.

TABLE 7 Monitoring requirements

Impacts

Parameters to monitor

Monitoring interval

Peak flows

Surface runoff, subsurface drain discharge, water levels in streams

1-hour intervals with automatic devices

Soil erosion

Sediment in surface runoff

Accumulation after a rainstorm, using sediment samplers

Nutrient losses

Nitrate, ammonium, phosphorus

Monthly, during the growing season

Pesticide losses

Pesticide concentrations

Monthly, during the growing season


Previous Page Top of Page Next Page