Management guiding principles
Management for crop production
Management for environmental protection
While irrigated agriculture has greatly increased crop productivity, inappropriate and inefficient irrigation has wasted water, polluted surface water and groundwater, damaged productivity and altered the ecology of vast areas of land. Contamination of water supplies by irrigation is, in many places, posing health risks and drastically increasing the costs of treating waters for domestic and industrial uses. Surface and groundwaters in many areas are being contaminated by salts, fertilizers, herbicides and pesticides. Toxic chemicals are rendering many developed water supplies unfit for drinking and even for irrigation in some cases. These pollutants also degrade the recreational use and esthetic value of surface waters. At the same time, costly limitations are being placed upon irrigation to reduce its pollutional discharges or to treat its wastes before discharge. Finding a suitable, acceptable place for such discharge is increasingly becoming a major problem in some situations, especially in the developed countries. Blending saline and fresh waters reduces the potential usability of the total water supply. Use of polluted waters for irrigation limits crop production potential, as well as posing some potential health hazards to the consumers of the food.
To overcome the above-described problems, new techniques need to be developed and implemented to reduce excessive water uses and to conserve limited water supplies and better ways must be found to implement existing methods more effectively. Efficiency of irrigation must be increased by the adoption of appropriate management strategies, systems and practices and through education and training. Reuse of wastewater, including the use of drainage water and shallow saline groundwater for crop production, must be made an integral component of irrigation water management, water conservation and environmental protection programmes. Effective salinity control measures must be implemented to sustain irrigated agriculture and to prevent pollution of associated water resources. Such measures must be chosen with recognition of the natural processes operative in irrigated, geohydrologic systems, not just those on-farm, and with an understanding of how they affect the quality of soil and water resources, not just crop production. Some practices can be used to control salinity within the crop rootzone, while other practices can be used to control salinity within larger units of management, such as irrigation projects, river basins, etc. Additional practices can be used to protect off-site environments and ecological systems - including the associated surface and groundwater resources. The "on-farm" practices usually consist of agronomic and engineering techniques applied by the farmer on a field-by-field basis. The "district-wide" or "larger organizational basis" practices generally consist primarily of engineering structures for water control (both delivery and discharge) and systems for the collection, reuse, treatment and/or disposal of drainage waters.
There is usually no single way to achieve safe use of saline water in irrigation. Many different approaches and practices can be combined into satisfactory saline water irrigation systems; the appropriate combination depends upon economic, climatic, social, as well as edaphic and hydrogeologic situations. Thus, no procedures are given here for selecting "the" appropriate set of practices for different situations. Rather, some important goals, principles and strategies of water, soil and crop management practices that should be considered in the use of saline water for irrigation are presented as guidelines.
Salinity management constitutes an important aspect of safe use of saline water irrigation. This requires an understanding of how salts affect plants and soils, of how hydrogeologic processes affect salt accumulation, and also of how cropping and irrigation activities affect soil and water salinity. The basic effects of salts on soils and plants and the major causes and processes of salinization in irrigated lands and associated water resources that must be understood in order to develop and implement effective control practices were discussed in chapters 4 and 5.
To prevent the excessive accumulation of salt in the rootzone from irrigation, extra water (or rainfall) must, over the long term, be applied in excess of that needed for ET and must pass through the rootzone in a minimum net amount. This amount, in fractional terms, is referred to as the "leaching requirement" (Lr, the fraction of infiltrated water that must pass through the rootzone to keep salinity within acceptable levels; US Salinity Laboratory Staff 1954). In fields irrigated to steady-state conditions with conventional irrigation management, the salt concentration of the soil water is essentially uniform near the soil surface regardless of the leaching fraction (LF, the fraction of infiltrated water that actually passes through the root-zone) but increases with depth as LF decreases. Likewise, average rootzone salinity increases as LF decreases; crop yield is decreased when tolerable levels of salinity are exceeded. Methods to calculate the leaching requirement and to predict crop yield losses due to salinity effects were described previously. Once the soil solution has reached the maximum salinity level compatible with the cropping system, at least as much salt as is brought in with additional irrigations must be removed from the rootzone; a process called "maintaining salt balance."
To prevent waterlogging and secondary salination, drainage must remove the precipitation and irrigation water infiltrated into the soil that is in excess of crop demand and any other excessive water (surface or subsurface) that flows into the area; it must provide an outlet for the removal of salts that accumulate in the rootzone in order to avoid excessive soil salinization, and it must keep the water table sufficiently deep to permit adequate root development, to prevent the net flow of salt-laden groundwater up into the rootzone by capillary forces and to permit the movement and operation of farm implements in the fields. Artificial drainage systems may be used in the absence of adequate natural drainage. They are essentially engineering structures that control the water table at a safe level according to the principles of soil physics and hydraulics. The water table depth required to prevent a net upward flow of water and salt into the rootzone is dependent on irrigation management and is not single-valued as is commonly assumed (van Schilfgaarde 1976). Methods to calculate drainage requirements are given elsewhere (Rhoades 1974; Kruse et al. 1990; Hoffman et al. 1990).
As discussed earlier, the time-averaged level of rootzone salinity is affected by the degree to which the soil water is depleted between irrigations, as well as by the leaching fraction. As the time between irrigations is increased, soil water content decreases as the soil dries, and the matric and osmotic potentials of the soil water decrease as salts concentrate in the reduced volume of water. Water uptake and crop yield are closely related to the time and depth averaged total soil water potential, i.e. matric plus osmotic. As water is removed from a soil with non-uniform salinity distribution, the total water potential of the water being absorbed by the plant tends to approach uniformity in all depths of the rootzone. Following irrigation, plant roots preferentially absorb water from rootzone depths with high water potential. Normally this means that most of the water uptake is initially from the upper, less saline soil depths until sufficient water is removed to increase the total water stress to a level equal to that in the lower depths. After that water is removed from the deeper, more saline soil depths and the effect of salinity, per se, on crop growth is magnified. This implies that:
· forms of irrigation that minimize matric stress, such as drip irrigation, can be used to minimize the harmful effects of irrigating with saline water;· high leaching fractions can be used to minimize the buildup (hence harmful effects) of high levels of salinity in deeper regions of the rootzone.
The distribution within and the degree to which a soil profile becomes salinized are also functions of the manner of water application, as well as the leaching fraction. More salt is generally removed per unit of leachate with sprinkler irrigation than with flood irrigation. Thus, the salinity of water applied by sprinkler irrigation can be somewhat higher, all else being equal, than that applied by flood or furrow irrigation with a comparable degree of cropping success, provided foliar burn is avoided. The high salt-removal efficiency of sprinkler irrigation may be explained as follows. Solute transport is governed by the combined processes of convection (movement of solutes with the bulk solution) and diffusion (independent movement of solutes as driven by a concentration gradient); convection is usually the predominant process in flood-irrigated soils. Differential velocities of water flow can occur within the soil matrix because the pore size distribution is typically non-uniform. This phenomenon is called dispersion. It can be appreciable when flow velocity is high and pore size distribution is large; diffusion often limits salt removal under such conditions. Soils with large cracks and well-developed structure are especially variable in their water and solute transport properties because the large "pores" are preferred pathways for water flow, as are earthworm channels, old root holes, interpedal voids, etc.; most of the flow in flooded soils occurs via these "pores". Much of the water and salt in the small and intra-aggregate pores is "bypassed" in flood irrigated soils. Flow velocity and water content are typically lower in soils irrigated with sprinklers; hence, bypass is reduced and efficiency of salt leaching is increased. Other soil-related processes also affect salt concentration and transport during the irrigation and leaching of soils. In most arid land soils, the clay particles are dominated by negative charges, which can retard cation transport through adsorption and/or exchange processes. Simultaneously, anions are largely excluded from that part of the pore solution adjacent to the negatively-charged clay surface; this accelerates their relative transport. The borate anion also undergoes adsorption reactions that retard its movement. For a more quantitative description of effects of convection and dispersion and other soil factors on solute transport in soils see the review of Wagenet (1984).
The distribution of salts in the soil is also influenced by seedbed shape. Salts tend to accumulate to excess levels in certain regions of the seedbed under furrow irrigation (Bernstein et al. 1955; Bernstein and Fireman 1957). Information from this early study shows that seedbed and furrow shape can be designed to minimize this problem. Seed placement and surface irrigation strategies (e.g. alternative furrow, depth of water in furrows, etc.) that can also be used to optimize plant establishment under saline conditions are described by Kruse et al. (1990). Sprinkler irrigation can be effective in leaching excessive salinity from the top-soil and in producing a favourable low-salinity environment in the upper soil layer which is necessary for the establishment of salt-sensitive seedings. However, other problems (such as foliar injury) are associated with sprinkling of saline water. Saline, "bed-peaks" can be detopped to prevent exposure to emerging shoots. Under drip irrigation, the salt content is usually lowest in the soil immediately below and adjacent to the emitters and highest in the periphery of the wetted zone. Removal of salt that has accumulated in this wetting zone "front" must be addressed in the long-term.
Susceptible crops should not be irrigated with saline water by sprinkler irrigation since their foliage absorbs salts upon wetting. Salts can accumulate in leaves by foliar absorption of such crops until lethal concentrations have been reached. Crop sensitivity to saline sprinkling water is related more to the rate of foliar salt accumulation than to crop tolerance to soil salinity, per se. Hence, applications should be made during the night and in a manner to achieve frequent wetting ("washing") of the leaves in order to minimize foliar absorption of salts when irrigating with saline waters by sprinkler methods.
The prevalent models of solute reactions and transport in irrigated soils suffer the deficiency of not appropriately representing the effects of the above-described processes that often occur under field conditions. Neither do they adequately account for the distribution uniformity effects of the irrigation application system itself, or of the infiltration uniformity effects resulting from variable soil permeability across the field. Only recently has this problem been approached directly by measuring, on a large scale, solute distributions in field soil profiles. The results to date indicate that as yet no suitable method to quantify and integrate the effects of these processes on a field basis exists (Jury 1984). It is probable that alternative modelling approaches, like that proposed by Corwin and Waggoner (1990), may help in this regard.
Some unique effects of irrigation are operative at the scale of whole projects and entire geohydrologic systems; hence, some management practices for salinity control should address this larger scale. The following paragraphs provide a brief review of such information, as a basis for determining appropriate management requirements for irrigating with saline water.
As discussed earlier, some soil and water salination is inevitable with irrigation; the salt contained in the irrigation water remains in the soil as the pure water passes back to the atmosphere through the processes of evaporation and plant transpiration. Therefore, water in excess of evapotranspiration must be applied with irrigation to achieve leaching and prevent excess salt accumulation. This water must drain from the rootzone. Seepage from delivery canals also occurs in many irrigation projects. These drainage and seepage waters percolate through the underlying strata (often dissolving additional salts in the process), flow to lower elevation lands or waters and frequently cause problems there of waterlogging and salt-loading. Saline soils typically are formed in such lands through the processes of evaporation. Ground and surface waters receiving these drainage and seepage waters typically are increased in salt concentration.
The primary sources of return flow from an irrigation project are bypass water, canal seepage, deep percolation, and surface (tailwater) runoff. Bypass water is often required to maintain hydraulic head and adequate flow through a gravity-controlled canal system. It is usually returned directly to the river, and few pollutants, if any, are picked up in this route. Evaporation losses from canals commonly amount to only a small percentage of the diverted water. Seepage from unlined canals is often substantial. It may contribute to high water tables, increase groundwater salinity and phreatophyte growth, and generally increases the amount and salinity of the required drainage from irrigated areas. Law et al. (1972) estimated that 20 percent of the total water diverted for irrigation in the USA is lost by seepage from conveyance and irrigation canals. If the water passes through salt-laden substrata or displaces saline groundwater, the salt pickup from this source can be substantial. Canal lining can reduce such salt loading. Closed conduit conveyance systems can minimize both seepage and evaporation losses and ET by phreatophytes. The closed conduit system also provides the potential to increase project irrigation efficiency and to thus lower salt loading (van Schilfgaarde and Rawlins 1980).
Reducing the volume of water applied for irrigation proportionately reduces the amount of salt added and the amount needed to be removed by leaching. Minimizing the leaching fraction maximizes the precipitation of applied Ca, HCO3, and SO4 salts as carbonates and gypsum minerals in the soil, and it minimizes the "pickup" of weathered and dissolved salts from the soil. The salt load from the rootzone can be reduced from about 2 to 12 tons/ha per year by reducing LF from 0.3 to 0.1 (Rhoades et al. 1973; 1974; Rhoades and Suarez 1977; Oster and Rhoades 1975).
Minimizing leaching may or may not reduce salinity degradation of the receiving water where the drainage water is not intercepted and isolated and is returned to the associated surface or groundwater. A reduction of degradation will generally occur where saline groundwaters with concentrations in excess of those of the recharging rootzone drainage waters are displaced into the surface water or where additional salts, other than those derived from the irrigation water, are encountered in the drainage flow path and brought into solution by weathering and dissolution processes.
Groundwaters receiving irrigation drainage water may not always benefit from reduced leaching. With no sources of recharge other than drainage return flow, the groundwater eventually must come to the composition of the drainage water, which will be more saline with low leaching. Reduced leaching slows the arrival time of the leachate. Thus, the groundwater salinity may be lower with reduced leaching for an interim period of time (Rhoades and Suarez 1977; Suarez and van Genuchten 1981). For groundwater under-saturated with CaCO3 (unlikely in arid lands) being pumped for irrigation with no recharge other than by drainage return, groundwater will be slightly less saline under low leaching; groundwater saturated with CaCO3 will show no benefit under low leaching; and groundwater saturated with CaCO3 and nearing saturation with gypsum will show substantial benefit from low leaching. Low leaching management can continuously reduce degradation of the groundwater, only if other sources of high-quality recharge into the basin exist and if flow out of the basin is high relative to drainage inflow.
The extent to which leaching can be minimized is limited by the salt tolerances of the crops being grown, the irrigation system distribution uniformities and the variability in soil infiltration rates. In most irrigation projects, the currently used leaching fractions can be reduced appreciably without harming crops or soils, especially with improvements in irrigation management (van Schilfgaarde et al. 1974).
Growing suitably tolerant crops
Managing seedbeds and grading fields to minimize local accumulations of salinity
Managing soils under saline water irrigation
Operating delivery systems efficiently
Irrigating efficiently
Monitoring soil water and salinity and assessing adequacy of leaching and drainage
Management practices for the safe use of saline water for irrigation primarily consist of:
· selection of crops or crop varieties that will produce satisfactory yields under the existing or predicted conditions of salinity or sodicity;· special planting procedures that minimize or compensate for salt accumulation in the vicinity of the seed;
· irrigation to maintain a relatively high level of soil moisture and to achieve periodic leaching of the soil;
· use of land preparation to increase the uniformity of water distribution and infiltration, leaching and removal of salinity;
· special treatments (such as tillage and additions of chemical amendments, organic matter and growing green manure crops) to maintain soil permeability and tilth. The crop grown, the quality of water used for irrigation, the rainfall pattern and climate, and the soil properties determine to a large degree the kind and extent of management practices needed.
Where salinity cannot be kept within acceptable limits by leaching, crops should be selected that can produce satisfactory yields under the resulting saline conditions. In selecting crops for saline soils, particular attention should be given to the salt tolerance of the crop during seedling development, because poor yields frequently result from failure to obtain a satisfactory stand. Some crops that are salt tolerant during later stages of growth are quite sensitive to salinity during early growth. Tolerances of the various major crops to salinity are given in Tables 13 to 21.
Failure to obtain a satisfactory stand of furrow-irrigated row crops on moderately saline soils is a serious problem in many places. This is because the rate of germination is reduced by excessive salinity, as previously discussed. The failures are usually due to the accumulation of soluble salt in raised beds that are "wet-up" by furrow irrigation. Modifications in irrigation practice and bed shape should be used to reduce salt accumulation near the seed. The tendency of salts to accumulate near the seed during irrigation is greatest in single-row, round-topped beds (see Figure 15).
Sufficient salt to prevent germination may concentrate in the seed zone even if the average salt content of the soil is moderately low. Thus, such beds should be avoided when irrigating with saline waters using furrow methods, though "decapping" techniques may be used to advantage in this regard. With double-row, flat-topped beds, since most of the salt moves into the centre of the bed, the shoulders are left relatively free of salt, thus seedling establishment may be enhanced by planting on the shoulders of such beds. Sloping beds are best for saline conditions because the seed can be safely planted on the slope below the zone of high salt accumulation. Such beds should be used, if possible, when furrow irrigating with saline waters. Planting in furrows or basins is satisfactory from the standpoint of salinity control but is often unfavourable for the emergence of many row crops because of problems related to crusting and poor aeration. This method is recommended only for the use of very saline irrigation waters and vigorous, hardy emerging plants. Pre-emergence irrigation by use of sprinklers or special furrows placed close to the seed may be used to keep the soluble salt concentration low in the seedbed during germination and seedling establishment (see Figure 16, after Bernstein and Francois 1973). After the seedlings are established, the special furrows may then be abandoned and new furrows made between the rows, and sprinkling replaced by furrow irrigation.
FIGURE 15 Pattern of salt build-up as a function of seed placement, bedshape and level of soil salinity (after Bernstein, Fireman and Reeve 1955)
FIGURE 16 Influence of the irrigation system on the soil salinity pattern and yield of bell pepper at two levels of irrigation water quality
Careful grading of land makes possible a more uniform application of water and, hence, better salinity control when irrigating with saline water. Barren or poor areas in otherwise productive fields are often either high spots that do not receive enough water for good crop growth or for leaching purposes or low spots that remain too wet for seedling establishment. Lands that have been irrigated one or two years after initial grading usually need to be regraded to remove the surface unevenness caused by the settling of fill material. Annual crops should be grown after the first grading so that regrading can be performed before a perennial crop is planted. A prior detailed topographic survey could be very helpful to avoid ruining soil properties and in particular removing the surface soil which may be relatively more fertile. Land levelling causes a significant soil compaction due to the weight of the heavy equipment and it is advisable to follow this operation with subsoiling, chiselling and ploughing to break up the compaction and restore or improve water infiltration.
Several physical, chemical and biological soil management measures help facilitate the safe use of saline water in crop production. Some important ones in this regard are: tillage, deep ploughing, sanding, use of chemical amendments and soil conditioners, organic and green manuring and mulching.
Tillage is a mechanical operation that is usually carried out for seedbed preparation, soil permeability improvement, to break up surface crusts and to improve water infiltration. If tillage is improperly executed, it might form a plough layer or bring a salty layer closer to the surface. Sodic soils are especially subject to puddling and crusting; they should be tilled carefully and wet soil conditions avoided. Heavy machinery traffic should also be avoided. More frequent irrigation, especially during the germination and seedling stages, tends to soften surface crusts on sodic soils and encourages better stands.
Deep ploughing refers to depths of ploughing from about 40 to 150 cm. It is most beneficial on stratified soils having impermeable layers lying between permeable layers. In sodic soils, deep ploughing should be carried out after removing and reclaiming the sodicity, otherwise it will cause complete disturbances and collapse of the soil structure. Deep ploughing to 60 cm loosens the aggregates, improves the physical condition of these layers, increases soil-water storage capacity and helps control salt accumulation when using saline water for irrigation. Crop yields can be markedly improved by ploughing to this depth every three or four years. The selection of the right plough types (shape and spacings between shanks), sequence, ploughing depth and moisture content at the time of ploughing should provide good soil tilth and improve soil structure (Mashali 1989). Special equipment can even invert whole soil profiles or break up substrata as deep as 2.5m that impede deep percolation, so that many adverse physical soil conditions associated with land irrigated with saline water can be modified in order to improve leachability and drainability.
Sanding is used in some cases to make a fine textured surface soil more permeable by mixing sand into it, thus a relatively permanent change in surface soil texture is obtained. When properly done, sanding results in improved root penetration and better air and water permeability which facilitates leaching by saline sodic water and when surface infiltration limits water penetration. The method can be combined with initial deep ploughing.
Chemical amendments are used to neutralize soil reaction, to react with calcium carbonate and to replace exchangeable sodium by calcium. This decreases the ESP and should be followed by leaching for removal of salts derived from the reaction of the amendments with sodic soils. They also decrease the SAR of irrigation water if added in the irrigation system. Gypsum is by far the most common amendment for sodic soil reclamation, particularly when using saline water with a high SAR value for irrigation. Calcium chloride is highly soluble and would be a satisfactory amendment especially when added to irrigation water. Lime is not an effective amendment for improving sodic conditions when used alone but when combined with a large amount of organic manure it has a beneficial effect. Sulphur too can be effective; it is inert until it is oxidized to sulphuric acid by soil micro-organisms. Other sulphur-containing amendments (sulphuric acid, iron sulphate, aluminium sulphate) are similarly effective because of the sulphuric acid originally present or formed upon microbial oxidation or hydrolysis.
The choice of an amendment for a particular situation will depend upon its relative effectiveness judged from its improvement of soil properties and crop growth, the availability of the amendments, relative cost involved, handling and application difficulties and time allowed and required for the amendment to react in soil and effectively replace adsorbed sodium.
Attempts have been made to coagulate soil particles and provide deep aeration and better permeability and water infiltration by chemical treatment. Treating the soil with dilute bituminous emulsions can result in effective aggregation, improved aggregate stability and reduced surface crust formation. Water percolation rate is faster in bitumen-treated soil.
Sulphate lignin conditioner can also be used to improve soil structure, and to improve soil permeability. Soil conditioners can have practical applications in seedling establishment when soil is irrigated with saline water of high SAR. Stability of soil aggregates prevents dispersion and formation of deposit crusts and infiltration can be maintained by application of small quantities of organic polyelectrolytes to the soil surface. They can be effective when introduced in the irrigation water or when sprayed over the soil surface.
Mineral fertilizers: Salt accumulation affects nutrient content and availability for plants in one or more of the following ways: by changing the form in which the nutrients are present in the soil; by enhancing loss of nutrients from the soil through heavy leaching or, as in nitrogen, through denitrification, or by precipitation in soil; through the effects of non-nutrient (complementary) ions on nutrient uptake; and by adverse interactions between the salt present in saline water and fertilizers, decreasing fertilizer use efficiency.
Crop response to fertilizer under saline or sodic conditions is complex since it is influenced by many soil, crop and environmental factors. The benefits expected from using soil management measures to facilitate the safe use of saline water for irrigation will not be realized unless adequate, but not excessive, plant nutrients are applied as fertilizers. The level of salinity may itself be altered by excess fertilizer application as mineral fertilizers are for the most part soluble salts. The type of fertilizer applied, when using saline water for irrigation, should preferably be acid and contain Ca rather than Na taking into consideration the complementary anions present. Timing and placement of mineral fertilizers are important and unless properly applied they may contribute to or cause a salinity problem.
Organic and green manures and mulching: Incorporating organic matter into the soil has two principal beneficial effects of soils irrigated with saline water with high SAR and on saline sodic soils: improvement of soil permeability and release of carbon dioxide and certain organic acids during decomposition. This will help in lowering soil pH, releasing calcium by solubilization of CaCO3 and other minerals, thereby increasing ECe and replacement of exchangeable Na by Ca and Mg which lowers the ESP. Growing legumes and using green manure will improve soil structure. Green manure has a similar effect to organic manure. Salinization during fallowing may be severe where a shallow water table exists, since evaporation rates of about 8, 3 and 1 mm/day could occur from the dry surface of fine sandy loam when the water table is kept at 90, 120 and 180 cm from the soil surface, respectively. Mulching to reduce evaporation losses will also decrease the opportunity for soil salinization. When using saline water where the concentration of soluble salts in the soil is expected to be high in the surface, mulching can considerably help leach salts, reduce ESP and thus facilitate the production of tolerant crops. Thus, whenever feasible, mulching to reduce the upward flux of soluble salts should be encouraged.
Water delivery and distribution systems must be operated efficiently to facilitate the timely supply of water in the right quantities and to avoid waterlogging and salinity build-up in irrigated lands, especially when saline waters are involved. The amount of water applied should be sufficient to supply the crop and satisfy the leaching requirement but not enough to overload the drainage system. Over-irrigation contributes to the high water table, increases the drainage requirement and is a major cause of salinity build-up in many irrigation projects of the world. Therefore, a proper relation between irrigation, leaching, and drainage must be maintained in order to prevent irrigated lands from becoming excessively waterlogged and salt-affected.
Often irrigation water delivery and distribution systems are over-designed, in the absence of reliable data or appropriate methods to predict project water requirements. It is all the more important, when using saline waters, that excessive amounts are not diverted into irrigation schemes as this is likely to cause more damage than excessive amounts of "good quality" water. FAO has developed methods to determine project water requirements based on actual crop water needs, leaching requirements and irrigation efficiencies (FAO 1984).
A computer program, called CROPWAT (FAO 1992) has been developed to calculate crop water requirements and irrigation requirements from climatic and crop data. Further, the program allows the development of irrigation schedules for different management conditions and the calculation of scheme water supply for varying cropping patterns. The program runs on any standard personal computer with a minimum of 360 Kb of memory. The program can be obtained from FAO on request. A complementary computerized database program called CLIMWAT (FAO 1991) is available to obtain the required climatic data for CROPWAT. CLIMWAT has data from a total of 3262 meteorological stations from 144 countries.
Excessive loss of irrigation water from canals constructed in permeable soil is a major cause of high water tables and secondary salination in many irrigation projects. Such seepage losses should be reduced by lining the canals with impermeable materials or by compacting the soil to achieve a very low permeability. Because the amount of water passing critical points in the irrigation delivery system must be known in order to provide water control and to achieve high water-use efficiency, provisions for effective flow measurement should be made. Unfortunately, many current irrigation systems do not use flow measuring devices and, thus, the farmers operate with limited control and knowledge of the amount of water actually diverted to the farms. In addition, many delivery systems encourage over-irrigation because water is supplied for fixed periods, or in fixed amounts, irrespective of seasonal variations in on-farm needs. Salinity and water table problems are often the result. The distribution system should be designed and operated so as to provide water on demand and in metered amounts as needed to achieve high efficiency and to facilitate salinity control and the use of saline waters for irrigation.
Improvements in salinity control generally come hand-in-hand with improvements in irrigation efficiency. The key to the effective use of saline irrigation waters and salinity control is to provide the proper amount of water to the plant at the proper time. The ideal irrigation scheme should provide water as nearly continuously as possible, though not in excess, as needed to keep the soil water content in the rootzone within optimum safe limits. However, carefully programmed periods of stress may be needed to obtain maximum economic yield with some crops; cultural practices also may demand occasional periods of dry soil. Thus, the timing and amount of water applied to the rootzone should be carefully controlled to obtain good water use efficiency and good crop yield, especially when irrigating with saline water. As mentioned above, this requires water delivery to the field on demand which, in turn, requires the establishment of close coordination between the farmer and the entity that distributes the water; it calls for the use of feedback devices to measure the water and salt contents and potentials in the soil and devices to measure water flow (rates and volumes) in the conveyance systems.
The method and frequency of irrigation and the amount of irrigation water applied may be managed to control salinity. The main ways to apply water are basin flooding, furrow irrigation, sprinkling, subirrigation, and drip irrigation. Flood irrigation is good for salinity control when using saline waters if the land is level, though aeration and crusting problems may occur. Aeration and crusting problems are minimized by using furrow irrigation, but salts tend to accumulate in the beds. If excess salt does accumulate, a rotation of crops and periodic irrigation by sprinkler or flooding should be used as salinity-control measures. Alternatively, cultivation and irrigation depths should be modified, once the seedlings are well established, to "shallow" the furrows so that the beds will be leached by later irrigations. Irrigation by sprinkling may give better control of the amount and distribution of water; however, the tendency is to apply too little water by this method, and leaching of salts beyond the rootzone may sometimes be accomplished only with special effort. Salinity can be kept low in the seedbed during germination with sprinkler-irrigation, but crusting may be a problem. Emergence problems associated with such crusting may be overcome with frequent light irrigations during this time or by use of special tillage techniques. Subirrigation with saline water is not generally advisable unless the soil is periodically leached of the accumulated salts by rainfall or by surface applications of low-salinity water. Drip irrigation, if properly designed, is recommended for use of saline irrigation water because it minimizes salinity and matric stresses in the rootzone, though salts accumulate in the periphery of the wetted area. As noted earlier, higher levels of salinity in the irrigation water can be tolerated with drip as compared with other methods of irrigation.
Because soluble salts reduce the availability of water in almost direct proportion to their total concentration in the soil solution, irrigation frequency should be increased so that the moisture content and salinity of irrigated soils are maintained as high and low, respectively, as is practicable, especially during seedling establishment and the early stage of vegetative growth, if it can be done without resulting in excessive leaching or insufficient depth of rooting. The most practical way to accomplish this is through use of drip irrigation.
Additional water (over that required to replenish losses by plant transpiration and evaporation) must be applied, at least occasionally, to leach out the salt that has accumulated during previous irrigations. This leaching requirement depends on the salt content of the irrigation water and on the maximum salt concentration permissible in the soil solution which depends in turn on the salt tolerance of the crop and the manner of irrigation. If there is insignificant rainfall and irrigation is undertaken with a single water to steady-state, the leaching requirement can be estimated from the relations given in Figure 12. Fortunately, much of the needed leaching can be achieved between crops or during pre-irrigation and early growth-stage irrigations when soil permeability is generally relatively high, especially when using low-salinity waters in the cyclic use strategy. The first irrigations provided for the renewal of cropping following a fallow or uncropped period often unavoidably result in relatively high leaching. Many irrigation practices, especially with flood irrigation, inadvertently result in excess leaching, especially during pre-plant or early-season irrigations before the soil aggregation has slaked and surface soil permeability has diminished. Effects of non-uniform crop stand and cover, soil infiltration rates (permeabilities) and water application and distribution result in generally non-uniform leaching across an irrigated field. Calculation of the leaching requirement is disproportionately subject to errors related to uncertainties in knowledge of evapotranspiration, since Lr, = 1 - Vet/Viw. The value, much less the distribution, of evapotranspiration is not precisely known for most field situations, especially for conditions of irrigation with saline waters and in the presence of shallow, saline water tables. Consequently, there is little documented evidence of the positive benefits of increased leaching on crop yield under actual field conditions when irrigating with saline waters (Shalhevet 1984). While, certainly, the excess salts applied with saline irrigation waters must be removed over time to sustain crop production, for both short- and long-season crops it is generally sufficient to intentionally apply extra water for leaching only if and when the levels of salinity in the active rootzone actually become excessive. Giving extra water for leaching according to traditional Lr,. equations with each and every irrigation is not necessary. Rainfall in sub-humid climates often provides the required leaching. The control of salinity by leaching is accomplished most easily in permeable coarse-textured soils. Medium- and fine-textured soils have the agronomic advantage of a greater water-holding capacity and ordinarily present no major problem from the stand-point of irrigating with saline water and salinity control, particularly if they have good structure and are underlain by a sand or gravel aquifer which facilitates the removal of drainage water. Prevention of excessive salt accumulation is generally more difficult in fine-textured, stratified and slowly permeable soils.
Automated solid-set and centre-pivot sprinklers systems are conducive to good control and uniform distribution of applied water; in principle, trickle irrigation 5s even better. But gravity systems, if designed and operated properly, can also achieve good uniformity. Precision land grading and use of smaller water applications should be used to facilitate the achievement of high uniformity of areal water distribution over the field and infiltration, respectively. Closed conduits, rather than open waterways should be used for water distribution laterals if possible; they have the advantage of more effective off-on control, in addition to capturing gravitational energy for use in pressurizing delivery systems or controls which offer better potential for achieving high irrigation efficiencies.
The most advanced centre pivot irrigation system now used by some farmers in the USA is called the LEPA system - Low Energy Precision Application. In this system the sprinkler and spray nozzles used on the centre pivot systems are replaced with drop-down hoses and low pressure emitters that operate at only 0.3 kg/cm2 and are placed as close as 20-45 cm above the ground. Experience has shown that these systems can reduce evaporation and wind-drift losses to less than 5 percent of the emitted water.
The new LEPA technology became commercially available in 1986 and since then has been adopted by an estimated 1000 farmers. The advantage of LEPA with regard to saline water irrigation is that it can irrigate crops with the right amount of water, avoiding excess and runoff, and minimize foliar damage which is common with saline water irrigation. However, the technique is new, costly and needs to be further developed to reduce costs and make the system simpler for adoption by a wider group of farmers.
In furrow-irrigated areas, furrow length should be reduced in order to improve intake distribution and to reduce tail water runoff. Worstell's (1979) multi-set system is useful for such purposes. Surge irrigation techniques can sometimes be used to improve irrigation uniformity in graded furrows (Bishop et al. 1981). For tree crops, a low-head bubbler system can be used to provide excellent control and to minimize the pressure requirements and expensive filtration systems (Rawlins 1977). Drip systems, of course, are increasingly being used for permanent crops and high-value annual crops and are well suited for use with saline irrigation waters. All opportunities to modify existing irrigation systems to increase their effectiveness of water and salinity control should be sought and implemented. Irrigation management technology for salinity control is the subject of reviews by van Schilfgaarde (1976); van Schilfgaarde and Rawlins (1980) and Kruse et al. (1990).
A frequent constraint in improving on-farm water use is the lack of knowledge of just when an irrigation is needed and of how much capacity for storage is available in the rootzone. Ways to detect the onset of plant stress and to determine the amount of depleted soil water are prerequisites to supplying water on demand and in the amount needed. Prevalent methods of scheduling irrigation usually do not, but should, incorporate salinity effects on soil-water availability (Rhoades et al. 1981). When irrigating with saline waters, the osmotic component of the soil water potential of the rootzone must be considered in scheduling decisions.
Ideally, irrigation management should have the available soil water near the upper limit during germination and emergence but depleted by about 50 percent, or more, at harvest and should maintain available water within the major rootzone during the early vegetative, flowering and yield formation growth stages at a level which produces no deleterious plant water stress through successive, properly-timed irrigations (Doorenbos and Kassam; FAO 1979). Under saline conditions, some "extra" water must be given for leaching - a minimum commensurate with salt tolerance of the crop being grown, if rainfall is inadequate in this regard, as discussed previously. Some method of assessing the water availability to the crop with sufficient lead time to provide for a water application before significant stress occurs should be used for irrigation scheduling purposes. In addition, the amounts of water needed for replenishment of the depleted soil moisture from the rootzone and for leaching must be determined.
Prevalent methods used to determine the onset of stress include both direct and indirect measurements. Leaf water potential can be measured with a pressure "bomb" and used to determine the onset of stress; however, the method does not give information with which to predict when the stress will occur much in advance of its occurrence, nor does it provide a measure of the amount of water to apply. Infrared thermometry can also be used to measure plant water stress indirectly which results in the partial closure of leaf stomates and in reduced transpiration rates, causing leaf temperature to rise abnormally above ambient air temperature. This temperature difference can be interpreted in terms of a crop water stress index with which irrigation-need can be assessed. It suffers the same limitation as the leaf water potential method.
Various scheduling methods can be used which are based on sensing depletion of soil -water per se or soil water potential (matric, osmotic or total), or some associated soil or water property, and knowledge of the critical level (the set-point value). Such levels can be ascertained from salt tolerance data (see Tables 10-13) by converting threshold ECe values to osmotic potentials and assuming equivalent crop yield loss (also ET loss) would result from total water potential (i.e. assuming the effects of matric and osmotic potentials are equivalent and additive). Matric potential should be measured by any suitable means. Osmotic potential should be determined by one of the methods of salinity measurement described in Rhoades (1990b). Daily potential evapotranspiration can be calculated from measurements of air temperature, humidity, solar radiation and wind or of pan evaporation. The actual evapotranspiration (ETa) can then be estimated from empirically determined, crop coefficients as described by Doorenbos and Kassam (FAO 1979). The summation of these daily ETa values can then be used to estimate accumulative soil water depletion and total water potential. A plot of depletion or water potential versus time is then used to project the need for irrigation. This basic approach can be used based regardless of whether direct measurements of soil water content, or a related parameter, using neutron meters, resistance blocks, time-domain reflectometric (TDR) sensors, four-electrode sensors, or various soil matric potential sensors, etc., are used or estimated from ET methods. All of the methods suffer the limitation of needing to know the critical set-point value for irrigation, which varies with crop type, rooting characteristics, stage of plant growth, soil properties and climatic stress, etc. An estimate of this value can be obtained as described above or by the method of Doorenbos and Kassam (FAO 1979).
For saline water, irrigations should be scheduled before the total soil water potential (matric plus osmotic) drops below the level (as estimated above) which permits the crop to extract sufficient water to sustain its physiologic processes without loss in yield. Since, typically, the crop's root system normally extracts progressively less water with increasing soil depth (because rooting density decreases with depth and salt concentration increases with depth, as discussed earlier), the frequency of irrigations should be determined by the level of total soil water potential in the upper half of the rootzone where the rate of water depletion is greatest. Besides the extent of soil water depletion by ET, determination of the amount of water to apply should also be based on stage of plant development, the salt tolerance of the crop at this stage and the status of the soil water salinity at deeper depths in the rootzone. In early stages of plant development it is often desirable to irrigate just sufficiently to bring the soil to "field capacity" to the depth of present-rooting or just beyond. Eventually, however, excess water must be applied to leach salts accumulated in the upper profile to deeper depth in order to provide the growing plant access to more "usable" soil water in accordance with its expanding needs. Thus, the amount of irrigation water required is dictated by the plant's need for water, the volume of soil reservoir in need of replenishment and the level of soil salinity in the lower rootzone. Benefits of different amounts of saline irrigation water should be determined by evaluating their effects on relative crop yield using the water production function model.
For more discussion on irrigation management for salinity control, see the reviews of van Schilfgaarde (1976), van Schilfgaarde and Rawlins (1980), Hoffman et al. (1990), Shalhevet (1984) and Kruse et al. (1990).
"Feedback" information on the status of salt and water within the crop rootzone and the extent of leaching being achieved should be obtained periodically to identify developing problem areas, to evaluate the appropriateness of model predictions and as a guide to monitor the effectiveness of the irrigation system and management strategies being used. Soil water content (or matric potential), salinity (and hence osmotic potential) and leaching fraction can, in theory, all be determined from measurements of soil electrical conductivity, ECa, since ECa is a measure of both soil water content and soil water salinity. Soil salinity in irrigated agriculture is normally low at shallow soil depths and increases through the rootzone. Thus measurements of EC, in shallow depths of the soil profile made over an irrigation cycle are relatively more indicative of changing soil water content (permitting estimation of matric potential), while measurements of ECa in deeper depths of the profile, where less water uptake occurs, are relatively more indicative of salinity. Thus, in principle, depletion of soil water to a set-point level, depth of water penetration from an irrigation or rainfall event and leaching fraction can all be determined from ECa measurements made within the rootzone over time (Rhoades 1980; Rhoades et al. 1981). However, separate measurements of soil water content and soil water salinity, from which the total water potential can be estimated (matric plus osmotic), are more ideally suited for these needs. The use of time domain reflectometric (TDR) sensors offer potential in this regard (Dalton and Poss 1990).
Proper operation of a viable, permanent irrigated agriculture, which also uses water efficiently, requires periodic information on soil salinity, especially with use of saline waters. Only with this information can the effectiveness of irrigation project operation be assessed with respect to the adequacy of leaching and drainage, salt balance and water use efficiency. Monitoring programs should be implemented to evaluate the appropriateness of model predictions, the effectiveness of control programs, and to assess the adequacy of the irrigation and drainage systems on a project-wide basis. Frequently used methods based on "salt-balance" calculations are inadequate in this regard, for reasons given elsewhere (Kaddah and Rhoades 1976).
The direct inventorying and monitoring of soil salinity which are appropriate and needed in this regard are complicated by salinity's spatial variability, since numerous samples are needed to characterize an area. Monitoring is also complicated by salinity's dynamic nature, due to the influence of changing weather patterns, management practices, water table depth, etc. When the need for repeated measurements is multiplied by the extensive requirements of a single sampling period, the expenditures of time and effort with conventional soil sampling procedures increase proportionately. Hence, simple, practical methods for measuring or predicting field salinity are needed. Procedures for delineating representative areas within irrigation projects, where periodic measurements can be made for monitoring, are also needed, as are procedures for rapidly producing soil salinity maps. For these reasons new instruments for measuring soil electrical conductivity should be used and coupled with mobile transport vehicles, remotely sensed imagery and computer mapping techniques into an integrated system for inventorying and monitoring soil salinity. These procedures should also be integrated with solute-transport models to develop a geographic information system for salinity assessment and management needs. A network of representative soil salinity monitoring stations should be established in irrigation projects, especially those projects where saline waters are used for irrigation. For a discussion of mobilized, automated, and instrumental methods of salinity inventorying and monitoring see Rhoades (1990b; 1991).
For more discussion of the principles and practices of irrigation soil salinity control see the reviews of Rhoades 1987a; Hoffman et al. 1990; Rhoades and Loveday 1990; and Kruse et al. 1990.
Practices to control salinity in water resources
Integrated strategy to facilitate the use of saline waters for irrigation, to minimize drainage disposal problems and to maximize the beneficial use of multiple water sources
As discussed previously, irrigated agriculture is a major contributor to the salinity of many surface- and groundwaters. The agricultural community has a responsibility to protect the quality of these waters. It must also maintain a viable, permanent irrigated agriculture. Irrigated agriculture cannot be sustained without adequate leaching and drainage to prevent excessive salinization of the soil, yet these processes are the very ones that contribute to the salt loading of surface and groundwaters. But surface and groundwater salinity could be reduced if salt loading contributions from the irrigation processes were minimized or eliminated. The protection of water resources against excessive salination, while sustaining agricultural production through irrigation, requires the implementation of comprehensive land and water use policies that incorporate the natural processes involved in the soil-plant-water and associated geohydrological systems.
Strategies to consider in coping with increasing salinity in receiving water systems resulting from irrigation include:
· eliminating irrigation of certain polluting lands;
· intercepting point sources of drainage return flow and diverting them to other uses;
· reducing the amount of water lost in seepage and deep percolation;
· isolating saline drainage water from good quality water supplies.
Only the last two strategies are discussed herein, primarily the last one.
Minimizing Deep Percolation and Intercepting Drainage
As discussed earlier, minimizing leaching always reduces the salt discharged from the rootzone. Additionally, deeply percolating water often displaces saline groundwater of higher salinity or dissolves additional salt from the subsoil. Reducing deep percolation will generally reduce the salt load returned to the river as well as reduce water loss. The "minimized leaching" concept of irrigation which reduces deep percolation should be adopted and implemented to reduce salinization of water resources associated with irrigation projects, especially in projects underlain by salt-laden sediments (van Schilfgaarde et al. 1974; Rhoades and Suarez 1977). In addition, saline drainage water should be intercepted. Intercepted saline drainage water can be desalted and reused, disposed of by pond evaporation or by injection into some isolated deep aquifer, or it can be used as a water supply where use of saline water is appropriate. Desalination of agricultural drainage waters for improving water quality is not generally economically feasible even though it is to be implemented for the return flow of the Wellton-Mohawk irrigation project of Arizona, USA. The high costs of the pretreatment, maintenance, and power are deterrents. Only in extreme cases, or for political rather than technical reasons, is desalination advocated (van Schilfgaarde 1979; 1982).
Isolating and Reusing Drainage Water for Irrigation
While there is an excellent opportunity to reduce the salt load contributed by drainage water through better irrigation management, especially through reductions in seepage and deep percolation, there are practical constraints which limit such reductions. But the ultimate goal should be to maximize the utilization of the irrigation water supply in a single application with minimum drainage. To the extent that the drainage water still has value for transpirational use by a crop of higher salt tolerance, it should be used again for irrigation.
Drainage waters are often returned by diffuse flow or intentional direct discharge to the watercourse and automatically "reused." Dilution of return flows is often advocated for controlling water salinity. This concept has serious limitations when one considers its overall effect on the volume of usable water, and it should not be advocated as a general method of salinity control.
The preferred strategy to control the salinity of water resources associated with irrigated lands is to intercept drainage waters before they are returned to the river and to use them for irrigation by substituting them for the low-salinity water normally used for irrigation at certain periods during the irrigation season of certain crops in the rotation. When the drainage water quality is such that its potential for reuse is exhausted then this drainage should be discharged to some appropriate outlet. This strategy will conserve water, sustain crop production and minimize the salt loading of rivers that occurs by irrigation return flow (Rhoades 1984a, b, c). It will also reduce the amount of water that needs to be diverted for irrigation. This strategy is discussed in more detail in the next section.
As indicated in the preceding section, the ultimate goal of irrigation management should be to minimize the amount of water extracted from a good-quality water supply and to maximize the utilization of the extracted portion during irrigation use, so that as much of it as possible is consumed in transpiration (hence producing biomass) and as little as possible is wasted and discharged as drainage. Towards this goal, to the extent that the drainage water from a field or project still has value for transpirational use by a crop of higher salt tolerance, it should be used again for irrigation before ultimate disposal.
It is the intent of this section to describe an integrated strategy of management that will simultaneously facilitate the successful use of saline waters for irrigation, minimize the harmful off-site effects of drainage discharge on the pollution of water resources and maximize the beneficial use of the total water supply available in typical irrigated lands and projects. This strategy illustrates how the information and principles given in the preceding sections of these guidelines can be integrated towards the goals of sustaining irrigated agriculture and protecting soil and water resources.
To the extent practical, water diverted and applied for irrigation should be minimized using the principles and methods previously discussed. Unavoidable excessive, usable resulting drainage water should be intercepted and isolated from good-quality water supplies and used within dedicated parts of the project as a substitute for part of the freshwater given to the crops. The "dual rotation, cyclic" management strategy of Rhoades (1984a, b, c) can be used to enhance the feasibility of reusing such saline drainage waters for irrigation. In this system, sensitive crops (such as lettuce, alfalfa, etc.) in the rotation are irrigated with "low salinity" water (usually the developed water supply of the irrigation project), and salt-tolerant crops (such as cotton, sugarbeets, wheat, etc.) are irrigated with saline drainage water or the shallow groundwater created by over-irrigation in the project. For the salt-tolerant crops, the switch to saline water is usually made after seedling establishment; preplant irrigations and initial irrigations being made with low-salinity irrigation water. The secondary drainage resulting from such re-use should also be isolated and used successively for crops (including halophytes and tolerant trees) of increasingly greater salt tolerance. The ultimate unusable drainage water should be disposed of to some appropriate outlet or treatment facility.
The feasibility of this "dual-rotation, cyclic" strategy is supported by the following:
The maximum possible soil salinity in the rootzone resulting from continuous use of saline water does not occur when this water is used only for a fraction of the time.Alleviation of salt build-up resulting from irrigation of salt-tolerant crops with the saline water occurs later when a salt-sensitive crop (s) is irrigated with the low-salinity water supply, or during off season periods of high rainfall.
Proper preplant irrigation and careful irrigation management undertaken during germination and seedling establishment are made using the low-salinity water supply to leach salts accumulated from saline irrigations out of the seed-area and from shallow soil depths.
Data obtained in modelling studies and in field experiments support the credibility and feasibility of this "cyclic" reuse strategy (Rhoades 1977; 1989; Rhoades et al. 1989a, b, and c; Minhas et al. 1989; 1990a and b).
Results of an experiment to test the feasibility of the cyclic, "dual-rotation" reuse strategy are reviewed to clarify and illustrate the concept and to demonstrate its credibility. The strategy was tested in a 20 ha field experiment on a commercial farm in the Imperial Valley of California (Rhoades et al. 1989a, b, c). Two cropping patterns were tested. One was a two-year, successive-crop rotation of wheat, sugarbeets and cantaloupe melons. In this rotation, Colorado River water (900 mg/l TDS) was used for the preplant and early vegetative growth stage irrigations of wheat and sugarbeets and for all irrigations of the melons. The remaining irrigations were with drainage water of 3500 mg/l TDS (Alamo River water). The other cropping pattern tested was a four-year block rotation consisting of two consecutive years of cotton (a salt-tolerant crop) followed by wheat (a crop of intermediate salt-tolerance) and then by two years of continuous alfalfa (a relatively salt-sensitive crop). Drainage water was used for the irrigation of cotton after seedling establishment; beginning with the wheat crop, only Colorado River water was used. From Watsuit calculations, it was hypothesized that the crops irrigated with the drainage water would yield fully when established with Colorado River water and from other calculations that sufficient desalination of the soil would occur when irrigating with Colorado River water to achieve a good plant stand and to keep the soil from becoming excessively saline over the long-run.
The yields of the crops grown in the successive and block rotations are given in Tables 40 and 41, respectively. No significant losses in the yields of the wheat and sugarbeet crops occurred in either cycle of the successive crop rotation from substituting drainage water (even in the greater amount; 65-75 percent; treatment cA) for Colorado River water for the irrigation of these crops after seedling establishment. The mean yield of cantaloupe seed obtained in the cA plots was about 10 percent lower than the control, but the difference was not statistically significant. The yields of the fresh-market melons (numbers of cartons of cantaloupes obtained by commercial harvest operations) in 1985 was higher in the Ca and cA treatments than in the C treatment, but they were not significantly different (see Table 40). Hence, no significant yield loss was observed from growing cantaloupes using Colorado River for irrigation in the land previously salinized from the irrigation of wheat and sugarbeets using drainage water.
TABLE 40 Yields of crops in successive rotation (after Rhoades et al. 1989a)
Treatment' |
Crop/year |
|||||
wheat/19822 |
sugarbeets/19833 |
cantaloupes/1983" |
wheat/19842 |
sugarbeets/19853 |
cantaloupes/19855 |
|
C |
3.60 |
4.3 |
392 |
3.51 |
4.1 |
115 |
|
(0.06)6 |
(0.1) |
(12) |
(0.09) |
(0.1) |
(5) |
Ca |
3.60 |
4.3 |
384 |
3.46 |
4.1 |
142 |
|
(0.08) |
(0.2) |
(10) |
(0.10) |
(0.1) |
(8) |
cA |
3.71 |
4.1 |
355 |
3.55 |
3.9 |
139 |
|
(0.06) |
(0.1) |
(14) |
(0.09) |
(0.1) |
(12) |
1 C = Colorado River water used solely for irrigation; Alamo River water used in relatively smaller (Ca) and larger (cA) amounts, after seedling establishment with Colorado River water for wheat and sugarbeets. Cantaloupes only irrigated with Colorado River water.2 Tons of grain per acre.
3 Tons of sugar per acre.
4 Lbs of seed per acre.
5 Commercial yield in number of cartons per plot; plot size = 750 × 38 feet = 0.6543 acres.
6 Value within ( ) is standard error of mean; six replicates.
TABLE 41 Yields of crops in block rotation (after Rhoades et al. 1989a)
Treatment1 |
Crop/year |
|||
cotton/19822 |
cotton/19832 |
wheat/19843 |
alfalfa/19854 |
|
C |
2.62 (.07)5 |
2.061.10) |
3.43 (.06) |
7.8 (0.4) |
Ca |
2.65 (.06) |
2.00 (.09) |
3.43 (.06) |
7.0 (0.5) |
A |
2.76 (.04) |
1.32 (.05) |
3.41 (.05) |
7.4 (0.3) |
1 C = Colorado River water used solely for irrigation; A = Alamo River water used solely for irrigation; cA = Alamo River water used for irrigation after seedling establishment with Colorado River water for cotton. Wheat and alfalfa irrigated only with Colorado River water.2 Commercial yield of lint, bales per acre.
3 Tons of grain per acre.
4 Tons of dry hay per acre.
5 Value within ( ) is standard error of mean; six replicates.
In the block rotation, there was no loss in lint yield in the first cotton crop (1982) from use of saline drainage water for irrigation, even when it was used for all irrigations, including the preplant and seedling establishment periods (treatment A). There was no significant loss in lint yield in the second cotton crop (1983) grown in the block rotation from use of drainage water for the irrigations given following seedling establishment which was accomplished using Colorado River (the recommended strategy treatment, cA). But there was a significant and substantial loss of lint yield, as expected, where the drainage water was used solely for irrigation (the "extreme - control" treatment, A). This loss of yield was caused primarily by a loss of stand that occurred this second year due to excess salinity in the seedbed during the establishment period. No loss in yield of the wheat grain or alfalfa hay crops occurred in the block rotation associated with the previous use of drainage water to grow cotton on these lands when they were subsequently grown with use of Colorado River water for irrigation. The qualities of all of these crops were never inferior, and often were superior, when grown using the drainage water for irrigation or on the land where it had previously been used. These quality data are given elsewhere (Rhoades et al. 1989a, b).
The average amounts of water applied to each crop and over the entire four-year period are given in Tables 42 and 43 for the successive and block rotations, respectively. These data include all water applied, including that used for preplant irrigations and land preparation purposes. These data along with those in Tables 40 and 41 show that substantial amounts of drainage water were substituted for Colorado River water in the irrigation of these crops without yield loss.
The estimated amounts of water consumed by the crops through evapotranspiration and lost as deep percolation are given in Table 44 by individual crop and by succession of crops for both rotations. It was assumed that consumptive use was the same in all treatments, since no substantial losses of yield resulted in any treatment. These data show that the saline drainage water was successfully used for irrigation without resorting to high leaching. Data on levels of soil salinity and sodicity in the seedbeds and rootzones are given in Rhoades et al. (1989b). Their levels were kept within acceptable limits for seedling establishment and the subsequent growth of the individual crops grown in the rotation when the recommended strategy was employed. These results along with the high crop yields and qualities obtained in this test under actual farming conditions support the credibility of the recommended cyclic, dual-rotation (crop and water) strategy to facilitate the use of saline waters for irrigation.
In this cyclic strategy, steady-state salinity conditions in the soil profile are never reached, since the irrigation water quality changes with crop type in the rotation and with time in the irrigation season. Consequently, a flexible cropping pattern which includes salt-sensitive crops can be achieved. The intermittent leaching which occurs using this strategy is more effective in leaching salts than is continuous leaching (i.e. imposing a leaching fraction with each irrigation) for the reasons given earlier. Another advantage of the strategy is that a facility for blending waters of different qualities is not required.
In order to plan and implement a successful practice involving the use of the cyclic, dual-rotation strategy for irrigating with saline waters, various other considerations must be addressed. The intention here is not to provide a step-by-step process that must be followed nor a rigid set of criteria to address these considerations, since most management decisions are subjective and case specific, but to discuss some of the factors that should be considered and to provide some rough guidelines for selecting appropriate management practices.
Perhaps the most important management decision to make before implementing a reuse practice is crop selection. Crop tolerances of crops to salinity and to specific elements are given in Tables 13-21. A list of other criteria that should be considered in the selection of crops for a reuse practice is given in Table 45. In most cases, it is recommended that crops of high tolerance to salinity be selected when saline drainage water is to be used for irrigation. However, crops of intermediate tolerance (e.g. alfalfa, melons, tomatoes and wheat) may also be used in some cases, especially if the crop quality is sufficiently benefitted. For example, drainage water (EC 4-8 dS/m) significantly increased the protein content of wheat and alfalfa (Rhoades et al. 1989a), soluble solids in melons and tomatoes (Grattan et al. 1987), total digestible nutrients in alfalfa (Rhoades et al. 1989a), and improved colour and netting of cantaloupe (Rhoades et al. 1989a), and improved peelability in processing tomato (Grattan and Rhoades 1990). While improved plant quality should not be the major factor in adopting a reuse practice it may be an important factor in crop selection. Use of saline water to irrigate crops of intermediate tolerance to salinity is feasible, of course, only after seedlings have been established by good quality water.
1 Includes preplant water applications.2 C = Colorado River water used solely for irrigation; Alamo River water used for irrigation in relatively smaller (Ca) and larger (cA) amounts after seedling establishment with Colorado River water.
3 Number within ( ) is standard error of mean.
TABLE 43 Amounts of Colorado and Alamo river waters used for irrigation in the block rotation (mm) (after Rhoades et al. 1989b)
Treatment2 |
1982 cotton |
1983 cotton |
1984 wheat |
|||||||||
Colorado River |
Alamo River |
Total |
% Alamo |
Colorado River |
Alamo River |
Total |
% Alamo |
Colorado River |
Alamo River |
Total |
% Alamo |
|
C |
1306 (19)3 |
0 |
1306 (19) |
0 |
1177 (6) |
0 |
1177 (6) |
0 |
823 (8) |
0 |
823 (8) |
0 |
cA |
0515 (12) |
774 (30) |
1289 (42) |
60 |
617 (4) |
545(3) |
1162 (6) |
47 |
798 (2) |
0 |
798 (2) |
0 |
A |
0 |
1187 (25) |
1187 (25) |
100 |
0 |
1149 (7) |
1149 (7) |
100 |
795 (5) |
0 |
795 (5) |
0 |
|
alfalfa |
complete rotation |
|
|||||||||
|
Colorado River |
Alamo River |
Total |
% Alamo |
Colorado River |
Alamo River |
Total |
% Alamo |
|
|||
C |
2048 (6) |
0 |
2048 (6) |
0 |
5372 (8) |
0 |
5372 (8) |
0 |
|
|||
cA |
2058 (7) |
0 |
2058 (7) |
0 |
3995 (18) |
1132 (34) |
5327 (50) |
25 |
|
|||
A |
2029 (16) |
0 |
2029 (16) |
0 |
2824 (17) |
2336 (30) |
5160 (42) |
45 |
|
1 Includes preplant water applications.2 C = Colorado River water used solely for irrigation; Alamo River water used for irrigation in relatively smaller (Ca) and larger (cA) amounts after seedling establishment with Colorado River water.
3 Number within ( ) is standard error of mean.
TABLE 44 Estimated evapotranspiration and deep percolation (inches) (after Rhoades et al. 1989b)
Crop |
Vet1 |
Viw2 |
Vdw3 |
LF4 |
Accumulated5 |
|||
Vet |
Viw |
Vdw |
LF |
|||||
Successive crop rotation |
||||||||
1982 wheat |
25.8 |
21.9 |
-3.9 |
-0.18 |
25.8 |
21.9 |
-3.9 |
-0.18 |
1983 s. beet |
40.5 |
49.1 |
8.6 |
0.18 |
66.3 |
71.0 |
4.7 |
0.07 |
1983 melons |
16.8 |
24.7 |
7.9 |
0.32 |
83.1 |
95.7 |
12.6 |
0.13 |
1984 wheat |
27.1 |
32.8 |
5.7 |
0.17 |
110.2 |
128.5 |
18.3 |
0.14 |
1985 s. beet |
42.3 |
53.7 |
11.4 |
0.21 |
152.5 |
182.2 |
29.7 |
0.16 |
1985 melons |
16.8 |
13.6 |
-3.2 |
-0.24 |
169.3 |
195.8 |
26.5 |
0.14 |
Block rotation |
||||||||
1982 cotton |
38.9 |
50.7 |
11.8 |
0.23 |
38.9 |
50.7 |
11.9 |
0.23 |
1983 cotton |
40.7 |
45.7 |
5.0 |
0.11 |
79.6 |
96.5 |
16.9 |
0.18 |
1984 wheat |
27.1 |
31.4 |
4.3 |
0.14 |
106.7 |
127.9 |
21.3 |
0.17 |
1985 alfalfa |
81.2 |
81.0 |
-0.2 |
-0.00 |
187.8 |
208.9 |
21.1 |
0.10 |
1 Evapotranspiration estimated from pan evaporation and crop factors at Brawley, California.2 Total amount of water applied for irrigation,
3 Estimate of deep percolation drainage water i.e. Viw - Vet.
4 Estimate of leaching fraction, i.e. Vdw/Viw.
5 Accumulated over entire experimental period.
TABLE 45 Criteria to be considered for selecting crops for a reuse practice (after Grattan and Rhoades 1990)
Selection criteria |
Desirable |
Undesirable | |
1. |
Economic value/marketability |
high marketability |
low, unmarketable |
2. |
Crop salt tolerance |
tolerant |
sensitive |
3. |
Crop boron/chloride tolerance |
tolerant |
sensitive |
4. |
Crop potential to accumulate toxic constituent |
toxic element excluder |
toxic element accumulation |
5. |
Crop quality |
unaffected or improved by saline water |
adversely affected by saline water |
6. |
Crop rotation consideration |
compatible |
incompatible |
7. |
Management/environmental conditions requirements |
Easy management, able to grow under diverse conditions |
requires intensive management and can only be grown under very specific conditions |
Economics is also an important selection consideration, since it would be senseless to grow a high yielding crop without a marketable product and the potential for a positive cash flow. In the San Joaquin Valley in California, there is negative correlation between crop tolerance to salinity and economic value (Grattan and Rhoades 1990). It is unfortunate that there are not many crops that are both tolerant to salinity and have a high economic value. Asparagus is tolerant to salinity and has a high economic value, but harvesting is labour-intensive and costly.
The cyclic, "dual-rotation" reuse strategy described above presupposes the availability of two water sources; the saline water to be utilized and the other a water of low salinity. Such reuse requires that the saline water be readily accessible for irrigation. Possible sources can be the drainage waters that are being discharged in pipes or canals from the irrigation project or that present in the underlying shallow groundwater system. Rainfall may also be the source of good-quality water, if it occurs at required times during the year to meet crop needs periodically and to leach excessive accumulations of soluble salts from the rootzone.
There are many different situations where the use of saline water for irrigation in the recommended strategy could be practical. One situation is where high quality water is available during the early growing season but is either too costly or too limited in supply to meet the entire seasons requirements. This situation is common in parts of India and Pakistan, for example. Where high-quality water costs are prohibitive, crops of moderate to high salt tolerance could be irrigated with saline drainage or groundwater, especially at later growth stages with economical advantage, even if this practice resulted in some reduction in yield relative to that obtainable with a full supply of fresh water. Use of saline water for irrigation reduces the amount of high-quality water required to grow crops and hence expands the water-resource base for crop production.
Another situation conducive for such reuse is one where drainage water disposal, or a means of lowering an excessively shallow water table, is impractical due to physical, environmental, social or political factors. Reuse of the drainage water for irrigation in this situation decreases the volume of drainage water requiring disposal or treatment, and the associated costs. Furthermore, a reduction in the drainage volume also reduces the salt loading of the receiving water. Many growers in the San Joaquin Valley of California are presently undertaking reuse of drainage water, at least as a temporary solution, in order to reduce drainage volume and to meet recently imposed discharge restrictions related to protection of the quality and ecology of receiving water systems.
A difficulty in adopting the cyclic, "dual-rotation" strategy may exist on small farms where the drainage water produced on-site is too little or does not coincide with peak crop-water demand. In the San Joaquin Valley in California, farms are often sufficiently large but peak drain water flow occurs from January to June when most crops would require high quality water. Sole use of drainage water later in the season may not be feasible if the flow rate needed for irrigation exceeds the flow rate from the drains. To avoid this limitation, surface storage reservoirs can be constructed to store the drainage water until its use is required. An option is to plug the subsurface drains and allow the soil to act as the reservoir. The latter option would not take land out of production for water storage purposes. However, regardless of where the drainage water is stored, a drainage water collection and irrigation system should be designed and operated with "reuse" in mind in order to implement this strategy most efficiently.
One method of collecting sufficient quantities of drainage water is to install a network of interceptor drains in areas with shallow water tables. A submersible pump could be placed in collector sumps as a means to access the drainage water. The size of the area that can be irrigated using such "drainage water" will vary, of course, depending on the capacity of the drainage system. To surface irrigate effectively, at least 10 litres/min/ha is required. Another way to collect drainage water is to install a network of shallow wells in strategic shallow water table areas. The wells can be connected to a common drainage manifold to facilitate collection and distribution. Consultation with irrigation and drainage engineers is advised before installing any drainage water collection system for irrigation use.
The long-term feasibility of using drainage water for irrigation in order to reduce drainage volume would likely be increased if implemented on a project or regional scale such as shown in Figure 17, rather than on a farm scale. Regional management permits reuse in dedicated areas so as to avoid the successive increase in concentration of the drainage water that would occur if the reuse process were to operate on the same water supply and same land area (i.e. in a closed loop). With regional management, certain areas in the region can be dedicated to reuse while other areas such as upslope areas, are irrigated solely with high quality water as usual. The second-generation drainage water from the primary reuse area is discharged to other dedicated reuse areas where even more salt-tolerant crops are grown, or to regional evaporation ponds or to treatment plants. Ideally, regional coordination and cost-sharing among growers should be undertaken in such a regional reuse system.
A novel means of "treating" saline waste waters before their ultimate disposal is to use them to irrigate specific crops that have the ability to accumulate large quantities of undesirable constituents (e.g. Se, Mo, NO3, B, etc.) in the plants, in order to help reduce adverse ecological effects of disposal. The feasibility of biofiltration, the term used to describe this process, has been demonstrated by Cervinka et al. (1987), and Wu et al. (1987). They found that mustard, some grasses and certain native plant species found in California are effective in accumulating substantial amounts of Se in their shoots. This alternative "reuse" practice is most attractive where: (i) drainage disposal problems exist related to a potentially toxic trace constituent, (ii) a bioaccumulator with economic value exists, and (iii) other treatment processes are either unavailable or too expensive.
An alternative reuse strategy that is often advocated is to blend water supplies before or during irrigation (Shalhevet 1984; Meiri et al. 1986; Rains 1987; Rolston et al. 1988). Blending may be appropriate provided the drainage or shallow groundwater is not too saline per se for the crop to be grown. However, in many cases this approach is inappropriate for the reasons given in chapter 5.
If the blending strategy is adopted, there must be a controlled means of mixing the water supplies. Shalhevet (1984) and Meiri et al. (1986) described two blending processes (i) network dilution, and (ii) soil dilution. With network dilution, water supplies are blended in the irrigation conveyance system. With soil dilution, the soil acts as the medium for mixing water of different qualities. A network blending system must be designed and installed if the blending strategy is to be adopted. The theory and design of dilution control systems and their use in irrigation networks has been developed by Sinai et al. (1985; 1989).
The cyclic strategy is preferred over the blending strategy in that (i) more salt sensitive crops can be included in the rotation, (ii) a blending facility is not required, and (iii) there is less danger in losing "usable water" for the crop. However, the cyclic strategy will require larger quantities of drainage water during the irrigations where it is used (since the water is not blended) and thus a storage system may be required in order to supply sufficient water for an effective irrigation. In summary, the "cyclic" strategy has more potential and flexibility than does the "blending" strategy, although the latter strategy is easier to implement in some cases.
FIGURE 17 Regional drainage water reuse plan
Another concern besides excessive salinity build-up as regards the long-term feasibility of using saline water for irrigation is that of soil permeability and tilth. As discussed in Chapter 4, the likelihood of these problems increase as SAR increases and as electrical conductivity decreases. Therefore, adverse effects are most likely to occur during the periods of rainfall and irrigation using low-salinity water on soils previously irrigated with sodic, saline water. Such problems occurred at an experimental "reuse" site in California following pre-season rains and pre-irrigation with 0.3 dS/m canal water where sodic, saline water [9000 mg/l TDS; SAR = 30 (mmolc/l)½] had been used for irrigation for four consecutive years (Rolston et al. 1988). The consequence was impermeable, crusted soils and poor stand establishment. Whether such a problem will occur, or not, depends upon whether the EC of the high quality water is less than the threshold value, given the SAR of the saline water. Some combinations of the two waters are not permissible. The methods given in chapter 4 may be used to assess whether such a problem is likely to occur or not. This problem can often be controlled by the use of amendments and appropriate tillage practices as discussed.
Soil salinity under the cyclic strategy will fluctuate more, both spatially and temporally than in soils irrigated with conventional water supplies. Therefore, predicting plant response will be more difficult under these conditions. Hence, long-term effects on soil salination should be monitored using the techniques described earlier. Management must be adjusted to keep the average rootzone salinity levels within acceptable limits for the crop being grown, considering its stage of growth.
Many saline waters contain certain elements, such as boron and chloride, that can potentially accumulate in plants, especially woody, perennial ones, to levels that cause foliar injury and a subsequent reduction in their yield. In such cases, toxicity may produce more long-term detrimental effects than does salinity. Since boron is adsorbed by the soil it requires longer to build to toxic levels in the soil solution and it requires more leaching to remove its excessive accumulations than does salinity. Thus long-term accumulation in the soil of potential toxicants must be considered, since toxic effects may not become evident for years and may be more difficult to eliminate. Water containing excessive concentrations of B or Cl should not be used to irrigate perennial crops, since (i) these crops are generally more sensitive to specific-ion effects (ii) they represent a long-term investment, and (iii) they will have a long time opportunity to accumulate toxic levels. This same concern applies to growing perennial crops in the presence of a shallow groundwater that contains solutes potentially toxic to the plant.
Another consideration as regards use of saline water for irrigation is the potential of the plant to accumulate certain elements (such as Se, Mo, heavy metals) that are toxic to consumers of the crops (humans and animals). For example, in the San Joaquin Valley of California, drainage water in several locations contains unusually high levels of Se (³ 50/m g Se/1). Although Se is essential to humans and animals in small amounts, excessive amounts can cause Se toxicosis. In the San Joaquin Valley, melons and processing tomatoes irrigated with drainage water containing 250 to 350 m g Se/l accumulated elevated levels of Se in the fruit (250 to 750/m g/kg, dry wt.) that, while not an immediate health hazard, might become so to one whose diet was mostly restricted to such food (Grattan et al. 1987; Fan and Jackson 1987). Many forages and native plant species have the potential to accumulate excessive amounts of Se (Wu et al. 1987). Since grazing animals consume larger quantities of plant mass than do humans, they have a greater potential for being "poisoned" in this manner.
Since plants vary in their ability to absorb and translocate toxic elements, crops that accumulate large quantities of toxic elements in the edible animal organs should also be avoided when using saline waters containing such elements for irrigation. Fleming (1962) found that Se concentrations were higher in Cruciferae (cabbage, cress, radish, rape and turnips), Liliacea (onion), and Leguminosae (clover and peas) than in Compositae (artichoke and lettuce), Gramineae (barley, oats, rye grass, and wheat), and Umbelilferae (parsnip and carrots) when grown on seleniferous soils in Ireland. It is also important to understand how the toxic constituent is partitioned within the plant. In most annual fruit and vegetable crops selenium accumulates more in the leaves than in the fruit (Mikkelson et al. 1988), but exceptions exist. If the saline water in question contains high levels of a potentially toxic element, the user should obtain expert advice.