William R. Johnson, Consulting Engineer, Modesto, California, USA
Kenneth K. Tanji, University of California, Davis, California, USA; and Robert T. Burns, Westland Water District, Fresno, California, USA
Disposal to natural hydrological systems
Land application and retirement
Deep well injection
Disposal to natural hydrological systems
There are a limited number of options available when trying to decide where and how to dispose of agricultural drainage Water into the natural hydrological system. The common option is to return the water either to the land as part of the irrigation water supply, or to rivers and lakes, or to salt sinks, such as the ocean. The options available to any single project may be limited because of water quality concerns. Drainage water quality may vary within a catchment. Frequently, in developing countries, agricultural drains are used for the disposal of domestic and industrial wastewater, or for the disposal of polluted water from other non-agricultural sources. This may adversely affect the quality of agricultural drainage water, and limit its potential for reuse.
Downstream beneficial uses of any surface water body to which drainage water is added must be protected. For example, it may not be appropriate to discharge saline drainage water into a river or lake when that surface water body is being used for domestic or agricultural water supplies.
In many cases, it may be possible and fully acceptable to discharge drainage water into a large freshwater body. However, it will be necessary to determine the assimilative capacity of the receiving water and identify the constituents in the drainage water to determine the 'safe level' or discharge requirements for the drainage water. The discharge requirements should specify the maximum allowable concentration of each constituent of concern and the volume of drainage water discharge that will be acceptable. In some cases, there will be a significant difference between the quality of the drainage water and that of the receiving water, while in other cases, there will be little difference between the two. The dilution capacity of the receiving water will vary from place to place and from time to time depending on numerous local conditions and the upstream uses of the receiving water. The discharge of drainage water of a higher quality than the receiving water is generally acceptable. Pollutants in the drainage water may end up in the channel bed material. Here, they may create subsequent water quality problems, if and when the channels need to be dredged.
Special attention should also be given to the pollution hazards posed by regional seepage flows. This seepage may pollute the receiving surface water with solutes picked up during movement through various soil or rock formations. This may reduce the dilution capacity of surface water. Shallow seepage may cause excessive nutrient loading, while deeper seepage may convey toxic geochemicals.
Mitigation to facilitate the disposal of drainage to surface waters should preferably start at the field level. Measures such as the construction of retention ponds and the establishment of riparian border strips can be taken to reduce the direct inflow of polluted surface drainage water into the receiving system. In the receiving system, chemical concentrations may be further reduced by dilution and mixing. Less polluted or even fresh water from upstream storage can sometimes be used for dilution and flushing, so as to facilitate safe drainage discharge into streams and lakes.
The disposal of drainage water from acid sulphate soils is unique. The pH of this drainage water can be as low as three or four. Generally, this type of drainage water must be highly diluted in the main drainage system. Where dilution is not possible, special arrangements should be made to flush and transport this acid drainage water from the system.
In most cases, lakes must be given special consideration, as there may not be an adequate outlet or flow volume from the lake, and contaminants may accumulate. This could cause substantial long-term problems to any beneficial uses of the lake water, such as aquatic habitat.
Generally, rivers continually cleanse themselves and can tolerate somewhat higher pollutant constituent loads than lakes. However, it is still essential that the quality of both the drainage water and the river water be evaluated to determine the safe level of constituents that can be placed in the river without affecting downstream beneficial uses.
Estuaries, bays and oceans
Estuaries, bays and oceans have somewhat different water quality requirements. Considerable exchange of water takes place in the estuarine, bay and ocean environments due to the changing tides and ocean currents. In addition, the introduction of saline drainage water into a salt-water environment generally reduces the impact of the discharge of the drainage water on the receiving water. However, each receiving water needs to be analysed and compared with the drainage effluent in order to protect beneficial uses of the receiving water. Nutrients in drainage waters may cause problems in estuaries. If possible, the oceans and seas should be the ultimate natural disposal location for saline drainage water.
In irrigated areas, there is often some low quality land available near to where agricultural drainage problems occur. Such land can be used for the disposal of drainage water in evaporation ponds or constructed wetlands, or for re-use in saline agriculture-forestry. However, if the drainage water contains potentially toxic constituents, such as Se, then disposal of drainage water in evaporation ponds or in wetlands will be environmentally hazardous. In California's Central Valley, drainage water containing Se in concentrations greater than 2 m g/litre is not recommended for use as water for ponds or wetlands (California Regional Water Quality Control Board, 1995). Some proportion of the water must be flushed through such wetlands to prevent toxic ion concentration from accumulating to dangerous levels.
Land retirement is sometimes portrayed as a 'solution' to an agricultural drainage problem where salt is allowed to accumulate on the soil surface due to evaporation from a shallow water table. The principal difficulty with land retirement as a 'solution' to drainage problems is that it takes a considerable amount of land out of production. This is obviously a perplexing issue when farmers lose half of their land (the salt sink area) in order to continue production on the remainder. This concept of taking care of a salt balance-waterlogging problem has recently been termed 'dry drainage' (Gowing and Wyseure, 1992). The objective is to have the non-cropped area large enough and the evaporation from this area fast enough to achieve the necessary salt balance and a stable water table over the entire area. The theory is that if inflow balances outflow, then the water table will be stable. The cropped area must be continuously irrigated to avoid salt accumulation in the cultivated portion of the fields, since the water table in the irrigated lands will be higher than it is in the adjacent lands. Dry drainage is not recommended.
Considerable research, study and economic evaluation must be undertaken before this technique is selected as a 'solution' to an agricultural drainage problem. Gowing and Wyseure (1992) point out three important questions that must be answered:
i. What is the limiting cropping intensity?
ii. What is the limiting water table depth?
iii. What is the long-term impact of salt accumulation in the drainage salt sink area?
'Dry drainage' adds to the evaporative water loss of the area.
Constructed evaporation ponds
Pond water chemistry and mineralogy
Pond biology and toxicity
Biological, chemical and physical treatment options
Disposal of runoff and drainage waters into natural depressions has been practised for centuries. The impounded waters are dissipated by evaporation, seepage and transpiration losses. Use of constructed disposal basins for saline agricultural drainage waters are also common worldwide where there are constraints on discharging into natural salt sinks such as the oceans and inland closed basins. Examples of such practices include the salinity control and reclamation projects (SCARP) along the Indus River in Pakistan (Trewhella and Badruddin, 1991); the irrigation projects along the Syr Darya and Amu Darya Rivers in Kazakhstan and Uzbekistan (Micklin, 1991); and those along the Murray River in Australia (Evans, 1989). Constructed basins managed for the evaporation of saline drainage waters are a comparatively new practice (Tanji et al., 1993).
In the Murray-Darling basin in Australia, some of the constructed evaporation basins are intended to hold saline water only temporarily. The stored waters are then released during high river flows. Closed basins are expected to have operating lives of 50-150 years. The surface areas of these basins range from a few hectares to about 2 000 ha. The waters impounded in the Australian basins are typically dominated by NaCl type salts. Data on trace elements have not been reported. Lateral and vertical seepage losses from these basins are substantial, 20-50% in most basins (Evans, 1989).
In the San Joaquin Valley of California, evaporation ponds are used in areas where there are no opportunities for discharging saline subsurface drainage water. The ponds are constructed by excavating soils from the interior of the basins to build up embankments. Drainage water is discharged into the ponds by pumping. One of the first large-scale evaporation facilities established in the United States in 1975 was in the Lake Tulare bed, a hydrologically closed basin in the southern San Joaquin Valley (Summers Engineering, 1995). From the mid-1970s to 1985, 28 basins were installed in this valley. These basins occupy a total surface area of about 2 800 ha and vary in area from 2.5 to 730 ha and in depth from 0.5 to 2 m. Many ponds consist of one to three cells, with a few ponds having from six to eleven cells. In multi-cell systems, water is routed by gravity from cell to cell to optimize evaporation rates in the initial cells and precipitate most of the salts in the terminal cells (Ford, 1988).
The evaporation basins in the San Joaquin Valley receive about 3.9 million m³/year of subsurface drainage water from about 22 700 ha of subsurface drained fields (Ford, 1988). Some ponds or pond cells are allowed to dry, while others are maintained at a minimal water level. The ponds are typically located on clay loam to clay soils. Emergent vegetation is controlled but the submerged rooted vegetation and phytoplankton are not. Rainfall at the evaporation pond facilities is low, from less than 100 to 250 mm/year. The grass ET rate in this region is about 1 500 mm/year. The hydrology of ponds is comparatively simple. The inputs into the ponds include drainage water from croplands, rainfall and pumped drainage water from perimeter drains installed to intercept pond seepage. The outputs include water evaporation, unrecovered pond seepage and ET from aquatic vegetation (Tanji and Grismer, 1989).
Estimates of pond seepage rates have been made by point measurements as well as by mass balance of pond inflows, changes in water surface elevation and evaporation rates (Tanji and Grismer, 1989; Grismer et al., 1993). Mass-balance based estimates for several different ponds have yielded similar results despite differences in the soil texture of the pond bed material. Although initial seepage rates upon filling a newly constructed pond may be as great as 10 mm/day, seepage rates decrease dramatically in a few months to a few millimetres per day. This decrease in seepage is attributed to the plugging of conducting pores in the bed material by microbial slimes and colloidal soil materials.
Seepage losses are generally dependent on local soil, geohydrological and topographical conditions, and excessive seepage could lead to environmental problems. For example, in Pakistan, seepage from evaporation ponds caused considerable waterlogging and salinization in surrounding farmland. Therefore, pond site selection is important, and seepage losses should be controlled. A lining and leachate collection system may be required to ensure that there will be no contamination of usable groundwater. The type of lining, clay or synthetic, should be determined by material availability, local requirements or environmental regulations.
Evaporation from free water surfaces is influenced by many variables such as air temperature, wind speed, humidity, net solar radiation and water salinity. The evaporation rates from ponds have been measured with floating evaporation pans and diurnal monitoring of pond waters and nearby terrestrial climatic data (Tanji et al., 1992). Results from floating evaporation pans containing water of the same salinity in the pond show that evaporation rates from water surfaces decline with increasing water salinities due to a reduction in water surface vapour pressure. For example, pan evaporation rate for an 8-day period in September 1989 was 63.5 mm for water electrical conductivity (EC) of 14 dS/m, 54.5 mm for water EC of 30 dS/m and 52.4 mm for water EC of 47 dS/m. Evaporation rates have also been estimated using reference evapotranspiration, ETo, data: E=Y(ETo), where Y= 1.3234 - 0.0066 EC (dS/m) for water EC up to 60 dS/m. The measured correction factor, Y, was 1.2 for water EC of about 20 dS/m, 1.07 for water EC of 47 dS/m and 0.92 for water EC of 59 dS/m. Where pond waters are evapoconcentrated to a point where evaporite minerals begin to form, a thin crust of evaporites may form on the water surface during cooler air and water temperatures (night-time), so effectively limiting water evaporation. However, during higher temperatures (daytime), that thin crust of evaporites on the water surface may melt back into solution. With further evapoconcentration, permanent salt deposits will form.
The waters disposed of in evaporation basins or ponds have dissolved mineral salt concentrations ranging from 2 500 to 65 000 mg/litre (m g/g) (Ford, 1988). The estimated annual salt loading into the San Joaquin Valley ponds is about 880 000 t. This is equivalent to about 25 % of the annual salt accumulation in the 0.9 million ha of irrigated lands on the west side of the San Joaquin Valley (Tanji and Grismer, 1989). The waters impounded in the ponds are NaSO4 or NaSO4/NaCl types. The principal sources of salts are the applied water and the chemical weathering of the marine sedimentary rocks in the Pacific Coast Range and the alluvium formed in the valley floor (Tanji et al., 1986).
Impounded drainage waters are evapoconcentrated up to about 388 000 mg/litre, more than ten times the 35 000 mg/litre seawater salinity. As the water evapoconcentrates, a sequence of different evaporite minerals form as their solubility products are exceeded. The suite of evaporites deposited is mainly regulated by the initial chemical composition of the disposed water and the evapoconcentration factor (Smith et al., 1995). A brine chemistry model has been validated to predict the sequence of evaporite formation with desiccation (Smith et al., 1995). The most common minerals to form in copious quantities during the early stages of evapoconcentration are calcite (CaCO3) and gypsum (CaSO4· 2H2O). With further evapoconcentration, glauberite [Na2Ca(SO4)2], bloedite [Na2Mg(SO4)2· 4H2O] and mirabilite (Na2SO4) are typically formed. The last evaporites to be deposited are halite (NaCl) and thenardite (Na2SO4· 10H2O).
Associated with these saline drainage waters are several trace elements of concern, including As, B, Mo, Se, U and V (Westcot et al., 1989; Tanji and Grismer, 1989). These trace elements are naturally occurring in the Cretaceous geologic formations of the San Joaquin Valley. Of principal concern is Se. Bioaccumulated in the aquatic food chain, it caused the reduced reproduction, the deformity and death of waterbirds at Kesterson Reservoir in the San Joaquin Valley in the 1980s (Ohlendorf et al., 1993). Selenium toxicity of waterbirds is occurring in a number of agricultural evaporation basins elsewhere in the San Joaquin Valley.
A survey of annual sampling of ponds and pond cells in the San Joaquin Valley revealed considerable variability in salinity and trace element contents (Ford, 1988). The geometric means of constituents of concern are 31 000 mg/litre TDS, including 25 mg/litre B, 101 m g/litre (ppb) As, 16 m g/litre Se, 2 817 m g/litre Mo, 308 m g/litre U and 22 m g/litre V. The trace element concentrations in influent pond waters are strongly influenced by physiographic locations (Westcot et al., 1989 and 1993). For example, B and Se are at elevated concentrations in subsurface drainwaters from alluvial fans, and As in waters from lake bed soils. With respect to pond bottom sediments, the geometric means of constituents of concern are 107 mg/kg B, 9 mg/kg As, 0.6 mg/kg Se, 6 mg/kg Mo, 9 mg/kg U and 57 mg/kg V.
Seven evaporite forming ponds and 55 naturally occurring hypersaline waters and their specimens of evaporites were sampled, and 55 pairs of samples were analysed for trace elements (Tanji et al., 1992). Of the evaporite samples analysed, none of the salt deposits contained Se, As, B and Mo exceeding California's hazardous solid waste criteria. By contrast, the concentrations of these trace elements in hypersaline waters approached or exceeded the local hazardous liquid waste criteria. Apparently, these trace elements in waters are somehow excluded or dissipated during the crystallization of salts and accumulate in the liquid phase.
As the evaporation pond facilities mature, salt deposits accumulate at rates of 0.9-15 cm/year (Tanji and Grismer, 1989). There is growing concern that dried salt beds may contribute towards salt dust storms, as in the Aral Sea area in Kazakhstan and Uzbekistan, and at Lake Owens in California. Salt dust storms from the desiccated Aral Sea have damaged downwind vegetation and affected the health of humans and animals (Micklin, 1991). There is a need to develop management strategies for deposited salts subject to aeolian erosion. In the San Joaquin Valley, however, salt dust storms do not constitute a problem at present.
Selenium was found to be toxic to fish and waterbirds at Kesterson Reservoir in 1980 and 1981, respectively. The influent Se concentration of drainage water from croplands was about 300 m g/litre and had bioaccumulated in the aquatic food chain. Algae and rooted plants bioconcentrated Se by about 560-fold and 600-fold (69 and 73 mg/kg), respectively. Zooplankton and aquatic invertebrates (insects) feeding on algae and rooted plants, respectively, biomagnified Se by about 1.2-fold and 1.7-fold (85 and 122 mg/kg), respectively. Fish feeding on zooplankton and aquatic insects also biomagnified Se by about 1.5- to 2.2-fold (188 mg/kg). The net result was that the Se bioaccumulation factor from water to fish was 1 540-fold in the Kesterson pond.
A similar degree of bioaccumulation of Se took place in waterbirds, but was dependent on whether their food source was benthic or herbaceous. Due to the elevated salinities and harsh environmental conditions prevalent in many pond facilities, comparatively few species of plants, invertebrates and fish are able to tolerate the high water salinity, widely fluctuating water levels, high water temperatures and low dissolved oxygen (Tanji et al., 1993). However, widgeon grass (Ruppia maritima), waterboatmen (Trichocorixa reticulata) and brine shrimp (Artemia salina) can become quite abundant. These pond food items are eaten by waterfowl and shorebirds.
Selenium becomes toxic to birds when Se substitutes for sulphur in the essential amino acids methionine and cysteine. The hazards of Se to waterfowl may be best evaluated by measuring the Se content in bird eggs and livers (Ohlendorf et al., 1993). In normal non-contaminated aquatic habitats, the Se concentrations in avian eggs rarely exceed 3 m g Se/g. The likelihood of reproductive impairment of birds increases substantially when sets of eggs contain more than 20 m g Se/g.
Arsenic is another constituent of concern. Despite the presence of elevated concentrations of As in ponds located in the Lake Tulare bed, bird eggs do not accumulate As (Ohlendorf et al., 1993). Although B has bioaccumulated in bird eggs up to a maximum of 18 m g/g, B alone does not appear to cause toxicity in bird embryos. Evaporation ponds also contain elevated levels of Mo, but Mo concentrations up to 16 m g/g do not have adverse effects on birds.
Therefore, Se is the element of greatest concern to waterbirds attracted to evaporation basins in the San Joaquin Valley.
The design and management of evaporation pond facilities to enhance desiccation of impounded waters and reduce hazards to wildlife are important considerations. This section addresses the potential biological, chemical and physical treatment options for removing contaminant hazards in evaporation pond waters. Numerous drainwater treatment and disposal options were investigated immediately following the discovery of Se toxicosis of waterbirds at Kesterson Reservoir (Lee, 1993). Among those investigated were: desalination reverse osmosis, microbially-mediated anaerobic reduction of selenate and selenite to elemental Se, microalgal-bacterial removal of Se, biomethylation and volatilization of Se, adsorption of Se to iron filings, chemical reduction of Se, deep well injection, and drainage water re-use. The removal of trace elements in influent and impounded pond waters is inherently difficult due to the saline matrix and extent of removal required to protect wildlife. The various biological methods of removing Se in waters are capable of reducing Se to about 20-50 m g/litre. With additional chemical and physical treatment processes in tandem, reduction to about 10 m g/litre is possible. These levels of treatment are inadequate to protect wildlife from hazardous bioaccumulation of Se. The recommended criterion for Se in waters is 1 m g/litre for aquatic life and this criterion appears to be unattainable using currently known treatment technologies. Some dilution would also be required to achieve acceptable concentrations of Se.
The impoundment of drainage waters in evaporation ponds can have an effect similar to lagoon treatment of municipal wastewaters. Selenium, the principal constituent of concern in evaporation basins, is subject to a number of sink/dissipation mechanisms within the pond environment. Selenite, but not selenate, is strongly adsorbed by soil materials with exposed hydroxyl groups such as oxides of iron, aluminium and other metals. Inorganic Se may be reduced and immobilized into elemental Se and selenides. Indigenous bacteria, fungi and microalgae can methylate Se to dimethylselenide (DMSe) that may eventually volatilize into the atmosphere. In fact, biomethylation and volatilization of Se is the principal pathway by which Se enters into the atmosphere in the natural biogeochemical cycle of Se.
Other processes that remove Se from the water column are the uptake of Se by aquatic plants and the subsequent deposition of organic Se into bottom mud. However, the Se removed from the water column into immobilized forms may be regenerated back into the water column by oxidation. Moreover, Se bioaccumulated by aquatic biota may become part of the aquatic food chain and pose a hazard to animals at the uppermost level of the food chain. The only mechanisms by which the Se inventory may be dissipated from ponds are by inadvertent seepage losses or by volatilization into the atmosphere. The former is not desirable because of the potential contamination of shallow groundwaters. As for the latter, emission of DMSe does not appear to have adverse impacts.
Concept and technology
Disposal by injection is a process in which liquids are pumped into a well for placement into porous rock or sand formations below the ground surface. The well is generally called a 'deep well' because the proposed injection occurs beneath the lowermost underground source of usable water. Most formations selected for disposal reservoirs in California are old marine sediments containing concentrated saltwater. Any porous and permeable rock formation such as sandstone can act as a disposal reservoir for the injection liquid.
In California, deep well injection technology has been used for more than 60 years for the subsurface disposal of oil field brines. During 1994, about 74.3 million m3 of brine waters were disposed of in California oil fields by deep well injection as reported by the California Division of Oil and Gas. Oil field practices all across the United States have established deep well injection as a viable alternative method for the disposal of these types of industrial wastes. Injection well disposal of agricultural drainage water is a viable alternative for disposal where receiving formation conditions are adequate and the costs of injection are not prohibitive.
Prior to formal injection testing, the formation aquifer into which drainage water is to be injected must be sampled and tested to document baseline water quality and to assess chemical compatibility with the drainage water. Drainage water mixed with formation water should be analysed to ensure that the two fluids will not produce precipitates that would clog the injection well. Water from all zones sampled should be analysed for calcium hardness, total alkalinity, TDS, temperature and pH. The results from these analyses will serve as baseline data for later use if monitoring of confining layer containment is necessary.
A suitable formation is usually a sand or sandstone which has sufficient porosity and thickness to receive the injection fluid. Significant space can be created over a large volume of receiving formation, when the formation is compressible. The disposal formation should be extensive enough to offer a sizeable reservoir and should be deep enough to allow adequate injection pressures in the injection wells.
The pressure build-up required for injection occurs at the injection well head and at varying distances from the well over a period of time. The injection fluids begin to displace natural fluids and a resulting pressure rise becomes evident at the well head. This well head pressure rise can be modelled using the Bernard formula for pressure build-up (URS, 1987). The Bernard formula relies on certain site specific data such as porosity, permeability, formation thickness and other data which can only be determined from core samples and well head pressure testing.
A continuously monitored leak detection system for the injection system must be incorporated into a Class I well design. A Class I well is designed to protect the entire area above the uppermost confining layer from leaks of the injected drainage water. The monitoring system would automatically shut off the injection pumps and instantly reduce the pressure during a sudden leak. An example of a leak detection system can be seen in Figure 7. The annular space between the injection tubing and the middle casing is filled with a non-corrosive saline fluid. The saline fluid exerts a specified annular fluid pressure along the column. This annular fluid pressure will remain constant if the tubing and cement plugs are not leaking. Pressure in the annular space can now be monitored during testing of the injection system by an electronic pressure sensor.
FIGURE 7 Injection well leak detection system
The construction of deep injection wells will generally require some form of permit. In the United States, a federal government permit is required from the USEPA pursuant to the Safe Drinking Water Act regulations (40 Code of Federal Regulations (CFR), Section 144 et seq.). The EPA underground injection control programme requirements state that an injection well must be below the lowermost formation that contains (within 400 m) a well used for drinking water and be within a geological formation capable of receiving the liquid. The EPA has designated underground sources of drinking water at a TDS level of 10 000 mg/litre or less. A Class I well under these requirements covers the use of injecting hazardous and non-hazardous wastewaters that are not considered part of oil field operations. The permit also requires extensive geological, construction, testing procedure, monitoring, well abandonment, and pressure build-up data. The information and findings of all relative data are open to public scrutiny and hearings.
Location and formation
The most important consideration in the siting of an injection well is to locate a suitable geological formation for injection of the drainage water. The formation overlying the injection formation must be highly impermeable so that it will act as a hydraulically confining barrier which will prevent upward migration of the injection fluid. Both formations should be thick enough to ensure the desired injection rate. The confining formation should be located as far away as possible from any known faults.
Proximity of source
The next most important consideration in the location of an injection well is distance from the drainage source. Long distance from the injection well site to the source results in added costs for construction, pumping and future maintenance. However, this may be required if no suitable injection formation is available in proximity to the drainage water source.
Formation permeability and porosity
The receiving formation for injection must have a high permeability, measured in millidarcies (md), to provide for an adequate injection rate. One millidarcy is about 1 mm/day for water. Core samples of the injection zone and confining layer may be analysed in the laboratory by testing the air permeabilities. The liquid permeabilities can be conservatively estimated to be 50% less than the laboratory measured air permeabilities. Porosity or the percentage of voids in a soil or rock sample should be estimated by using an appropriate method such as sonic log data from the open hole logs (Schlumberger, 1989).
Potential formation plugging problems
The plugging of injection wells by micro-organisms is a common problem encountered in wells that produce oil. Injecting agricultural drainage water which contains nitrate into a formation that contains organic matter and ferrous iron will produce excellent growth conditions for nitrate reducing bacteria. If the drainage water is untreated before injection, the formation will probably clog as the pores of the sandstone formation become filled by an accumulation of biotite.
There are two common chemical treatments used by the oil industry to improve the injection capability of an injection well. The first method is to continuously add sufficient chlorine to produce a chlorine residual of 0.2-1.0 mg/g. The best time to add chlorine to drainage water is just before injection. This treatment may solve only a small portion of the injection problem. If the sandstone formation has a biological nitrate demand, it may also have a chemical chlorine demand, and will remove chlorine from the chlorinated drainage water. A rapid and complete removal of chlorine from the drainage water may suggest a very reactive reducing agent is present in the formation. The second method for treating an injection well is to add a buffered solution of hydrofluoric acid to the well. Laboratory tests have shown that permeability can be increased, but not in all cases.
Westlands Water District's prototype deep well injection project in the San Joaquin Valley of California was to dispose of up to 4 000 m³ /day of agricultural drainage water from the San Luis Drain. Drainage water was to be injected into two geologic shale and sand formations 1 554 m (Zilch-Temblor) and 2 164 m (Martinez) beneath the ground surface. The goals of the prototype project were to assess the technical, economic and administrative feasibility of deep well injection as a means of managing agricultural drainage water.
FIGURE 8 Well casing and cementing
Drilling and completion
The drilling of the well resulted in a total depth of 2 469 m. Figure 8 shows the final configuration of the district's well casing and cementing. The completion operations were accomplished by perforating the Martinez formation casing with 13 perforations per metre from depths of 2 245 to 2 344 m, and from 2 411 to 2 414 m. The total perforation length was 102 m. The Zilch-Temblor formation was not tested due to the EPA's decision not to allow the district to inject into this formation. Subsequently, the EPA was shown, by computer modelling analysis, that the Zilch-Temblor was also a safe and viable zone for injection.
Following the recovery of some natural formation fluid samples, an injection test was conducted. Injection fluid, consisting of filtered Westlands Water District irrigation water, was treated with 2% potassium chloride and a chlorine biocide. All fluid was filtered to 0.5 micron using a Pall filter. The water was injected through a 4.75 cm tubing at a rate of 12 litres/s (4.4-4.5 barrels/min). A total of 175 000 litres were injected before the well was shut in for a 48 hour pressure fall-off test. At the end of the test, a temporary bridge plug was set at 2 228 m.
Results and economics
Calculations from the 48-hour pressure fall-off test using the final surface pressure revealed a permeability of 12 md, or about 12 mm/day. This value of permeability was too low to achieve the desired injection rate of 44 litres/s. Based on 12 md, the maximum injection rate would be approximately 20% of the proposed rate, or 8.8 litres/s at a pumping pressure of 6.21 MPa. The cost of injecting drainage water was estimated at over $US 810 per 1 000 m3. An acceptable minimum permeability for recharge on this project would be from 50 to 60 md.
The outlined procedures for plugging and abandoning an injection well can be found in Section 146.10 of 40 CFR (Plugging and Abandonment Class I-III Wells). This regulation states that 'the well shall be plugged with cement in a manner which will not allow the movement of fluids either into or between underground sources of drinking water'. In the United States, all abandonment procedures must be witnessed by an authorized representative of the Environmental Protection Agency (40 CFR, Section 146.10). The district's prototype deep well was abandoned in July 1993. Four cement plugs were used in closing the well. The cement plugs were placed at the top of the lower and uppermost injection zones, at the base of the freshwater sands, and 1.5 m below the normal ground surface.